WorldWideScience

Sample records for stratified random sampling

  1. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  2. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  3. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  4. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  5. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  6. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Science.gov (United States)

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  7. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  8. Estimation of Finite Population Mean in Multivariate Stratified Sampling under Cost Function Using Goal Programming

    Directory of Open Access Journals (Sweden)

    Atta Ullah

    2014-01-01

    Full Text Available In practical utilization of stratified random sampling scheme, the investigator meets a problem to select a sample that maximizes the precision of a finite population mean under cost constraint. An allocation of sample size becomes complicated when more than one characteristic is observed from each selected unit in a sample. In many real life situations, a linear cost function of a sample size nh is not a good approximation to actual cost of sample survey when traveling cost between selected units in a stratum is significant. In this paper, sample allocation problem in multivariate stratified random sampling with proposed cost function is formulated in integer nonlinear multiobjective mathematical programming. A solution procedure is proposed using extended lexicographic goal programming approach. A numerical example is presented to illustrate the computational details and to compare the efficiency of proposed compromise allocation.

  9. Effects of unstratified and centre-stratified randomization in multi-centre clinical trials.

    Science.gov (United States)

    Anisimov, Vladimir V

    2011-01-01

    This paper deals with the analysis of randomization effects in multi-centre clinical trials. The two randomization schemes most often used in clinical trials are considered: unstratified and centre-stratified block-permuted randomization. The prediction of the number of patients randomized to different treatment arms in different regions during the recruitment period accounting for the stochastic nature of the recruitment and effects of multiple centres is investigated. A new analytic approach using a Poisson-gamma patient recruitment model (patients arrive at different centres according to Poisson processes with rates sampled from a gamma distributed population) and its further extensions is proposed. Closed-form expressions for corresponding distributions of the predicted number of the patients randomized in different regions are derived. In the case of two treatments, the properties of the total imbalance in the number of patients on treatment arms caused by using centre-stratified randomization are investigated and for a large number of centres a normal approximation of imbalance is proved. The impact of imbalance on the power of the study is considered. It is shown that the loss of statistical power is practically negligible and can be compensated by a minor increase in sample size. The influence of patient dropout is also investigated. The impact of randomization on predicted drug supply overage is discussed. Copyright © 2010 John Wiley & Sons, Ltd.

  10. Stereo imaging and random array stratified imaging for cargo radiation inspecting

    International Nuclear Information System (INIS)

    Wang Jingjin; Zeng Yu

    2003-01-01

    This paper presents a Stereo Imaging and Random Array Stratified Imaging for cargo container radiation Inspecting. By using dual-line vertical detector array scan, a stereo image of inspected cargo can be obtained and watched with virtual reality view. The random detector array has only one-row of detectors but distributed in a certain horizontal dimension randomly. To scan a cargo container with this random array detector, a 'defocused' image is obtained. By using 'anti-random focusing', one layer of the image can be focused on the background of all defocused images from other layers. A stratified X-ray image of overlapped bike wheels is presented

  11. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  12. Data splitting for artificial neural networks using SOM-based stratified sampling.

    Science.gov (United States)

    May, R J; Maier, H R; Dandy, G C

    2010-03-01

    Data splitting is an important consideration during artificial neural network (ANN) development where hold-out cross-validation is commonly employed to ensure generalization. Even for a moderate sample size, the sampling methodology used for data splitting can have a significant effect on the quality of the subsets used for training, testing and validating an ANN. Poor data splitting can result in inaccurate and highly variable model performance; however, the choice of sampling methodology is rarely given due consideration by ANN modellers. Increased confidence in the sampling is of paramount importance, since the hold-out sampling is generally performed only once during ANN development. This paper considers the variability in the quality of subsets that are obtained using different data splitting approaches. A novel approach to stratified sampling, based on Neyman sampling of the self-organizing map (SOM), is developed, with several guidelines identified for setting the SOM size and sample allocation in order to minimize the bias and variance in the datasets. Using an example ANN function approximation task, the SOM-based approach is evaluated in comparison to random sampling, DUPLEX, systematic stratified sampling, and trial-and-error sampling to minimize the statistical differences between data sets. Of these approaches, DUPLEX is found to provide benchmark performance with good model performance, with no variability. The results show that the SOM-based approach also reliably generates high-quality samples and can therefore be used with greater confidence than other approaches, especially in the case of non-uniform datasets, with the benefit of scalability to perform data splitting on large datasets. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  14. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    Science.gov (United States)

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  15. Random forcing of geostrophic motion in rotating stratified turbulence

    Science.gov (United States)

    Waite, Michael L.

    2017-12-01

    Random forcing of geostrophic motion is a common approach in idealized simulations of rotating stratified turbulence. Such forcing represents the injection of energy into large-scale balanced motion, and the resulting breakdown of quasi-geostrophic turbulence into inertia-gravity waves and stratified turbulence can shed light on the turbulent cascade processes of the atmospheric mesoscale. White noise forcing is commonly employed, which excites all frequencies equally, including frequencies much higher than the natural frequencies of large-scale vortices. In this paper, the effects of these high frequencies in the forcing are investigated. Geostrophic motion is randomly forced with red noise over a range of decorrelation time scales τ, from a few time steps to twice the large-scale vortex time scale. It is found that short τ (i.e., nearly white noise) results in about 46% more gravity wave energy than longer τ, despite the fact that waves are not directly forced. We argue that this effect is due to wave-vortex interactions, through which the high frequencies in the forcing are able to excite waves at their natural frequencies. It is concluded that white noise forcing should be avoided, even if it is only applied to the geostrophic motion, when a careful investigation of spontaneous wave generation is needed.

  16. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  17. Bayesian stratified sampling to assess corpus utility

    Energy Technology Data Exchange (ETDEWEB)

    Hochberg, J.; Scovel, C.; Thomas, T.; Hall, S.

    1998-12-01

    This paper describes a method for asking statistical questions about a large text corpus. The authors exemplify the method by addressing the question, ``What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?`` They estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Bayesian analysis and stratified sampling are used to reduce the sampling uncertainty of the estimate from over 3,100 documents to fewer than 1,000. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  18. Monte Carlo stratified source-sampling

    International Nuclear Information System (INIS)

    Blomquist, R.N.; Gelbard, E.M.

    1997-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo open-quotes eigenvalue of the worldclose quotes problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. The original test-problem was treated by a special code designed specifically for that purpose. Recently ANL started work on a method for dealing with more realistic eigenvalue of the world configurations, and has been incorporating this method into VIM. The original method has been modified to take into account real-world statistical noise sources not included in the model problem. This paper constitutes a status report on work still in progress

  19. Unit Stratified Sampling as a Tool for Approximation of Stochastic Optimization Problems

    Czech Academy of Sciences Publication Activity Database

    Šmíd, Martin

    2012-01-01

    Roč. 19, č. 30 (2012), s. 153-169 ISSN 1212-074X R&D Projects: GA ČR GAP402/11/0150; GA ČR GAP402/10/0956; GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : Stochastic programming * approximation * stratified sampling Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/smid-unit stratified sampling as a tool for approximation of stochastic optimization problems.pdf

  20. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  1. A direct observation method for auditing large urban centers using stratified sampling, mobile GIS technology and virtual environments.

    Science.gov (United States)

    Lafontaine, Sean J V; Sawada, M; Kristjansson, Elizabeth

    2017-02-16

    With the expansion and growth of research on neighbourhood characteristics, there is an increased need for direct observational field audits. Herein, we introduce a novel direct observational audit method and systematic social observation instrument (SSOI) for efficiently assessing neighbourhood aesthetics over large urban areas. Our audit method uses spatial random sampling stratified by residential zoning and incorporates both mobile geographic information systems technology and virtual environments. The reliability of our method was tested in two ways: first, in 15 Ottawa neighbourhoods, we compared results at audited locations over two subsequent years, and second; we audited every residential block (167 blocks) in one neighbourhood and compared the distribution of SSOI aesthetics index scores with results from the randomly audited locations. Finally, we present interrater reliability and consistency results on all observed items. The observed neighbourhood average aesthetics index score estimated from four or five stratified random audit locations is sufficient to characterize the average neighbourhood aesthetics. The SSOI was internally consistent and demonstrated good to excellent interrater reliability. At the neighbourhood level, aesthetics is positively related to SES and physical activity and negatively correlated with BMI. The proposed approach to direct neighbourhood auditing performs sufficiently and has the advantage of financial and temporal efficiency when auditing a large city.

  2. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  3. Distribution-Preserving Stratified Sampling for Learning Problems.

    Science.gov (United States)

    Cervellera, Cristiano; Maccio, Danilo

    2017-06-09

    The need for extracting a small sample from a large amount of real data, possibly streaming, arises routinely in learning problems, e.g., for storage, to cope with computational limitations, obtain good training/test/validation sets, and select minibatches for stochastic gradient neural network training. Unless we have reasons to select the samples in an active way dictated by the specific task and/or model at hand, it is important that the distribution of the selected points is as similar as possible to the original data. This is obvious for unsupervised learning problems, where the goal is to gain insights on the distribution of the data, but it is also relevant for supervised problems, where the theory explains how the training set distribution influences the generalization error. In this paper, we analyze the technique of stratified sampling from the point of view of distances between probabilities. This allows us to introduce an algorithm, based on recursive binary partition of the input space, aimed at obtaining samples that are distributed as much as possible as the original data. A theoretical analysis is proposed, proving the (greedy) optimality of the procedure together with explicit error bounds. An adaptive version of the algorithm is also introduced to cope with streaming data. Simulation tests on various data sets and different learning tasks are also provided.

  4. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  5. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  6. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across American Samoa in 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across...

  7. Prototypic Features of Loneliness in a Stratified Sample of Adolescents

    Directory of Open Access Journals (Sweden)

    Mathias Lasgaard

    2009-06-01

    Full Text Available Dominant theoretical approaches in loneliness research emphasize the value of personality characteristics in explaining loneliness. The present study examines whether dysfunctional social strategies and attributions in lonely adolescents can be explained by personality characteristics. A questionnaire survey was conducted with 379 Danish Grade 8 students (M = 14.1 years, SD = 0.4 from 22 geographically stratified and randomly selected schools. Hierarchical linear regression analysis showed that network orientation, success expectation and avoidance in affiliative situations predicted loneliness independent of personality characteristics, demographics and social desirability. The study indicates that dysfunctional strategies and attributions in affiliative situations are directly related to loneliness in adolescence. These strategies and attributions may preclude lonely adolescents from guidance and intervention. Thus, professionals need to be knowledgeable about prototypic features of loneliness in addition to employing a pro-active approach when assisting adolescents who display prototypic features.

  8. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  9. Brachytherapy dose-volume histogram computations using optimized stratified sampling methods

    International Nuclear Information System (INIS)

    Karouzakis, K.; Lahanas, M.; Milickovic, N.; Giannouli, S.; Baltas, D.; Zamboglou, N.

    2002-01-01

    A stratified sampling method for the efficient repeated computation of dose-volume histograms (DVHs) in brachytherapy is presented as used for anatomy based brachytherapy optimization methods. The aim of the method is to reduce the number of sampling points required for the calculation of DVHs for the body and the PTV. From the DVHs are derived the quantities such as Conformity Index COIN and COIN integrals. This is achieved by using partial uniform distributed sampling points with a density in each region obtained from a survey of the gradients or the variance of the dose distribution in these regions. The shape of the sampling regions is adapted to the patient anatomy and the shape and size of the implant. For the application of this method a single preprocessing step is necessary which requires only a few seconds. Ten clinical implants were used to study the appropriate number of sampling points, given a required accuracy for quantities such as cumulative DVHs, COIN indices and COIN integrals. We found that DVHs of very large tissue volumes surrounding the PTV, and also COIN distributions, can be obtained using a factor of 5-10 times smaller the number of sampling points in comparison with uniform distributed points

  10. A stratified random survey of the proportion of poor quality oral artesunate sold at medicine outlets in the Lao PDR – implications for therapeutic failure and drug resistance

    Directory of Open Access Journals (Sweden)

    Vongsack Latsamy

    2009-07-01

    Full Text Available Abstract Background Counterfeit oral artesunate has been a major public health problem in mainland SE Asia, impeding malaria control. A countrywide stratified random survey was performed to determine the availability and quality of oral artesunate in pharmacies and outlets (shops selling medicines in the Lao PDR (Laos. Methods In 2003, 'mystery' shoppers were asked to buy artesunate tablets from 180 outlets in 12 of the 18 Lao provinces. Outlets were selected using stratified random sampling by investigators not involved in sampling. Samples were analysed for packaging characteristics, by the Fast Red Dye test, high-performance liquid chromatography (HPLC, mass spectrometry (MS, X-ray diffractometry and pollen analysis. Results Of 180 outlets sampled, 25 (13.9% sold oral artesunate. Outlets selling artesunate were more commonly found in the more malarious southern Laos. Of the 25 outlets, 22 (88%; 95%CI 68–97% sold counterfeit artesunate, as defined by packaging and chemistry. No artesunate was detected in the counterfeits by any of the chemical analysis techniques and analysis of the packaging demonstrated seven different counterfeit types. There was complete agreement between the Fast Red dye test, HPLC and MS analysis. A wide variety of wrong active ingredients were found by MS. Of great concern, 4/27 (14.8% fakes contained detectable amounts of artemisinin (0.26–115.7 mg/tablet. Conclusion This random survey confirms results from previous convenience surveys that counterfeit artesunate is a severe public health problem. The presence of artemisinin in counterfeits may encourage malaria resistance to artemisinin derivatives. With increasing accessibility of artemisinin-derivative combination therapy (ACT in Laos, the removal of artesunate monotherapy from pharmacies may be an effective intervention.

  11. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  12. Stratified Sampling to Define Levels of Petrographic Variation in Coal Beds: Examples from Indonesia and New Zealand

    Directory of Open Access Journals (Sweden)

    Tim A. Moore

    2016-01-01

    Full Text Available DOI: 10.17014/ijog.3.1.29-51Stratified sampling of coal seams for petrographic analysis using block samples is a viable alternative to standard methods of channel sampling and particulate pellet mounts. Although petrographic analysis of particulate pellets is employed widely, it is both time consuming and does not allow variation within sampling units to be assessed - an important measure in any study whether it be for paleoenvironmental reconstruction or in obtaining estimates of industrial attributes. Also, samples taken as intact blocks provide additional information, such as texture and botanical affinity that cannot be gained using particulate pellets. Stratified sampling can be employed both on ‘fine’ and ‘coarse’ grained coal units. Fine-grained coals are defined as those coal intervals that do not contain vitrain bands greater than approximately 1 mm in thickness (as measured perpendicular to bedding. In fine-grained coal seams, a reasonable sized block sample (with a polished surface area of ~3 cm2 can be taken that encapsulates the macroscopic variability. However, for coarse-grained coals (vitrain bands >1 mm a different system has to be employed in order to accurately account for the larger particles. Macroscopic point counting of vitrain bands can accurately account for those particles>1 mm within a coal interval. This point counting method is conducted using something as simple as string on a coal face with marked intervals greater than the largest particle expected to be encountered (although new technologies are being developed to capture this type of information digitally. Comparative analyses of particulate pellets and blocks on the same interval show less than 6% variation between the two sample types when blocks are recalculated to include macroscopic counts of vitrain. Therefore even in coarse-grained coals, stratified sampling can be used effectively and representatively.

  13. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  14. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    Science.gov (United States)

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  15. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  16. Temporally stratified sampling programs for estimation of fish impingement

    International Nuclear Information System (INIS)

    Kumar, K.D.; Griffith, J.S.

    1977-01-01

    Impingement monitoring programs often expend valuable and limited resources and fail to provide a dependable estimate of either total annual impingement or those biological and physicochemical factors affecting impingement. In situations where initial monitoring has identified ''problem'' fish species and the periodicity of their impingement, intensive sampling during periods of high impingement will maximize information obtained. We use data gathered at two nuclear generating facilities in the southeastern United States to discuss techniques of designing such temporally stratified monitoring programs and their benefits and drawbacks. Of the possible temporal patterns in environmental factors within a calendar year, differences among seasons are most influential in the impingement of freshwater fishes in the Southeast. Data on the threadfin shad (Dorosoma petenense) and the role of seasonal temperature changes are utilized as an example to demonstrate ways of most efficiently and accurately estimating impingement of the species

  17. [Comparison study on sampling methods of Oncomelania hupensis snail survey in marshland schistosomiasis epidemic areas in China].

    Science.gov (United States)

    An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang

    2016-06-29

    To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.

  18. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across American Samoa in 2015 (NCEI Accession 0159168)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across...

  19. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  20. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  1. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  2. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  3. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  4. Design of dry sand soil stratified sampler

    Science.gov (United States)

    Li, Erkang; Chen, Wei; Feng, Xiao; Liao, Hongbo; Liang, Xiaodong

    2018-04-01

    This paper presents a design of a stratified sampler for dry sand soil, which can be used for stratified sampling of loose sand under certain conditions. Our group designed the mechanical structure of a portable, single - person, dry sandy soil stratified sampler. We have set up a mathematical model for the sampler. It lays the foundation for further development of design research.

  5. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  6. Properties of the endogenous post-stratified estimator using a random forests model

    Science.gov (United States)

    John Tipton; Jean Opsomer; Gretchen G. Moisen

    2012-01-01

    Post-stratification is used in survey statistics as a method to improve variance estimates. In traditional post-stratification methods, the variable on which the data is being stratified must be known at the population level. In many cases this is not possible, but it is possible to use a model to predict values using covariates, and then stratify on these predicted...

  7. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across American Samoa in 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across American Samoa in 2015 as a part of...

  8. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  9. Mahalanobis' Contributions to Sample Surveys

    Indian Academy of Sciences (India)

    Sample Survey started its operations in October 1950 under the ... and adopted random cuts for estimating the acreage under jute ... demographic factors relating to indebtedness, unemployment, ... traffic surveys, demand for currency coins and average life of .... Mahalanobis derived the optimum allocation in stratified.

  10. Gambling problems in the family – A stratified probability sample study of prevalence and reported consequences

    Directory of Open Access Journals (Sweden)

    Øren Anita

    2008-12-01

    Full Text Available Abstract Background Prior studies on the impact of problem gambling in the family mainly include help-seeking populations with small numbers of participants. The objective of the present stratified probability sample study was to explore the epidemiology of problem gambling in the family in the general population. Methods Men and women 16–74 years-old randomly selected from the Norwegian national population database received an invitation to participate in this postal questionnaire study. The response rate was 36.1% (3,483/9,638. Given the lack of validated criteria, two survey questions ("Have you ever noticed that a close relative spent more and more money on gambling?" and "Have you ever experienced that a close relative lied to you about how much he/she gambles?" were extrapolated from the Lie/Bet Screen for pathological gambling. Respondents answering "yes" to both questions were defined as Concerned Significant Others (CSOs. Results Overall, 2.0% of the study population was defined as CSOs. Young age, female gender, and divorced marital status were factors positively associated with being a CSO. CSOs often reported to have experienced conflicts in the family related to gambling, worsening of the family's financial situation, and impaired mental and physical health. Conclusion Problematic gambling behaviour not only affects the gambling individual but also has a strong impact on the quality of life of family members.

  11. Health service accreditation as a predictor of clinical and organisational performance: a blinded, random, stratified study.

    Science.gov (United States)

    Braithwaite, Jeffrey; Greenfield, David; Westbrook, Johanna; Pawsey, Marjorie; Westbrook, Mary; Gibberd, Robert; Naylor, Justine; Nathan, Sally; Robinson, Maureen; Runciman, Bill; Jackson, Margaret; Travaglia, Joanne; Johnston, Brian; Yen, Desmond; McDonald, Heather; Low, Lena; Redman, Sally; Johnson, Betty; Corbett, Angus; Hennessy, Darlene; Clark, John; Lancaster, Judie

    2010-02-01

    Despite the widespread use of accreditation in many countries, and prevailing beliefs that accreditation is associated with variables contributing to clinical care and organisational outcomes, little systematic research has been conducted to examine its validity as a predictor of healthcare performance. To determine whether accreditation performance is associated with self-reported clinical performance and independent ratings of four aspects of organisational performance. Independent blinded assessment of these variables in a random, stratified sample of health service organisations. Acute care: large, medium and small health-service organisations in Australia. Study participants Nineteen health service organisations employing 16 448 staff treating 321 289 inpatients and 1 971 087 non-inpatient services annually, representing approximately 5% of the Australian acute care health system. Correlations of accreditation performance with organisational culture, organisational climate, consumer involvement, leadership and clinical performance. Results Accreditation performance was significantly positively correlated with organisational culture (rho=0.618, p=0.005) and leadership (rho=0.616, p=0.005). There was a trend between accreditation and clinical performance (rho=0.450, p=0.080). Accreditation was unrelated to organisational climate (rho=0.378, p=0.110) and consumer involvement (rho=0.215, p=0.377). Accreditation results predict leadership behaviours and cultural characteristics of healthcare organisations but not organisational climate or consumer participation, and a positive trend between accreditation and clinical performance is noted.

  12. Towards Cost-efficient Sampling Methods

    OpenAIRE

    Peng, Luo; Yongli, Li; Chong, Wu

    2014-01-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper presents two new sampling methods based on the perspective that a small part of vertices with high node degree can possess the most structure information of a network. The two proposed sampling methods are efficient in sampling the nodes with high degree. The first new sampling method is improved on the basis of the stratified random sampling method and...

  13. Impacts of Sample Design for Validation Data on the Accuracy of Feedforward Neural Network Classification

    Directory of Open Access Journals (Sweden)

    Giles M. Foody

    2017-08-01

    Full Text Available Validation data are often used to evaluate the performance of a trained neural network and used in the selection of a network deemed optimal for the task at-hand. Optimality is commonly assessed with a measure, such as overall classification accuracy. The latter is often calculated directly from a confusion matrix showing the counts of cases in the validation set with particular labelling properties. The sample design used to form the validation set can, however, influence the estimated magnitude of the accuracy. Commonly, the validation set is formed with a stratified sample to give balanced classes, but also via random sampling, which reflects class abundance. It is suggested that if the ultimate aim is to accurately classify a dataset in which the classes do vary in abundance, a validation set formed via random, rather than stratified, sampling is preferred. This is illustrated with the classification of simulated and remotely-sensed datasets. With both datasets, statistically significant differences in the accuracy with which the data could be classified arose from the use of validation sets formed via random and stratified sampling (z = 2.7 and 1.9 for the simulated and real datasets respectively, for both p < 0.05%. The accuracy of the classifications that used a stratified sample in validation were smaller, a result of cases of an abundant class being commissioned into a rarer class. Simple means to address the issue are suggested.

  14. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across Wake Island from 2014-03-16 to 2014-03-20 (NCEI Accession 0159157)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across Wake...

  15. A census-weighted, spatially-stratified household sampling strategy for urban malaria epidemiology

    Directory of Open Access Journals (Sweden)

    Slutsker Laurence

    2008-02-01

    Full Text Available Abstract Background Urban malaria is likely to become increasingly important as a consequence of the growing proportion of Africans living in cities. A novel sampling strategy was developed for urban areas to generate a sample simultaneously representative of population and inhabited environments. Such a strategy should facilitate analysis of important epidemiological relationships in this ecological context. Methods Census maps and summary data for Kisumu, Kenya, were used to create a pseudo-sampling frame using the geographic coordinates of census-sampled structures. For every enumeration area (EA designated as urban by the census (n = 535, a sample of structures equal to one-tenth the number of households was selected. In EAs designated as rural (n = 32, a geographically random sample totalling one-tenth the number of households was selected from a grid of points at 100 m intervals. The selected samples were cross-referenced to a geographic information system, and coordinates transferred to handheld global positioning units. Interviewers found the closest eligible household to the sampling point and interviewed the caregiver of a child aged Results 4,336 interviews were completed in 473 of the 567 study area EAs from June 2002 through February 2003. EAs without completed interviews were randomly distributed, and non-response was approximately 2%. Mean distance from the assigned sampling point to the completed interview was 74.6 m, and was significantly less in urban than rural EAs, even when controlling for number of households. The selected sample had significantly more children and females of childbearing age than the general population, and fewer older individuals. Conclusion This method selected a sample that was simultaneously population-representative and inclusive of important environmental variation. The use of a pseudo-sampling frame and pre-programmed handheld GPS units is more efficient and may yield a more complete sample than

  16. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across American Samoa in 2015 (NCEI Accession 0157752)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across American Samoa in 2015 as a part of...

  17. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  18. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  19. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  20. National Coral Reef Monitoring Program: benthic images collected from stratified random sites (StRS) across the Hawaiian Archipelago from 2016-07-13 to 2016-09-27 (NCEI Accession 0164293)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the...

  1. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across the Hawaiian Archipelago from 2013-05-01 to 2013-10-31 (NCEI Accession 0159144)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the...

  2. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across the Mariana Archipelago from 2014-03-25 to 2014-05-07 (NCEI Accession 0159142)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the...

  3. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  4. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  5. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across the Hawaiian Archipelago in 2013 (NCEI Accession 0159140)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the Hawaiian archipelago in 2013 as a...

  6. Spatial Sampling of Weather Data for Regional Crop Yield Simulations

    Science.gov (United States)

    Van Bussel, Lenny G. J.; Ewert, Frank; Zhao, Gang; Hoffmann, Holger; Enders, Andreas; Wallach, Daniel; Asseng, Senthold; Baigorria, Guillermo A.; Basso, Bruno; Biernath, Christian; hide

    2016-01-01

    Field-scale crop models are increasingly applied at spatio-temporal scales that range from regions to the globe and from decades up to 100 years. Sufficiently detailed data to capture the prevailing spatio-temporal heterogeneity in weather, soil, and management conditions as needed by crop models are rarely available. Effective sampling may overcome the problem of missing data but has rarely been investigated. In this study the effect of sampling weather data has been evaluated for simulating yields of winter wheat in a region in Germany over a 30-year period (1982-2011) using 12 process-based crop models. A stratified sampling was applied to compare the effect of different sizes of spatially sampled weather data (10, 30, 50, 100, 500, 1000 and full coverage of 34,078 sampling points) on simulated wheat yields. Stratified sampling was further compared with random sampling. Possible interactions between sample size and crop model were evaluated. The results showed differences in simulated yields among crop models but all models reproduced well the pattern of the stratification. Importantly, the regional mean of simulated yields based on full coverage could already be reproduced by a small sample of 10 points. This was also true for reproducing the temporal variability in simulated yields but more sampling points (about 100) were required to accurately reproduce spatial yield variability. The number of sampling points can be smaller when a stratified sampling is applied as compared to a random sampling. However, differences between crop models were observed including some interaction between the effect of sampling on simulated yields and the model used. We concluded that stratified sampling can considerably reduce the number of required simulations. But, differences between crop models must be considered as the choice for a specific model can have larger effects on simulated yields than the sampling strategy. Assessing the impact of sampling soil and crop management

  7. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  8. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  9. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  10. A user`s guide to LHS: Sandia`s Latin Hypercube Sampling Software

    Energy Technology Data Exchange (ETDEWEB)

    Wyss, G.D.; Jorgensen, K.H. [Sandia National Labs., Albuquerque, NM (United States). Risk Assessment and Systems Modeling Dept.

    1998-02-01

    This document is a reference guide for LHS, Sandia`s Latin Hypercube Sampling Software. This software has been developed to generate either Latin hypercube or random multivariate samples. The Latin hypercube technique employs a constrained sampling scheme, whereas random sampling corresponds to a simple Monte Carlo technique. The present program replaces the previous Latin hypercube sampling program developed at Sandia National Laboratories (SAND83-2365). This manual covers the theory behind stratified sampling as well as use of the LHS code both with the Windows graphical user interface and in the stand-alone mode.

  11. Stratified B-trees and versioning dictionaries

    OpenAIRE

    Twigg, Andy; Byde, Andrew; Milos, Grzegorz; Moreton, Tim; Wilkes, John; Wilkie, Tom

    2011-01-01

    A classic versioned data structure in storage and computer science is the copy-on-write (CoW) B-tree -- it underlies many of today's file systems and databases, including WAFL, ZFS, Btrfs and more. Unfortunately, it doesn't inherit the B-tree's optimality properties; it has poor space utilization, cannot offer fast updates, and relies on random IO to scale. Yet, nothing better has been developed since. We describe the `stratified B-tree', which beats all known semi-external memory versioned B...

  12. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  13. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  14. Effects of a random spatial variation of the plasma density on the mode conversion in cold, unmagnetized, and stratified plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Jung Yu, Dae [School of Space Research, Kyung Hee University, Yongin 446-701 (Korea, Republic of); Kim, Kihong [Department of Energy Systems Research, Ajou University, Suwon 443-749 (Korea, Republic of)

    2013-12-15

    We study the effects of a random spatial variation of the plasma density on the mode conversion of electromagnetic waves into electrostatic oscillations in cold, unmagnetized, and stratified plasmas. Using the invariant imbedding method, we calculate precisely the electromagnetic field distribution and the mode conversion coefficient, which is defined to be the fraction of the incident wave power converted into electrostatic oscillations, for the configuration where a numerically generated random density variation is added to the background linear density profile. We repeat similar calculations for a large number of random configurations and take an average of the results. We obtain a peculiar nonmonotonic dependence of the mode conversion coefficient on the strength of randomness. As the disorder increases from zero, the maximum value of the mode conversion coefficient decreases initially, then increases to a maximum, and finally decreases towards zero. The range of the incident angle in which mode conversion occurs increases monotonically as the disorder increases. We present numerical results suggesting that the decrease of mode conversion mainly results from the increased reflection due to the Anderson localization effect originating from disorder, whereas the increase of mode conversion of the intermediate disorder regime comes from the appearance of many resonance points and the enhanced tunneling between the resonance points and the cutoff point. We also find a very large local enhancement of the magnetic field intensity for particular random configurations. In order to obtain high mode conversion efficiency, it is desirable to restrict the randomness close to the resonance region.

  15. Effects of a random spatial variation of the plasma density on the mode conversion in cold, unmagnetized, and stratified plasmas

    International Nuclear Information System (INIS)

    Jung Yu, Dae; Kim, Kihong

    2013-01-01

    We study the effects of a random spatial variation of the plasma density on the mode conversion of electromagnetic waves into electrostatic oscillations in cold, unmagnetized, and stratified plasmas. Using the invariant imbedding method, we calculate precisely the electromagnetic field distribution and the mode conversion coefficient, which is defined to be the fraction of the incident wave power converted into electrostatic oscillations, for the configuration where a numerically generated random density variation is added to the background linear density profile. We repeat similar calculations for a large number of random configurations and take an average of the results. We obtain a peculiar nonmonotonic dependence of the mode conversion coefficient on the strength of randomness. As the disorder increases from zero, the maximum value of the mode conversion coefficient decreases initially, then increases to a maximum, and finally decreases towards zero. The range of the incident angle in which mode conversion occurs increases monotonically as the disorder increases. We present numerical results suggesting that the decrease of mode conversion mainly results from the increased reflection due to the Anderson localization effect originating from disorder, whereas the increase of mode conversion of the intermediate disorder regime comes from the appearance of many resonance points and the enhanced tunneling between the resonance points and the cutoff point. We also find a very large local enhancement of the magnetic field intensity for particular random configurations. In order to obtain high mode conversion efficiency, it is desirable to restrict the randomness close to the resonance region

  16. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  17. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    Science.gov (United States)

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  18. Detection and monitoring of invasive exotic plants: a comparison of four sampling methods

    Science.gov (United States)

    Cynthia D. Huebner

    2007-01-01

    The ability to detect and monitor exotic invasive plants is likely to vary depending on the sampling method employed. Methods with strong qualitative thoroughness for species detection often lack the intensity necessary to monitor vegetation change. Four sampling methods (systematic plot, stratified-random plot, modified Whittaker, and timed meander) in hemlock and red...

  19. The Risk-Stratified Osteoporosis Strategy Evaluation study (ROSE)

    DEFF Research Database (Denmark)

    Rubin, Katrine Hass; Holmberg, Teresa; Rothmann, Mette Juel

    2015-01-01

    The risk-stratified osteoporosis strategy evaluation study (ROSE) is a randomized prospective population-based study investigating the effectiveness of a two-step screening program for osteoporosis in women. This paper reports the study design and baseline characteristics of the study population....... 35,000 women aged 65-80 years were selected at random from the population in the Region of Southern Denmark and-before inclusion-randomized to either a screening group or a control group. As first step, a self-administered questionnaire regarding risk factors for osteoporosis based on FRAX......(®) was issued to both groups. As second step, subjects in the screening group with a 10-year probability of major osteoporotic fractures ≥15 % were offered a DXA scan. Patients diagnosed with osteoporosis from the DXA scan were advised to see their GP and discuss pharmaceutical treatment according to Danish...

  20. Toward cost-efficient sampling methods

    Science.gov (United States)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  1. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across the Mariana Archipelago from 2014-03-25 to 2014-05-07 (NCEI Accession 0159148)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the Mariana archipelago in 2014 as a...

  2. National Coral Reef Monitoring Program: benthic cover derived from analysis of benthic images collected during stratified random surveys (StRS) across the Hawaiian Archipelago from 2016-07-13 to 2016-09-27 (NCEI Accession 0164295)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the Hawaiian archipelago in 2016 as a...

  3. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    Science.gov (United States)

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  4. Rationale and Design of Khuzestan Vitamin D Deficiency Screening Program in Pregnancy: A Stratified Randomized Vitamin D Supplementation Controlled Trial.

    Science.gov (United States)

    Rostami, Maryam; Ramezani Tehrani, Fahimeh; Simbar, Masoumeh; Hosseinpanah, Farhad; Alavi Majd, Hamid

    2017-04-07

    Although there have been marked improvements in our understanding of vitamin D functions in different diseases, gaps on its role during pregnancy remain. Due to the lack of consensus on the most accurate marker of vitamin D deficiency during pregnancy and the optimal level of 25-hydroxyvitamin D, 25(OH)D, for its definition, vitamin D deficiency assessment during pregnancy is a complicated process. Besides, the optimal protocol for treatment of hypovitaminosis D and its effect on maternal and neonatal outcomes are still unclear. The aim of our study was to estimate the prevalence of vitamin D deficiency in the first trimester of pregnancy and to compare vitamin D screening strategy with no screening. Also, we intended to compare the effectiveness of various treatment regimens on maternal and neonatal outcomes in Masjed-Soleyman and Shushtar cities of Khuzestan province, Iran. This was a two-phase study. First, a population-based cross-sectional study was conducted; recruiting 1600 and 900 first trimester pregnant women from health centers of Masjed-Soleyman and Shushtar, respectively, using stratified multistage cluster sampling with probability proportional to size (PPS) method. Second, to assess the effect of screening strategy on maternal and neonatal outcomes, Masjed-Soleyman participants were assigned to a screening program versus Shushtar participants who became the nonscreening arm. Within the framework of the screening regimen, an 8-arm blind randomized clinical trial was undertaken to compare the effects of various treatment protocols. A total of 800 pregnant women with vitamin D deficiency were selected using simple random sampling from the 1600 individuals of Masjed-Soleyman as interventional groups. Serum concentrations of 25(OH)D were classified as: (1) severe deficient (20ng/ml). Those with severe and moderate deficiency were randomly divided into 4 subgroups and received vitamin D3 based on protocol and were followed until delivery. Data was analyzed

  5. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    OpenAIRE

    Henrik J. Kleven; Martin B. Knudsen; Claus T. Kreiner; Søren Pedersen; Emmanuel Saez

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main...

  6. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  7. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across the Pacific Remote Island Areas from 2015-01-26 to 2015-04-28 (NCEI Accession 0159165)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across the Pacific Remote Island Areas since...

  8. Assessment of the effect of population and diary sampling methods on estimation of school-age children exposure to fine particles.

    Science.gov (United States)

    Che, W W; Frey, H Christopher; Lau, Alexis K H

    2014-12-01

    Population and diary sampling methods are employed in exposure models to sample simulated individuals and their daily activity on each simulation day. Different sampling methods may lead to variations in estimated human exposure. In this study, two population sampling methods (stratified-random and random-random) and three diary sampling methods (random resampling, diversity and autocorrelation, and Markov-chain cluster [MCC]) are evaluated. Their impacts on estimated children's exposure to ambient fine particulate matter (PM2.5 ) are quantified via case studies for children in Wake County, NC for July 2002. The estimated mean daily average exposure is 12.9 μg/m(3) for simulated children using the stratified population sampling method, and 12.2 μg/m(3) using the random sampling method. These minor differences are caused by the random sampling among ages within census tracts. Among the three diary sampling methods, there are differences in the estimated number of individuals with multiple days of exposures exceeding a benchmark of concern of 25 μg/m(3) due to differences in how multiday longitudinal diaries are estimated. The MCC method is relatively more conservative. In case studies evaluated here, the MCC method led to 10% higher estimation of the number of individuals with repeated exposures exceeding the benchmark. The comparisons help to identify and contrast the capabilities of each method and to offer insight regarding implications of method choice. Exposure simulation results are robust to the two population sampling methods evaluated, and are sensitive to the choice of method for simulating longitudinal diaries, particularly when analyzing results for specific microenvironments or for exposures exceeding a benchmark of concern. © 2014 Society for Risk Analysis.

  9. Stratifying empiric risk of schizophrenia among first degree relatives using multiple predictors in two independent Indian samples.

    Science.gov (United States)

    Bhatia, Triptish; Gettig, Elizabeth A; Gottesman, Irving I; Berliner, Jonathan; Mishra, N N; Nimgaonkar, Vishwajit L; Deshpande, Smita N

    2016-12-01

    Schizophrenia (SZ) has an estimated heritability of 64-88%, with the higher values based on twin studies. Conventionally, family history of psychosis is the best individual-level predictor of risk, but reliable risk estimates are unavailable for Indian populations. Genetic, environmental, and epigenetic factors are equally important and should be considered when predicting risk in 'at risk' individuals. To estimate risk based on an Indian schizophrenia participant's family history combined with selected demographic factors. To incorporate variables in addition to family history, and to stratify risk, we constructed a regression equation that included demographic variables in addition to family history. The equation was tested in two independent Indian samples: (i) an initial sample of SZ participants (N=128) with one sibling or offspring; (ii) a second, independent sample consisting of multiply affected families (N=138 families, with two or more sibs/offspring affected with SZ). The overall estimated risk was 4.31±0.27 (mean±standard deviation). There were 19 (14.8%) individuals in the high risk group, 75 (58.6%) in the moderate risk and 34 (26.6%) in the above average risk (in Sample A). In the validation sample, risks were distributed as: high (45%), moderate (38%) and above average (17%). Consistent risk estimates were obtained from both samples using the regression equation. Familial risk can be combined with demographic factors to estimate risk for SZ in India. If replicated, the proposed stratification of risk may be easier and more realistic for family members. Copyright © 2016. Published by Elsevier B.V.

  10. Exploring pseudo- and chaotic random Monte Carlo simulations

    Science.gov (United States)

    Blais, J. A. Rod; Zhang, Zhan

    2011-07-01

    Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer-generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as importance sampling and stratified sampling can be applied in most Monte Carlo simulations and significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on some practical examples of geodetic direct and inverse problems, conclusions and recommendations concerning their performance and general applicability are included.

  11. Effectiveness of a two-step population-based osteoporosis screening program using FRAX: the randomized Risk-stratified Osteoporosis Strategy Evaluation (ROSE) study.

    Science.gov (United States)

    Rubin, K H; Rothmann, M J; Holmberg, T; Høiberg, M; Möller, S; Barkmann, R; Glüer, C C; Hermann, A P; Bech, M; Gram, J; Brixen, K

    2018-03-01

    The Risk-stratified Osteoporosis Strategy Evaluation (ROSE) study investigated the effectiveness of a two-step screening program for osteoporosis in women. We found no overall reduction in fractures from systematic screening compared to the current case-finding strategy. The group of moderate- to high-risk women, who accepted the invitation to DXA, seemed to benefit from the program. The purpose of the ROSE study was to investigate the effectiveness of a two-step population-based osteoporosis screening program using the Fracture Risk Assessment Tool (FRAX) derived from a self-administered questionnaire to select women for DXA scan. After the scanning, standard osteoporosis management according to Danish national guidelines was followed. Participants were randomized to either screening or control group, and randomization was stratified according to age and area of residence. Inclusion took place from February 2010 to November 2011. Participants received a self-administered questionnaire, and women in the screening group with a FRAX score ≥ 15% (major osteoporotic fractures) were invited to a DXA scan. Primary outcome was incident clinical fractures. Intention-to-treat analysis and two per-protocol analyses were performed. A total of 3416 fractures were observed during a median follow-up of 5 years. No significant differences were found in the intention-to-treat analyses with 34,229 women included aged 65-80 years. The per-protocol analyses showed a risk reduction in the group that underwent DXA scanning compared to women in the control group with a FRAX ≥ 15%, in regard to major osteoporotic fractures, hip fractures, and all fractures. The risk reduction was most pronounced for hip fractures (adjusted SHR 0.741, p = 0.007). Compared to an office-based case-finding strategy, the two-step systematic screening strategy had no overall effect on fracture incidence. The two-step strategy seemed, however, to be beneficial in the group of women who were

  12. Interactive Fuzzy Goal Programming approach in multi-response stratified sample surveys

    Directory of Open Access Journals (Sweden)

    Gupta Neha

    2016-01-01

    Full Text Available In this paper, we applied an Interactive Fuzzy Goal Programming (IFGP approach with linear, exponential and hyperbolic membership functions, which focuses on maximizing the minimum membership values to determine the preferred compromise solution for the multi-response stratified surveys problem, formulated as a Multi- Objective Non Linear Programming Problem (MONLPP, and by linearizing the nonlinear objective functions at their individual optimum solution, the problem is approximated to an Integer Linear Programming Problem (ILPP. A numerical example based on real data is given, and comparison with some existing allocations viz. Cochran’s compromise allocation, Chatterjee’s compromise allocation and Khowaja’s compromise allocation is made to demonstrate the utility of the approach.

  13. Comparison of sampling strategies for object-based classification of urban vegetation from Very High Resolution satellite images

    Science.gov (United States)

    Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas

    2016-09-01

    Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.

  14. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  15. Study of the therapeutic effects of a hippotherapy simulator in children with cerebral palsy: a stratified single-blind randomized controlled trial.

    Science.gov (United States)

    Herrero, Pablo; Gómez-Trullén, Eva M; Asensio, Angel; García, Elena; Casas, Roberto; Monserrat, Esther; Pandyan, Anand

    2012-12-01

    To investigate whether hippotherapy (when applied by a simulator) improves postural control and balance in children with cerebral palsy. Stratified single-blind randomized controlled trial with an independent assessor. Stratification was made by gross motor function classification system levels, and allocation was concealed. Children between 4 and 18 years old with cerebral palsy. Participants were randomized to an intervention (simulator ON) or control (simulator OFF) group after getting informed consent. Treatment was provided once a week (15 minutes) for 10 weeks. Gross Motor Function Measure (dimension B for balance and the Total Score) and Sitting Assessment Scale were carried out at baseline (prior to randomization), end of intervention and 12 weeks after completing the intervention. Thirty-eight children participated. The groups were balanced at baseline. Sitting balance (measured by dimension B of the Gross Motor Function Measure) improved significantly in the treatment group (effect size = 0.36; 95% CI 0.01-0.71) and the effect size was greater in the severely disabled group (effect size = 0.80; 95% CI 0.13-1.47). The improvements in sitting balance were not maintained over the follow-up period. Changes in the total score of the Gross Motor Function Measure and the Sitting Assessment Scale were not significant. Hippotherapy with a simulator can improve sitting balance in cerebral palsy children who have higher levels of disability. However, this did not lead to a change in the overall function of these children (Gross Motor Function Classification System level V).

  16. Information content of household-stratified epidemics

    Directory of Open Access Journals (Sweden)

    T.M. Kinyanjui

    2016-09-01

    Full Text Available Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs.

  17. Information content of household-stratified epidemics.

    Science.gov (United States)

    Kinyanjui, T M; Pellis, L; House, T

    2016-09-01

    Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity, i.e., the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Comparison of sampling strategies for tobacco retailer inspections to maximize coverage in vulnerable areas and minimize cost.

    Science.gov (United States)

    Lee, Joseph G L; Shook-Sa, Bonnie E; Bowling, J Michael; Ribisl, Kurt M

    2017-06-23

    In the United States, tens of thousands of inspections of tobacco retailers are conducted each year. Various sampling choices can reduce travel costs, emphasize enforcement in areas with greater non-compliance, and allow for comparability between states and over time. We sought to develop a model sampling strategy for state tobacco retailer inspections. Using a 2014 list of 10,161 North Carolina tobacco retailers, we compared results from simple random sampling; stratified, clustered at the ZIP code sampling; and, stratified, clustered at the census tract sampling. We conducted a simulation of repeated sampling and compared approaches for their comparative level of precision, coverage, and retailer dispersion. While maintaining an adequate design effect and statistical precision appropriate for a public health enforcement program, both stratified, clustered ZIP- and tract-based approaches were feasible. Both ZIP and tract strategies yielded improvements over simple random sampling, with relative improvements, respectively, of average distance between retailers (reduced 5.0% and 1.9%), percent Black residents in sampled neighborhoods (increased 17.2% and 32.6%), percent Hispanic residents in sampled neighborhoods (reduced 2.2% and increased 18.3%), percentage of sampled retailers located near schools (increased 61.3% and 37.5%), and poverty rate in sampled neighborhoods (increased 14.0% and 38.2%). States can make retailer inspections more efficient and targeted with stratified, clustered sampling. Use of statistically appropriate sampling strategies like these should be considered by states, researchers, and the Food and Drug Administration to improve program impact and allow for comparisons over time and across states. The authors present a model tobacco retailer sampling strategy for promoting compliance and reducing costs that could be used by U.S. states and the Food and Drug Administration (FDA). The design is feasible to implement in North Carolina. Use of

  19. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  20. Employment status, inflation and suicidal behaviour: an analysis of a stratified sample in Italy.

    Science.gov (United States)

    Solano, Paola; Pizzorno, Enrico; Gallina, Anna M; Mattei, Chiara; Gabrielli, Filippo; Kayman, Joshua

    2012-09-01

    There is abundant empirical evidence of a surplus risk of suicide among the unemployed, although few studies have investigated the influence of economic downturns on suicidal behaviours in an employment status-stratified sample. We investigated how economic inflation affected suicidal behaviours according to employment status in Italy from 2001 to 2008. Data concerning economically active people were provided by the Italian Institute for Statistical Analysis and by the International Monetary Fund. The association between inflation and completed versus attempted suicide with respect to employment status was investigated in every year and quarter-year of the study time frame. We considered three occupational categories: employed, unemployed who were previously employed and unemployed who had never worked. The unemployed are at higher suicide risk than the employed. Among the PE, a significant association between inflation and suicide attempt was found, whereas no association was reported concerning completed suicides. No association was found between completed and attempted suicides among the employed, the NE and inflation. Completed suicide in females is significantly associated with unemployment in every quarter-year. The reported vulnerability to suicidal behaviours among the PE as inflation rises underlines the need of effective support strategies for both genders in times of economic downturns.

  1. An alternative procedure for estimating the population mean in simple random sampling

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2012-03-01

    Full Text Available This paper deals with the problem of estimating the finite population mean using auxiliary information in simple random sampling. Firstly we have suggested a correction to the mean squared error of the estimator proposed by Gupta and Shabbir [On improvement in estimating the population mean in simple random sampling. Jour. Appl. Statist. 35(5 (2008, pp. 559-566]. Later we have proposed a ratio type estimator and its properties are studied in simple random sampling. Numerically we have shown that the proposed class of estimators is more efficient than different known estimators including Gupta and Shabbir (2008 estimator.

  2. Systematic random sampling of the comet assay.

    Science.gov (United States)

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  3. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  4. Prevalence of Childhood Physical Abuse in a Representative Sample of College Students in Samsun, Turkey

    Science.gov (United States)

    Turla, Ahmet; Dundar, Cihad; Ozkanli, Caglar

    2010-01-01

    The main objective of this article is to obtain the prevalence of childhood physical abuse experiences in college students. This cross-sectional study was performed on a gender-stratified random sample of 988 participants studying at Ondokuz Mayis University, with self-reported anonymous questionnaires. It included questions on physical abuse in…

  5. Development and Assessment of an E-Learning Course on Breast Imaging for Radiographers: A Stratified Randomized Controlled Trial

    Science.gov (United States)

    Ventura, Sandra Rua; Ramos, Isabel; Rodrigues, Pedro Pereira

    2015-01-01

    Background Mammography is considered the best imaging technique for breast cancer screening, and the radiographer plays an important role in its performance. Therefore, continuing education is critical to improving the performance of these professionals and thus providing better health care services. Objective Our goal was to develop an e-learning course on breast imaging for radiographers, assessing its efficacy, effectiveness, and user satisfaction. Methods A stratified randomized controlled trial was performed with radiographers and radiology students who already had mammography training, using pre- and post-knowledge tests, and satisfaction questionnaires. The primary outcome was the improvement in test results (percentage of correct answers), using intention-to-treat and per-protocol analysis. Results A total of 54 participants were assigned to the intervention (20 students plus 34 radiographers) with 53 controls (19+34). The intervention was completed by 40 participants (11+29), with 4 (2+2) discontinued interventions, and 10 (7+3) lost to follow-up. Differences in the primary outcome were found between intervention and control: 21 versus 4 percentage points (pp), Pe-learning course is effective, especially for radiographers, which highlights the need for continuing education. PMID:25560547

  6. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  7. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  8. Diagnostic accuracy of the STRATIFY clinical prediction rule for falls: A systematic review and meta-analysis

    LENUS (Irish Health Repository)

    Billington, Jennifer

    2012-08-07

    AbstractBackgroundThe STRATIFY score is a clinical prediction rule (CPR) derived to assist clinicians to identify patients at risk of falling. The purpose of this systematic review and meta-analysis is to determine the overall diagnostic accuracy of the STRATIFY rule across a variety of clinical settings.MethodsA literature search was performed to identify all studies that validated the STRATIFY rule. The methodological quality of the studies was assessed using the Quality Assessment of Diagnostic Accuracy Studies tool. A STRATIFY score of ≥2 points was used to identify individuals at higher risk of falling. All included studies were combined using a bivariate random effects model to generate pooled sensitivity and specificity of STRATIFY at ≥2 points. Heterogeneity was assessed using the variance of logit transformed sensitivity and specificity.ResultsSeventeen studies were included in our meta-analysis, incorporating 11,378 patients. At a score ≥2 points, the STRATIFY rule is more useful at ruling out falls in those classified as low risk, with a greater pooled sensitivity estimate (0.67, 95% CI 0.52–0.80) than specificity (0.57, 95% CI 0.45 – 0.69). The sensitivity analysis which examined the performance of the rule in different settings and subgroups also showed broadly comparable results, indicating that the STRATIFY rule performs in a similar manner across a variety of different ‘at risk’ patient groups in different clinical settings.ConclusionThis systematic review shows that the diagnostic accuracy of the STRATIFY rule is limited and should not be used in isolation for identifying individuals at high risk of falls in clinical practice.

  9. The status of dental caries and related factors in a sample of Iranian adolescents

    DEFF Research Database (Denmark)

    Pakpour, Amir H.; Hidarnia, Alireza; Hajizadeh, Ebrahim

    2011-01-01

    Objective: To describe the status of dental caries in a sample of Iranian adolescents aged 14 to 18 years in Qazvin, and to identify caries-related factors affecting this group. Study design: Qazvin was divided into three zones according to socio-economic status. The sampling procedure used...... was a stratified cluster sampling technique; incorporating 3 stratified zones, for each of which a cluster of school children were recruited from randomly selected high schools. The adolescents agreed to participate in the study and to complete a questionnaire. Dental caries status was assessed in terms of decayed...... their teeth on a regular basis. Although the incidence of caries was found to be moderate, it was influenced by demographic factors such as age and gender in addition to socio-behavioral variables such as family income, the level of education attained by parents, the frequency of dental brushing and flossing...

  10. Comparison of randomization techniques for clinical trials with data from the HOMERUS-trial

    NARCIS (Netherlands)

    Verberk, W. J.; Kroon, A. A.; Kessels, A. G. H.; Nelemans, P. J.; van Ree, J. W.; Lenders, J. W. M.; Thien, T.; Bakx, J. C.; van Montfrans, G. A.; Smit, A. J.; Beltman, F. W.; de Leeuw, P. W.

    2005-01-01

    Background. Several methods of randomization are available to create comparable intervention groups in a study. In the HOMERUS-trial, we compared the minimization procedure with a stratified and a non-stratified method of randomization in order to test which one is most appropriate for use in

  11. Comparison of randomization techniques for clinical trials with data from the HOMERUS-trial.

    NARCIS (Netherlands)

    Verberk, W.J.; Kroon, A.A.; Kessels, A.G.H.; Nelemans, P.J.; Ree, J.W. van; Lenders, J.W.M.; Thien, Th.; Bakx, J.C.; Montfrans, G.A. van; Smit, A.J.; Beltman, F.W.; Leeuw, P.W. de

    2005-01-01

    BACKGROUND: Several methods of randomization are available to create comparable intervention groups in a study. In the HOMERUS-trial, we compared the minimization procedure with a stratified and a non-stratified method of randomization in order to test which one is most appropriate for use in

  12. A study on the representative sampling survey for the inspection of the clearance level for the radioisotope waste

    International Nuclear Information System (INIS)

    Hong Joo Ahn; Se Chul Sohn; Kwang Yong Jee; Ju Youl Kim; In Koo Lee

    2007-01-01

    Utilization facilities for radioisotope (RI) are increasing annually in South Korea, and the total number was 2,723, as of December 31, 2005. The inspection of a clearance level is a very important problem in order to ensure a social reliance for releasing radioactive materials to the environment. Korean regulations for such a clearance are described in Notice No. 2001-30 of the Ministry of Science and Technology (MOST) and Notice No. 2002-67 of the Ministry of Commerce, Industry and Energy (MOCIE). Most unsealed sources in RI waste drums at a storage facility are low-level beta-emitters with short half-lives, so it is impossible to measure their inventories by a nondestructive analysis. Furthermore, RI wastes generated from hospital, educational and research institutes and industry have a heterogeneous, multiple, irregular, and a small quantity of a waste stream. This study addresses a representative (master) sampling survey and analysis plan for RI wastes because a complete enumeration of waste drums is impossible and not desirable in terms of a cost and efficiency. The existing approaches to a representative sampling include a judgmental, simple random, stratified random, systematic grid, systematic random, composite, and adaptive sampling. A representative sampling plan may combine two or more of the above sampling approaches depending on the type and distribution of a waste stream. Stratified random sampling (constrained randomization) is proven to be adequate for a sampling design of a RI waste regarding a half-life, surface dose, undertaking time to a storage facility, and type of waste. The developed sampling protocol includes estimating the number of drums within a waste stream, estimating the number of samples, and a confirmation of the required number of samples. The statistical process control for a quality assurance plan includes control charts and an upper control limit (UCL) of 95% to determine whether a clearance level is met or not. (authors)

  13. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  14. Is a 'convenience' sample useful for estimating immunization coverage in a small population?

    Science.gov (United States)

    Weir, Jean E; Jones, Carrie

    2008-01-01

    Rapid survey methodologies are widely used for assessing immunization coverage in developing countries, approximating true stratified random sampling. Non-random ('convenience') sampling is not considered appropriate for estimating immunization coverage rates but has the advantages of low cost and expediency. We assessed the validity of a convenience sample of children presenting to a travelling clinic by comparing the coverage rate in the convenience sample to the true coverage established by surveying each child in three villages in rural Papua New Guinea. The rate of DTF immunization coverage as estimated by the convenience sample was within 10% of the true coverage when the proportion of children in the sample was two-thirds or when only children over the age of one year were counted, but differed by 11% when the sample included only 53% of the children and when all eligible children were included. The convenience sample may be sufficiently accurate for reporting purposes and is useful for identifying areas of low coverage.

  15. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  16. Design and simulation of stratified probability digital receiver with application to the multipath communication

    Science.gov (United States)

    Deal, J. H.

    1975-01-01

    One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.

  17. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  18. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  19. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  20. Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks

    Science.gov (United States)

    Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.

    2011-01-01

    Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.

  1. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  2. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  3. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  4. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    Science.gov (United States)

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  5. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  6. Random On-Board Pixel Sampling (ROPS) X-Ray Camera

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhehui [Los Alamos; Iaroshenko, O. [Los Alamos; Li, S. [Los Alamos; Liu, T. [Fermilab; Parab, N. [Argonne (main); Chen, W. W. [Purdue U.; Chu, P. [Los Alamos; Kenyon, G. [Los Alamos; Lipton, R. [Fermilab; Sun, K.-X. [Nevada U., Las Vegas

    2017-09-25

    Recent advances in compressed sensing theory and algorithms offer new possibilities for high-speed X-ray camera design. In many CMOS cameras, each pixel has an independent on-board circuit that includes an amplifier, noise rejection, signal shaper, an analog-to-digital converter (ADC), and optional in-pixel storage. When X-ray images are sparse, i.e., when one of the following cases is true: (a.) The number of pixels with true X-ray hits is much smaller than the total number of pixels; (b.) The X-ray information is redundant; or (c.) Some prior knowledge about the X-ray images exists, sparse sampling may be allowed. Here we first illustrate the feasibility of random on-board pixel sampling (ROPS) using an existing set of X-ray images, followed by a discussion about signal to noise as a function of pixel size. Next, we describe a possible circuit architecture to achieve random pixel access and in-pixel storage. The combination of a multilayer architecture, sparse on-chip sampling, and computational image techniques, is expected to facilitate the development and applications of high-speed X-ray camera technology.

  7. Penaksir Rasio Proporsi Yang Efisien Untuk Rata-rata Populasi Pada Sampling Acak Berstrata

    OpenAIRE

    Maulana, Devri; Adnan, Arisman; Sirait, Haposan

    2014-01-01

    In this article we review three proportion ratio estimators for the population mean on stratified random sampling, i.e. traditional proportion ratio estimator, proportion ratio estimator using coefficient of regression, and proportion ratio estimator usingcoefficient of regression and curtosis as discussed by Singh and Audu [5]. The three estimators are biased estimators, then the mean square error of each estimator is determined. Furthermore, these mean square errors are compa...

  8. Universal shift of the Brewster angle and disorder-enhanced delocalization of p waves in stratified random media.

    Science.gov (United States)

    Lee, Kwang Jin; Kim, Kihong

    2011-10-10

    We study theoretically the propagation and the Anderson localization of p-polarized electromagnetic waves incident obliquely on randomly stratified dielectric media with weak uncorrelated Gaussian disorder. Using the invariant imbedding method, we calculate the localization length and the disorder-averaged transmittance in a numerically precise manner. We find that the localization length takes an extremely large maximum value at some critical incident angle, which we call the generalized Brewster angle. The disorder-averaged transmittance also takes a maximum very close to one at the same incident angle. Even in the presence of an arbitrarily weak disorder, the generalized Brewster angle is found to be substantially different from the ordinary Brewster angle in uniform media. It is a rapidly increasing function of the average dielectric permittivity and approaches 90° when the average relative dielectric permittivity is slightly larger than two. We make a remarkable observation that the dependence of the generalized Brewster angle on the average dielectric permittivity is universal in the sense that it is independent of the strength of disorder. We also find, surprisingly, that when the average relative dielectric permittivity is less than one and the incident angle is larger than the generalized Brewster angle, both the localization length and the disorder-averaged transmittance increase substantially as the strength of disorder increases in a wide range of the disorder parameter. In other words, the Anderson localization of incident p waves can be weakened by disorder in a certain parameter regime.

  9. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  10. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  11. National-scale vegetation change across Britain; an analysis of sample-based surveillance data from the Countryside Surveys of 1990 and 1998

    NARCIS (Netherlands)

    Smart, S.M.; Clarke, R.T.; Poll, van de H.M.; Robertson, E.J.; Shield, E.R.; Bunce, R.G.H.; Maskell, L.C.

    2003-01-01

    Patterns of vegetation across Great Britain (GB) between 1990 and 1998 were quantified based on an analysis of plant species data from a total of 9596 fixed plots. Plots were established on a stratified random basis within 501 1 km sample squares located as part of the Countryside Survey of GB.

  12. The effect of surfactant on stratified and stratifying gas-liquid flows

    Science.gov (United States)

    Heiles, Baptiste; Zadrazil, Ivan; Matar, Omar

    2013-11-01

    We consider the dynamics of a stratified/stratifying gas-liquid flow in horizontal tubes. This flow regime is characterised by the thin liquid films that drain under gravity along the pipe interior, forming a pool at the bottom of the tube, and the formation of large-amplitude waves at the gas-liquid interface. This regime is also accompanied by the detachment of droplets from the interface and their entrainment into the gas phase. We carry out an experimental study involving axial- and radial-view photography of the flow, in the presence and absence of surfactant. We show that the effect of surfactant is to reduce significantly the average diameter of the entrained droplets, through a tip-streaming mechanism. We also highlight the influence of surfactant on the characteristics of the interfacial waves, and the pressure gradient that drives the flow. EPSRC Programme Grant EP/K003976/1.

  13. Electromagnetic waves in stratified media

    CERN Document Server

    Wait, James R; Fock, V A; Wait, J R

    2013-01-01

    International Series of Monographs in Electromagnetic Waves, Volume 3: Electromagnetic Waves in Stratified Media provides information pertinent to the electromagnetic waves in media whose properties differ in one particular direction. This book discusses the important feature of the waves that enables communications at global distances. Organized into 13 chapters, this volume begins with an overview of the general analysis for the electromagnetic response of a plane stratified medium comprising of any number of parallel homogeneous layers. This text then explains the reflection of electromagne

  14. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    Science.gov (United States)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the

  15. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  16. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  17. Development and enrolee satisfaction with basic medical insurance in China: A systematic review and stratified cluster sampling survey.

    Science.gov (United States)

    Jing, Limei; Chen, Ru; Jing, Lisa; Qiao, Yun; Lou, Jiquan; Xu, Jing; Wang, Junwei; Chen, Wen; Sun, Xiaoming

    2017-07-01

    Basic Medical Insurance (BMI) has changed remarkably over time in China because of health reforms that aim to achieve universal coverage and better health care with adequate efforts by increasing subsidies, reimbursement, and benefits. In this paper, we present the development of BMI, including financing and operation, with a systematic review. Meanwhile, Pudong New Area in Shanghai was chosen as a typical BMI sample for its coverage and management; a stratified cluster sampling survey together with an ordinary logistic regression model was used for the analysis. Enrolee satisfaction and the factors associated with enrolee satisfaction with BMI were analysed. We found that the reenrolling rate superficially improved the BMI coverage and nearly achieved universal coverage. However, BMI funds still faced dual contradictions of fund deficit and insured under compensation, and a long-term strategy is needed to realize the integration of BMI schemes with more homogeneous coverage and benefits. Moreover, Urban Resident Basic Medical Insurance participants reported a higher rate of dissatisfaction than other participants. The key predictors of the enrolees' satisfaction were awareness of the premium and compensation, affordability of out-of-pocket costs, and the proportion of reimbursement. These results highlight the importance that the Chinese government takes measures, such as strengthening BMI fund management, exploring mixed payment methods, and regulating sequential medical orders, to develop an integrated medical insurance system of universal coverage and vertical equity while simultaneously improving enrolee satisfaction. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Analysis of Turbulent Combustion in Simplified Stratified Charge Conditions

    Science.gov (United States)

    Moriyoshi, Yasuo; Morikawa, Hideaki; Komatsu, Eiji

    The stratified charge combustion system has been widely studied due to the significant potentials for low fuel consumption rate and low exhaust gas emissions. The fuel-air mixture formation process in a direct-injection stratified charge engine is influenced by various parameters, such as atomization, evaporation, and in-cylinder gas motion at high temperature and high pressure conditions. It is difficult to observe the in-cylinder phenomena in such conditions and also challenging to analyze the following stratified charge combustion. Therefore, the combustion phenomena in simplified stratified charge conditions aiming to analyze the fundamental stratified charge combustion are examined. That is, an experimental apparatus which can control the mixture distribution and the gas motion at ignition timing was developed, and the effects of turbulence intensity, mixture concentration distribution, and mixture composition on stratified charge combustion were examined. As a result, the effects of fuel, charge stratification, and turbulence on combustion characteristics were clarified.

  19. Development and assessment of an e-learning course on breast imaging for radiographers: a stratified randomized controlled trial.

    Science.gov (United States)

    Moreira, Inês C; Ventura, Sandra Rua; Ramos, Isabel; Rodrigues, Pedro Pereira

    2015-01-05

    Mammography is considered the best imaging technique for breast cancer screening, and the radiographer plays an important role in its performance. Therefore, continuing education is critical to improving the performance of these professionals and thus providing better health care services. Our goal was to develop an e-learning course on breast imaging for radiographers, assessing its efficacy, effectiveness, and user satisfaction. A stratified randomized controlled trial was performed with radiographers and radiology students who already had mammography training, using pre- and post-knowledge tests, and satisfaction questionnaires. The primary outcome was the improvement in test results (percentage of correct answers), using intention-to-treat and per-protocol analysis. A total of 54 participants were assigned to the intervention (20 students plus 34 radiographers) with 53 controls (19+34). The intervention was completed by 40 participants (11+29), with 4 (2+2) discontinued interventions, and 10 (7+3) lost to follow-up. Differences in the primary outcome were found between intervention and control: 21 versus 4 percentage points (pp), Peffect in radiographers (23 pp vs 4 pp; P=.004) but was unclear in students (18 pp vs 5 pp; P=.098). Nonetheless, differences in students' posttest results were found (88% vs 63%; P=.003), which were absent in pretest (63% vs 63%; P=.106). The per-protocol analysis showed a higher effect (26 pp vs 2 pp; Pe-learning course is effective, especially for radiographers, which highlights the need for continuing education.

  20. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  1. The relationships between sixteen perfluorinated compound concentrations in blood serum and food, and other parameters, in the general population of South Korea with proportionate stratified sampling method.

    Science.gov (United States)

    Kim, Hee-Young; Kim, Seung-Kyu; Kang, Dong-Mug; Hwang, Yong-Sik; Oh, Jeong-Eun

    2014-02-01

    Serum samples were collected from volunteers of various ages and both genders using a proportionate stratified sampling method, to assess the exposure of the general population in Busan, South Korea to perfluorinated compounds (PFCs). 16 PFCs were investigated in serum samples from 306 adults (124 males and 182 females) and one day composite diet samples (breakfast, lunch, and dinner) from 20 of the serum donors, to investigate the relationship between food and serum PFC concentrations. Perfluorooctanoic acid and perfluorooctanesulfonic acid were the dominant PFCs in the serum samples, with mean concentrations of 8.4 and 13 ng/mL, respectively. Perfluorotridecanoic acid was the dominant PFC in the composite food samples, ranging from studies. We confirmed from the relationships between questionnaire results and the PFC concentrations in the serum samples, that food is one of the important contribution factors of human exposure to PFCs. However, there were no correlations between the PFC concentrations in the one day composite diet samples and the serum samples, because a one day composite diet sample is not necessarily representative of a person's long-term diet and because of the small number of samples taken. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  3. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  4. Grain distinct stratified nanolayers in aluminium alloys

    Energy Technology Data Exchange (ETDEWEB)

    Donatus, U., E-mail: uyimedonatus@yahoo.com [School of Materials, The University of Manchester, Manchester, M13 9PL, England (United Kingdom); Thompson, G.E.; Zhou, X.; Alias, J. [School of Materials, The University of Manchester, Manchester, M13 9PL, England (United Kingdom); Tsai, I.-L. [Oxford Instruments NanoAnalysis, HP12 2SE, High Wycombe (United Kingdom)

    2017-02-15

    The grains of aluminium alloys have stratified nanolayers which determine their mechanical and chemical responses. In this study, the nanolayers were revealed in the grains of AA6082 (T6 and T7 conditions), AA5083-O and AA2024-T3 alloys by etching the alloys in a solution comprising 20 g Cr{sub 2}O{sub 3} + 30 ml HPO{sub 3} in 1 L H{sub 2}O. Microstructural examination was conducted on selected grains of interest using scanning electron microscopy and electron backscatter diffraction technique. It was observed that the nanolayers are orientation dependent and are parallel to the {100} planes. They have ordered and repeated tunnel squares that are flawed at the sides which are aligned in the <100> directions. These flawed tunnel squares dictate the tunnelling corrosion morphology as well as appearing to have an affect on the arrangement and sizes of the precipitation hardening particles. The inclination of the stratified nanolayers, their interpacing, and the groove sizes have significant influence on the corrosion behaviour and seeming influence on the strengthening mechanism of the investigated aluminium alloys. - Highlights: • Stratified nanolayers in aluminium alloy grains. • Relationship of the stratified nanolayers with grain orientation. • Influence of the inclinations of the stratified nanolayers on corrosion. • Influence of the nanolayers interspacing and groove sizes on hardness and corrosion.

  5. PREDOMINANTLY LOW METALLICITIES MEASURED IN A STRATIFIED SAMPLE OF LYMAN LIMIT SYSTEMS AT Z  = 3.7

    Energy Technology Data Exchange (ETDEWEB)

    Glidden, Ana; Cooper, Thomas J.; Simcoe, Robert A. [Massachusetts Institute of Technology, 77 Massachusetts Ave, Cambridge, MA 02139 (United States); Cooksey, Kathy L. [Department of Physics and Astronomy, University of Hawai‘i at Hilo, 200 West Kāwili Street, Hilo, HI 96720 (United States); O’Meara, John M., E-mail: aglidden@mit.edu, E-mail: tjcooper@mit.edu, E-mail: simcoe@space.mit.edu, E-mail: kcooksey@hawaii.edu, E-mail: jomeara@smcvt.edu [Department of Physics, Saint Michael’s College, One Winooski Park, Colchester, VT 05439 (United States)

    2016-12-20

    We measured metallicities for 33 z = 3.4–4.2 absorption line systems drawn from a sample of H i-selected-Lyman limit systems (LLSs) identified in Sloan Digital Sky Survey (SDSS) quasar spectra and stratified based on metal line features. We obtained higher-resolution spectra with the Keck Echellette Spectrograph and Imager, selecting targets according to our stratification scheme in an effort to fully sample the LLS population metallicity distribution. We established a plausible range of H i column densities and measured column densities (or limits) for ions of carbon, silicon, and aluminum, finding ionization-corrected metallicities or upper limits. Interestingly, our ionization models were better constrained with enhanced α -to-aluminum abundances, with a median abundance ratio of [ α /Al] = 0.3. Measured metallicities were generally low, ranging from [M/H] = −3 to −1.68, with even lower metallicities likely for some systems with upper limits. Using survival statistics to incorporate limits, we constructed the cumulative distribution function (CDF) for LLS metallicities. Recent models of galaxy evolution propose that galaxies replenish their gas from the low-metallicity intergalactic medium (IGM) via high-density H i “flows” and eject enriched interstellar gas via outflows. Thus, there has been some expectation that LLSs at the peak of cosmic star formation ( z  ≈ 3) might have a bimodal metallicity distribution. We modeled our CDF as a mix of two Gaussian distributions, one reflecting the metallicity of the IGM and the other representative of the interstellar medium of star-forming galaxies. This bimodal distribution yielded a poor fit. A single Gaussian distribution better represented the sample with a low mean metallicity of [M/H] ≈ −2.5.

  6. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  7. SCRAED - Simple and Complex Random Assignment in Experimental Designs

    OpenAIRE

    Alferes, Valentim R.

    2009-01-01

    SCRAED is a package of 37 self-contained SPSS syntax files that performs simple and complex random assignment in experimental designs. For between-subjects designs, SCRAED includes simple random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities), block random assignment (simple and generalized blocks), and stratified random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities). For within-subject...

  8. The stratified H-index makes scientific impact transparent

    DEFF Research Database (Denmark)

    Würtz, Morten; Schmidt, Morten

    2017-01-01

    The H-index is widely used to quantify and standardize researchers' scientific impact. However, the H-index does not account for the fact that co-authors rarely contribute equally to a paper. Accordingly, we propose the use of a stratified H-index to measure scientific impact. The stratified H......-index supplements the conventional H-index with three separate H-indices: one for first authorships, one for second authorships and one for last authorships. The stratified H-index takes scientific output, quality and individual author contribution into account....

  9. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  10. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  11. Evaluation of sampling strategies to estimate crown biomass

    Directory of Open Access Journals (Sweden)

    Krishna P Poudel

    2015-01-01

    Full Text Available Background Depending on tree and site characteristics crown biomass accounts for a significant portion of the total aboveground biomass in the tree. Crown biomass estimation is useful for different purposes including evaluating the economic feasibility of crown utilization for energy production or forest products, fuel load assessments and fire management strategies, and wildfire modeling. However, crown biomass is difficult to predict because of the variability within and among species and sites. Thus the allometric equations used for predicting crown biomass should be based on data collected with precise and unbiased sampling strategies. In this study, we evaluate the performance different sampling strategies to estimate crown biomass and to evaluate the effect of sample size in estimating crown biomass. Methods Using data collected from 20 destructively sampled trees, we evaluated 11 different sampling strategies using six evaluation statistics: bias, relative bias, root mean square error (RMSE, relative RMSE, amount of biomass sampled, and relative biomass sampled. We also evaluated the performance of the selected sampling strategies when different numbers of branches (3, 6, 9, and 12 are selected from each tree. Tree specific log linear model with branch diameter and branch length as covariates was used to obtain individual branch biomass. Results Compared to all other methods stratified sampling with probability proportional to size estimation technique produced better results when three or six branches per tree were sampled. However, the systematic sampling with ratio estimation technique was the best when at least nine branches per tree were sampled. Under the stratified sampling strategy, selecting unequal number of branches per stratum produced approximately similar results to simple random sampling, but it further decreased RMSE when information on branch diameter is used in the design and estimation phases. Conclusions Use of

  12. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  13. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  14. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  15. Differences in Mathematics Teachers' Perceived Preparedness to Demonstrate Competence in Secondary School Mathematics Content by Teacher Characteristics

    Science.gov (United States)

    Ng'eno, J. K.; Chesimet, M. C.

    2016-01-01

    A sample of 300 mathematics teachers drawn from a population of 1500 participated in this study. The participants were selected using systematic random sampling and stratified random sampling (stratified by qualification and gender). The data was collected using self-report questionnaires for mathematics teachers. One tool was used to collect…

  16. Designing Wood Supply Scenarios from Forest Inventories with Stratified Predictions

    Directory of Open Access Journals (Sweden)

    Philipp Kilham

    2018-02-01

    Full Text Available Forest growth and wood supply projections are increasingly used to estimate the future availability of woody biomass and the correlated effects on forests and climate. This research parameterizes an inventory-based business-as-usual wood supply scenario, with a focus on southwest Germany and the period 2002–2012 with a stratified prediction. First, the Classification and Regression Trees algorithm groups the inventory plots into strata with corresponding harvest probabilities. Second, Random Forest algorithms generate individual harvest probabilities for the plots of each stratum. Third, the plots with the highest individual probabilities are selected as harvested until the harvest probability of the stratum is fulfilled. Fourth, the harvested volume of these plots is predicted with a linear regression model trained on harvested plots only. To illustrate the pros and cons of this method, it is compared to a direct harvested volume prediction with linear regression, and a combination of logistic regression and linear regression. Direct harvested volume regression predicts comparable volume figures, but generates these volumes in a way that differs from business-as-usual. The logistic model achieves higher overall classification accuracies, but results in underestimations or overestimations of harvest shares for several subsets of the data. The stratified prediction method balances this shortcoming, and can be of general use for forest growth and timber supply projections from large-scale forest inventories.

  17. Theory of sampling and its application in tissue based diagnosis

    Directory of Open Access Journals (Sweden)

    Kayser Gian

    2009-02-01

    Full Text Available Abstract Background A general theory of sampling and its application in tissue based diagnosis is presented. Sampling is defined as extraction of information from certain limited spaces and its transformation into a statement or measure that is valid for the entire (reference space. The procedure should be reproducible in time and space, i.e. give the same results when applied under similar circumstances. Sampling includes two different aspects, the procedure of sample selection and the efficiency of its performance. The practical performance of sample selection focuses on search for localization of specific compartments within the basic space, and search for presence of specific compartments. Methods When a sampling procedure is applied in diagnostic processes two different procedures can be distinguished: I the evaluation of a diagnostic significance of a certain object, which is the probability that the object can be grouped into a certain diagnosis, and II the probability to detect these basic units. Sampling can be performed without or with external knowledge, such as size of searched objects, neighbourhood conditions, spatial distribution of objects, etc. If the sample size is much larger than the object size, the application of a translation invariant transformation results in Kriege's formula, which is widely used in search for ores. Usually, sampling is performed in a series of area (space selections of identical size. The size can be defined in relation to the reference space or according to interspatial relationship. The first method is called random sampling, the second stratified sampling. Results Random sampling does not require knowledge about the reference space, and is used to estimate the number and size of objects. Estimated features include area (volume fraction, numerical, boundary and surface densities. Stratified sampling requires the knowledge of objects (and their features and evaluates spatial features in relation to

  18. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  19. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  20. Executive control resources and frequency of fatty food consumption: findings from an age-stratified community sample.

    Science.gov (United States)

    Hall, Peter A

    2012-03-01

    Fatty foods are regarded as highly appetitive, and self-control is often required to resist consumption. Executive control resources (ECRs) are potentially facilitative of self-control efforts, and therefore could predict success in the domain of dietary self-restraint. It is not currently known whether stronger ECRs facilitate resistance to fatty food consumption, and moreover, it is unknown whether such an effect would be stronger in some age groups than others. The purpose of the present study was to examine the association between ECRs and consumption of fatty foods among healthy community-dwelling adults across the adult life span. An age-stratified sample of individuals between 18 and 89 years of age attended two laboratory sessions. During the first session they completed two computer-administered tests of ECRs (Stroop and Go-NoGo) and a test of general cognitive function (Wechsler Abbreviated Scale of Intelligence); participants completed two consecutive 1-week recall measures to assess frequency of fatty and nonfatty food consumption. Regression analyses revealed that stronger ECRs were associated with lower frequency of fatty food consumption over the 2-week interval. This association was observed for both measures of ECR and a composite measure. The effect remained significant after adjustment for demographic variables (age, gender, socioeconomic status), general cognitive function, and body mass index. The observed effect of ECRs on fatty food consumption frequency was invariant across age group, and did not generalize to nonfatty food consumption. ECRs may be potentially important, though understudied, determinants of dietary behavior in adults across the life span.

  1. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  2. Exploring the role of wave drag in the stable stratified oceanic and atmospheric bottom boundary layer in the cnrs-toulouse (cnrm-game) large stratified water flume

    NARCIS (Netherlands)

    Kleczek, M.; Steeneveld, G.J.; Paci, A.; Calmer, R.; Belleudy, A.; Canonici, J.C.; Murguet, F.; Valette, V.

    2014-01-01

    This paper reports on a laboratory experiment in the CNRM-GAME (Toulouse) stratified water flume of a stably stratified boundary layer, in order to quantify the momentum transfer due to orographically induced gravity waves by gently undulating hills in a boundary layer flow. In a stratified fluid, a

  3. Unwilling or Unable to Cheat? Evidence from a Randomized Tax Audit Experiment in Denmark

    DEFF Research Database (Denmark)

    Kleven, Henrik Jacobsen; Knudsen, Martin B.; Kreiner, Claus Thustrup

    2010-01-01

    This paper analyzes a randomized tax enforcement experiment in Denmark. In the base year, a stratified and representative sample of over 40,000 individual income tax filers was selected for the experiment. Half of the tax filers were randomly selected to be thoroughly audited, while the rest were...... deliberately not audited. The following year, "threat-of-audit" letters were randomly assigned and sent to tax filers in both groups. Using comprehensive administrative tax data, we present four main findings. First, we find that the tax evasion rate is very small (0.3%) for income subject to third...... impact on tax evasion, but that this effect is small in comparison to avoidance responses. Third, we find that prior audits substantially increase self-reported income, implying that individuals update their beliefs about detection probability based on experiencing an audit. Fourth, threat-of-audit...

  4. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  5. Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies

    Science.gov (United States)

    Theis, Fabian J.

    2017-01-01

    Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464

  6. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  7. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  8. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  9. Aligning the Economic Value of Companion Diagnostics and Stratified Medicines

    Directory of Open Access Journals (Sweden)

    Edward D. Blair

    2012-11-01

    Full Text Available The twin forces of payors seeking fair pricing and the rising costs of developing new medicines has driven a closer relationship between pharmaceutical companies and diagnostics companies, because stratified medicines, guided by companion diagnostics, offer better commercial, as well as clinical, outcomes. Stratified medicines have created clinical success and provided rapid product approvals, particularly in oncology, and indeed have changed the dynamic between drug and diagnostic developers. The commercial payback for such partnerships offered by stratified medicines has been less well articulated, but this has shifted as the benefits in risk management, pricing and value creation for all stakeholders become clearer. In this larger healthcare setting, stratified medicine provides both physicians and patients with greater insight on the disease and provides rationale for providers to understand cost-effectiveness of treatment. This article considers how the economic value of stratified medicine relationships can be recognized and translated into better outcomes for all healthcare stakeholders.

  10. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  11. Large eddy simulation of stably stratified turbulence

    International Nuclear Information System (INIS)

    Shen Zhi; Zhang Zhaoshun; Cui Guixiang; Xu Chunxiao

    2011-01-01

    Stably stratified turbulence is a common phenomenon in atmosphere and ocean. In this paper the large eddy simulation is utilized for investigating homogeneous stably stratified turbulence numerically at Reynolds number Re = uL/v = 10 2 ∼10 3 and Froude number Fr = u/NL = 10 −2 ∼10 0 in which u is root mean square of velocity fluctuations, L is integral scale and N is Brunt-Vaïsälä frequency. Three sets of computation cases are designed with different initial conditions, namely isotropic turbulence, Taylor Green vortex and internal waves, to investigate the statistical properties from different origins. The computed horizontal and vertical energy spectra are consistent with observation in atmosphere and ocean when the composite parameter ReFr 2 is greater than O(1). It has also been found in this paper that the stratification turbulence can be developed under different initial velocity conditions and the internal wave energy is dominated in the developed stably stratified turbulence.

  12. Yield and quality of ground water from stratified-drift aquifers, Taunton River basin, Massachusetts : executive summary

    Science.gov (United States)

    Lapham, Wayne W.; Olimpio, Julio C.

    1989-01-01

    Water shortages are a chronic problem in parts of the Taunton River basin and are caused by a combination of factors. Water use in this part of the Boston metropolitan area is likely to increase during the next decade. The Massachusetts Division of Water Resources projects that about 50% of the cities and towns within and on the perimeter of the basin may have water supply deficits by 1990 if water management projects are not pursued throughout the 1980s. Estimates of the long-term yield of the 26 regional aquifers indicate that the yields of the two most productive aquifers equal or exceed 11.9 and 11.3 cu ft/sec, 90% of the time, respectively, if minimum stream discharge is maintained at 99.5% flow duration. Eighteen of the 26 aquifers were pumped for public water supply during 1983. Further analysis of the yield characteristics of these 18 aquifers indicates that the 1983 pumping rate of each of these 18 aquifers can be sustained at least 70% of the time. Selected physical properties and concentrations of major chemical constituents in groundwater from the stratified-drift aquifers at 80 sampling sites were used to characterize general water quality in aquifers throughout the basin. The pH of the groundwater ranged from 5.4 to 7.0. Natural elevated concentrations of Fe and Mn in water in the stratified-drift aquifers are present locally in the basin. Natural concentrations of these two metals commonly exceed the limits of 0.3 mg/L for Fe and 0.05 mg/L for Mn recommended for drinking water. Fifty-one analyses of selected trace metals in groundwater samples from stratified-drift aquifers throughout the basin were used to characterize trace metal concentrations in the groundwater. Of the 10 constituents sampled that have US EPA limits recommended for drinking water, only the Pb concentration in water at one site (60 micrograms/L) exceeded the recommended limit of 50 micrograms/L. Analyses of selected organic compounds in water in the stratified-drift aquifers at 74

  13. Geospatial techniques for developing a sampling frame of watersheds across a region

    Science.gov (United States)

    Gresswell, Robert E.; Bateman, Douglas S.; Lienkaemper, George; Guy, T.J.

    2004-01-01

    Current land-management decisions that affect the persistence of native salmonids are often influenced by studies of individual sites that are selected based on judgment and convenience. Although this approach is useful for some purposes, extrapolating results to areas that were not sampled is statistically inappropriate because the sampling design is usually biased. Therefore, in recent investigations of coastal cutthroat trout (Oncorhynchus clarki clarki) located above natural barriers to anadromous salmonids, we used a methodology for extending the statistical scope of inference. The purpose of this paper is to apply geospatial tools to identify a population of watersheds and develop a probability-based sampling design for coastal cutthroat trout in western Oregon, USA. The population of mid-size watersheds (500-5800 ha) west of the Cascade Range divide was derived from watershed delineations based on digital elevation models. Because a database with locations of isolated populations of coastal cutthroat trout did not exist, a sampling frame of isolated watersheds containing cutthroat trout had to be developed. After the sampling frame of watersheds was established, isolated watersheds with coastal cutthroat trout were stratified by ecoregion and erosion potential based on dominant bedrock lithology (i.e., sedimentary and igneous). A stratified random sample of 60 watersheds was selected with proportional allocation in each stratum. By comparing watershed drainage areas of streams in the general population to those in the sampling frame and the resulting sample (n = 60), we were able to evaluate the how representative the subset of watersheds was in relation to the population of watersheds. Geospatial tools provided a relatively inexpensive means to generate the information necessary to develop a statistically robust, probability-based sampling design.

  14. Stratified medicine and reimbursement issues

    NARCIS (Netherlands)

    Fugel, Hans-Joerg; Nuijten, Mark; Postma, Maarten

    2012-01-01

    Stratified Medicine (SM) has the potential to target patient populations who will most benefit from a therapy while reducing unnecessary health interventions associated with side effects. The link between clinical biomarkers/diagnostics and therapies provides new opportunities for value creation to

  15. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  16. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  17. Optimization of refueling-shuffling scheme in PWR core by random search strategy

    International Nuclear Information System (INIS)

    Wu Yuan

    1991-11-01

    A random method for simulating optimization of refueling management in a pressurized water reactor (PWR) core is described. The main purpose of the optimization was to select the 'best' refueling arrangement scheme which would produce maximum economic benefits under certain imposed conditions. To fulfill this goal, an effective optimization strategy, two-stage random search method was developed. First, the search was made in a manner similar to the stratified sampling technique. A local optimum can be reached by comparison of the successive results. Then the other random experiences would be carried on between different strata to try to find the global optimum. In general, it can be used as a practical tool for conventional fuel management scheme. However, it can also be used in studies on optimization of Low-Leakage fuel management. Some calculations were done for a typical PWR core on a CYBER-180/830 computer. The results show that the method proposed can obtain satisfactory approach at reasonable low computational cost

  18. Statistical sampling applied to the radiological characterization of historical waste

    Directory of Open Access Journals (Sweden)

    Zaffora Biagio

    2016-01-01

    Full Text Available The evaluation of the activity of radionuclides in radioactive waste is required for its disposal in final repositories. Easy-to-measure nuclides, like γ-emitters and high-energy X-rays, can be measured via non-destructive nuclear techniques from outside a waste package. Some radionuclides are difficult-to-measure (DTM from outside a package because they are α- or β-emitters. The present article discusses the application of linear regression, scaling factors (SF and the so-called “mean activity method” to estimate the activity of DTM nuclides on metallic waste produced at the European Organization for Nuclear Research (CERN. Various statistical sampling techniques including simple random sampling, systematic sampling, stratified and authoritative sampling are described and applied to 2 waste populations of activated copper cables. The bootstrap is introduced as a tool to estimate average activities and standard errors in waste characterization. The analysis of the DTM Ni-63 is used as an example. Experimental and theoretical values of SFs are calculated and compared. Guidelines for sampling historical waste using probabilistic and non-probabilistic sampling are finally given.

  19. The Stratified Legitimacy of Abortions.

    Science.gov (United States)

    Kimport, Katrina; Weitz, Tracy A; Freedman, Lori

    2016-12-01

    Roe v. Wade was heralded as an end to unequal access to abortion care in the United States. However, today, despite being common and safe, abortion is performed only selectively in hospitals and private practices. Drawing on 61 interviews with obstetrician-gynecologists in these settings, we examine how they determine which abortions to perform. We find that they distinguish between more and less legitimate abortions, producing a narrative of stratified legitimacy that privileges abortions for intended pregnancies, when the fetus is unhealthy, and when women perform normative gendered sexuality, including distress about the abortion, guilt about failure to contracept, and desire for motherhood. This stratified legitimacy can perpetuate socially-inflected inequality of access and normative gendered sexuality. Additionally, we argue that the practice by physicians of distinguishing among abortions can legitimate legislative practices that regulate and restrict some kinds of abortion, further constraining abortion access. © American Sociological Association 2016.

  20. Stratified charge rotary engine for general aviation

    Science.gov (United States)

    Mount, R. E.; Parente, A. M.; Hady, W. F.

    1986-01-01

    A development history, a current development status assessment, and a design feature and performance capabilities account are given for stratified-charge rotary engines applicable to aircraft propulsion. Such engines are capable of operating on Jet-A fuel with substantial cost savings, improved altitude capability, and lower fuel consumption by comparison with gas turbine powerplants. Attention is given to the current development program of a 400-hp engine scheduled for initial operations in early 1990. Stratified charge rotary engines are also applicable to ground power units, airborne APUs, shipboard generators, and vehicular engines.

  1. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    Science.gov (United States)

    Ma, Y. C.; Liu, H. Y.; Yan, S. B.; Yang, Y. H.; Yang, M. W.; Li, J. M.; Tang, J.

    2013-05-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency.

  2. Nitrogen transformations in stratified aquatic microbial ecosystems

    DEFF Research Database (Denmark)

    Revsbech, Niels Peter; Risgaard-Petersen, N.; Schramm, Andreas

    2006-01-01

    Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about n...... performing dissimilatory reduction of nitrate to ammonium have given new dimensions to the understanding of nitrogen cycling in nature, and the occurrence of these organisms and processes in stratified microbial communities will be described in detail.......Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about...... nitrogen fixation, nitrification, denitrification, and dissimilatory reduction of nitrate to ammonium, and about the microorganisms performing the processes, has been produced by use of these techniques. During the last decade the discovery of anammmox bacteria and migrating, nitrate accumulating bacteria...

  3. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  4. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  5. Random vs. systematic sampling from administrative databases involving human subjects.

    Science.gov (United States)

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  6. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  7. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    International Nuclear Information System (INIS)

    Ma, Y C; Liu, H Y; Yan, S B; Li, J M; Tang, J; Yang, Y H; Yang, M W

    2013-01-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency. (paper)

  8. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  9. Simulation model of stratified thermal energy storage tank using finite difference method

    Science.gov (United States)

    Waluyo, Joko

    2016-06-01

    Stratified TES tank is normally used in the cogeneration plant. The stratified TES tanks are simple, low cost, and equal or superior in thermal performance. The advantage of TES tank is that it enables shifting of energy usage from off-peak demand for on-peak demand requirement. To increase energy utilization in a stratified TES tank, it is required to build a simulation model which capable to simulate the charging phenomenon in the stratified TES tank precisely. This paper is aimed to develop a novel model in addressing the aforementioned problem. The model incorporated chiller into the charging of stratified TES tank system in a closed system. The model was developed in one-dimensional type involve with heat transfer aspect. The model covers the main factors affect to degradation of temperature distribution namely conduction through the tank wall, conduction between cool and warm water, mixing effect on the initial flow of the charging as well as heat loss to surrounding. The simulation model is developed based on finite difference method utilizing buffer concept theory and solved in explicit method. Validation of the simulation model is carried out using observed data obtained from operating stratified TES tank in cogeneration plant. The temperature distribution of the model capable of representing S-curve pattern as well as simulating decreased charging temperature after reaching full condition. The coefficient of determination values between the observed data and model obtained higher than 0.88. Meaning that the model has capability in simulating the charging phenomenon in the stratified TES tank. The model is not only capable of generating temperature distribution but also can be enhanced for representing transient condition during the charging of stratified TES tank. This successful model can be addressed for solving the limitation temperature occurs in charging of the stratified TES tank with the absorption chiller. Further, the stratified TES tank can be

  10. Free Falling in Stratified Fluids

    Science.gov (United States)

    Lam, Try; Vincent, Lionel; Kanso, Eva

    2017-11-01

    Leaves falling in air and discs falling in water are examples of unsteady descents due to complex interaction between gravitational and aerodynamic forces. Understanding these descent modes is relevant to many branches of engineering and science such as estimating the behavior of re-entry space vehicles to studying biomechanics of seed dispersion. For regularly shaped objects falling in homogenous fluids, the motion is relatively well understood. However, less is known about how density stratification of the fluid medium affects the falling behavior. Here, we experimentally investigate the descent of discs in both pure water and in stable linearly stratified fluids for Froude numbers Fr 1 and Reynolds numbers Re between 1000 -2000. We found that stable stratification (1) enhances the radial dispersion of the disc at landing, (2) increases the descent time, (3) decreases the inclination (or nutation) angle, and (4) decreases the fluttering amplitude while falling. We conclude by commenting on how the corresponding information can be used as a predictive model for objects free falling in stratified fluids.

  11. Landslide Susceptibility Assessment Using Frequency Ratio Technique with Iterative Random Sampling

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Oh

    2017-01-01

    Full Text Available This paper assesses the performance of the landslide susceptibility analysis using frequency ratio (FR with an iterative random sampling. A pair of before-and-after digital aerial photographs with 50 cm spatial resolution was used to detect landslide occurrences in Yongin area, Korea. Iterative random sampling was run ten times in total and each time it was applied to the training and validation datasets. Thirteen landslide causative factors were derived from the topographic, soil, forest, and geological maps. The FR scores were calculated from the causative factors and training occurrences repeatedly ten times. The ten landslide susceptibility maps were obtained from the integration of causative factors that assigned FR scores. The landslide susceptibility maps were validated by using each validation dataset. The FR method achieved susceptibility accuracies from 89.48% to 93.21%. And the landslide susceptibility accuracy of the FR method is higher than 89%. Moreover, the ten times iterative FR modeling may contribute to a better understanding of a regularized relationship between the causative factors and landslide susceptibility. This makes it possible to incorporate knowledge-driven considerations of the causative factors into the landslide susceptibility analysis and also be extensively used to other areas.

  12. Impressions of the turbulence variability in a weakly stratified, flat-bottom deep-sea ‘boundary layer’

    NARCIS (Netherlands)

    van Haren, H.

    2015-01-01

    The character of turbulent overturns in a weakly stratified deep-sea is investigated in some detail using 144 high-resolution temperature sensors at 0.7 m intervals, starting 5 m above the bottom. A 9-day, 1 Hz sampled record from the 912 m depth flat-bottom (<0.5% bottom-slope) mooring site in the

  13. Prevalence and Risk Factors of Dengue Infection in Khanh Hoa Province, Viet Nam: A Stratified Cluster Sampling Survey.

    Science.gov (United States)

    Mai, Vien Quang; Mai, Trịnh Thị Xuan; Tam, Ngo Le Minh; Nghia, Le Trung; Komada, Kenichi; Murakami, Hitoshi

    2018-05-19

    Dengue is a clinically important arthropod-borne viral disease with increasing global incidence. Here we aimed to estimate the prevalence of dengue infections in Khanh Hoa Province, central Viet Nam, and to identify risk factors for infection. We performed a stratified cluster sampling survey including residents of 3-60 years of age in Nha Trang City, Ninh Hoa District and Dien Khanh District, Khanh Hoa Province, in October 2011. Immunoglobulin G (IgG) and immunoglobulin M (IgM) against dengue were analyzed using a rapid test kit. Participants completed a questionnaire exploring clinical dengue incidence, socio-economic status, and individual behavior. A household checklist was used to examine environment, mosquito larvae presence, and exposure to public health interventions. IgG positivity was 20.5% (urban, 16.3%; rural, 23.0%), IgM positivity was 6.7% (urban, 6.4%; rural, 6.9%), and incidence of clinically compatible dengue during the prior 3 months was 2.8 per 1,000 persons (urban, 1.7; rural, 3.4). For IgG positivity, the adjusted odds ratio (AOR) was 2.68 (95% confidence interval [CI], 1.24-5.81) for mosquito larvae presence in water pooled in old tires and was 3.09 (95% CI, 1.75-5.46) for proximity to a densely inhabited area. For IgM positivity, the AOR was 3.06 (95% CI, 1.50-6.23) for proximity to a densely inhabited area. Our results indicated rural penetration of dengue infections. Control measures should target densely inhabited areas, and may include clean-up of discarded tires and water-collecting waste.

  14. Thermal stratification built up in hot water tank with different inlet stratifiers

    DEFF Research Database (Denmark)

    Dragsted, Janne; Furbo, Simon; Dannemand, Mark

    2017-01-01

    Thermal stratification in a water storage tank can strongly increase the thermal performance of solar heating systems. Thermal stratification can be built up in a storage tank during charge, if the heated water enters through an inlet stratifier. Experiments with a test tank have been carried out...... in order to elucidate how well thermal stratification is established in the tank with differently designed inlet stratifiers under different controlled laboratory conditions. The investigated inlet stratifiers are from Solvis GmbH & Co KG and EyeCular Technologies ApS. The inlet stratifier from Solvis Gmb...... for Solvis GmbH & Co KG had a better performance at 4 l/min. In the intermediate charge test the stratifier from EyeCular Technologies ApS had a better performance in terms of maintaining the thermal stratification in the storage tank while charging with a relative low temperature. [All rights reserved...

  15. Large eddy simulation of turbulent and stably-stratified flows

    International Nuclear Information System (INIS)

    Fallon, Benoit

    1994-01-01

    The unsteady turbulent flow over a backward-facing step is studied by mean of Large Eddy Simulations with structure function sub grid model, both in isothermal and stably-stratified configurations. Without stratification, the flow develops highly-distorted Kelvin-Helmholtz billows, undergoing to helical pairing, with A-shaped vortices shed downstream. We show that forcing injected by recirculation fluctuations governs this oblique mode instabilities development. The statistical results show good agreements with the experimental measurements. For stably-stratified configurations, the flow remains more bi-dimensional. We show with increasing stratification, how the shear layer growth is frozen by inhibition of pairing process then of Kelvin-Helmholtz instabilities, and the development of gravity waves or stable density interfaces. Eddy structures of the flow present striking analogies with the stratified mixing layer. Additional computations show the development of secondary Kelvin-Helmholtz instabilities on the vorticity layers between two primary structures. This important mechanism based on baroclinic effects (horizontal density gradients) constitutes an additional part of the turbulent mixing process. Finally, the feasibility of Large Eddy Simulation is demonstrated for industrial flows, by studying a complex stratified cavity. Temperature fluctuations are compared to experimental measurements. We also develop three-dimensional un-stationary animations, in order to understand and visualize turbulent interactions. (author) [fr

  16. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  17. Association between Spouse/Child Separation and Migration-Related Stress among a Random Sample of Rural-to-Urban Migrants in Wuhan, China.

    Directory of Open Access Journals (Sweden)

    Yan Guo

    Full Text Available Millions of people move from rural areas to urban areas in China to pursue new opportunities while leaving their spouses and children at rural homes. Little is known about the impact of migration-related separation on mental health of these rural migrants in urban China.Survey data from a random sample of rural-to-urban migrants (n = 1113, aged 18-45 from Wuhan were analyzed. The Domestic Migration Stress Questionnaire (DMSQ, an instrument with four subconstructs, was used to measure migration-related stress. The relationship between spouse/child separation and stress was assessed using survey estimation methods to account for the multi-level sampling design.16.46% of couples were separated from their spouses (spouse-separation only, 25.81% of parents were separated from their children (child separation only. Among the participants who married and had children, 5.97% were separated from both their spouses and children (double separation. Spouse-separation only and double separation did not scored significantly higher on DMSQ than those with no separation. Compared to parents without child separation, parents with child separation scored significantly higher on DMSQ (mean score = 2.88, 95% CI: [2.81, 2.95] vs. 2.60 [2.53, 2.67], p < .05. Stratified analysis by separation type and by gender indicated that the association was stronger for child-separation only and for female participants.Child-separation is an important source of migration-related stress, and the effect is particularly strong for migrant women. Public policies and intervention programs should consider these factors to encourage and facilitate the co-migration of parents with their children to mitigate migration-related stress.

  18. Rationale, design, methodology and sample characteristics for the Vietnam pre-conceptual micronutrient supplementation trial (PRECONCEPT: a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Nguyen Phuong H

    2012-10-01

    Full Text Available Abstract Background Low birth weight and maternal anemia remain intractable problems in many developing countries. The adequacy of the current strategy of providing iron-folic acid (IFA supplements only during pregnancy has been questioned given many women enter pregnancy with poor iron stores, the substantial micronutrient demand by maternal and fetal tissues, and programmatic issues related to timing and coverage of prenatal care. Weekly IFA supplementation for women of reproductive age (WRA improves iron status and reduces the burden of anemia in the short term, but few studies have evaluated subsequent pregnancy and birth outcomes. The Preconcept trial aims to determine whether pre-pregnancy weekly IFA or multiple micronutrient (MM supplementation will improve birth outcomes and maternal and infant iron status compared to the current practice of prenatal IFA supplementation only. This paper provides an overview of study design, methodology and sample characteristics from baseline survey data and key lessons learned. Methods/design We have recruited 5011 WRA in a double-blind stratified randomized controlled trial in rural Vietnam and randomly assigned them to receive weekly supplements containing either: 1 2800 μg folic acid 2 60 mg iron and 2800 μg folic acid or 3 MM. Women who become pregnant receive daily IFA, and are being followed through pregnancy, delivery, and up to three months post-partum. Study outcomes include birth outcomes and maternal and infant iron status. Data are being collected on household characteristics, maternal diet and mental health, anthropometry, infant feeding practices, morbidity and compliance. Discussion The study is timely and responds to the WHO Global Expert Consultation which identified the need to evaluate the long term benefits of weekly IFA and MM supplementation in WRA. Findings will generate new information to help guide policy and programs designed to reduce the burden of anemia in women and

  19. Presence of psychoactive substances in oral fluid from randomly selected drivers in Denmark

    DEFF Research Database (Denmark)

    Simonsen, K. Wiese; Steentoft, A.; Hels, Tove

    2012-01-01

    . The percentage of drivers positive for medicinal drugs above the Danish legal concentration limit was 0.4%; while, 0.3% of the drivers tested positive for one or more illicit drug at concentrations exceeding the Danish legal limit. Tetrahydrocannabinol, cocaine, and amphetamine were the most frequent illicit......This roadside study is the Danish part of the EU-project DRUID (Driving under the Influence of Drugs, Alcohol, and Medicines) and included three representative regions in Denmark. Oral fluid samples (n = 3002) were collected randomly from drivers using a sampling scheme stratified by time, season......, and road type. The oral fluid samples were screened for 29 illegal and legal psychoactive substances and metabolites as well as ethanol. Fourteen (0.5%) drivers were positive for ethanol alone or in combination with drugs) at concentrations above 0.53 g/l (0.5 mg/g), which is the Danish legal limit...

  20. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  1. Comparison of sampling designs for estimating deforestation from landsat TM and MODIS imagery: a case study in Mato Grosso, Brazil.

    Science.gov (United States)

    Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin

    2014-01-01

    Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  2. Stratified Medicine and Reimbursement Issues

    Directory of Open Access Journals (Sweden)

    Hans-Joerg eFugel

    2012-10-01

    Full Text Available Stratified Medicine (SM has the potential to target patient populations who will most benefit from a therapy while reducing unnecessary health interventions associated with side effects. The link between clinical biomarkers/diagnostics and therapies provides new opportunities for value creation to strengthen the value proposition to pricing and reimbursement (P&R authorities. However, the introduction of SM challenges current reimbursement schemes in many EU countries and the US as different P&R policies have been adopted for drugs and diagnostics. Also, there is a lack of a consistent process for value assessment of more complex diagnostics in these markets. New, innovative approaches and more flexible P&R systems are needed to reflect the added value of diagnostic tests and to stimulate investments in new technologies. Yet, the framework for access of diagnostic–based therapies still requires further development while setting the right incentives and appropriate align stakeholders interests when realizing long- term patient benefits. This article addresses the reimbursement challenges of SM approaches in several EU countries and the US outlining some options to overcome existing reimbursement barriers for stratified medicine.

  3. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  4. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    Energy Technology Data Exchange (ETDEWEB)

    Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id [Center for Energy Studies, Gadjah Mada University, Sekip K-1A Kampus UGM, Yogyakarta 55281 (Indonesia); Department of Mechanical and Industrial Engineering, Faculty of Engineering, Gadjah Mada University, Jalan Grafika 2, Yogyakarta 55281 (Indonesia); Hudaya, Akhmad Zidni; Dinaryanto, Okto [Department of Mechanical and Industrial Engineering, Faculty of Engineering, Gadjah Mada University, Jalan Grafika 2, Yogyakarta 55281 (Indonesia)

    2016-06-03

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.

  5. Numerical simulation of stratified flows with different k-ε turbulence models

    International Nuclear Information System (INIS)

    Dagestad, S.

    1991-01-01

    The thesis comprises the numerical simulation of stratified flows with different k-ε models. When using the k-ε model, two equations are solved to describe the turbulence. The k-equation represents the turbulent kinetic energy of the turbulence and the ε-equation is the turbulent dissipation. Different k-ε models predict stratified flows differently. The standard k-ε model leads to higher turbulent mixing than the low-Reynolds model does. For lower Froude numbers, F 0 , this effect becomes enhanced. Buoyancy extension of the k-ε model also leads to less vertical mixing in cases with strong stratification. When the stratification increases, buoyancy-extension becomes larger influence. The turbulent Prandtl number effects have large impact on the transport of heat and the development of the flow. Two different formulae which express the turbulent Prandtl effects have been tested. For unstably stratified flows, the rapid mixing and three-dimensionality of the flow can in fact be computed using a k-ε model when buoyancy-extended is employed. The turbulent heat transfer and thus turbulent production in unstable stratified flows depends strongly upon the turbulent Prandtl number effect. The main conclusions are: Stable stratified flows should be computed with a buoyancy-extended low-Reynolds k-ε model; Unstable stratified flows should be computed with a buoyancy-extended standard k-ε model; The turbulent Prandtl number effects should be included in the computations; Buoyancy-extension has lead to more correct description of the physics for all of the investigated flows. 78 refs., 128 figs., 17 tabs

  6. MC3D modelling of stratified explosion

    International Nuclear Information System (INIS)

    Picchi, S.; Berthoud, G.

    1999-01-01

    It is known that a steam explosion can occur in a stratified geometry and that the observed yields are lower than in the case of explosion in a premixture configuration. However, very few models are available to quantify the amount of melt which can be involved and the pressure peak that can be developed. In the stratified application of the MC3D code, mixing and fragmentation of the melt are explained by the growth of Kelvin Helmholtz instabilities due to the shear flow of the two phase coolant above the melt. Such a model is then used to recalculate the Frost-Ciccarelli tin-water experiment. Pressure peak, speed of propagation, bubble shape and erosion height are well reproduced as well as the influence of the inertial constraint (height of the water pool). (author)

  7. MC3D modelling of stratified explosion

    Energy Technology Data Exchange (ETDEWEB)

    Picchi, S.; Berthoud, G. [DTP/SMTH/LM2, CEA, 38 - Grenoble (France)

    1999-07-01

    It is known that a steam explosion can occur in a stratified geometry and that the observed yields are lower than in the case of explosion in a premixture configuration. However, very few models are available to quantify the amount of melt which can be involved and the pressure peak that can be developed. In the stratified application of the MC3D code, mixing and fragmentation of the melt are explained by the growth of Kelvin Helmholtz instabilities due to the shear flow of the two phase coolant above the melt. Such a model is then used to recalculate the Frost-Ciccarelli tin-water experiment. Pressure peak, speed of propagation, bubble shape and erosion height are well reproduced as well as the influence of the inertial constraint (height of the water pool). (author)

  8. Discrete element method (DEM) simulations of stratified sampling during solid dosage form manufacturing.

    Science.gov (United States)

    Hancock, Bruno C; Ketterhagen, William R

    2011-10-14

    Discrete element model (DEM) simulations of the discharge of powders from hoppers under gravity were analyzed to provide estimates of dosage form content uniformity during the manufacture of solid dosage forms (tablets and capsules). For a system that exhibits moderate segregation the effects of sample size, number, and location within the batch were determined. The various sampling approaches were compared to current best-practices for sampling described in the Product Quality Research Institute (PQRI) Blend Uniformity Working Group (BUWG) guidelines. Sampling uniformly across the discharge process gave the most accurate results with respect to identifying segregation trends. Sigmoidal sampling (as recommended in the PQRI BUWG guidelines) tended to overestimate potential segregation issues, whereas truncated sampling (common in industrial practice) tended to underestimate them. The size of the sample had a major effect on the absolute potency RSD. The number of sampling locations (10 vs. 20) had very little effect on the trends in the data, and the number of samples analyzed at each location (1 vs. 3 vs. 7) had only a small effect for the sampling conditions examined. The results of this work provide greater understanding of the effect of different sampling approaches on the measured content uniformity of real dosage forms, and can help to guide the choice of appropriate sampling protocols. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. The effect of existing turbulence on stratified shear instability

    Science.gov (United States)

    Kaminski, Alexis; Smyth, William

    2017-11-01

    Ocean turbulence is an essential process governing, for example, heat uptake by the ocean. In the stably-stratified ocean interior, this turbulence occurs in discrete events driven by vertical variations of the horizontal velocity. Typically, these events have been modelled by assuming an initially laminar stratified shear flow which develops wavelike instabilities, becomes fully turbulent, and then relaminarizes into a stable state. However, in the real ocean there is always some level of turbulence left over from previous events, and it is not yet understood how this turbulence impacts the evolution of future mixing events. Here, we perform a series of direct numerical simulations of turbulent events developing in stratified shear flows that are already at least weakly turbulent. We do so by varying the amplitude of the initial perturbations, and examine the subsequent development of the instability and the impact on the resulting turbulent fluxes. This work is supported by NSF Grant OCE1537173.

  10. A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.

    Science.gov (United States)

    Mascaro, Joseph; Asner, Gregory P; Knapp, David E; Kennedy-Bowdoin, Ty; Martin, Roberta E; Anderson, Christopher; Higgins, Mark; Chadwick, K Dana

    2014-01-01

    Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus). The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging)-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area) for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag"), which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1) when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.

  11. A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.

    Directory of Open Access Journals (Sweden)

    Joseph Mascaro

    Full Text Available Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus. The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag", which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1 when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.

  12. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  13. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  14. Visualization of mole fraction distribution of slow jet forming stably stratified field

    International Nuclear Information System (INIS)

    Fumizawa, Motoo; Hishida, Makoto

    1990-01-01

    An experimental study has been performed to investigate the behavior of flow and mass transfer in gaseous slow jet in which buoyancy force opposed the flow forming stably stratified field. The study has been performed to understand the basic features of air ingress phenomena at pipe rupture accident of the high temperature gas-cooled reactor. A displacement fringe technique was adopted in Mach-Zehnder interferometer to visualize the mole fraction distribution. As the result, the followings were obtained: (1) The stably stratified fields were formed in the vicinity of the outlet of the slow jet. The penetration distance of the stably stratified fields increased with Froude number. (2) Mass fraction distributions in the stably stratified fields were well correlated with the present model using the ramp mole velocity profile. (author)

  15. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  16. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Simulation of steam explosion in stratified melt-coolant configuration

    International Nuclear Information System (INIS)

    Leskovar, Matjaž; Centrih, Vasilij; Uršič, Mitja

    2016-01-01

    Highlights: • Strong steam explosions may develop spontaneously in stratified configurations. • Considerable melt-coolant premixed layer formed in subcooled water with hot melts. • Analysis with MC3D code provided insight into stratified steam explosion phenomenon. • Up to 25% of poured melt was mixed with water and available for steam explosion. • Better instrumented experiments needed to determine dominant mixing process. - Abstract: A steam explosion is an energetic fuel coolant interaction process, which may occur during a severe reactor accident when the molten core comes into contact with the coolant water. In nuclear reactor safety analyses steam explosions are primarily considered in melt jet-coolant pool configurations where sufficiently deep coolant pool conditions provide complete jet breakup and efficient premixture formation. Stratified melt-coolant configurations, i.e. a molten melt layer below a coolant layer, were up to now believed as being unable to generate strong explosive interactions. Based on the hypothesis that there are no interfacial instabilities in a stratified configuration it was assumed that the amount of melt in the premixture is insufficient to produce strong explosions. However, the recently performed experiments in the PULiMS and SES (KTH, Sweden) facilities with oxidic corium simulants revealed that strong steam explosions may develop spontaneously also in stratified melt-coolant configurations, where with high temperature melts and subcooled water conditions a considerable melt-coolant premixed layer is formed. In the article, the performed study of steam explosions in a stratified melt-coolant configuration in PULiMS like conditions is presented. The goal of this analytical work is to supplement the experimental activities within the PULiMS research program by addressing the key questions, especially regarding the explosivity of the formed premixed layer and the mechanisms responsible for the melt-water mixing. To

  18. Comparison of Sampling Designs for Estimating Deforestation from Landsat TM and MODIS Imagery: A Case Study in Mato Grosso, Brazil

    Directory of Open Access Journals (Sweden)

    Shanyou Zhu

    2014-01-01

    Full Text Available Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  19. RADIAL STABILITY IN STRATIFIED STARS

    International Nuclear Information System (INIS)

    Pereira, Jonas P.; Rueda, Jorge A.

    2015-01-01

    We formulate within a generalized distributional approach the treatment of the stability against radial perturbations for both neutral and charged stratified stars in Newtonian and Einstein's gravity. We obtain from this approach the boundary conditions connecting any two phases within a star and underline its relevance for realistic models of compact stars with phase transitions, owing to the modification of the star's set of eigenmodes with respect to the continuous case

  20. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  1. Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling

    Science.gov (United States)

    Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing

    2018-05-01

    The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.

  2. Soil mixing of stratified contaminated sands.

    Science.gov (United States)

    Al-Tabba, A; Ayotamuno, M J; Martin, R J

    2000-02-01

    Validation of soil mixing for the treatment of contaminated ground is needed in a wide range of site conditions to widen the application of the technology and to understand the mechanisms involved. Since very limited work has been carried out in heterogeneous ground conditions, this paper investigates the effectiveness of soil mixing in stratified sands using laboratory-scale augers. This enabled a low cost investigation of factors such as grout type and form, auger design, installation procedure, mixing mode, curing period, thickness of soil layers and natural moisture content on the unconfined compressive strength, leachability and leachate pH of the soil-grout mixes. The results showed that the auger design plays a very important part in the mixing process in heterogeneous sands. The variability of the properties measured in the stratified soils and the measurable variations caused by the various factors considered, highlighted the importance of duplicating appropriate in situ conditions, the usefulness of laboratory-scale modelling of in situ conditions and the importance of modelling soil and contaminant heterogeneities at the treatability study stage.

  3. Stratified charge rotary aircraft engine technology enablement program

    Science.gov (United States)

    Badgley, P. R.; Irion, C. E.; Myers, D. M.

    1985-01-01

    The multifuel stratified charge rotary engine is discussed. A single rotor, 0.7L/40 cu in displacement, research rig engine was tested. The research rig engine was designed for operation at high speeds and pressures, combustion chamber peak pressure providing margin for speed and load excursions above the design requirement for a high is advanced aircraft engine. It is indicated that the single rotor research rig engine is capable of meeting the established design requirements of 120 kW, 8,000 RPM, 1,379 KPA BMEP. The research rig engine, when fully developed, will be a valuable tool for investigating, advanced and highly advanced technology components, and provide an understanding of the stratified charge rotary engine combustion process.

  4. A review of recent developments on turbulent entrainment in stratified flows

    International Nuclear Information System (INIS)

    Cotel, Aline J

    2010-01-01

    Stratified interfaces are present in many geophysical flow situations, and transport across such an interface is an essential factor for correctly evaluating the physical processes taking place at many spatial and temporal scales in such flows. In order to accurately evaluate vertical and lateral transport occurring when a turbulent flow impinges on a stratified interface, the turbulent entrainment and vorticity generation mechanisms near the interface must be understood and quantified. Laboratory experiments were performed for three flow configurations: a vertical thermal, a sloping gravity current and a vertical turbulent jet with various tilt angles and precession speeds. All three flows impinged on an interface separating a two-layer stably stratified environment. The entrainment rate is quantified for each flow using laser-induced fluorescence and compared to predictions of Cotel and Breidenthal (1997 Appl. Sci. Res. 57 349-66). The possible applications of transport across stratified interfaces include the contribution of hydrothermal plumes to the global ocean energy budget, turbidity currents on the ocean floor, the design of lake de-stratification systems, modeling gas leaks from storage reservoirs, weather forecasting and global climate change.

  5. Stratified turbulent Bunsen flames : flame surface analysis and flame surface density modelling

    NARCIS (Netherlands)

    Ramaekers, W.J.S.; Oijen, van J.A.; Goey, de L.P.H.

    2012-01-01

    In this paper it is investigated whether the Flame Surface Density (FSD) model, developed for turbulent premixed combustion, is also applicable to stratified flames. Direct Numerical Simulations (DNS) of turbulent stratified Bunsen flames have been carried out, using the Flamelet Generated Manifold

  6. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  7. Stratified Entomological Sampling in Preparation for an Area-Wide Integrated Pest Management Program: The Example of Glossina palpalis gambiensis (Diptera: Glossinidae) in the Niayes of Senegal

    International Nuclear Information System (INIS)

    Bouyer, Jeremy; Seck, Momar Talla; Guerrini, Laure; Sall, Baba; Ndiaye, Elhadji Youssou; Vreysen, Marc J.B.

    2010-01-01

    The riverine tsetse species Glossina palpalis gambiensis Vanderplank 1949 (Diptera: Glossinidae) inhabits riparian forests along river systems in West Africa. The government of Senegal has embarked on a project to eliminate this tsetse species, and African animal trypanosomoses, from the Niayes are using an area-wide integrated pest management approach. A stratified entomological sampling strategy was therefore developed using spatial analytical tools and mathematical modeling. A preliminary phytosociological census identified eight types of suitable habitat, which could be discriminated from LandSat 7ETM satellite images and denominated wet areas. At the end of March 2009, 683 unbaited Vavoua traps had been deployed, and the observed infested area in the Niayes was 525 km2. In the remaining area, a mathematical model was used to assess the risk that flies were present despite a sequence of zero catches. The analysis showed that this risk was above 0.05 in19% of this area that will be considered as infested during the control operations.The remote sensing analysis that identifed the wet areas allowed a restriction of the area to be surveyed to 4% of the total surface area (7,150km2), whereas the mathematical model provided an efficient method to improve the accuracy and the robustness of the sampling protocol. The final size of the control area will be decided based on the entomological collection data.This entomological sampling procedure might be used for other vector or pest control scenarios. (Authors)

  8. Two-phase air-water stratified flow measurement using ultrasonic techniques

    International Nuclear Information System (INIS)

    Fan, Shiwei; Yan, Tinghu; Yeung, Hoi

    2014-01-01

    In this paper, a time resolved ultrasound system was developed for investigating two-phase air-water stratified flow. The hardware of the system includes a pulsed wave transducer, a pulser/receiver, and a digital oscilloscope. The time domain cross correlation method is used to calculate the velocity profile along ultrasonic beam. The system is able to provide velocities with spatial resolution of around 1mm and the temporal resolution of 200μs. Experiments were carried out on single phase water flow and two-phase air-water stratified flow. For single phase water flow, the flow rates from ultrasound system were compared with those from electromagnetic flow (EM) meter, which showed good agreement. Then, the experiments were conducted on two-phase air-water stratified flow and the results were given. Compared with liquid height measurement from conductance probe, it indicated that the measured velocities were explainable

  9. Spinning phenomena and energetics of spherically pulsating patterns in stratified fluids

    International Nuclear Information System (INIS)

    Ibragimov, Ranis N; Dameron, Michael

    2011-01-01

    The nonlinear solutions of the two-dimensional Boussinesq equations describing internal waves in rotating stratified fluids were obtained as group invariant solutions. The latter nonlinear solutions correspond to the rotation transformation preserving the form of the original nonlinear equations of motion. It is shown that the obtained class of exact solutions can be associated with the spherically pulsating patterns observed in uniformly stratified fluids. It is also shown that the obtained rotationally symmetric solutions are bounded functions that can be visualized as spinning patterns in stratified fluids. It is also shown that the rotational transformation provides the energy conservation law together with other conservation laws for which the spinning phenomena is observed. The effects of nonlinearity and the Earth's rotation on such a phenomenon are also discussed.

  10. Hydrogeology and water quality of the stratified-drift aquifer in the Pony Hollow Creek Valley, Tompkins County, New York

    Science.gov (United States)

    Bugliosi, Edward F.; Miller, Todd S.; Reynolds, Richard J.

    2014-01-01

    The lithology, areal extent, and the water-table configuration in stratified-drift aquifers in the northern part of the Pony Hollow Creek valley in the Town of Newfield, New York, were mapped as part of an ongoing aquifer mapping program in Tompkins County. Surficial geologic and soil maps, well and test-boring records, light detection and ranging (lidar) data, water-level measurements, and passive-seismic surveys were used to map the aquifer geometry, construct geologic sections, and determine the depth to bedrock at selected locations throughout the valley. Additionally, water-quality samples were collected from selected streams and wells to characterize the quality of surface and groundwater in the study area. Sedimentary bedrock underlies the study area and is overlain by unstratified drift (till), stratified drift (glaciolacustrine and glaciofluvial deposits), and recent post glacial alluvium. The major type of unconsolidated, water-yielding material in the study area is stratified drift, which consists of glaciofluvial sand and gravel, and is present in sufficient amounts in most places to form an extensive unconfined aquifer throughout the study area, which is the source of water for most residents, farms, and businesses in the valleys. A map of the water table in the unconfined aquifer was constructed by using (1) measurements made between the mid-1960s through 2010, (2) control on the altitudes of perennial streams at 10-foot contour intervals from lidar data collected by Tompkins County, and (3) water surfaces of ponds and wetlands that are hydraulically connected to the unconfined aquifer. Water-table contours indicate that the direction of groundwater flow within the stratified-drift aquifer is predominantly from the valley walls toward the streams and ponds in the central part of the valley where groundwater then flows southwestward (down valley) toward the confluence with the Cayuta Creek valley. Locally, the direction of groundwater flow is radially

  11. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  12. Modeling the Conducting Stably-Stratified Layer of the Earth's Core

    Science.gov (United States)

    Petitdemange, L.; Philidet, J.; Gissinger, C.

    2017-12-01

    Observations of the Earth magnetic field as well as recent theoretical works tend to show that the Earth's outer liquid core is mostly comprised of a convective zone in which the Earth's magnetic field is generated - likely by dynamo action -, but also features a thin, stably stratified layer at the top of the core.We carry out direct numerical simulations by modeling this thin layer as an axisymmetric spherical Couette flow for a stably stratified fluid embedded in a dipolar magnetic field. The dynamo region is modeled by a conducting inner core rotating slightly faster than the insulating mantle due to magnetic torques acting on it, such that a weak differential rotation (low Rossby limit) can develop in the stably stratified layer.In the case of a non-stratified fluid, the combined action of the differential rotation and the magnetic field leads to the well known regime of `super-rotation', in which the fluid rotates faster than the inner core. Whereas in the classical case, this super-rotation is known to vanish in the magnetostrophic limit, we show here that the fluid stratification significantly extends the magnitude of the super-rotation, keeping this phenomenon relevant for the Earth core. Finally, we study how the shear layers generated by this new state might give birth to magnetohydrodynamic instabilities or waves impacting the secular variations or jerks of the Earth's magnetic field.

  13. Women’s perspectives and experiences on screening for osteoporosis (Risk-stratified Osteoporosis Strategy Evaluation, ROSE)

    DEFF Research Database (Denmark)

    Rothmann, Mette Juel; Huniche, Lotte; Ammentorp, Jette

    2014-01-01

    main themes: knowledge about osteoporosis, psychological aspects of screening, and moral duty. The women viewed the program in the context of their everyday life and life trajectories. Age, lifestyle, and knowledge about osteoporosis were important to how women ascribed meaning to the program, how......This study aimed to investigate women's perspectives and experiences with screening for osteoporosis. Focus groups and individual interviews were conducted. Three main themes emerged: knowledge about osteoporosis, psychological aspects of screening, and moral duty. Generally, screening was accepted...... due to life experiences, self-perceived risk, and the preventive nature of screening. PURPOSE: The risk-stratified osteoporosis strategy evaluation (ROSE) study is a randomized prospective population-based trial investigating the efficacy of a screening program to prevent fractures in women aged 65...

  14. Background stratified Poisson regression analysis of cohort data.

    Science.gov (United States)

    Richardson, David B; Langholz, Bryan

    2012-03-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

  15. Invited Review. Combustion instability in spray-guided stratified-charge engines. A review

    Energy Technology Data Exchange (ETDEWEB)

    Fansler, Todd D. [Univ. of Wisconsin, Madison, WI (United States); Reuss, D. L. [Univ. of Michigan, Ann Arbor, MI (United States); Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sick, V. [Univ. of Michigan, Ann Arbor, MI (United States); Dahms, R. N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-02-02

    Our article reviews systematic research on combustion instabilities (principally rare, random misfires and partial burns) in spray-guided stratified-charge (SGSC) engines operated at part load with highly stratified fuel -air -residual mixtures. Results from high-speed optical imaging diagnostics and numerical simulation provide a conceptual framework and quantify the sensitivity of ignition and flame propagation to strong, cyclically varying temporal and spatial gradients in the flow field and in the fuel -air -residual distribution. For SGSC engines using multi-hole injectors, spark stretching and locally rich ignition are beneficial. Moreover, combustion instability is dominated by convective flow fluctuations that impede motion of the spark or flame kernel toward the bulk of the fuel, coupled with low flame speeds due to locally lean mixtures surrounding the kernel. In SGSC engines using outwardly opening piezo-electric injectors, ignition and early flame growth are strongly influenced by the spray's characteristic recirculation vortex. For both injection systems, the spray and the intake/compression-generated flow field influence each other. Factors underlying the benefits of multi-pulse injection are identified. Finally, some unresolved questions include (1) the extent to which piezo-SGSC misfires are caused by failure to form a flame kernel rather than by flame-kernel extinction (as in multi-hole SGSC engines); (2) the relative contributions of partially premixed flame propagation and mixing-controlled combustion under the exceptionally late-injection conditions that permit SGSC operation on E85-like fuels with very low NOx and soot emissions; and (3) the effects of flow-field variability on later combustion, where fuel-air-residual mixing within the piston bowl becomes important.

  16. Nitrogen transformations in stratified aquatic microbial ecosystems

    DEFF Research Database (Denmark)

    Revsbech, N. P.; Risgaard-Petersen, N.; Schramm, A.

    2006-01-01

    Abstract  New analytical methods such as advanced molecular techniques and microsensors have resulted in new insights about how nitrogen transformations in stratified microbial systems such as sediments and biofilms are regulated at a µm-mm scale. A large and ever-expanding knowledge base about n...

  17. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  18. [Causes of emergency dizziness stratified by etiology].

    Science.gov (United States)

    Qiao, Wenying; Liu, Jianguo; Zeng, Hong; Liu, Yugeng; Jia, Weihua; Wang, Honghong; Liu, Bo; Tan, Jing; Li, Changqing

    2014-06-03

    To explore the causes of emergency dizziness stratified to improve the diagnostic efficiency. A total of 1 857 cases of dizziness at our emergency department were collected and their etiologies stratified by age and gender. The top three diagnoses were benign paroxysmal positional vertigo (BPPV, 31.7%), hypertension (24.0%) and posterior circulation ischemia (PCI, 20.5%). Stratified by age, the main causes of dizziness included BPPV (n = 6), migraine-associated vertigo (n = 2), unknown cause (n = 1) for the group of vertigo (14.5%) and neurosis (7.3%) for 18-44 years; BPPV (36.8%), hypertension (22.4%) and migraine-associated vertigo (11.2%) for 45-59 years; hypertension (30.8%), PCI (29.8%) and BPPV (22.9%) for 60-74 years; PCI (30.7%), hypertension (28.6%) and BPPV (25.5%) for 75-92 years. BPPV, migraine and neurosis were more common in females while hypertension and PCI predominated in males (all P hypertension, neurosis and migraine showed the following significant demographic features: BPPV, PCI, hypertension, neurosis and migraine may be the main causes of dizziness. BPPV should be considered initially when vertigo was triggered repeatedly by positional change, especially for young and middle-aged women. And the other common causes of dizziness were migraine-associated vertigo, neurosis and Meniere's disease.Hypertension should be screened firstly in middle-aged and elderly patients presenting mainly with head heaviness and stretching. In elders with dizziness, BPPV is second in constituent ratio to PCI and hypertension.In middle-aged and elderly patients with dizziness, psychological factors should be considered and diagnosis and treatment should be offered timely.

  19. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    International Nuclear Information System (INIS)

    Maziero, Jonas

    2015-01-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  20. Improving ambulatory saliva-sampling compliance in pregnant women: a randomized controlled study.

    Directory of Open Access Journals (Sweden)

    Julian Moeller

    Full Text Available OBJECTIVE: Noncompliance with scheduled ambulatory saliva sampling is common and has been associated with biased cortisol estimates in nonpregnant subjects. This study is the first to investigate in pregnant women strategies to improve ambulatory saliva-sampling compliance, and the association between sampling noncompliance and saliva cortisol estimates. METHODS: We instructed 64 pregnant women to collect eight scheduled saliva samples on two consecutive days each. Objective compliance with scheduled sampling times was assessed with a Medication Event Monitoring System and self-reported compliance with a paper-and-pencil diary. In a randomized controlled study, we estimated whether a disclosure intervention (informing women about objective compliance monitoring and a reminder intervention (use of acoustical reminders improved compliance. A mixed model analysis was used to estimate associations between women's objective compliance and their diurnal cortisol profiles, and between deviation from scheduled sampling and the cortisol concentration measured in the related sample. RESULTS: Self-reported compliance with a saliva-sampling protocol was 91%, and objective compliance was 70%. The disclosure intervention was associated with improved objective compliance (informed: 81%, noninformed: 60%, F(1,60  = 17.64, p<0.001, but not the reminder intervention (reminders: 68%, without reminders: 72%, F(1,60 = 0.78, p = 0.379. Furthermore, a woman's increased objective compliance was associated with a higher diurnal cortisol profile, F(2,64  = 8.22, p<0.001. Altered cortisol levels were observed in less objective compliant samples, F(1,705  = 7.38, p = 0.007, with delayed sampling associated with lower cortisol levels. CONCLUSIONS: The results suggest that in pregnant women, objective noncompliance with scheduled ambulatory saliva sampling is common and is associated with biased cortisol estimates. To improve sampling compliance, results suggest

  1. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  2. Sampling high-altitude and stratified mating flights of red imported fire ant.

    Science.gov (United States)

    Fritz, Gary N; Fritz, Ann H; Vander Meer, Robert K

    2011-05-01

    With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens and males during mating flights at altitudinal intervals reaching as high as "140 m. Our trapping system uses an electric winch and a 1.2-m spindle bolted to a swiveling platform. The winch dispenses up to 183 m of Kevlar-core, nylon rope and the spindle stores 10 panels (0.9 by 4.6 m each) of nylon tulle impregnated with Tangle-Trap. The panels can be attached to the rope at various intervals and hoisted into the air by using a 3-m-diameter, helium-filled balloon. Raising or lowering all 10 panels takes approximately 15-20 min. This trap also should be useful for altitudinal sampling of other insects of medical importance.

  3. Molecular polymorphism of a cell surface proteoglycan: distinct structures on simple and stratified epithelia.

    Science.gov (United States)

    Sanderson, R D; Bernfield, M

    1988-12-01

    Epithelial cells are organized into either a single layer (simple epithelia) or multiple layers (stratified epithelia). Maintenance of these cellular organizations requires distinct adhesive mechanisms involving many cell surface molecules. One such molecule is a cell surface proteoglycan, named syndecan, that contains both heparan sulfate and chondroitin sulfate chains. This proteoglycan binds cells to fibrillar collagens and fibronectin and thus acts as a receptor for interstitial matrix. The proteoglycan is restricted to the basolateral surface of simple epithelial cells, but is located over the entire surface of stratified epithelial cells, even those surfaces not contacting matrix. We now show that the distinct localization in simple and stratified epithelia correlates with a distinct proteoglycan structure. The proteoglycan from simple epithelia (modal molecular size, 160 kDa) is larger than that from stratified epithelia (modal molecular size, 92 kDa), but their core proteins are identical in size and immunoreactivity. The proteoglycan from simple epithelia has more and larger heparan sulfate and chondroitin sulfate chains than the proteoglycan from stratified epithelia. Thus, the cell surface proteoglycan shows a tissue-specific structural polymorphism due to distinct posttranslational modifications. This polymorphism likely reflects distinct proteoglycan functions in simple and stratified epithelia, potentially meeting the different adhesive requirements of the cells in these different organizations.

  4. Methylmercury speciation in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

    Energy Technology Data Exchange (ETDEWEB)

    Clarisse, Olivier [Trent University, Department of Chemistry, 1600 West Bank Drive, Peterborough, Ontario K9J 7B8 (Canada)], E-mail: olivier.clarisse@umoncton.ca; Foucher, Delphine; Hintelmann, Holger [Trent University, Department of Chemistry, 1600 West Bank Drive, Peterborough, Ontario K9J 7B8 (Canada)

    2009-03-15

    The diffusive gradient in thin film (DGT) technique was successfully used to monitor methylmercury (MeHg) speciation in the dissolved phase of a stratified boreal lake, Lake 658 of the Experimental Lakes Area (ELA) in Ontario, Canada. Water samples were conventionally analysed for MeHg, sulfides, and dissolved organic matter (DOM). MeHg accumulated by DGT devices was compared to MeHg concentration measured conventionally in water samples to establish MeHg speciation. In the epilimnion, MeHg was almost entirely bound to DOM. In the top of the hypolimnion an additional labile fraction was identified, and at the bottom of the lake a significant fraction of MeHg was potentially associated to colloidal material. As part of the METAALICUS project, isotope enriched inorganic mercury was applied to Lake 658 and its watershed for several years to establish the relationship between atmospheric Hg deposition and Hg in fish. Little or no difference in MeHg speciation in the dissolved phase was detected between ambient and spike MeHg. - Methylmercury speciation was determined in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique.

  5. Methylmercury speciation in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

    International Nuclear Information System (INIS)

    Clarisse, Olivier; Foucher, Delphine; Hintelmann, Holger

    2009-01-01

    The diffusive gradient in thin film (DGT) technique was successfully used to monitor methylmercury (MeHg) speciation in the dissolved phase of a stratified boreal lake, Lake 658 of the Experimental Lakes Area (ELA) in Ontario, Canada. Water samples were conventionally analysed for MeHg, sulfides, and dissolved organic matter (DOM). MeHg accumulated by DGT devices was compared to MeHg concentration measured conventionally in water samples to establish MeHg speciation. In the epilimnion, MeHg was almost entirely bound to DOM. In the top of the hypolimnion an additional labile fraction was identified, and at the bottom of the lake a significant fraction of MeHg was potentially associated to colloidal material. As part of the METAALICUS project, isotope enriched inorganic mercury was applied to Lake 658 and its watershed for several years to establish the relationship between atmospheric Hg deposition and Hg in fish. Little or no difference in MeHg speciation in the dissolved phase was detected between ambient and spike MeHg. - Methylmercury speciation was determined in the dissolved phase of a stratified lake using the diffusive gradient in thin film technique

  6. Background stratified Poisson regression analysis of cohort data

    International Nuclear Information System (INIS)

    Richardson, David B.; Langholz, Bryan

    2012-01-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models. (orig.)

  7. Classification of archaeologically stratified pumice by INAA

    International Nuclear Information System (INIS)

    Peltz, C.; Bichler, M.

    2001-01-01

    In the framework of the research program 'Synchronization of Civilization in the Eastern Mediterranean Region in the 2nd Millenium B.C.' instrumental neutron activation analysis (INAA) was used to determine 30 elements in pumice from archaeological excavations to reveal their specific volcanic origin. The widespread pumiceous products of several eruptions in the Aegean region were used as abrasive tools and were therefore popular trade objects. A remarkable quantity of pumice and pumiceous tephra (several km 3 ) was produced by the 'Minoan eruption' of Thera (Santorini), which is assumed to have happened between 1450 and 1650 B.C. Thus the discovery of the primary fallout of 'Minoan' tephra in archaeologically stratified locations can be used as a relative time mark. Additionally, pumice lumps used as abrasive can serve for dating by first appearance. Essential to an identification of the primary volcanic source is the knowledge that pumices from the Aegean region can easily be distinguished by their trace element distribution patterns, as previous work has shown. The elements Al, Ba, Ca, Ce, Co, Cr, Cs, Dy, Eu, Fe, Hf, K, La, Lu, Mn, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, Ti, U, V, Yb, Zn and Zr were determined in 16 samples of pumice lumps from excavations in Tell-el-Dab'a and Tell-el-Herr (Egypt). Two irradiation cycles and five measurement runs were applied. A reliable identification of the samples is achieved by comparing these results to the database compiled in previous studies. (author)

  8. Theoretical study of evaporation heat transfer in horizontal microfin tubes: stratified flow model

    Energy Technology Data Exchange (ETDEWEB)

    Honda, H; Wang, Y S [Kyushu Univ., Inst. for Materials Chemistry and Engineering, Kasuga, Fukuoka (Japan)

    2004-08-01

    The stratified flow model of evaporation heat transfer in helically grooved, horizontal microfin tubes has been developed. The profile of stratified liquid was determined by a theoretical model previously developed for condensation in horizontal microfin tubes. For the region above the stratified liquid, the meniscus profile in the groove between adjacent fins was determined by a force balance between the gravity and surface tension forces. The thin film evaporation model was applied to predict heat transfer in the thin film region of the meniscus. Heat transfer through the stratified liquid was estimated by using an empirical correlation proposed by Mori et al. The theoretical predictions of the circumferential average heat transfer coefficient were compared with available experimental data for four tubes and three refrigerants. A good agreement was obtained for the region of Fr{sub 0}<2.5 as long as partial dry out of tube surface did not occur. (Author)

  9. Improving Precision and Reducing Runtime of Microscopic Traffic Simulators through Stratified Sampling

    Directory of Open Access Journals (Sweden)

    Khewal Bhupendra Kesur

    2013-01-01

    Full Text Available This paper examines the application of Latin Hypercube Sampling (LHS and Antithetic Variables (AVs to reduce the variance of estimated performance measures from microscopic traffic simulators. LHS and AV allow for a more representative coverage of input probability distributions through stratification, reducing the standard error of simulation outputs. Two methods of implementation are examined, one where stratification is applied to headways and routing decisions of individual vehicles and another where vehicle counts and entry times are more evenly sampled. The proposed methods have wider applicability in general queuing systems. LHS is found to outperform AV, and reductions of up to 71% in the standard error of estimates of traffic network performance relative to independent sampling are obtained. LHS allows for a reduction in the execution time of computationally expensive microscopic traffic simulators as fewer simulations are required to achieve a fixed level of precision with reductions of up to 84% in computing time noted on the test cases considered. The benefits of LHS are amplified for more congested networks and as the required level of precision increases.

  10. Dual Spark Plugs For Stratified-Charge Rotary Engine

    Science.gov (United States)

    Abraham, John; Bracco, Frediano V.

    1996-01-01

    Fuel efficiency of stratified-charge, rotary, internal-combustion engine increased by improved design featuring dual spark plugs. Second spark plug ignites fuel on upstream side of main fuel injector; enabling faster burning and more nearly complete utilization of fuel.

  11. The Correlation between Obsessive Compulsive Features and Dimensions of Pathological Eating Attitudes in Non-clinical Samples

    Directory of Open Access Journals (Sweden)

    Ali Mohammadzadeh

    2017-01-01

    Full Text Available Background and Objectives: Obsessive compulsive symptoms are prevalent in individuals with eating disorders at clinical level. The purpose of this study was to investigate the correlation between obsessive compulsive features and pathological eating attitudes. Methods: This research is a correlational study. A sample of 790 university students were selected using stratified random sampling method and investigated by Obsessive Compulsive Inventory-Revised (OCI-R, and Eating Attitudes Test (EAT-26 questionnaires. Data were analyzed using multivariate regression analysis. Results: There were a correlation between obsessive-compulsive features and pathological eating attitudes (p<0.001, r=0.38, The results showed that obsessive-compulsive features can predict 15% of pathological eating attitudes (p<0.001, r2=0.15. Conclusion: The identified correlation is possibly related to common components between obsessive compulsive and eating disorders.

  12. Implementing content constraints in alpha-stratified adaptive testing using a shadow test approach

    NARCIS (Netherlands)

    van der Linden, Willem J.; Chang, Hua-Hua

    2001-01-01

    The methods of alpha-stratified adaptive testing and constrained adaptive testing with shadow tests are combined in this study. The advantages are twofold. First, application of the shadow test allows the researcher to implement any type of constraint on item selection in alpha-stratified adaptive

  13. Stratified coastal ocean interactions with tropical cyclones

    Science.gov (United States)

    Glenn, S. M.; Miles, T. N.; Seroka, G. N.; Xu, Y.; Forney, R. K.; Yu, F.; Roarty, H.; Schofield, O.; Kohut, J.

    2016-01-01

    Hurricane-intensity forecast improvements currently lag the progress achieved for hurricane tracks. Integrated ocean observations and simulations during hurricane Irene (2011) reveal that the wind-forced two-layer circulation of the stratified coastal ocean, and resultant shear-induced mixing, led to significant and rapid ahead-of-eye-centre cooling (at least 6 °C and up to 11 °C) over a wide swath of the continental shelf. Atmospheric simulations establish this cooling as the missing contribution required to reproduce Irene's accelerated intensity reduction. Historical buoys from 1985 to 2015 show that ahead-of-eye-centre cooling occurred beneath all 11 tropical cyclones that traversed the Mid-Atlantic Bight continental shelf during stratified summer conditions. A Yellow Sea buoy similarly revealed significant and rapid ahead-of-eye-centre cooling during Typhoon Muifa (2011). These findings establish that including realistic coastal baroclinic processes in forecasts of storm intensity and impacts will be increasingly critical to mid-latitude population centres as sea levels rise and tropical cyclone maximum intensities migrate poleward. PMID:26953963

  14. Effect of optimum stratification on sampling with varying probabilities under proportional allocation

    Directory of Open Access Journals (Sweden)

    Syed Ejaz Husain Rizvi

    2007-10-01

    Full Text Available The problem of optimum stratification on an auxiliary variable when the units from different strata are selected with probability proportional to the value of auxiliary variable (PPSWR was considered by Singh (1975 for univariate case. In this paper we have extended the same problem, for proportional allocation, when two variates are under study. A cum. 3 R3(x rule for obtaining approximately optimum strata boundaries has been provided. It has been shown theoretically as well as empirically that the use of stratification has inverse effect on the relative efficiency of PPSWR as compared to unstratified PPSWR method when proportional method of allocation is envisaged. Further comparison showed that with increase in number of strata the stratified simple random sampling is equally efficient as PPSWR.

  15. Tumour vasculature immaturity, oxidative damage and systemic inflammation stratify survival of colorectal cancer patients on bevacizumab treatment

    Science.gov (United States)

    Martin, Petra; Biniecka, Monika; Ó'Meachair, Shane; Maguire, Aoife; Tosetto, Miriam; Nolan, Blathnaid; Hyland, John; Sheahan, Kieran; O'Donoghue, Diarmuid; Mulcahy, Hugh; Fennelly, David; O'Sullivan, Jacintha

    2018-01-01

    Despite treatment of patients with metastatic colorectal cancer (mCRC) with bevacizumab plus chemotherapy, response rates are modest and there are no biomarkers available that will predict response. The aim of this study was to assess if markers associated with three interconnected cancer-associated biological processes, specifically angiogenesis, inflammation and oxidative damage, could stratify the survival outcome of this cohort. Levels of angiogenesis, inflammation and oxidative damage markers were assessed in pre-bevacizumab resected tumour and serum samples of mCRC patients by dual immunofluorescence, immunohistochemistry and ELISA. This study identified that specific markers of angiogenesis, inflammation and oxidative damage stratify survival of patients on this anti-angiogenic treatment. Biomarkers of immature tumour vasculature (% IMM, p=0.026, n=80), high levels of oxidative damage in the tumour epithelium (intensity of 8-oxo-dG in nuclear and cytoplasmic compartments, p=0.042 and 0.038 respectively, n=75) and lower systemic pro-inflammatory cytokines (IL6 and IL8, p=0.053 and 0.049 respectively, n=61) significantly stratify with median overall survival (OS). In summary, screening for a panel of biomarkers for high levels of immature tumour vasculature, high levels of oxidative DNA damage and low levels of systemic pro-inflammatory cytokines may be beneficial in predicting enhanced survival outcome following bevacizumab treatment for mCRC. PMID:29535825

  16. Stratifying Parkinson's Patients With STN-DBS Into High-Frequency or 60 Hz-Frequency Modulation Using a Computational Model.

    Science.gov (United States)

    Khojandi, Anahita; Shylo, Oleg; Mannini, Lucia; Kopell, Brian H; Ramdhani, Ritesh A

    2017-07-01

    High frequency stimulation (HFS) of the subthalamic nucleus (STN) is a well-established therapy for Parkinson's disease (PD), particularly the cardinal motor symptoms and levodopa induced motor complications. Recent studies have suggested the possible role of 60 Hz stimulation in STN-deep brain stimulation (DBS) for patients with gait disorder. The objective of this study was to develop a computational model, which stratifies patients a priori based on symptomatology into different frequency settings (i.e., high frequency or 60 Hz). We retrospectively analyzed preoperative MDS-Unified Parkinson's Disease Rating Scale III scores (32 indicators) collected from 20 PD patients implanted with STN-DBS at Mount Sinai Medical Center on either 60 Hz stimulation (ten patients) or HFS (130-185 Hz) (ten patients) for an average of 12 months. Predictive models using the Random Forest classification algorithm were built to associate patient/disease characteristics at surgery to the stimulation frequency. These models were evaluated objectively using leave-one-out cross-validation approach. The computational models produced, stratified patients into 60 Hz or HFS (130-185 Hz) with 95% accuracy. The best models relied on two or three predictors out of the 32 analyzed for classification. Across all predictors, gait and rest tremor of the right hand were consistently the most important. Computational models were developed using preoperative clinical indicators in PD patients treated with STN-DBS. These models were able to accurately stratify PD patients into 60 Hz stimulation or HFS (130-185 Hz) groups a priori, offering a unique potential to enhance the utilization of this therapy based on clinical subtypes. © 2017 International Neuromodulation Society.

  17. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  18. Study of MRI in stratified viscous plasma configuration

    Science.gov (United States)

    Carlevaro, Nakia; Montani, Giovanni; Renzi, Fabrizio

    2017-02-01

    We analyze the morphology of the magneto-rotational instability (MRI) for a stratified viscous plasma disk configuration in differential rotation, taking into account the so-called corotation theorem for the background profile. In order to select the intrinsic Alfvénic nature of MRI, we deal with an incompressible plasma and we adopt a formulation of the local perturbation analysis based on the use of the magnetic flux function as a dynamical variable. Our study outlines, as consequence of the corotation condition, a marked asymmetry of the MRI with respect to the equatorial plane, particularly evident in a complete damping of the instability over a positive critical height on the equatorial plane. We also emphasize how such a feature is already present (although less pronounced) even in the ideal case, restoring a dependence of the MRI on the stratified morphology of the gravitational field.

  19. Replication Variance Estimation under Two-phase Sampling in the Presence of Non-response

    Directory of Open Access Journals (Sweden)

    Muqaddas Javed

    2014-09-01

    Full Text Available Kim and Yu (2011 discussed replication variance estimator for two-phase stratified sampling. In this paper estimators for mean have been proposed in two-phase stratified sampling for different situation of existence of non-response at first phase and second phase. The expressions of variances of these estimators have been derived. Furthermore, replication-based jackknife variance estimators of these variances have also been derived. Simulation study has been conducted to investigate the performance of the suggested estimators.

  20. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    Science.gov (United States)

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  1. On Internal Waves in a Density-Stratified Estuary

    NARCIS (Netherlands)

    Kranenburg, C.

    1991-01-01

    In this article some field observations, made in recent years, of internal wave motions in a density-stratified estuary are presented, In order to facilitate the appreciation of the results, and to make some quantitative comparisons, the relevant theory is also summarized. Furthermore, the origins

  2. Prognosis research strategy (PROGRESS) 4: Stratified medicine research

    NARCIS (Netherlands)

    A. Hingorani (Aroon); D.A.W.M. van der Windt (Daniëlle); R.D. Riley (Richard); D. Abrams; K.G.M. Moons (Karel); E.W. Steyerberg (Ewout); S. Schroter (Sara); W. Sauerbrei (Willi); D.G. Altman (Douglas); H. Hemingway; A. Briggs (Andrew); N. Brunner; P. Croft (Peter); J. Hayden (Jill); P.A. Kyzas (Panayiotis); N. Malats (Núria); G. Peat; P. Perel (Pablo); I. Roberts (Ian); A. Timmis (Adam)

    2013-01-01

    textabstractIn patients with a particular disease or health condition, stratified medicine seeks to identify thosewho will have the most clinical benefit or least harm from a specific treatment. In this article, thefourth in the PROGRESS series, the authors discuss why prognosis research should form

  3. Geohydrology and water quality of stratified-drift aquifers in the lower Merrimack and coastal river basins, southeastern New Hampshire

    Science.gov (United States)

    Stekl, Peter J.; Flanagan, Sarah M.

    1992-01-01

    -water-level measurements and collect ground-water-quality samples. Surface-water-discharge measurements were made at 16 sites during low flow when the surface water is primarily ground-water discharge . These low-flow measurements indicate quantities of ground water potentially available from aquifers. Hydraulic conductivities of aquifer materials were estimated from grain-size-distribution data from 61 samples of stratified drift . Transmissivity was estimated from well logs by assigning hydraulic conductivity to specific well-log intervals, multiplying by the saturated thickness of the interval, and summing the results . Additional transmissivity values were obtained from an analysis of specific capacity and aquifer-test data. Long-term aquifer yields and contributing areas to hypothetical supply wells were estimated by application of a method that is analogous to super position and incorporates a ground-water-flow model developed by McDonald and Harbaugh (1988) . This method was applied to two aquifers judged to have the best potential for providing additional ground-water supplies. Samples of ground water from 26 test wells and 4 municipal wells were collected in March and August 1987 for analysis of common inorganic, organic, and volatile organic constituents. Methods for collecting and analyzing the samples are described by Fishman and Freidman (1989) . The water-quality results from the well samples were used to characterize background water quality in the stratified-drift aquifers.

  4. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  5. Social network diversity and risks of ischemic heart disease and total mortality

    DEFF Research Database (Denmark)

    Barefoot, John C; Grønbaek, Morten; Jensen, Gorm

    2005-01-01

    Measures of various types of social contacts were used as predictors of ischemic heart disease events and total mortality in an age-stratified random sample of 9,573 adults enrolled in the Copenhagen City Heart Study (Copenhagen, Denmark). Baseline examinations were conducted in 1991-1994, and pa......Measures of various types of social contacts were used as predictors of ischemic heart disease events and total mortality in an age-stratified random sample of 9,573 adults enrolled in the Copenhagen City Heart Study (Copenhagen, Denmark). Baseline examinations were conducted in 1991...

  6. The optimism trap: Migrants' educational choices in stratified education systems.

    Science.gov (United States)

    Tjaden, Jasper Dag; Hunkler, Christian

    2017-09-01

    Immigrant children's ambitious educational choices have often been linked to their families' high level of optimism and motivation for upward mobility. However, previous research has mostly neglected alternative explanations such as information asymmetries or anticipated discrimination. Moreover, immigrant children's higher dropout rates at the higher secondary and university level suggest that low performing migrant students could have benefitted more from pursuing less ambitious tracks, especially in countries that offer viable vocational alternatives. We examine ethnic minority's educational choices using a sample of academically low performing, lower secondary school students in Germany's highly stratified education system. We find that their families' optimism diverts migrant students from viable vocational alternatives. Information asymmetries and anticipated discrimination do not explain their high educational ambitions. While our findings further support the immigrant optimism hypothesis, we discuss how its effect may have different implications depending on the education system. Copyright © 2017. Published by Elsevier Inc.

  7. Stratified Simulations of Collisionless Accretion Disks

    Energy Technology Data Exchange (ETDEWEB)

    Hirabayashi, Kota; Hoshino, Masahiro, E-mail: hirabayashi-k@eps.s.u-tokyo.ac.jp [Department of Earth and Planetary Science, The University of Tokyo, Tokyo, 113-0033 (Japan)

    2017-06-10

    This paper presents a series of stratified-shearing-box simulations of collisionless accretion disks in the recently developed framework of kinetic magnetohydrodynamics (MHD), which can handle finite non-gyrotropy of a pressure tensor. Although a fully kinetic simulation predicted a more efficient angular-momentum transport in collisionless disks than in the standard MHD regime, the enhanced transport has not been observed in past kinetic-MHD approaches to gyrotropic pressure anisotropy. For the purpose of investigating this missing link between the fully kinetic and MHD treatments, this paper explores the role of non-gyrotropic pressure and makes the first attempt to incorporate certain collisionless effects into disk-scale, stratified disk simulations. When the timescale of gyrotropization was longer than, or comparable to, the disk-rotation frequency of the orbit, we found that the finite non-gyrotropy selectively remaining in the vicinity of current sheets contributes to suppressing magnetic reconnection in the shearing-box system. This leads to increases both in the saturated amplitude of the MHD turbulence driven by magnetorotational instabilities and in the resultant efficiency of angular-momentum transport. Our results seem to favor the fast advection of magnetic fields toward the rotation axis of a central object, which is required to launch an ultra-relativistic jet from a black hole accretion system in, for example, a magnetically arrested disk state.

  8. Stratification in Business and Agriculture Surveys with R

    Directory of Open Access Journals (Sweden)

    Marco Ballin

    2016-06-01

    Full Text Available Usually sample surveys on enterprises and farms adopt a one stage stratified sampling design. In practice the sampling frame is divided in non-overlapping strata and simple random sampling is carried out independently in each stratum. Stratification allows for reduction of the sampling error and permits to derive accurate estimates. Stratified sampling requires a number of decisions strictly related: (i how to stratify the population and how many strata to consider; (ii the size of the whole sample and corresponding partitioning among the strata (so called allocation. This paper will deal mainly with the problem (i and will show how to tackle it in the R environment using packages already available on the CRAN.

  9. Use of psychotherapy in a representative adult community sample in São Paulo, Brazil

    Science.gov (United States)

    Blay, Sergio L.; Fillenbaum, Gerda G.; da Silva, Paula Freitas R.; Peluso, Erica T.

    2014-01-01

    Little is known about the use of psychotherapy to treat common mental disorders in a major city in a middle income country. Data come from in-home interviews with a stratified random sample of 2,000 community residents age 18–65 in the city of São Paulo, Brazil. The information obtained included sociodemographic characteristics; psychotropic drugs; mental status; and lifetime, previous 12 months, and current use of psychotherapy. Logistic regression was used to examine determinants of use of psychotherapy. Of the sample, 22.7% met General Health Questionnaire-12 criteria for common mental disorders. Lifetime, previous 12 months, and current use of psychotherapy were reported by 14.6%, 4.6%, and 2.3% of the sample respectively. Users were typically women, more educated, higher income, not married, unemployed, with common mental disorders. Further analysis found that 47% (with higher education and income) paid out-of-pocket, and 53% used psychotropic medication. Psychotherapy does not appear to be the preferred treatment for common mental disorders. PMID:25118139

  10. Dispersion of (light) inertial particles in stratified turbulence

    NARCIS (Netherlands)

    van Aartrijk, M.; Clercx, H.J.H.; Armenio, Vincenzo; Geurts, Bernardus J.; Fröhlich, Jochen

    2010-01-01

    We present a brief overview of a numerical study of the dispersion of particles in stably stratified turbulence. Three types of particles arc examined: fluid particles, light inertial particles ($\\rho_p/\\rho_f = \\mathcal{O}(1)$) and heavy inertial particles ($\\rho_p/\\rho_f \\gg 1$). Stratification

  11. Randomized comparison of vaginal self-sampling by standard vs. dry swabs for Human papillomavirus testing

    International Nuclear Information System (INIS)

    Eperon, Isabelle; Vassilakos, Pierre; Navarria, Isabelle; Menoud, Pierre-Alain; Gauthier, Aude; Pache, Jean-Claude; Boulvain, Michel; Untiet, Sarah; Petignat, Patrick

    2013-01-01

    To evaluate if human papillomavirus (HPV) self-sampling (Self-HPV) using a dry vaginal swab is a valid alternative for HPV testing. Women attending colposcopy clinic were recruited to collect two consecutive Self-HPV samples: a Self-HPV using a dry swab (S-DRY) and a Self-HPV using a standard wet transport medium (S-WET). These samples were analyzed for HPV using real time PCR (Roche Cobas). Participants were randomized to determine the order of the tests. Questionnaires assessing preferences and acceptability for both tests were conducted. Subsequently, women were invited for colposcopic examination; a physician collected a cervical sample (physician-sampling) with a broom-type device and placed it into a liquid-based cytology medium. Specimens were then processed for the production of cytology slides and a Hybrid Capture HPV DNA test (Qiagen) was performed from the residual liquid. Biopsies were performed if indicated. Unweighted kappa statistics (κ) and McNemar tests were used to measure the agreement among the sampling methods. A total of 120 women were randomized. Overall HPV prevalence was 68.7% (95% Confidence Interval (CI) 59.3–77.2) by S-WET, 54.4% (95% CI 44.8–63.9) by S-DRY and 53.8% (95% CI 43.8–63.7) by HC. Among paired samples (S-WET and S-DRY), the overall agreement was good (85.7%; 95% CI 77.8–91.6) and the κ was substantial (0.70; 95% CI 0.57-0.70). The proportion of positive type-specific HPV agreement was also good (77.3%; 95% CI 68.2-84.9). No differences in sensitivity for cervical intraepithelial neoplasia grade one (CIN1) or worse between the two Self-HPV tests were observed. Women reported the two Self-HPV tests as highly acceptable. Self-HPV using dry swab transfer does not appear to compromise specimen integrity. Further study in a large screening population is needed. ClinicalTrials.gov: http://clinicaltrials.gov/show/NCT01316120

  12. Internal circle uplifts, transversality and stratified G-structures

    Energy Technology Data Exchange (ETDEWEB)

    Babalic, Elena Mirela [Department of Theoretical Physics, National Institute of Physics and Nuclear Engineering,Str. Reactorului no.30, P.O.BOX MG-6, Postcode 077125, Bucharest-Magurele (Romania); Department of Physics, University of Craiova,13 Al. I. Cuza Str., Craiova 200585 (Romania); Lazaroiu, Calin Iuliu [Center for Geometry and Physics, Institute for Basic Science,Pohang 790-784 (Korea, Republic of)

    2015-11-24

    We study stratified G-structures in N=2 compactifications of M-theory on eight-manifolds M using the uplift to the auxiliary nine-manifold M̂=M×S{sup 1}. We show that the cosmooth generalized distribution D̂ on M̂ which arises in this formalism may have pointwise transverse or non-transverse intersection with the pull-back of the tangent bundle of M, a fact which is responsible for the subtle relation between the spinor stabilizers arising on M and M̂ and for the complicated stratified G-structure on M which we uncovered in previous work. We give a direct explanation of the latter in terms of the former and relate explicitly the defining forms of the SU(2) structure which exists on the generic locus U of M to the defining forms of the SU(3) structure which exists on an open subset Û of M̂, thus providing a dictionary between the eight- and nine-dimensional formalisms.

  13. Dyadic Green's function of an eccentrically stratified sphere.

    Science.gov (United States)

    Moneda, Angela P; Chrissoulidis, Dimitrios P

    2014-03-01

    The electric dyadic Green's function (dGf) of an eccentrically stratified sphere is built by use of the superposition principle, dyadic algebra, and the addition theorem of vector spherical harmonics. The end result of the analytical formulation is a set of linear equations for the unknown vector wave amplitudes of the dGf. The unknowns are calculated by truncation of the infinite sums and matrix inversion. The theory is exact, as no simplifying assumptions are required in any one of the analytical steps leading to the dGf, and it is general in the sense that any number, position, size, and electrical properties can be considered for the layers of the sphere. The point source can be placed outside of or in any lossless part of the sphere. Energy conservation, reciprocity, and other checks verify that the dGf is correct. A numerical application is made to a stratified sphere made of gold and glass, which operates as a lens.

  14. Determination of Initial Conditions for the Safety Analysis by Random Sampling of Operating Parameters

    International Nuclear Information System (INIS)

    Jeong, Hae-Yong; Park, Moon-Ghu

    2015-01-01

    In most existing evaluation methodologies, which follow a conservative approach, the most conservative initial conditions are searched for each transient scenario through tremendous assessment for wide operating windows or limiting conditions for operation (LCO) allowed by the operating guidelines. In this procedure, a user effect could be involved and a remarkable time and human resources are consumed. In the present study, we investigated a more effective statistical method for the selection of the most conservative initial condition by the use of random sampling of operating parameters affecting the initial conditions. A method for the determination of initial conditions based on random sampling of plant design parameters is proposed. This method is expected to be applied for the selection of the most conservative initial plant conditions in the safety analysis using a conservative evaluation methodology. In the method, it is suggested that the initial conditions of reactor coolant flow rate, pressurizer level, pressurizer pressure, and SG level are adjusted by controlling the pump rated flow, setpoints of PLCS, PPCS, and FWCS, respectively. The proposed technique is expected to contribute to eliminate the human factors introduced in the conventional safety analysis procedure and also to reduce the human resources invested in the safety evaluation of nuclear power plants

  15. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  16. Analysis of photonic band-gap structures in stratified medium

    DEFF Research Database (Denmark)

    Tong, Ming-Sze; Yinchao, Chen; Lu, Yilong

    2005-01-01

    in electromagnetic and microwave applications once the Maxwell's equations are appropriately modeled. Originality/value - The method validates its values and properties through extensive studies on regular and defective 1D PBG structures in stratified medium, and it can be further extended to solving more......Purpose - To demonstrate the flexibility and advantages of a non-uniform pseudo-spectral time domain (nu-PSTD) method through studies of the wave propagation characteristics on photonic band-gap (PBG) structures in stratified medium Design/methodology/approach - A nu-PSTD method is proposed...... in solving the Maxwell's equations numerically. It expands the temporal derivatives using the finite differences, while it adopts the Fourier transform (FT) properties to expand the spatial derivatives in Maxwell's equations. In addition, the method makes use of the chain-rule property in calculus together...

  17. The Values of Combined and Sub-Stratified Imaging Scores with Ultrasonography and Mammography in Breast Cancer Subtypes.

    Directory of Open Access Journals (Sweden)

    Tsun-Hou Chang

    Full Text Available The Breast Imaging Reporting and Data System (BI-RADS of Mammography (MG and Ultrasonography (US were equivalent to the "5-point score" and applied for combined and sub-stratified imaging assessments. This study evaluated the value of combined and sub-stratified imaging assessments with MG and US over breast cancer subtypes (BCS.Medical records of 5,037 cases having imaging-guided core biopsy, performed from 2009 to 2012, were retrospectively reviewed. This study selected 1,995 cases (1,457 benign and 538 invasive cancer having both MG and US before biopsy. These cases were categorized with the "5-point score" for their MG and US, and applied for combined and sub-stratified imaging assessments. Invasive cancers were classified on the basis of BCS, and correlated with combined and sub-stratified imaging assessments.These selected cases were evaluated by the "5-point score." MG, US, and combined and sub-stratified imaging assessments all revealed statistically significant (P < 0.001 incidence of malignancy. The sensitivity was increased in the combined imaging score (99.8%, and the specificity was increased in the sub-stratified combined score (75.4%. In the sub-stratified combined imaging assessment, all BCS can be classified with higher scores (abnormality hierarchy, and luminal B subtype showed the most salient result (hierarchy: higher, 95%; lower, 5%.Combined and sub-stratified imaging assessments can increase sensitivity and specificity of breast cancer diagnosis, respectively, and Luminal B subtype shows the best identification by sub-stratified combined imaging scoring.

  18. Adjunctive Mitomycin C or Amniotic Membrane Transplantation for Ahmed Glaucoma Valve Implantation: A Randomized Clinical Trial.

    Science.gov (United States)

    Yazdani, Shahin; Mahboobipour, Hassan; Pakravan, Mohammad; Doozandeh, Azadeh; Ghahari, Elham

    2016-05-01

    To determine whether adjunctive mitomycin C (MMC) or amniotic membrane transplantation (AMT) improve the outcomes of Ahmed glaucoma valve (AGV) implantation. This double-blind, stratified, 3-armed randomized clinical trial includes 75 eyes of 75 patients aged 7 to 75 years with refractory glaucoma. Eligible subjects underwent stratified block randomization; eyes were first stratified to surgery in the superior or inferior quadrants based on feasibility; in each subgroup, eyes were randomly assigned to the study arms using random blocks: conventional AGV implantation (group A, 25 eyes), AGV with MMC (group B, 25 eyes), and AGV with AMT (group C, 25 eyes). The 3 study groups were comparable regarding baseline characteristics and mean follow-up (P=0.288). A total of 68 patients including 23 eyes in group A, 25 eyes in group B, and 20 eyes group C completed the follow-up period and were analyzed. Intraocular pressure was lower in the MMC group only 3 weeks postoperatively (P=0.04) but comparable at other time intervals. Overall success rate was comparable in the 3 groups at 12 months (P=0.217). The number of eyes requiring medications (P=0.30), time to initiation of medications (P=0.13), and number of medications (P=0.22) were comparable. Hypertensive phase was slightly but insignificantly more common with standard surgery (82%) as compared with MMC-augmented (60%) and AMT-augmented (70%) procedures (P=0.23). Complications were comparable over 1 year (P=0.28). Although adjunctive MMC and AMT were safe during AGV implantation, they did not influence success rates or intraocular pressure outcomes. Complications, including hypertensive phase, were also comparable.

  19. Sampling and analysis validates acceptable knowledge on LANL transuranic, heterogeneous, debris waste, or ''Cutting the Gordian knot that binds WIPP''

    International Nuclear Information System (INIS)

    Kosiewicz, S.T.; Triay, I.R.; Souza, L.A.

    1999-01-01

    Through sampling and toxicity characteristic leaching procedure (TCLP) analyses, LANL and the DOE validated that a LANL transuranic (TRU) waste (TA-55-43, Lot No. 01) was not a Resource Recovery and Conservation Act (RCRA) hazardous waste. This paper describes the sampling and analysis project as well as the statistical assessment of the analytical results. The analyses were conducted according to the requirements and procedures in the sampling and analysis plan approved by the New Mexico Environmental Department. The plan used a statistical approach that was consistent with the stratified, random sampling requirements of SW-846. LANL adhered to the plan during sampling and chemical analysis of randomly selected items of the five major types of materials in this heterogeneous, radioactive, debris waste. To generate portions of the plan, LANL analyzed a number of non-radioactive items that were representative of the mix of items present in the waste stream. Data from these cold surrogates were used to generate means and variances needed to optimize the design. Based on statistical arguments alone, only two samples from the entire waste stream were deemed necessary, however a decision was made to analyze at least two samples of each of the five major waste types. To obtain these samples, nine TRU waste drums were opened. Sixty-six radioactively contaminated and four non-radioactive grab samples were collected. Portions of the samples were composited for chemical analyses. In addition, a radioactively contaminated sample of rust-colored powder of interest to the New Mexico Environment Department (NMED) was collected and qualitatively identified as rust

  20. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  1. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  2. Representativeness-based sampling network design for the State of Alaska

    Science.gov (United States)

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  3. General Practitioners' and patients' perceptions towards stratified care: a theory informed investigation.

    Science.gov (United States)

    Saunders, Benjamin; Bartlam, Bernadette; Foster, Nadine E; Hill, Jonathan C; Cooper, Vince; Protheroe, Joanne

    2016-08-31

    Stratified primary care involves changing General Practitioners' (GPs) clinical behaviour in treating patients, away from the current stepped care approach to instead identifying early treatment options that are matched to patients' risk of persistent disabling pain. This article explores the perspectives of UK-based GPs and patients about a prognostic stratified care model being developed for patients with the five most common primary care musculoskeletal pain presentations. The focus was on views about acceptability, and anticipated barriers and facilitators to the use of stratified care in routine practice. Four focus groups and six semi-structured telephone interviews were conducted with GPs (n = 23), and three focus groups with patients (n = 20). Data were analysed thematically; and identified themes examined in relation to the Theoretical Domains Framework (TDF), which facilitates comprehensive identification of behaviour change determinants. A critical approach was taken in using the TDF, examining the nuanced interrelationships between theoretical domains. Four key themes were identified: Acceptability of clinical decision-making guided by stratified care; impact on the therapeutic relationship; embedding a prognostic approach within a biomedical model; and practical issues in using stratified care. Whilst within each theme specific findings are reported, common across themes was the identified relationships between the theoretical domains of knowledge, skills, professional role and identity, environmental context and resources, and goals. Through analysis of these identified relationships it was found that, for GPs and patients to perceive stratified care as being acceptable, it must be seen to enhance GPs' knowledge and skills, not undermine GPs' and patients' respective identities and be integrated within the environmental context of the consultation with minimal disruption. Findings highlight the importance of taking into account the context of

  4. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  5. Ethanol dehydration to ethylene in a stratified autothermal millisecond reactor.

    Science.gov (United States)

    Skinner, Michael J; Michor, Edward L; Fan, Wei; Tsapatsis, Michael; Bhan, Aditya; Schmidt, Lanny D

    2011-08-22

    The concurrent decomposition and deoxygenation of ethanol was accomplished in a stratified reactor with 50-80 ms contact times. The stratified reactor comprised an upstream oxidation zone that contained Pt-coated Al(2)O(3) beads and a downstream dehydration zone consisting of H-ZSM-5 zeolite films deposited on Al(2)O(3) monoliths. Ethanol conversion, product selectivity, and reactor temperature profiles were measured for a range of fuel:oxygen ratios for two autothermal reactor configurations using two different sacrificial fuel mixtures: a parallel hydrogen-ethanol feed system and a series methane-ethanol feed system. Increasing the amount of oxygen relative to the fuel resulted in a monotonic increase in ethanol conversion in both reaction zones. The majority of the converted carbon was in the form of ethylene, where the ethanol carbon-carbon bonds stayed intact while the oxygen was removed. Over 90% yield of ethylene was achieved by using methane as a sacrificial fuel. These results demonstrate that noble metals can be successfully paired with zeolites to create a stratified autothermal reactor capable of removing oxygen from biomass model compounds in a compact, continuous flow system that can be configured to have multiple feed inputs, depending on process restrictions. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Analysing stratified medicine business models and value systems: innovation-regulation interactions.

    Science.gov (United States)

    Mittra, James; Tait, Joyce

    2012-09-15

    Stratified medicine offers both opportunities and challenges to the conventional business models that drive pharmaceutical R&D. Given the increasingly unsustainable blockbuster model of drug development, due in part to maturing product pipelines, alongside increasing demands from regulators, healthcare providers and patients for higher standards of safety, efficacy and cost-effectiveness of new therapies, stratified medicine promises a range of benefits to pharmaceutical and diagnostic firms as well as healthcare providers and patients. However, the transition from 'blockbusters' to what might now be termed 'niche-busters' will require the adoption of new, innovative business models, the identification of different and perhaps novel types of value along the R&D pathway, and a smarter approach to regulation to facilitate innovation in this area. In this paper we apply the Innogen Centre's interdisciplinary ALSIS methodology, which we have developed for the analysis of life science innovation systems in contexts where the value creation process is lengthy, expensive and highly uncertain, to this emerging field of stratified medicine. In doing so, we consider the complex collaboration, timing, coordination and regulatory interactions that shape business models, value chains and value systems relevant to stratified medicine. More specifically, we explore in some depth two convergence models for co-development of a therapy and diagnostic before market authorisation, highlighting the regulatory requirements and policy initiatives within the broader value system environment that have a key role in determining the probable success and sustainability of these models. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Magnitude and determinants of physical inactivity in Ethiopia ...

    African Journals Online (AJOL)

    user

    A mix of sampling approach namely stratified, three-stage cluster sampling, simple random sampling and Kish ... provide concrete picture on the state of NCDs in. Ethiopia. .... Measurement and Operational definitions. Assessing physical ...

  8. Inviscid incompressible limits of strongly stratified fluids

    Czech Academy of Sciences Publication Activity Database

    Feireisl, Eduard; Jin, B.J.; Novotný, A.

    2014-01-01

    Roč. 89, 3-4 (2014), s. 307-329 ISSN 0921-7134 R&D Projects: GA ČR GA201/09/0917 Institutional support: RVO:67985840 Keywords : compressible Navier-Stokes system * anelastic approximation * stratified fluid Subject RIV: BA - General Mathematics Impact factor: 0.528, year: 2014 http://iospress.metapress.com/content/d71255745tl50125/?p=969b60ae82634854ab8bd25505ce1f71&pi=3

  9. Acute stress symptoms during the second Lebanon war in a random sample of Israeli citizens.

    Science.gov (United States)

    Cohen, Miri; Yahav, Rivka

    2008-02-01

    The aims of this study were to assess prevalence of acute stress disorder (ASD) and acute stress symptoms (ASS) in Israel during the second Lebanon war. A telephone survey was conducted in July 2006 of a random sample of 235 residents of northern Israel, who were subjected to missile attacks, and of central Israel, who were not subjected to missile attacks. Results indicate that ASS scores were higher in the northern respondents; 6.8% of the northern sample and 3.9% of the central sample met ASD criteria. Appearance of each symptom ranged from 15.4% for dissociative to 88.4% for reexperiencing, with significant differences between northern and central respondents only for reexperiencing and arousal. A low ASD rate and a moderate difference between areas subjected and not subjected to attack were found.

  10. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    OpenAIRE

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Fr?d?ric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed f...

  11. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  12. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  13. Analysis Of Career Aspirations Of Agricultural Science Graduates ...

    African Journals Online (AJOL)

    The objective of this study was to identify the career aspirations of agricultural science graduates from Nigerian Universities of Agriculture. A random sample of 215 graduating students of agriculture was selected using stratified random sampling method. Data were collected with the aid of a structured questionnaire and the ...

  14. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  15. Discriminative motif discovery via simulated evolution and random under-sampling.

    Science.gov (United States)

    Song, Tao; Gu, Hong

    2014-01-01

    Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs) training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  16. RANS Modeling of Stably Stratified Turbulent Boundary Layer Flows in OpenFOAM®

    Directory of Open Access Journals (Sweden)

    Wilson Jordan M.

    2015-01-01

    Full Text Available Quantifying mixing processes relating to the transport of heat, momentum, and scalar quantities of stably stratified turbulent geophysical flows remains a substantial task. In a stably stratified flow, such as the stable atmospheric boundary layer (SABL, buoyancy forces have a significant impact on the flow characteristics. This study investigates constant and stability-dependent turbulent Prandtl number (Prt formulations linking the turbulent viscosity (νt and diffusivity (κt for modeling applications of boundary layer flows. Numerical simulations of plane Couette flow and pressure-driven channel flow are performed using the Reynolds-averaged Navier-Stokes (RANS framework with the standard k-ε turbulence model. Results are compared with DNS data to evaluate model efficacy for predicting mean velocity and density fields. In channel flow simulations, a Prandtl number formulation for wall-bounded flows is introduced to alleviate overmixing of the mean density field. This research reveals that appropriate specification of Prt can improve predictions of stably stratified turbulent boundary layer flows.

  17. Improving depression and enhancing resilience in family dementia caregivers: a pilot randomized placebo-controlled trial of escitalopram.

    Science.gov (United States)

    Lavretsky, Helen; Siddarth, Prabha; Irwin, Michael R

    2010-02-01

    This study examined the potential of an antidepressant drug, escitalopram, to improve depression, resilience to stress, and quality of life in family dementia caregivers in a randomized placebo-controlled double-blinded trial. Forty family caregivers (43-91 years of age, 25 children and 15 spouses; 26 women) who were taking care of their relatives with Alzheimer disease were randomized to receive either escitalopram 10 mg/day or placebo for 12 weeks. Severity of depression, resilience, burden, distress, quality of life, and severity of care-recipient's cognitive and behavioral disturbances were assessed at baseline and over the course of the study. The Hamilton Depression Rating Scale scores at baseline ranged between 10 and 28. The groups were stratified by the diagnosis of major and minor depression. Most outcomes favored escitalopram over placebo. The severity of depression improved, and the remission rate was greater with the drug compared with placebo. Measures of anxiety, resilience, burden, and distress improved on escitalopram compared with placebo. Among caregivers, this small randomized controlled trial found that escitalopram use resulted in improvement in depression, resilience, burden and distress, and quality of life. Our results need to be confirmed in a larger sample.

  18. Stability of Miscible Displacements Across Stratified Porous Media

    Energy Technology Data Exchange (ETDEWEB)

    Shariati, Maryam; Yortsos, Yanis C.

    2000-09-11

    This report studied macro-scale heterogeneity effects. Reflecting on their importance, current simulation practices of flow and displacement in porous media were invariably based on heterogeneous permeability fields. Here, it was focused on a specific aspect of such problems, namely the stability of miscible displacements in stratified porous media, where the displacement is perpendicular to the direction of stratification.

  19. Mechanisms and Variability of Salt Transport in Partially-Stratified Estuaries

    National Research Council Canada - National Science Library

    Bowen, Melissa

    2000-01-01

    .... Analysis of salt transport from observations in the Hudson Estuary show that stratified periods with elevated estuarine salt transport occur in five-day intervals once a month during apogean neap tides...

  20. Stratified randomization controls better for batch effects in 450K methylation analysis: A cautionary tale

    Directory of Open Access Journals (Sweden)

    Olive D. Buhule

    2014-10-01

    Full Text Available Background: Batch effects in DNA methylation microarray experiments can lead to spurious results if not properly handled during the plating of samples. Methods: Two pilot studies examining the association of DNA methylation patterns across the genome with obesity in Samoan men were investigated for chip- and row-specific batch effects. For each study, the DNA of 46 obese men and 46 lean men were assayed using Illumina's Infinium HumanMethylation450 BeadChip. In the first study (Sample One, samples from obese and lean subjects were examined on separate chips. In the second study (Sample Two, the samples were balanced on the chips by lean/obese status, age group, and census region. We used methylumi, watermelon, and limma R packages, as well as ComBat, to analyze the data. Principal component analysis and linear regression were respectively employed to identify the top principal components and to test for their association with the batches and lean/obese status. To identify differentially methylated positions (DMPs between obese and lean males at each locus, we used a moderated t-test.Results: Chip effects were effectively removed from Sample Two but not Sample One. In addition, dramatic differences were observed between the two sets of DMP results. After removing'' batch effects with ComBat, Sample One had 94,191 probes differentially methylated at a q-value threshold of 0.05 while Sample Two had zero differentially methylated probes. The disparate results from Sample One and Sample Two likely arise due to the confounding of lean/obese status with chip and row batch effects.Conclusion: Even the best possible statistical adjustments for batch effects may not completely remove them. Proper study design is vital for guarding against spurious findings due to such effects.

  1. Adjusting for multiple prognostic factors in the analysis of randomised trials

    Science.gov (United States)

    2013-01-01

    Background When multiple prognostic factors are adjusted for in the analysis of a randomised trial, it is unclear (1) whether it is necessary to account for each of the strata, formed by all combinations of the prognostic factors (stratified analysis), when randomisation has been balanced within each stratum (stratified randomisation), or whether adjusting for the main effects alone will suffice, and (2) the best method of adjustment in terms of type I error rate and power, irrespective of the randomisation method. Methods We used simulation to (1) determine if a stratified analysis is necessary after stratified randomisation, and (2) to compare different methods of adjustment in terms of power and type I error rate. We considered the following methods of analysis: adjusting for covariates in a regression model, adjusting for each stratum using either fixed or random effects, and Mantel-Haenszel or a stratified Cox model depending on outcome. Results Stratified analysis is required after stratified randomisation to maintain correct type I error rates when (a) there are strong interactions between prognostic factors, and (b) there are approximately equal number of patients in each stratum. However, simulations based on real trial data found that type I error rates were unaffected by the method of analysis (stratified vs unstratified), indicating these conditions were not met in real datasets. Comparison of different analysis methods found that with small sample sizes and a binary or time-to-event outcome, most analysis methods lead to either inflated type I error rates or a reduction in power; the lone exception was a stratified analysis using random effects for strata, which gave nominal type I error rates and adequate power. Conclusions It is unlikely that a stratified analysis is necessary after stratified randomisation except in extreme scenarios. Therefore, the method of analysis (accounting for the strata, or adjusting only for the covariates) will not

  2. Inhibition of Rho-associated kinases disturbs the collective cell migration of stratified TE-10 cells

    Directory of Open Access Journals (Sweden)

    Taro Mikami

    2015-01-01

    Full Text Available BACKGROUND: The collective cell migration of stratified epithelial cells is considered to be an important phenomenon in wound healing, development, and cancer invasion; however, little is known about the mechanisms involved. Furthermore, whereas Rho family proteins, including RhoA, play important roles in cell migration, the exact role of Rho-associated coiled coil-containing protein kinases (ROCKs in cell migration is controversial and might be cell-type dependent. Here, we report the development of a novel modified scratch assay that was used to observe the collective cell migration of stratified TE-10 cells derived from a human esophageal cancer specimen. RESULTS: Desmosomes were found between the TE-10 cells and microvilli of the surface of the cell sheet. The leading edge of cells in the cell sheet formed a simple layer and moved forward regularly; these rows were followed by the stratified epithelium. ROCK inhibitors and ROCK small interfering RNAs (siRNAs disturbed not only the collective migration of the leading edge of this cell sheet, but also the stratified layer in the rear. In contrast, RhoA siRNA treatment resulted in more rapid migration of the leading rows and disturbed movement of the stratified portion. CONCLUSIONS: The data presented in this study suggest that ROCKs play an important role in mediating the collective migration of TE-10 cell sheets. In addition, differences between the effects of siRNAs targeting either RhoA or ROCKs suggested that distinct mechanisms regulate the collective cell migration in the simple epithelium of the wound edge versus the stratified layer of the epithelium.

  3. Stratified charge rotary engine combustion studies

    Science.gov (United States)

    Shock, H.; Hamady, F.; Somerton, C.; Stuecken, T.; Chouinard, E.; Rachal, T.; Kosterman, J.; Lambeth, M.; Olbrich, C.

    1989-07-01

    Analytical and experimental studies of the combustion process in a stratified charge rotary engine (SCRE) continue to be the subject of active research in recent years. Specifically to meet the demand for more sophisticated products, a detailed understanding of the engine system of interest is warranted. With this in mind the objective of this work is to develop an understanding of the controlling factors that affect the SCRE combustion process so that an efficient power dense rotary engine can be designed. The influence of the induction-exhaust systems and the rotor geometry are believed to have a significant effect on combustion chamber flow characteristics. In this report, emphasis is centered on Laser Doppler Velocimetry (LDV) measurements and on qualitative flow visualizations in the combustion chamber of the motored rotary engine assembly. This will provide a basic understanding of the flow process in the RCE and serve as a data base for verification of numerical simulations. Understanding fuel injection provisions is also important to the successful operation of the stratified charge rotary engine. Toward this end, flow visualizations depicting the development of high speed, high pressure fuel jets are described. Friction is an important consideration in an engine from the standpoint of lost work, durability and reliability. MSU Engine Research Laboratory efforts in accessing the frictional losses associated with the rotary engine are described. This includes work which describes losses in bearing, seal and auxillary components. Finally, a computer controlled mapping system under development is described. This system can be used to map shapes such as combustion chamber, intake manifolds or turbine blades accurately.

  4. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  5. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  6. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  7. Health-related quality of life predictors during medical residency in a random, stratified sample of residents Preditores de qualidade de vida relacionada à saúde durante a residência médica em uma amostra randomizada e estratificada de médicos residentes

    Directory of Open Access Journals (Sweden)

    Paula Costa Mosca Macedo

    2009-06-01

    Full Text Available OBJECTIVE: To evaluate the quality of life during the first three years of training and identify its association with sociodemographicoccupational characteristics, leisure time and health habits. METHOD: A cross-sectional study with a random sample of 128 residents stratified by year of training was conducted. The Medical Outcome Study -short form 36 was administered. Mann-Whitney tests were carried out to compare percentile distributions of the eight quality of life domains, according to sociodemographic variables, and a multiple linear regression analysis was performed, followed by a validity checking for the resulting models. RESULTS: The physical component presented higher quality of life medians than the mental component. Comparisons between the three years showed that in almost all domains the quality of life scores of the second year residents were higher than the first year residents (p OBJETIVO: Avaliar a qualidade de vida do médico residente durante os três anos do treinamento e identificar sua associação com as características sociodemográficas-ocupacionais, tempo de lazer e hábitos de saúde. MÉTODO: Foi realizado um estudo transversal com amostra randomizada de 128 residentes, estratificada por ano de residência. O Medical Outcome Study-Short Form 36 foi aplicado; as distribuições percentis dos domínios de qualidade de vida de acordo com variáveis sociodemográficas foram analisadas pelo teste de Mann-Whitney e regressão linear múltipla, bem como estudo de validação pós-regressão. RESULTADOS: O componente físico da qualidade de vida apresentou medianas mais altas do que o mental. Comparações entre os três anos mostraram que quase todos os domínios de qualidade de vida tiveram escores maiores no segundo do que no primeiro ano (p < 0,01; em relação ao componente mental observamos maiores escores no terceiro ano do que nos demais (p < 0,01. Preditores de maior qualidade de vida foram: estar no segundo ou

  8. Equipment for extracting and conveying stratified minerals

    Energy Technology Data Exchange (ETDEWEB)

    Blumenthal, G.; Kunzer, H.; Plaga, K.

    1991-08-14

    This invention relates to equipment for extracting stratified minerals and conveying the said minerals along the working face, comprising a trough shaped conveyor run assembled from lengths, a troughed extraction run in lengths matching the lengths of conveyor troughing, which is linked to the top edge of the working face side of the conveyor troughing with freedom to swivel vertically, and a positively guided chain carrying extraction tools and scrapers along the conveyor and extraction runs.

  9. A mechanically enhanced hybrid nano-stratified barrier with a defect suppression mechanism for highly reliable flexible OLEDs.

    Science.gov (United States)

    Jeong, Eun Gyo; Kwon, Seonil; Han, Jun Hee; Im, Hyeon-Gyun; Bae, Byeong-Soo; Choi, Kyung Cheol

    2017-05-18

    Understanding the mechanical behaviors of encapsulation barriers under bending stress is important when fabricating flexible organic light-emitting diodes (FOLEDs). The enhanced mechanical characteristics of a nano-stratified barrier were analyzed based on a defect suppression mechanism, and then experimentally demonstrated. Following the Griffith model, naturally-occurring cracks, which were caused by Zn etching at the interface of the nano-stratified structure, can curb the propagation of defects. Cross-section images after bending tests provided remarkable evidence to support the existence of a defect suppression mechanism. Many visible cracks were found in a single Al 2 O 3 layer, but not in the nano-stratified structure, due to the mechanism. The nano-stratified structure also enhanced the barrier's physical properties by changing the crystalline phase of ZnO. In addition, experimental results demonstrated the effect of the mechanism in various ways. The nano-stratified barrier maintained a low water vapor transmission rate after 1000 iterations of a 1 cm bending radius test. Using this mechanically enhanced hybrid nano-stratified barrier, FOLEDs were successfully encapsulated without losing mechanical or electrical performance. Finally, comparative lifetime measurements were conducted to determine reliability. After 2000 hours of constant current driving and 1000 iterations with a 1 cm bending radius, the FOLEDs retained 52.37% of their initial luminance, which is comparable to glass-lid encapsulation, with 55.96% retention. Herein, we report a mechanically enhanced encapsulation technology for FOLEDs using a nano-stratified structure with a defect suppression mechanism.

  10. Evaluation of a Stratified National Breast Screening Program in the United Kingdom : An Early Model-Based Cost-Effectiveness Analysis

    NARCIS (Netherlands)

    Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D. Gareth R.; Astley, Sue; Payne, Katherine

    Objectives: To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. Methods: A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1,

  11. Evaluation of a Stratified National Breast Screening Program in the United Kingdom: An Early Model-Based Cost-Effectiveness Analysis

    NARCIS (Netherlands)

    Gray, E.; Donten, A.; Karssemeijer, N.; Gils, C. van; Evans, D.G.; Astley, S.; Payne, K.

    2017-01-01

    OBJECTIVES: To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. METHODS: A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1,

  12. Choosing a suitable sample size in descriptive sampling

    International Nuclear Information System (INIS)

    Lee, Yong Kyun; Choi, Dong Hoon; Cha, Kyung Joon

    2010-01-01

    Descriptive sampling (DS) is an alternative to crude Monte Carlo sampling (CMCS) in finding solutions to structural reliability problems. It is known to be an effective sampling method in approximating the distribution of a random variable because it uses the deterministic selection of sample values and their random permutation,. However, because this method is difficult to apply to complex simulations, the sample size is occasionally determined without thorough consideration. Input sample variability may cause the sample size to change between runs, leading to poor simulation results. This paper proposes a numerical method for choosing a suitable sample size for use in DS. Using this method, one can estimate a more accurate probability of failure in a reliability problem while running a minimal number of simulations. The method is then applied to several examples and compared with CMCS and conventional DS to validate its usefulness and efficiency

  13. Crenothrix are major methane consumers in stratified lakes.

    Science.gov (United States)

    Oswald, Kirsten; Graf, Jon S; Littmann, Sten; Tienken, Daniela; Brand, Andreas; Wehrli, Bernhard; Albertsen, Mads; Daims, Holger; Wagner, Michael; Kuypers, Marcel Mm; Schubert, Carsten J; Milucka, Jana

    2017-09-01

    Methane-oxidizing bacteria represent a major biological sink for methane and are thus Earth's natural protection against this potent greenhouse gas. Here we show that in two stratified freshwater lakes a substantial part of upward-diffusing methane was oxidized by filamentous gamma-proteobacteria related to Crenothrix polyspora. These filamentous bacteria have been known as contaminants of drinking water supplies since 1870, but their role in the environmental methane removal has remained unclear. While oxidizing methane, these organisms were assigned an 'unusual' methane monooxygenase (MMO), which was only distantly related to 'classical' MMO of gamma-proteobacterial methanotrophs. We now correct this assignment and show that Crenothrix encode a typical gamma-proteobacterial PmoA. Stable isotope labeling in combination swith single-cell imaging mass spectrometry revealed methane-dependent growth of the lacustrine Crenothrix with oxygen as well as under oxygen-deficient conditions. Crenothrix genomes encoded pathways for the respiration of oxygen as well as for the reduction of nitrate to N 2 O. The observed abundance and planktonic growth of Crenothrix suggest that these methanotrophs can act as a relevant biological sink for methane in stratified lakes and should be considered in the context of environmental removal of methane.

  14. Crystallization of a compositionally stratified basal magma ocean

    Science.gov (United States)

    Laneuville, Matthieu; Hernlund, John; Labrosse, Stéphane; Guttenberg, Nicholas

    2018-03-01

    Earth's ∼3.45 billion year old magnetic field is regenerated by dynamo action in its convecting liquid metal outer core. However, convection induces an isentropic thermal gradient which, coupled with a high core thermal conductivity, results in rapid conducted heat loss. In the absence of implausibly high radioactivity or alternate sources of motion to drive the geodynamo, the Earth's early core had to be significantly hotter than the melting point of the lower mantle. While the existence of a dense convecting basal magma ocean (BMO) has been proposed to account for high early core temperatures, the requisite physical and chemical properties for a BMO remain controversial. Here we relax the assumption of a well-mixed convecting BMO and instead consider a BMO that is initially gravitationally stratified owing to processes such as mixing between metals and silicates at high temperatures in the core-mantle boundary region during Earth's accretion. Using coupled models of crystallization and heat transfer through a stratified BMO, we show that very high temperatures could have been trapped inside the early core, sequestering enough heat energy to run an ancient geodynamo on cooling power alone.

  15. Turbulent circulation above the surface heat source in stably stratified atmosphere

    Science.gov (United States)

    Kurbatskii, A. F.; Kurbatskaya, L. I.

    2016-10-01

    The 3-level RANS approach for simulating a turbulent circulation over the heat island in a stably stratified environment under nearly calm conditions is formulated. The turbulent kinetic energy its spectral consumption (dissipation) and the dispersion of turbulent fluctuations of temperature are found from differential equations, thus the correct modeling of transport processes in the interface layer with the counter-gradient heat flux is assured. The three-parameter turbulence RANS approach minimizes difficulties in simulating the turbulent transport in a stably stratified environment and reduces efforts needed for the numerical implementation of the 3-level RANS approach. Numerical simulation of the turbulent structure of the penetrative convection over the heat island under conditions of stably stratified atmosphere demonstrates that the three-equation model is able to predict the thermal circulation induced by the heat island. The temperature distribution, root-mean-square fluctuations of the turbulent velocity and temperature fields and spectral turbulent kinetic energy flux are in good agreement with the experimental data. The model describes such thin physical effects, as a crossing of vertical profiles of temperature of a thermal plume with the formation of the negative buoyancy area testifying to development of the dome-shaped form at the top part of a plume in the form of "hat".

  16. Investigations of contaminated fluvial sediment deposits: merging of statistical and geomorphic approaches.

    Science.gov (United States)

    Ryti, Randall T; Reneau, Steven L; Katzman, Danny

    2005-05-01

    Concentrations of contaminants in sediment deposits can have large spatial variability resulting from geomorphic processes acting over long time periods. Thus, systematic (e.g., regularly spaced sample locations) or random sampling approaches might be inefficient and/or lead to highly biased results. We demonstrate the bias associated with systematic sampling and compare these results to those achieved by methods that merge a geomorphic approach to evaluating the physical system and stratified random sampling concepts. By combining these approaches, we achieve a more efficient and less biased characterization of sediment contamination in fluvial systems. These methods are applied using a phased sampling approach to characterize radiological contamination in sediment deposits in two semiarid canyons that have received historical releases from the Los Alamos National Laboratory. Uncertainty in contaminant inventory was used as a metric to evaluate the adequacy of sampling during these phased investigations. Simple, one-dimensional Monte Carlo simulations were used to estimate uncertainty in contaminant inventory. We also show how one can use stratified random sampling theory to help estimate uncertainty in mean contaminant concentrations.

  17. Degradation of organic dyes using spray deposited nanocrystalline stratified WO3/TiO2 photoelectrodes under sunlight illumination

    Science.gov (United States)

    Hunge, Y. M.; Yadav, A. A.; Mahadik, M. A.; Bulakhe, R. N.; Shim, J. J.; Mathe, V. L.; Bhosale, C. H.

    2018-02-01

    The need to utilize TiO2 based metal oxide hetero nanostructures for the degradation of environmental pollutants like Rhodamine B and reactive red 152 from the wastewater using stratified WO3/TiO2 catalyst under sunlight illumination. WO3, TiO2 and stratified WO3/TiO2 catalysts were prepared by a spray pyrolysis method. It was found that the stratified WO3/TiO2 heterostructure has high crystallinity, no mixed phase formation occurs, strong optical absorption in the visible region of the solar spectrum, and large surface area. The photocatalytic activity was tested for degradation of Rhodamine B (Rh B) and reactive red 152 in an aqueous medium. TiO2 layer in stratified WO3/TiO2 catalyst helps to extend its absorption spectrum in the solar light region. Rh B and Reactive red 152is eliminated up to 98 and 94% within the 30 and 40 min respectively at optimum experimental condition by stratified WO3/TiO2. Moreover, stratified WO3/TiO2 photoelectrode has good stability and reusability than individual TiO2 and WO3 thin film in the degradation of Rh B and reactive red 152. The photoelectrocatalytic experimental results indicate that stratified WO3/TiO2 photoelectrode is a promising material for dye removal.

  18. Identification of major planktonic sulfur oxidizers in stratified freshwater lake.

    Directory of Open Access Journals (Sweden)

    Hisaya Kojima

    Full Text Available Planktonic sulfur oxidizers are important constituents of ecosystems in stratified water bodies, and contribute to sulfide detoxification. In contrast to marine environments, taxonomic identities of major planktonic sulfur oxidizers in freshwater lakes still remain largely unknown. Bacterioplankton community structure was analyzed in a stratified freshwater lake, Lake Mizugaki in Japan. In the clone libraries of 16S rRNA gene, clones very closely related to a sulfur oxidizer isolated from this lake, Sulfuritalea hydrogenivorans, were detected in deep anoxic water, and occupied up to 12.5% in each library of different water depth. Assemblages of planktonic sulfur oxidizers were specifically analyzed by constructing clone libraries of genes involved in sulfur oxidation, aprA, dsrA, soxB and sqr. In the libraries, clones related to betaproteobacteria were detected with high frequencies, including the close relatives of Sulfuritalea hydrogenivorans.

  19. Mixing of stratified flow around bridge piers in steady current

    DEFF Research Database (Denmark)

    Jensen, Bjarne; Carstensen, Stefan; Christensen, Erik Damgaard

    2018-01-01

    This paper presents the results of an experimental and numerical investigation of the mixing of stratified flow around bridge pier structures. In this study, which was carried out in connection with the Fehmarnbelt Fixed Link environmental impact assessment, the mixing processes of two-layer stra......This paper presents the results of an experimental and numerical investigation of the mixing of stratified flow around bridge pier structures. In this study, which was carried out in connection with the Fehmarnbelt Fixed Link environmental impact assessment, the mixing processes of two......-layer stratification was studied in which the lower level had a higher salinity than the upper layer. The physical experiments investigated two different pier designs. A general study was made regarding forces on the piers in which the effect of the current angle relative to the structure was also included...

  20. Quantum image pseudocolor coding based on the density-stratified method

    Science.gov (United States)

    Jiang, Nan; Wu, Wenya; Wang, Luo; Zhao, Na

    2015-05-01

    Pseudocolor processing is a branch of image enhancement. It dyes grayscale images to color images to make the images more beautiful or to highlight some parts on the images. This paper proposes a quantum image pseudocolor coding scheme based on the density-stratified method which defines a colormap and changes the density value from gray to color parallel according to the colormap. Firstly, two data structures: quantum image GQIR and quantum colormap QCR are reviewed or proposed. Then, the quantum density-stratified algorithm is presented. Based on them, the quantum realization in the form of circuits is given. The main advantages of the quantum version for pseudocolor processing over the classical approach are that it needs less memory and can speed up the computation. Two kinds of examples help us to describe the scheme further. Finally, the future work are analyzed.

  1. Relationships of the phase velocity with the microarchitectural parameters in bovine trabecular bone in vitro: Application of a stratified model

    Science.gov (United States)

    Lee, Kang Il

    2012-08-01

    The present study aims to provide insight into the relationships of the phase velocity with the microarchitectural parameters in bovine trabecular bone in vitro. The frequency-dependent phase velocity was measured in 22 bovine femoral trabecular bone samples by using a pair of transducers with a diameter of 25.4 mm and a center frequency of 0.5 MHz. The phase velocity exhibited positive correlation coefficients of 0.48 and 0.32 with the ratio of bone volume to total volume and the trabecular thickness, respectively, but a negative correlation coefficient of -0.62 with the trabecular separation. The best univariate predictor of the phase velocity was the trabecular separation, yielding an adjusted squared correlation coefficient of 0.36. The multivariate regression models yielded adjusted squared correlation coefficients of 0.21-0.36. The theoretical phase velocity predicted by using a stratified model for wave propagation in periodically stratified media consisting of alternating parallel solid-fluid layers showed reasonable agreements with the experimental measurements.

  2. Suppression of stratified explosive interactions

    Energy Technology Data Exchange (ETDEWEB)

    Meeks, M.K.; Shamoun, B.I.; Bonazza, R.; Corradini, M.L. [Wisconsin Univ., Madison, WI (United States). Dept. of Nuclear Engineering and Engineering Physics

    1998-01-01

    Stratified Fuel-Coolant Interaction (FCI) experiments with Refrigerant-134a and water were performed in a large-scale system. Air was uniformly injected into the coolant pool to establish a pre-existing void which could suppress the explosion. Two competing effects due to the variation of the air flow rate seem to influence the intensity of the explosion in this geometrical configuration. At low flow rates, although the injected air increases the void fraction, the concurrent agitation and mixing increases the intensity of the interaction. At higher flow rates, the increase in void fraction tends to attenuate the propagated pressure wave generated by the explosion. Experimental results show a complete suppression of the vapor explosion at high rates of air injection, corresponding to an average void fraction of larger than 30%. (author)

  3. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  4. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  5. Multiyear Synthesis of the Macroinvertebrate Component From 1992 to 2002 for the Long Term Resource Monitoring Program

    National Research Council Canada - National Science Library

    Sauer, Jennifer

    2004-01-01

    ...) were added to the sampling design in 1993 and zebra mussels (Dreissena Polymorpha) were added in 1995. Sampling was based on a stratified random design and conducted at approximately 125 sites per study area...

  6. Department of Geography, Ibra

    African Journals Online (AJOL)

    USER

    2016-09-07

    Sep 7, 2016 ... sample unit was determined using stratified sampling method and simple random technique ... demands of the population, (WWAP, ... the national average of 1.8%. .... Optimal Access ... updated figured was used to calculate.

  7. Prevalence of alcohol-impaired drivers based on random breath tests in a roadside survey in Catalonia (Spain).

    Science.gov (United States)

    Alcañiz, Manuela; Guillén, Montserrat; Santolino, Miguel; Sánchez-Moscona, Daniel; Llatje, Oscar; Ramon, Lluís

    2014-04-01

    Sobriety checkpoints are not usually randomly located by traffic authorities. As such, information provided by non-random alcohol tests cannot be used to infer the characteristics of the general driving population. In this paper a case study is presented in which the prevalence of alcohol-impaired driving is estimated for the general population of drivers. A stratified probabilistic sample was designed to represent vehicles circulating in non-urban areas of Catalonia (Spain), a region characterized by its complex transportation network and dense traffic around the metropolis of Barcelona. Random breath alcohol concentration tests were performed during spring 2012 on 7596 drivers. The estimated prevalence of alcohol-impaired drivers was 1.29%, which is roughly a third of the rate obtained in non-random tests. Higher rates were found on weekends (1.90% on Saturdays and 4.29% on Sundays) and especially at night. The rate is higher for men (1.45%) than for women (0.64%) and it shows an increasing pattern with age. In vehicles with two occupants, the proportion of alcohol-impaired drivers is estimated at 2.62%, but when the driver was alone the rate drops to 0.84%, which might reflect the socialization of drinking habits. The results are compared with outcomes in previous surveys, showing a decreasing trend in the prevalence of alcohol-impaired drivers over time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  9. Yoga for generalized anxiety disorder: design of a randomized controlled clinical trial.

    Science.gov (United States)

    Hofmann, Stefan G; Curtiss, Joshua; Khalsa, Sat Bir S; Hoge, Elizabeth; Rosenfield, David; Bui, Eric; Keshaviah, Aparna; Simon, Naomi

    2015-09-01

    Generalized anxiety disorder (GAD) is a common disorder associated with significant distress and interference. Although cognitive behavioral therapy (CBT) has been shown to be the most effective form of psychotherapy, few patients receive or have access to this intervention. Yoga therapy offers another promising, yet under-researched, intervention that is gaining increasing popularity in the general public, as an anxiety reduction intervention. The purpose of this innovative clinical trial protocol is to investigate the efficacy of a Kundalini Yoga intervention, relative to CBT and a control condition. Kundalini yoga and CBT are compared with each other in a noninferiority test and both treatments are compared to stress education training, an attention control intervention, in superiority tests. The sample will consist of 230 individuals with a primary DSM-5 diagnosis of GAD. This randomized controlled trial will compare yoga (N=95) to both CBT for GAD (N=95) and stress education (N=40), a commonly used control condition. All three treatments will be administered by two instructors in a group format over 12 weekly sessions with four to six patients per group. Groups will be randomized using permuted block randomization, which will be stratified by site. Treatment outcome will be evaluated bi-weekly and at 6month follow-up. Furthermore, potential mediators of treatment outcome will be investigated. Given the individual and economic burden associated with GAD, identifying accessible alternative behavioral treatments will have substantive public health implications. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Bacterial production, protozoan grazing, and mineralization in stratified Lake Vechten

    NARCIS (Netherlands)

    Bloem, J.

    1989-01-01

    The role of heterotrophic nanoflagellates (HNAN, size 2-20 μm) in grazing on bacteria and mineralization of organic matter in stratified Lake Vechten was studied.

    Quantitative effects of manipulation and fixation on HNAN were checked. Considerable losses were caused by

  11. Internal wave patterns in enclosed density-stratified and rotating fluids

    NARCIS (Netherlands)

    Manders, A.M.A.

    2003-01-01

    Stratified fluids support internal waves, which propagate obliquely through the fluid. The angle with respectto the stratification direction is contrained: it is purely determined by the wave frequency and the strength of the density stratification (internal gravity waves) or the rotation rate

  12. Computational Fluid Dynamics model of stratified atmospheric boundary-layer flow

    DEFF Research Database (Denmark)

    Koblitz, Tilman; Bechmann, Andreas; Sogachev, Andrey

    2015-01-01

    For wind resource assessment, the wind industry is increasingly relying on computational fluid dynamics models of the neutrally stratified surface-layer. So far, physical processes that are important to the whole atmospheric boundary-layer, such as the Coriolis effect, buoyancy forces and heat...

  13. A criterion for the onset of slugging in horizontal stratified air-water countercurrent flow

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Lee, Byung-Ryung; Kim, Yang-Seok

    1995-01-01

    This paper presents an experimental and theoretical investigation of wave height and transition criterion from wavy to slug flow in horizontal air-water countercurrent stratified flow conditions. A theoretical formula for the wave height in a stratified wavy flow regime has been developed using the concept of total energy balance over a wave crest to consider the shear stress acting on the interface of two fluids. From the limiting condition of the formula for the wave height, a necessary criterion for transition from a stratified wavy flow to a slug flow has been derived. A series of experiments have been conducted changing the non-dimensional water depth and the flow rates of air in a horizontal pipe and a duct. Comparisons between the measured data and the predictions of the present theory show that the agreement is within ±8%

  14. A criterion for the onset of slugging in horizontal stratified air-water countercurrent flow

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Moon-Hyun; Lee, Byung-Ryung; Kim, Yang-Seok [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)] [and others

    1995-09-01

    This paper presents an experimental and theoretical investigation of wave height and transition criterion from wavy to slug flow in horizontal air-water countercurrent stratified flow conditions. A theoretical formula for the wave height in a stratified wavy flow regime has been developed using the concept of total energy balance over a wave crest to consider the shear stress acting on the interface of two fluids. From the limiting condition of the formula for the wave height, a necessary criterion for transition from a stratified wavy flow to a slug flow has been derived. A series of experiments have been conducted changing the non-dimensional water depth and the flow rates of air in a horizontal pipe and a duct. Comparisons between the measured data and the predictions of the present theory show that the agreement is within {plus_minus}8%.

  15. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  16. Generation of stratified squamous epithelial progenitor cells from mouse induced pluripotent stem cells.

    Directory of Open Access Journals (Sweden)

    Satoru Yoshida

    Full Text Available BACKGROUND: Application of induced pluripotent stem (iPS cells in regenerative medicine will bypass ethical issues associated with use of embryonic stem cells. In addition, patient-specific IPS cells can be useful to elucidate the pathophysiology of genetic disorders, drug screening, and tailor-made medicine. However, in order to apply iPS cells to mitotic tissue, induction of tissue stem cells that give rise to progeny of the target organ is required. METHODOLOGY/PRINCIPAL FINDINGS: We induced stratified epithelial cells from mouse iPS cells by co-culture with PA6 feeder cells (SDIA-method with use of BMP4. Clusters of cells positive for the differentiation markers KRT1 or KRT12 were observed in KRT14-positive colonies. We successfully cloned KRT14 and p63 double-positive stratified epithelial progenitor cells from iPS-derived epithelial cells, which formed stratified epithelial sheets consisting of five- to six-polarized epithelial cells in vitro. When these clonal cells were cultured on denuded mouse corneas, a robust stratified epithelial layer was observed with physiological cell polarity including high levels of E-cadherin, p63 and K15 expression in the basal layer and ZO-1 in the superficial layer, recapitulating the apico-basal polarity of the epithelium in vivo. CONCLUSIONS/SIGNIFICANCE: These results suggest that KRT14 and p63 double-positive epithelial progenitor cells can be cloned from iPS cells in order to produce polarized multilayer epithelial cell sheets.

  17. Experimental Validation of a Domestic Stratified Hot Water Tank Model in Modelica for Annual Performance Assessment

    DEFF Research Database (Denmark)

    Carmo, Carolina; Dumont, Olivier; Nielsen, Mads Pagh

    2015-01-01

    The use of stratified hot water tanks in solar energy systems - including ORC systems - as well as heat pump systems is paramount for a better performance of these systems. However, the availability of effective and reliable models to predict the annual performance of stratified hot water tanks...

  18. Rotary Mode Core Sample System availability improvement

    International Nuclear Information System (INIS)

    Jenkins, W.W.; Bennett, K.L.; Potter, J.D.; Cross, B.T.; Burkes, J.M.; Rogers, A.C.

    1995-01-01

    The Rotary Mode Core Sample System (RMCSS) is used to obtain stratified samples of the waste deposits in single-shell and double-shell waste tanks at the Hanford Site. The samples are used to characterize the waste in support of ongoing and future waste remediation efforts. Four sampling trucks have been developed to obtain these samples. Truck I was the first in operation and is currently being used to obtain samples where the push mode is appropriate (i.e., no rotation of drill). Truck 2 is similar to truck 1, except for added safety features, and is in operation to obtain samples using either a push mode or rotary drill mode. Trucks 3 and 4 are now being fabricated to be essentially identical to truck 2

  19. Theory of hyperbolic stratified nanostructures for surface-enhanced Raman scattering

    Science.gov (United States)

    Wong, Herman M. K.; Dezfouli, Mohsen Kamandar; Axelrod, Simon; Hughes, Stephen; Helmy, Amr S.

    2017-11-01

    We theoretically investigate the enhancement of surface enhanced Raman spectroscopy (SERS) using hyperbolic stratified nanostructures and compare to metal nanoresonators. The photon Green function of each nanostructure within its environment is first obtained from a semianalytical modal theory, which is used in a quantum optics formalism of the molecule-nanostructure interaction to model the SERS spectrum. An intuitive methodology is presented for calculating the single-molecule enhancement factor (SMEF), which is also able to predict known experimental SERS enhancement factors of a gold nanodimer. We elucidate the important figures-of-merit of the enhancement and explore these for different designs. We find that the use of hyperbolic stratified materials can enhance the photonic local density of states (LDOS) by close to two times in comparison to pure metal nanostructures, when both designed to work at the same operating wavelengths. However, the increased LDOS is accompanied by higher electric field concentration within the lossy hyperbolic material, which leads to increased quenching that serves to reduce the overall detected SERS enhancement in the far field. For nanoresonators with resonant localized surface plasmon wavelengths in the near-infrared, the SMEF for the hyperbolic stratified nanostructure is approximately one order of magnitude lower than the pure metal counterpart. Conversely, we show that by detecting the Raman signal using a near-field probe, hyperbolic materials can provide an improvement in SERS enhancement compared to using pure metal nanostructures when the probe is sufficiently close (<50 nm ) to the Raman active molecule at the plasmonic hotspot.

  20. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Science.gov (United States)

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2) detected by s-DRY, 56.2% (95%CI: 47.6-64.4) by Dr-WET, and 54.6% (95%CI: 46.1-62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5-79.8) for s-FTA, 84.6% (95%CI: 66.5-93.9) for s-DRY, and 76.9% (95%CI: 58.0-89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. International Standard Randomized Controlled Trial Number (ISRCTN): 43310942.

  1. Maximizing the Diversity of Ensemble Random Forests for Tree Genera Classification Using High Density LiDAR Data

    Directory of Open Access Journals (Sweden)

    Connie Ko

    2016-08-01

    selected randomly (with stratified sample size; to 93.8%; when samples were selected with additional criteria; and from 88.4% to 93.8% when an ensemble method was used.

  2. A facility specialist model for improving retention of nursing home staff: results from a randomized, controlled study.

    Science.gov (United States)

    Pillemer, Karl; Meador, Rhoda; Henderson, Charles; Robison, Julie; Hegeman, Carol; Graham, Edwin; Schultz, Leslie

    2008-07-01

    This article reports on a randomized, controlled intervention study designed to reduce employee turnover by creating a retention specialist position in nursing homes. We collected data three times over a 1-year period in 30 nursing homes, sampled in stratified random manner from facilities in New York State and Connecticut and randomly assigned to treatment and control conditions. Staff outcomes were measured through certified nursing assistant interviews, and turnover rates were measured over the course of the year. In the intervention condition, a staff member was selected to be the facility retention specialist, who would advocate for and implement programs to improve staff retention and commitment throughout the facility. Retention specialists received an intensive 3-day training in retention leadership and in a number of evidence-based retention programs. Ongoing support was provided throughout the project. Treatment facilities experienced significant declines in turnover rates compared to control facilities. As predicted, we found positive effects on certified nursing assistant assessments of the quality of retention efforts and of care provided in the facility; we did not find effects for job satisfaction or stress. The study provides evidence for the effectiveness of the retention specialist model. Findings from a detailed process evaluation suggest modifications of the program that may increase program effects.

  3. Investigating the Randomness of Numbers

    Science.gov (United States)

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  4. Great Zimbabwe University Psychology Students' Perceptions of ...

    African Journals Online (AJOL)

    A quantitative approach was adopted, particularly making use of descriptive survey design. A sample of 38 students was selected through stratified random sampling and data was analysed using SPSS version 19 and Stata version 11.0.

  5. Large Eddy Simulation of stratified flows over structures

    OpenAIRE

    Brechler J.; Fuka V.

    2013-01-01

    We tested the ability of the LES model CLMM (Charles University Large-Eddy Microscale Model) to model the stratified flow around three dimensional hills. We compared the quantities, as the height of the dividing streamline, recirculation zone length or length of the lee waves with experiments by Hunt and Snyder[3] and numerical computations by Ding, Calhoun and Street[5]. The results mostly agreed with the references, but some important differences are present.

  6. Large Eddy Simulation of stratified flows over structures

    Science.gov (United States)

    Fuka, V.; Brechler, J.

    2013-04-01

    We tested the ability of the LES model CLMM (Charles University Large-Eddy Microscale Model) to model the stratified flow around three dimensional hills. We compared the quantities, as the height of the dividing streamline, recirculation zone length or length of the lee waves with experiments by Hunt and Snyder[3] and numerical computations by Ding, Calhoun and Street[5]. The results mostly agreed with the references, but some important differences are present.

  7. Types, Magnitude, Predictors and Controlling Mechanisms of ...

    African Journals Online (AJOL)

    Multi-stage sampling that involves simple random and stratified sampling techniques was used to select student participants. Accidental sampling techniques were employed to select teacher participants. Questionnaire that contained items on socio-demographic variables, scales on aggression, scales on parenting styles ...

  8. Ecosystem metabolism in a stratified lake

    DEFF Research Database (Denmark)

    Stæhr, Peter Anton; Christensen, Jesper Philip Aagaard; Batt, Ryan D.

    2012-01-01

    , differences were not significant. During stratification, daily variability in epilimnetic DO was dominated by metabolism (46%) and air-water gas exchange (44%). Fluxes related to mixed-layer deepening dominated in meta- and hypolimnic waters (49% and 64%), while eddy diffusion (1% and 14%) was less important....... Although air-water gas exchange rates differed among the three formulations of gas-transfer velocity, this had no significant effect on metabolic rates....... that integrates rates across the entire depth profile and includes DO exchange between depth layers driven by mixed-layer deepening and eddy diffusivity. During full mixing, NEP was close to zero throughout the water column, and GPP and R were reduced 2-10 times compared to stratified periods. When present...

  9. Relationships of the phase velocity with the micro architectural parameters in bovine trabecular bone in vitro: application of a stratified model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kang Il [Kangwon National University, Chuncheon (Korea, Republic of)

    2012-08-15

    The present study aims to provide insight into the relationships of the phase velocity with the micro architectural parameters in bovine trabecular bone in vitro. The frequency-dependent phase velocity was measured in 22 bovine femoral trabecular bone samples by using a pair of transducers with a diameter of 25.4 mm and a center frequency of 0.5 MHz. The phase velocity exhibited positive correlation coefficients of 0.48 and 0.32 with the ratio of bone volume to total volume and the trabecular thickness, respectively, but a negative correlation coefficient of -0.62 with the trabecular separation. The best univariate predictor of the phase velocity was the trabecular separation, yielding an adjusted squared correlation coefficient of 0.36. The multivariate regression models yielded adjusted squared correlation coefficients of 0.21 - 0.36. The theoretical phase velocity predicted by using a stratified model for wave propagation in periodically stratified media consisting of alternating parallel solid-fluid layers showed reasonable agreements with the experimental measurements.

  10. Relationships of the phase velocity with the micro architectural parameters in bovine trabecular bone in vitro: application of a stratified model

    International Nuclear Information System (INIS)

    Lee, Kang Il

    2012-01-01

    The present study aims to provide insight into the relationships of the phase velocity with the micro architectural parameters in bovine trabecular bone in vitro. The frequency-dependent phase velocity was measured in 22 bovine femoral trabecular bone samples by using a pair of transducers with a diameter of 25.4 mm and a center frequency of 0.5 MHz. The phase velocity exhibited positive correlation coefficients of 0.48 and 0.32 with the ratio of bone volume to total volume and the trabecular thickness, respectively, but a negative correlation coefficient of -0.62 with the trabecular separation. The best univariate predictor of the phase velocity was the trabecular separation, yielding an adjusted squared correlation coefficient of 0.36. The multivariate regression models yielded adjusted squared correlation coefficients of 0.21 - 0.36. The theoretical phase velocity predicted by using a stratified model for wave propagation in periodically stratified media consisting of alternating parallel solid-fluid layers showed reasonable agreements with the experimental measurements.

  11. Economic viability of Stratified Medicine concepts : An investor perspective on drivers and conditions that favour using Stratified Medicine approaches in a cost-contained healthcare environment

    NARCIS (Netherlands)

    Fugel, Hans-Joerg; Nuijten, Mark; Postma, Maarten

    2016-01-01

    RATIONALE: Stratified Medicine (SM) is becoming a natural result of advances in biomedical science and a promising path for the innovation-based biopharmaceutical industry to create new investment opportunities. While the use of biomarkers to improve R&D efficiency and productivity is very much

  12. A study of stratified gas-liquid pipe flow

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, George W.

    2005-07-01

    This work includes both theoretical modelling and experimental observations which are relevant to the design of gas condensate transport lines. Multicomponent hydrocarbon gas mixtures are transported in pipes over long distances and at various inclinations. Under certain circumstances, the heavier hydrocarbon components and/or water vapour condense to form one or more liquid phases. Near the desired capacity, the liquid condensate and water is efficiently transported in the form of a stratified flow with a droplet field. During operating conditions however, the flow rate may be reduced allowing liquid accumulation which can create serious operational problems due to large amounts of excess liquid being expelled into the receiving facilities during production ramp-up or even in steady production in severe cases. In particular, liquid tends to accumulate in upward inclined sections due to insufficient drag on the liquid from the gas. To optimize the transport of gas condensates, a pipe diameters should be carefully chosen to account for varying flow rates and pressure levels which are determined through the knowledge of the multiphase flow present. It is desirable to have a reliable numerical simulation tool to predict liquid accumulation for various flow rates, pipe diameters and pressure levels which is not presently accounted for by industrial flow codes. A critical feature of the simulation code would include the ability to predict the transition from small liquid accumulation at high flow rates to large liquid accumulation at low flow rates. A semi-intermittent flow regime of roll waves alternating with a partly backward flowing liquid film has been observed experimentally to occur for a range of gas flow rates. Most of the liquid is transported in the roll waves. The roll wave regime is not well understood and requires fundamental modelling and experimental research. The lack of reliable models for this regime leads to inaccurate prediction of the onset of

  13. Characteristics of men with substance use disorder consequent to illicit drug use: comparison of a random sample and volunteers.

    Science.gov (United States)

    Reynolds, Maureen D; Tarter, Ralph E; Kirisci, Levent

    2004-09-06

    Men qualifying for substance use disorder (SUD) consequent to consumption of an illicit drug were compared according to recruitment method. It was hypothesized that volunteers would be more self-disclosing and exhibit more severe disturbances compared to randomly recruited subjects. Personal, demographic, family, social, substance use, psychiatric, and SUD characteristics of volunteers (N = 146) were compared to randomly recruited (N = 102) subjects. Volunteers had lower socioceconomic status, were more likely to be African American, and had lower IQ than randomly recruited subjects. Volunteers also evidenced greater social and family maladjustment and more frequently had received treatment for substance abuse. In addition, lower social desirability response bias was observed in the volunteers. SUD was not more severe in the volunteers; however, they reported a higher lifetime rate of opiate, diet, depressant, and analgesic drug use. Volunteers and randomly recruited subjects qualifying for SUD consequent to illicit drug use are similar in SUD severity but differ in terms of severity of psychosocial disturbance and history of drug involvement. The factors discriminating volunteers and randomly recruited subjects are well known to impact on outcome, hence they need to be considered in research design, especially when selecting a sampling strategy in treatment research.

  14. Experimental analysis of an oblique turbulent flame front propagating in a stratified flow

    Energy Technology Data Exchange (ETDEWEB)

    Galizzi, C.; Escudie, D. [Universite de Lyon, CNRS, CETHIL, INSA-Lyon, UMR5008, F-69621 Cedex (France)

    2010-12-15

    This paper details the experimental study of a turbulent V-shaped flame expanding in a nonhomogeneous premixed flow. Its aim is to characterize the effects of stratification on turbulent flame characteristics. The setup consists of a stationary V-shaped flame stabilized on a rod and expanding freely in a lean premixed methane-air flow. One of the two oblique fronts interacts with a stratified slice, which has an equivalence ratio close to one and a thickness greater than that of the flame front. Several techniques such as PIV and CH{sup *} chemiluminescence are used to investigate the instantaneous fields, while laser Doppler anemometry and thermocouples are combined with a concentration probe to provide information on the mean fields. First, in order to provide a reference, the homogeneous turbulent case is studied. Next, the stratified turbulent premixed flame is investigated. Results show significant modifications of the whole flame and of the velocity field upstream of the flame front. The analysis of the geometric properties of the stratified flame indicates an increase in flame brush thickness, closely related to the local equivalence ratio. (author)

  15. Improvements to TRAC models of condensing stratified flow. Pt. 1

    International Nuclear Information System (INIS)

    Zhang, Q.; Leslie, D.C.

    1991-12-01

    Direct contact condensation in stratified flow is an important phenomenon in LOCA analyses. In this report, the TRAC interfacial heat transfer model for stratified condensing flow has been assessed against the Bankoff experiments. A rectangular channel option has been added to the code to represent the experimental geometry. In almost all cases the TRAC heat transfer coefficient (HTC) over-predicts the condensation rates and in some cases it is so high that the predicted steam is sucked in from the normal outlet in order to conserve mass. Based on their cocurrent and countercurrent condensing flow experiments, Bankoff and his students (Lim 1981, Kim 1985) developed HTC models from the two cases. The replacement of the TRAC HTC with either of Bankoff's models greatly improves the predictions of condensation rates in the experiment with cocurrent condensing flow. However, the Bankoff HTC for countercurrent flow is preferable because it is based only on the local quantities rather than on the quantities averaged from the inlet. (author)

  16. The effect of a high-protein, high-sodium diet on calcium and bone metabolism in postmenopausal women stratified by hormone replacement therapy use

    DEFF Research Database (Denmark)

    Harrington, M.; Bennett, T.; Jakobsen, Jette

    2004-01-01

    The objective of this study was to investigate the influence of a high-sodium, high-protein diet on bone metabolism in postmenopausal women ( aged 49 - 60 y) stratified by hormone replacement therapy (HRT) use. In a crossover trial, 18 women (n = 8 HRT users (+HRT) and n = 10 nonusers (-HRT)) were...... randomly assigned to a diet high in protein ( 90 g/day) and sodium (180 mmol/day) ( calciuric diet) or a diet moderate in protein ( 70 g/day) and low in sodium ( 65 mmol/day) for 4 weeks followed by crossover to alternative dietary regimen for a further 4 weeks. The calciuric diet significantly (P...

  17. Patent foramen ovale, ischemic stroke and migraine: systematic review and stratified meta-analysis of association studies.

    Science.gov (United States)

    Davis, Daniel; Gregson, John; Willeit, Peter; Stephan, Blossom; Al-Shahi Salman, Rustam; Brayne, Carol

    2013-01-01

    Observational data have reported associations between patent foramen ovale (PFO), cryptogenic stroke and migraine. However, randomized trials of PFO closure do not demonstrate a clear benefit either because the underlying association is weaker than previously suggested or because the trials were underpowered. In order to resolve the apparent discrepancy between observational data and randomized trials, we investigated associations between (1) migraine and ischemic stroke, (2) PFO and ischemic stroke, and (3) PFO and migraine. Eligibility criteria were consistent; including all studies with specifically defined exposures and outcomes unrestricted by language. We focused on studies at lowest risk of bias by stratifying analyses based on methodological design and quantified associations using fixed-effects meta-analysis models. We included 37 studies of 7,686 identified. Compared to reports in the literature as a whole, studies with population-based comparators showed weaker associations between migraine with aura and cryptogenic ischemic stroke in younger women (OR 1.4; 95% CI 0.9-2.0; 1 study), PFO and ischemic stroke (HR 1.6; 95 CI 1.0-2.5; 2 studies; OR 1.3; 95% CI 0.9-1.9; 3 studies), or PFO and migraine (OR 1.0; 95% CI 0.6-1.6; 1 study). It was not possible to look for interactions or effect modifiers. These results are limited by sources of bias within individual studies. The overall pairwise associations between PFO, cryptogenic ischemic stroke and migraine do not strongly suggest a causal role for PFO. Ongoing randomized trials of PFO closure may need larger numbers of participants to detect an overall beneficial effect. Copyright © 2012 S. Karger AG, Basel.

  18. Using the Internet to Support Exercise and Diet: A Stratified Norwegian Survey.

    Science.gov (United States)

    Wangberg, Silje C; Sørensen, Tove; Andreassen, Hege K

    2015-08-26

    Internet is used for a variety of health related purposes. Use differs and has differential effects on health according to socioeconomic status. We investigated to what extent the Norwegian population use the Internet to support exercise and diet, what kind of services they use, and whether there are social disparities in use. We expected to find differences according to educational attainment. In November 2013 we surveyed a stratified sample of 2196 persons drawn from a Web panel of about 50,000 Norwegians over 15 years of age. The questionnaire included questions about using the Internet, including social network sites (SNS), or mobile apps in relation to exercise or diet, as well as background information about education, body image, and health. The survey email was opened by 1187 respondents (54%). Of these, 89 did not click on the survey hyperlink (declined to participate), while another 70 did not complete the survey. The final sample size is thus 1028 (87% response rate). Compared to the Norwegian census the sample had a slight under-representation of respondents under the age of 30 and with low education. The data was weighted accordingly before analyses. Sixty-nine percent of women and 53% of men had read about exercise or diet on the Internet (χ(2)= 25.6, Psocial disparities in health, and continue to monitor population use. For Internet- and mobile-based interventions to support health behaviors, this study provides information relevant to tailoring of delivery media and components to user.

  19. Effect of study design on the reported effect of cardiac resynchronization therapy (CRT) on quantitative physiological measures: stratified meta-analysis in narrow-QRS heart failure and implications for planning future studies.

    Science.gov (United States)

    Jabbour, Richard J; Shun-Shin, Matthew J; Finegold, Judith A; Afzal Sohaib, S M; Cook, Christopher; Nijjer, Sukhjinder S; Whinnett, Zachary I; Manisty, Charlotte H; Brugada, Josep; Francis, Darrel P

    2015-01-06

    Biventricular pacing (CRT) shows clear benefits in heart failure with wide QRS, but results in narrow QRS have appeared conflicting. We tested the hypothesis that study design might have influenced findings. We identified all reports of CRT-P/D therapy in subjects with narrow QRS reporting effects on continuous physiological variables. Twelve studies (2074 patients) met these criteria. Studies were stratified by presence of bias-resistance steps: the presence of a randomized control arm over a single arm, and blinded outcome measurement. Change in each endpoint was quantified using a standardized effect size (Cohen's d). We conducted separate meta-analyses for each variable in turn, stratified by trial quality. In non-randomized, non-blinded studies, the majority of variables (10 of 12, 83%) showed significant improvement, ranging from a standardized mean effect size of +1.57 (95%CI +0.43 to +2.7) for ejection fraction to +2.87 (+1.78 to +3.95) for NYHA class. In the randomized, non-blinded study, only 3 out of 6 variables (50%) showed improvement. For the randomized blinded studies, 0 out of 9 variables (0%) showed benefit, ranging from -0.04 (-0.31 to +0.22) for ejection fraction to -0.1 (-0.73 to +0.53) for 6-minute walk test. Differences in degrees of resistance to bias, rather than choice of endpoint, explain the variation between studies of CRT in narrow-QRS heart failure addressing physiological variables. When bias-resistance features are implemented, it becomes clear that these patients do not improve in any tested physiological variable. Guidance from studies without careful planning to resist bias may be far less useful than commonly perceived. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  20. Long-term vegetation monitoring for different habitats in floodplains

    Directory of Open Access Journals (Sweden)

    LANG Petra

    2014-03-01

    Full Text Available A floodplain-restoration project along the Danube between Neuburg and Ingolstadt (Germany aims to bring back water and sediment dynamic to the floodplain. The accompanied long-term monitoring has to document the changes in biodiversity related to this new dynamics. Considerations on and results of the vegetation monitoring concept are documented in this paper. In a habitat rich ecosystem like a floodplain different habitats (alluvial forest, semi-aquatic/aquatic sites have different demands on the sampling methods. Therefore, different monitoring designs (preferential, random, systematic, stratified random and transect sampling are discussed and tested for their use in different habitat types of the floodplain. A stratified random sampling is chosen for the alluvial forest stands, as it guarantees an equal distribution of the monitoring plots along the main driving factors, i.e. influence of water. The parameters distance to barrage, ecological flooding, height above thalweg and distance to the new floodplain river are used for stratifying and the plots are placed randomly into these strata, resulting in 117 permanent plots. Due to small changes at the semi-aquatic/aquatic sites a transect sampling was chosen. Further, a rough stratification (channel bed, river bank adjacent floodplain was implemented, which was only possible after the start of the restoration project. To capture the small-scale changes due to the restoration measures on the vegetation, 99 additional plots completed the transect sampling. We conclude that hetereogenous study areas need different monitoring approaches, but, later on, a joint analysis must be possible.

  1. Large Eddy Simulation of stratified flows over structures

    Directory of Open Access Journals (Sweden)

    Brechler J.

    2013-04-01

    Full Text Available We tested the ability of the LES model CLMM (Charles University Large-Eddy Microscale Model to model the stratified flow around three dimensional hills. We compared the quantities, as the height of the dividing streamline, recirculation zone length or length of the lee waves with experiments by Hunt and Snyder[3] and numerical computations by Ding, Calhoun and Street[5]. The results mostly agreed with the references, but some important differences are present.

  2. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Elsa Tavernier

    Full Text Available We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT. In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review. Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was 90%. Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  3. Mixed Convection Flow along a Stretching Cylinder in a Thermally Stratified Medium

    Directory of Open Access Journals (Sweden)

    Swati Mukhopadhyay

    2012-01-01

    Full Text Available An analysis for the axisymmetric laminar boundary layer mixed convection flow of a viscous and incompressible fluid towards a stretching cylinder immersed in a thermally stratified medium is presented in this paper. Similarity transformation is employed to convert the governing partial differential equations into highly nonlinear ordinary differential equations. Numerical solutions of these equations are obtained by a shooting method. It is found that the heat transfer rate at the surface is lower for flow in a thermally stratified medium compared to that of an unstratified medium. Moreover, both the skin friction coefficient and the heat transfer rate at the surface are larger for a cylinder compared to that for a flat plate.

  4. Plane Stratified Flow in a Room Ventilated by Displacement Ventilation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm; Nickel, J.; Baron, D. J. G.

    2004-01-01

    The air movement in the occupied zone of a room ventilated by displacement ventilation exists as a stratified flow along the floor. This flow can be radial or plane according to the number of wall-mounted diffusers and the room geometry. The paper addresses the situations where plane flow...

  5. The Role of Strategic Leadership during Change | Riwo-Abudho ...

    African Journals Online (AJOL)

    The modern business environment is highly dynamic due to numerous forces that interact with each other to affect organizations. ... with a sample of 173 respondents drawn from executive directors, senior managers and managers through stratified sampling who were picked by simple random sampling from airlines.

  6. Ibuprofen Versus Fennel for the Relief of Postpartum Pain: A Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Parvin Asti

    2011-06-01

    Full Text Available Objective: The present study aimed to compare the value of ibuprofen and fennel for postpartum painrelief in women with normal vaginal delivery.Materials and methods:In this randomized clinical trial we studied 90 women referring to obstetricsward for Normal Vaginal Delivery (NVD in Assali hospital in Khoramabad. Women were randomlyallocated to receive either oral ibuprofen or oral fennel by stratified random sampling technique. Allwomen were asked to give pain score by visual analogue scale before and at 1, 2, 3 and 4 hours aftertreatment.Results: Difference between fennel and ibuprofen groups was not significant considering severity of painbefore (P=0.22. Difference between two groups considering mean severity of pain one hour aftertreatment (P=0.57 was not significant. But comparing the mean of pain severity in two groups, showedsignificant difference after two (p<0.023, three (p<0.001 and four (p<0.001 hours after treatment.Conclusion: Ibuprofen and fennel were effective for relief of postpartum pain without any notable sideeffects, but in general ibuprofen was more effective than fennel. More studies are needed to confirm theefficacy of fennel in pain relief especially in postpartum women which must be compared to a notreatment control group.

  7. Is the Role of Teacher Performance Appraisal in Ethiopia Rhetoric or ...

    African Journals Online (AJOL)

    The study was conducted on eight randomly selected full cycle primary schools selected using urban and rural stratification. Teachers, school administrative committee, students, and parents were participants of the study. Proportionate stratified random sampling technique was employed to select teachers. On the other ...

  8. The Effect of Socio-Economic Factors on Pearl Millet ( Pennisetum ...

    African Journals Online (AJOL)

    The study investigated farmers' socio-economic factor affecting pearl millet production in randomly selected villages in Magumeri Local Government Area of Borno State. A total of 80 farmers were selected through stratified random sampling and were administered with questionnaires. The results revealed that educational ...

  9. White dwarf stars with chemically stratified atmospheres

    Science.gov (United States)

    Muchmore, D.

    1982-01-01

    Recent observations and theory suggest that some white dwarfs may have chemically stratified atmospheres - thin layers of hydrogen lying above helium-rich envelopes. Models of such atmospheres show that a discontinuous temperature inversion can occur at the boundary between the layers. Model spectra for layered atmospheres at 30,000 K and 50,000 K tend to have smaller decrements at 912 A, 504 A, and 228 A than uniform atmospheres would have. On the basis of their continuous extreme ultraviolet spectra, it is possible to distinguish observationally between uniform and layered atmospheres for hot white dwarfs.

  10. Measuring mixing efficiency in experiments of strongly stratified turbulence

    Science.gov (United States)

    Augier, P.; Campagne, A.; Valran, T.; Calpe Linares, M.; Mohanan, A. V.; Micard, D.; Viboud, S.; Segalini, A.; Mordant, N.; Sommeria, J.; Lindborg, E.

    2017-12-01

    Oceanic and atmospheric models need better parameterization of the mixing efficiency. Therefore, we need to measure this quantity for flows representative of geophysical flows, both in terms of types of flows (with vortices and/or waves) and of dynamical regimes. In order to reach sufficiently large Reynolds number for strongly stratified flows, experiments for which salt is used to produce the stratification have to be carried out in a large rotating platform of at least 10-meter diameter.We present new experiments done in summer 2017 to study experimentally strongly stratified turbulence and mixing efficiency in the Coriolis platform. The flow is forced by a slow periodic movement of an array of large vertical or horizontal cylinders. The velocity field is measured by 3D-2C scanned horizontal particles image velocimetry (PIV) and 2D vertical PIV. Six density-temperature probes are used to measure vertical and horizontal profiles and signals at fixed positions.We will show how we rely heavily on open-science methods for this study. Our new results on the mixing efficiency will be presented and discussed in terms of mixing parameterization.

  11. Spatial Distribution and Sampling Plans for Grapevine Plant Canopy-Inhabiting Scaphoideus titanus (Hemiptera: Cicadellidae) Nymphs.

    Science.gov (United States)

    Rigamonti, Ivo E; Brambilla, Carla; Colleoni, Emanuele; Jermini, Mauro; Trivellone, Valeria; Baumgärtner, Johann

    2016-04-01

    The paper deals with the study of the spatial distribution and the design of sampling plans for estimating nymph densities of the grape leafhopper Scaphoideus titanus Ball in vine plant canopies. In a reference vineyard sampled for model parameterization, leaf samples were repeatedly taken according to a multistage, stratified, random sampling procedure, and data were subjected to an ANOVA. There were no significant differences in density neither among the strata within the vineyard nor between the two strata with basal and apical leaves. The significant differences between densities on trunk and productive shoots led to the adoption of two-stage (leaves and plants) and three-stage (leaves, shoots, and plants) sampling plans for trunk shoots- and productive shoots-inhabiting individuals, respectively. The mean crowding to mean relationship used to analyze the nymphs spatial distribution revealed aggregated distributions. In both the enumerative and the sequential enumerative sampling plans, the number of leaves of trunk shoots, and of leaves and shoots of productive shoots, was kept constant while the number of plants varied. In additional vineyards data were collected and used to test the applicability of the distribution model and the sampling plans. The tests confirmed the applicability 1) of the mean crowding to mean regression model on the plant and leaf stages for representing trunk shoot-inhabiting distributions, and on the plant, shoot, and leaf stages for productive shoot-inhabiting nymphs, 2) of the enumerative sampling plan, and 3) of the sequential enumerative sampling plan. In general, sequential enumerative sampling was more cost efficient than enumerative sampling.

  12. The dynamics of small inertial particles in weakly stratified turbulence

    NARCIS (Netherlands)

    van Aartrijk, M.; Clercx, H.J.H.

    We present an overview of a numerical study on the small-scale dynamics and the large-scale dispersion of small inertial particles in stably stratified turbulence. Three types of particles are examined: fluid particles, light inertial particles (with particle-to-fluid density ratio 1Ͽp/Ͽf25) and

  13. Dipole formation by two interacting shielded monopoles in a stratified fluid

    NARCIS (Netherlands)

    Beckers, M.; Clercx, H.J.H.; Heijst, van G.J.F.; Verzicco, R.

    2002-01-01

    The interaction between two shielded monopolar vortices has been investigated experimentally in a nonrotating linearly stratified fluid and by full three-dimensional (3D) numerical simulations. The characteristic Reynolds and Froude numbers in the experiments are approximately Re [[approximate

  14. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    Science.gov (United States)

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Numerical simulations of the stratified oceanic bottom boundary layer

    Science.gov (United States)

    Taylor, John R.

    Numerical simulations are used to consider several problems relevant to the turbulent oceanic bottom boundary layer. In the first study, stratified open channel flow is considered with thermal boundary conditions chosen to approximate a shallow sea. Specifically, a constant heat flux is applied at the free surface and the lower wall is assumed to be adiabatic. When the surface heat flux is strong, turbulent upwellings of low speed fluid from near the lower wall are inhibited by the stable stratification. Subsequent studies consider a stratified bottom Ekman layer over a non-sloping lower wall. The influence of the free surface is removed by using an open boundary condition at the top of the computational domain. Particular attention is paid to the influence of the outer layer stratification on the boundary layer structure. When the density field is initialized with a linear profile, a turbulent mixed layer forms near the wall, which is separated from the outer layer by a strongly stable pycnocline. It is found that the bottom stress is not strongly affected by the outer layer stratification. However, stratification reduces turbulent transport to the outer layer and strongly limits the boundary layer height. The mean shear at the top of the boundary layer is enhanced when the outer layer is stratified, and this shear is strong enough to cause intermittent instabilities above the pycnocline. Turbulence-generated internal gravity waves are observed in the outer layer with a relatively narrow frequency range. An explanation for frequency content of these waves is proposed, starting with an observed broad-banded turbulent spectrum and invoking linear viscous decay to explain the preferential damping of low and high frequency waves. During the course of this work, an open-source computational fluid dynamics code has been developed with a number of advanced features including scalar advection, subgrid-scale models for large-eddy simulation, and distributed memory

  16. Importance sampling of heavy-tailed iterated random functions

    NARCIS (Netherlands)

    B. Chen (Bohan); C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2016-01-01

    textabstractWe consider a stochastic recurrence equation of the form $Z_{n+1} = A_{n+1} Z_n+B_{n+1}$, where $\\mathbb{E}[\\log A_1]<0$, $\\mathbb{E}[\\log^+ B_1]<\\infty$ and $\\{(A_n,B_n)\\}_{n\\in\\mathbb{N}}$ is an i.i.d. sequence of positive random vectors. The stationary distribution of this Markov

  17. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Directory of Open Access Journals (Sweden)

    Rosa Catarino

    Full Text Available Human papillomavirus (HPV self-sampling (self-HPV is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab.A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA. After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET. HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic.HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2 detected by s-DRY, 56.2% (95%CI: 47.6-64.4 by Dr-WET, and 54.6% (95%CI: 46.1-62.9 by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34, and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56. Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+ were: 64.0% (95%CI: 44.5-79.8 for s-FTA, 84.6% (95%CI: 66.5-93.9 for s-DRY, and 76.9% (95%CI: 58.0-89.0 for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%. Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD/per card vs. ~1 USD/per swab.Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA.International Standard Randomized Controlled Trial Number (ISRCTN: 43310942.

  18. Direct contact condensation induced transition from stratified to slug flow

    International Nuclear Information System (INIS)

    Strubelj, Luka; Ezsoel, Gyoergy; Tiselj, Iztok

    2010-01-01

    Selected condensation-induced water hammer experiments performed on PMK-2 device were numerically modelled with three-dimensional two-fluid models of computer codes NEPTUNE C FD and CFX. Experimental setup consists of the horizontal pipe filled with the hot steam that is being slowly flooded with cold water. In most of the experimental cases, slow flooding of the pipe was abruptly interrupted by a strong slugging and water hammer, while in the selected experimental runs performed at higher initial pressures and temperatures that are analysed in the present work, the transition from the stratified into the slug flow was not accompanied by the water hammer pressure peak. That makes these cases more suitable tests for evaluation of the various condensation models in the horizontally stratified flows and puts them in the range of the available CFD (Computational Fluid Dynamics) codes. The key models for successful simulation appear to be the condensation model of the hot vapour on the cold liquid and the interfacial momentum transfer model. The surface renewal types of condensation correlations, developed for condensation in the stratified flows, were used in the simulations and were applied also in the regions of the slug flow. The 'large interface' model for inter-phase momentum transfer model was compared to the bubble drag model. The CFD simulations quantitatively captured the main phenomena of the experiments, while the stochastic nature of the particular condensation-induced water hammer experiments did not allow detailed prediction of the time and position of the slug formation in the pipe. We have clearly shown that even the selected experiments without water hammer present a tough test for the applied CFD codes, while modelling of the water hammer pressure peaks in two-phase flow, being a strongly compressible flow phenomena, is beyond the capability of the current CFD codes.

  19. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  20. Proposed catalog of the neuroanatomy and the stratified anatomy for the 361 acupuncture points of 14 channels.

    Science.gov (United States)

    Chapple, Will

    2013-10-01

    In spite of the extensive research on acupuncture mechanisms, no comprehensive and systematic peer-reviewed reference list of the stratified anatomical and the neuroanatomical features of all 361 acupuncture points exists. This study creates a reference list of the neuroanatomy and the stratified anatomy for each of the 361 acupuncture points on the 14 classical channels and for 34 extra points. Each acupuncture point was individually assessed to relate the point's location to anatomical and neuroanatomical features. The design of the catalogue is intended to be useful for any style of acupuncture or Oriental medicine treatment modality. The stratified anatomy was divided into shallow, intermediate and deep insertion. A separate stratified anatomy was presented for different needle angles and directions. The following are identified for each point: additional specifications for point location, the stratified anatomy, motor innervation, cutaneous nerve and sensory innervation, dermatomes, Langer's lines, and somatotopic organization in the primary sensory and motor cortices. Acupuncture points for each muscle, dermatome and myotome are also reported. This reference list can aid clinicians, practitioners and researchers in furthering the understanding and accurate practice of acupuncture. Additional research on the anatomical variability around acupuncture points, the frequency of needle contact with an anatomical structure in a clinical setting, and conformational imaging should be done to verify this catalogue. Copyright © 2013. Published by Elsevier B.V.

  1. Stratified turbulent Bunsen flames: flame surface analysis and flame surface density modelling

    Science.gov (United States)

    Ramaekers, W. J. S.; van Oijen, J. A.; de Goey, L. P. H.

    2012-12-01

    In this paper it is investigated whether the Flame Surface Density (FSD) model, developed for turbulent premixed combustion, is also applicable to stratified flames. Direct Numerical Simulations (DNS) of turbulent stratified Bunsen flames have been carried out, using the Flamelet Generated Manifold (FGM) reduction method for reaction kinetics. Before examining the suitability of the FSD model, flame surfaces are characterized in terms of thickness, curvature and stratification. All flames are in the Thin Reaction Zones regime, and the maximum equivalence ratio range covers 0.1⩽φ⩽1.3. For all flames, local flame thicknesses correspond very well to those observed in stretchless, steady premixed flamelets. Extracted curvature radii and mixing length scales are significantly larger than the flame thickness, implying that the stratified flames all burn in a premixed mode. The remaining challenge is accounting for the large variation in (subfilter) mass burning rate. In this contribution, the FSD model is proven to be applicable for Large Eddy Simulations (LES) of stratified flames for the equivalence ratio range 0.1⩽φ⩽1.3. Subfilter mass burning rate variations are taken into account by a subfilter Probability Density Function (PDF) for the mixture fraction, on which the mass burning rate directly depends. A priori analysis point out that for small stratifications (0.4⩽φ⩽1.0), the replacement of the subfilter PDF (obtained from DNS data) by the corresponding Dirac function is appropriate. Integration of the Dirac function with the mass burning rate m=m(φ), can then adequately model the filtered mass burning rate obtained from filtered DNS data. For a larger stratification (0.1⩽φ⩽1.3), and filter widths up to ten flame thicknesses, a β-function for the subfilter PDF yields substantially better predictions than a Dirac function. Finally, inclusion of a simple algebraic model for the FSD resulted only in small additional deviations from DNS data

  2. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  3. Interfacial transport characteristics in a gas-liquid or an immiscible liquid-liquid stratified flow

    International Nuclear Information System (INIS)

    Inoue, A.; Aoki, S.; Aritomi, M.; Kozawa, Y.

    1982-01-01

    This paper is a review for an interfacial transport characteristics of mass, momentum and energy in a gas-liquid or a immiscible liquid-liquid stratified flow with wavy interface which have been studied in our division. In the experiment, a characteristic of wave motion and its effect to the turbulence near the interface as well as overall flow characteristics like pressure drop, position of the interface were investigated in an air-water, an air-mercury and a water-liquid metal stratified flow. On the other hand, several models based on the mixing length model and a two-equation model of turbulence, with special interfacial boundary conditions in which the wavy surface was regarded as a rough surface correspond to the wavy height, a source of turbulent energy equal to the wave energy and a damped-turbulence due to the surface tension, were proposed to predict the flow characteristics and the interfacial heat transfer in a fully developed and an undeveloped stratified flow and examined by the experimental data. (author)

  4. Vegetation structure and composition across different land use in a semi-arid savanna of southern Zimbabwe

    NARCIS (Netherlands)

    Zisadza-Gandiwa, P.; Mango, L.; Gandiwa, E.; Goza, D.; Parakasingwa, C.; Chinoitezvi, E.; Shimbani, J.; Muvengwi, J.

    2013-01-01

    We compared the structure and composition of vegetation communities across different land uses in the northern Gonarezhou National Park and adjacent areas, southeast Zimbabwe. Vegetation data were collected from 60 sample plots using a stratified random sampling technique from April to May 2012.

  5. LONGITUDINAL OSCILLATIONS IN DENSITY STRATIFIED AND EXPANDING SOLAR WAVEGUIDES

    Energy Technology Data Exchange (ETDEWEB)

    Luna-Cardozo, M. [Instituto de Astronomia y Fisica del Espacio, CONICET-UBA, CC. 67, Suc. 28, 1428 Buenos Aires (Argentina); Verth, G. [School of Computing, Engineering and Information Sciences, Northumbria University, Newcastle Upon Tyne NE1 8ST (United Kingdom); Erdelyi, R., E-mail: mluna@iafe.uba.ar, E-mail: robertus@sheffield.ac.uk, E-mail: gary.verth@northumbria.ac.uk [Solar Physics and Space Plasma Research Centre (SP2RC), University of Sheffield, Hicks Building, Hounsfield Road, Sheffield S3 7RH (United Kingdom)

    2012-04-01

    Waves and oscillations can provide vital information about the internal structure of waveguides in which they propagate. Here, we analytically investigate the effects of density and magnetic stratification on linear longitudinal magnetohydrodynamic (MHD) waves. The focus of this paper is to study the eigenmodes of these oscillations. It is our specific aim to understand what happens to these MHD waves generated in flux tubes with non-constant (e.g., expanding or magnetic bottle) cross-sectional area and density variations. The governing equation of the longitudinal mode is derived and solved analytically and numerically. In particular, the limit of the thin flux tube approximation is examined. The general solution describing the slow longitudinal MHD waves in an expanding magnetic flux tube with constant density is found. Longitudinal MHD waves in density stratified loops with constant magnetic field are also analyzed. From analytical solutions, the frequency ratio of the first overtone and fundamental mode is investigated in stratified waveguides. For small expansion, a linear dependence between the frequency ratio and the expansion factor is found. From numerical calculations it was found that the frequency ratio strongly depends on the density profile chosen and, in general, the numerical results are in agreement with the analytical results. The relevance of these results for solar magneto-seismology is discussed.

  6. Optimal energy growth in a stably stratified shear flow

    Science.gov (United States)

    Jose, Sharath; Roy, Anubhab; Bale, Rahul; Iyer, Krithika; Govindarajan, Rama

    2018-02-01

    Transient growth of perturbations by a linear non-modal evolution is studied here in a stably stratified bounded Couette flow. The density stratification is linear. Classical inviscid stability theory states that a parallel shear flow is stable to exponentially growing disturbances if the Richardson number (Ri) is greater than 1/4 everywhere in the flow. Experiments and numerical simulations at higher Ri show however that algebraically growing disturbances can lead to transient amplification. The complexity of a stably stratified shear flow stems from its ability to combine this transient amplification with propagating internal gravity waves (IGWs). The optimal perturbations associated with maximum energy amplification are numerically obtained at intermediate Reynolds numbers. It is shown that in this wall-bounded flow, the three-dimensional optimal perturbations are oblique, unlike in unstratified flow. A partitioning of energy into kinetic and potential helps in understanding the exchange of energies and how it modifies the transient growth. We show that the apportionment between potential and kinetic energy depends, in an interesting manner, on the Richardson number, and on time, as the transient growth proceeds from an optimal perturbation. The oft-quoted stabilizing role of stratification is also probed in the non-diffusive limit in the context of disturbance energy amplification.

  7. Teachers' Attitude towards Implementation of Learner-Centered Methodology in Science Education in Kenya

    Science.gov (United States)

    Ndirangu, Caroline

    2017-01-01

    This study aims to evaluate teachers' attitude towards implementation of learner-centered methodology in science education in Kenya. The study used a survey design methodology, adopting the purposive, stratified random and simple random sampling procedures and hypothesised that there was no significant relationship between the head teachers'…

  8. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  9. Exploring the salivary microbiome of children stratified by the oral hygiene index

    Science.gov (United States)

    Mashima, Izumi; Theodorea, Citra F.; Thaweboon, Boonyanit; Thaweboon, Sroisiri; Scannapieco, Frank A.

    2017-01-01

    Poor oral hygiene often leads to chronic diseases such as periodontitis and dental caries resulting in substantial economic costs and diminished quality of life in not only adults but also in children. In this study, the salivary microbiome was characterized in a group of children stratified by the Simplified Oral Hygiene Index (OHI-S). Illumina MiSeq high-throughput sequencing based on the 16S rRNA was utilized to analyze 90 salivary samples (24 Good, 31 Moderate and 35 Poor oral hygiene) from a cohort of Thai children. A total of 38,521 OTUs (Operational Taxonomic Units) with a 97% similarity were characterized in all of the salivary samples. Twenty taxonomic groups (Seventeen genera, two families and one class; Streptococcus, Veillonella, Gemellaceae, Prevotella, Rothia, Porphyromonas, Granulicatella, Actinomyces, TM-7-3, Leptotrichia, Haemophilus, Selenomonas, Neisseria, Megasphaera, Capnocytophaga, Oribacterium, Abiotrophia, Lachnospiraceae, Peptostreptococcus, and Atopobium) were found in all subjects and constituted 94.5–96.5% of the microbiome. Of these twenty genera, the proportion of Streptococcus decreased while Veillonella increased with poor oral hygiene status (P oral hygiene group. This is the first study demonstrating an important association between increase of Veillonella and poor oral hygiene status in children. However, further studies are required to identify the majority of Veillonella at species level in salivary microbiome of the Poor oral hygiene group. PMID:28934367

  10. Technetium reduction and removal in a stratified fjord

    International Nuclear Information System (INIS)

    Keith-Roach, M.; Roos, P.

    2002-01-01

    The distribution of Tc in the water columns of a stratified fjord has been measured to investigate the behaviour and fate of Tc on reaching reducing waters. Slow mixing in the water column of the fjord results in vertical transport of the dissolved Tc to the oxic/anoxic interface. Tc is reduced just below the interface and at 21 m 60% is sorbed to particulate and colloidal material. Tc is carried to the sediments sorbed to the particulate material, where there is a current inventory of approximately 3 Bq m -2 . (LN)

  11. Technetium reduction and removal in a stratified fjord

    Energy Technology Data Exchange (ETDEWEB)

    Keith-Roach, M.; Roos, P. [Risoe National Lab., Roskilde (Denmark)

    2002-04-01

    The distribution of Tc in the water columns of a stratified fjord has been measured to investigate the behaviour and fate of Tc on reaching reducing waters. Slow mixing in the water column of the fjord results in vertical transport of the dissolved Tc to the oxic/anoxic interface. Tc is reduced just below the interface and at 21 m 60% is sorbed to particulate and colloidal material. Tc is carried to the sediments sorbed to the particulate material, where there is a current inventory of approximately 3 Bq m{sup -2}. (LN)

  12. Sensitivity of the Geomagnetic Octupole to a Stably Stratified Layer in the Earth's Core

    Science.gov (United States)

    Yan, C.; Stanley, S.

    2017-12-01

    The presence of a stably stratified layer at the top of the core has long been proposed for Earth, based on evidence from seismology and geomagnetic secular variation. Geodynamo modeling offers a unique window to inspect the properties and dynamics in Earth's core. For example, numerical simulations have shown that magnetic field morphology is sensitive to the presence of stably stratified layers in a planet's core. Here we use the mMoSST numerical dynamo model to investigate the effects of a thin stably stratified layer at the top of the fluid outer core in Earth on the resulting large-scale geomagnetic field morphology. We find that the existence of a stable layer has significant influence on the octupolar component of the magnetic field in our models, whereas the quadrupole doesn't show an obvious trend. This suggests that observations of the geomagnetic field can be applied to provide information of the properties of this plausible stable layer, such as how thick and how stable this layer could be. Furthermore, we have examined whether the dominant thermal signature from mantle tomography at the core-mantle boundary (CMB) (a degree & order 2 spherical harmonic) can influence our results. We found that this heat flux pattern at the CMB has no outstanding effects on the quadrupole and octupole magnetic field components. Our studies suggest that if there is a stably stratified layer at the top of the Earth's core, it must be limited in terms of stability and thickness, in order to be compatible with the observed paleomagnetic record.

  13. Propagation of acoustic waves in a stratified atmosphere, 1

    Science.gov (United States)

    Kalkofen, W.; Rossi, P.; Bodo, G.; Massaglia, S.

    1994-01-01

    This work is motivated by the chromospheric 3 minute oscillations observed in the K(sub 2v) bright points. We study acoustic gravity waves in a one-dimensional, gravitationally stratified, isothermal atmosphere. The oscillations are excited either by a velocity pulse imparted to a layer in an atmosphere of infinite vertical extent, or by a piston forming the lower boundary of a semi-infinite medium. We consider both linear and non-linear waves.

  14. The Fokker-Planck equation for ray dispersion in gyrotropic stratified media

    NARCIS (Netherlands)

    Golynski, S.M.

    1984-01-01

    The Hamilton equations of geometrical optics determine the rays of the relevant wave field in the short wavelength. We give a systematic derivation of the Fokker-Planck equation for the joint probability density of the position and unit direction vector of rays propagating in a gyrotropic stratified

  15. Stratified polymer brushes from microcontact printing of polydopamine initiator on polymer brush surfaces.

    Science.gov (United States)

    Wei, Qiangbing; Yu, Bo; Wang, Xiaolong; Zhou, Feng

    2014-06-01

    Stratified polymer brushes are fabricated using microcontact printing (μCP) of initiator integrated polydopamine (PDOPBr) on polymer brush surfaces and the following surface initiated atom transfer radical polymerization (SI-ATRP). It is found that the surface energy, chemically active groups, and the antifouling ability of the polymer brushes affect transfer efficiency and adhesive stability of the polydopamine film. The stickiness of the PDOPBr pattern on polymer brush surfaces is stable enough to perform continuous μCP and SI-ATRP to prepare stratified polymer brushes with a 3D topography, which have broad applications in cell and protein patterning, biosensors, and hybrid surfaces. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Emerging Techniques in Stratified Designs and Continuous Gradients for Tissue Engineering of Interfaces

    Science.gov (United States)

    Dormer, Nathan H.; Berkland, Cory J.; Detamore, Michael S.

    2013-01-01

    Interfacial tissue engineering is an emerging branch of regenerative medicine, where engineers are faced with developing methods for the repair of one or many functional tissue systems simultaneously. Early and recent solutions for complex tissue formation have utilized stratified designs, where scaffold formulations are segregated into two or more layers, with discrete changes in physical or chemical properties, mimicking a corresponding number of interfacing tissue types. This method has brought forth promising results, along with a myriad of regenerative techniques. The latest designs, however, are employing “continuous gradients” in properties, where there is no discrete segregation between scaffold layers. This review compares the methods and applications of recent stratified approaches to emerging continuously graded methods. PMID:20411333

  17. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  18. Multi-Instrument Observations of Prolonged Stratified Wind Layers at Iqaluit, Nunavut

    Science.gov (United States)

    Mariani, Zen; Dehghan, Armin; Gascon, Gabrielle; Joe, Paul; Hudak, David; Strawbridge, Kevin; Corriveau, Julien

    2018-02-01

    Data collected between October 2015 and May 2016 at Environment and Climate Change Canada's Iqaluit research site (64°N, 69°W) have revealed a high frequency (40% of all days for which observations were available) of stratified wind layer events that occur from near the surface up to about 7.2 km above sea level. These stratified wind layers are clearly visible as wind shifts (90 to 180°) with height in range-height indicator scans from the Doppler lidar and Ka-band radar and in wind direction profiles from the Doppler lidar and radiosonde. During these events, the vertical structure of the flow appears to be a stack of 4 to 10 layers ranging in vertical width from 0.1 to 4.4 km. The stratification events that were observed occurred predominantly (81%) during light precipitation and lasted up to 27.5 h. The integrated measurement platforms at Iqaluit permitted continuous observations of the evolution of stratification events in different meteorological conditions.

  19. Assessing differences in groups randomized by recruitment chain in a respondent-driven sample of Seattle-area injection drug users.

    Science.gov (United States)

    Burt, Richard D; Thiede, Hanne

    2014-11-01

    Respondent-driven sampling (RDS) is a form of peer-based study recruitment and analysis that incorporates features designed to limit and adjust for biases in traditional snowball sampling. It is being widely used in studies of hidden populations. We report an empirical evaluation of RDS's consistency and variability, comparing groups recruited contemporaneously, by identical methods and using identical survey instruments. We randomized recruitment chains from the RDS-based 2012 National HIV Behavioral Surveillance survey of injection drug users in the Seattle area into two groups and compared them in terms of sociodemographic characteristics, drug-associated risk behaviors, sexual risk behaviors, human immunodeficiency virus (HIV) status and HIV testing frequency. The two groups differed in five of the 18 variables examined (P ≤ .001): race (e.g., 60% white vs. 47%), gender (52% male vs. 67%), area of residence (32% downtown Seattle vs. 44%), an HIV test in the previous 12 months (51% vs. 38%). The difference in serologic HIV status was particularly pronounced (4% positive vs. 18%). In four further randomizations, differences in one to five variables attained this level of significance, although the specific variables involved differed. We found some material differences between the randomized groups. Although the variability of the present study was less than has been reported in serial RDS surveys, these findings indicate caution in the interpretation of RDS results. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. The stratified Boycott effect

    Science.gov (United States)

    Peacock, Tom; Blanchette, Francois; Bush, John W. M.

    2005-04-01

    We present the results of an experimental investigation of the flows generated by monodisperse particles settling at low Reynolds number in a stably stratified ambient with an inclined sidewall. In this configuration, upwelling beneath the inclined wall associated with the Boycott effect is opposed by the ambient density stratification. The evolution of the system is determined by the relative magnitudes of the container depth, h, and the neutral buoyancy height, hn = c0(ρp-ρf)/|dρ/dz|, where c0 is the particle concentration, ρp the particle density, ρf the mean fluid density and dρ/dz Boycott layer transports dense fluid from the bottom to the top of the system; subsequently, the upper clear layer of dense saline fluid is mixed by convection. For sufficiently strong stratification, h > hn, layering occurs. The lowermost layer is created by clear fluid transported from the base to its neutral buoyancy height, and has a vertical extent hn; subsequently, smaller overlying layers develop. Within each layer, convection erodes the initially linear density gradient, generating a step-like density profile throughout the system that persists after all the particles have settled. Particles are transported across the discrete density jumps between layers by plumes of particle-laden fluid.

  1. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    Science.gov (United States)

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A random cluster survey and a convenience sample give comparable estimates of immunity to vaccine preventable diseases in children of school age in Victoria, Australia.

    Science.gov (United States)

    Kelly, Heath; Riddell, Michaela A; Gidding, Heather F; Nolan, Terry; Gilbert, Gwendolyn L

    2002-08-19

    We compared estimates of the age-specific population immunity to measles, mumps, rubella, hepatitis B and varicella zoster viruses in Victorian school children obtained by a national sero-survey, using a convenience sample of residual sera from diagnostic laboratories throughout Australia, with those from a three-stage random cluster survey. When grouped according to school age (primary or secondary school) there was no significant difference in the estimates of immunity to measles, mumps, hepatitis B or varicella. Compared with the convenience sample, the random cluster survey estimated higher immunity to rubella in samples from both primary (98.7% versus 93.6%, P = 0.002) and secondary school students (98.4% versus 93.2%, P = 0.03). Despite some limitations, this study suggests that the collection of a convenience sample of sera from diagnostic laboratories is an appropriate sampling strategy to provide population immunity data that will inform Australia's current and future immunisation policies. Copyright 2002 Elsevier Science Ltd.

  3. The contribution of simple random sampling to observed variations in faecal egg counts.

    Science.gov (United States)

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. FDTD scattered field formulation for scatterers in stratified dispersive media.

    Science.gov (United States)

    Olkkonen, Juuso

    2010-03-01

    We introduce a simple scattered field (SF) technique that enables finite difference time domain (FDTD) modeling of light scattering from dispersive objects residing in stratified dispersive media. The introduced SF technique is verified against the total field scattered field (TFSF) technique. As an application example, we study surface plasmon polariton enhanced light transmission through a 100 nm wide slit in a silver film.

  5. 1998 Annual Status Report: Submersed and Floating-Leaf Vegetation in Pools 4, 8, 13, and 26 and La Grange Pool of the Upper Mississippi River System

    National Research Council Canada - National Science Library

    Yin, Yao

    2001-01-01

    Aquatic vegetation was investigated in five navigation pools in the Upper Mississippi River System using a new protocol named 'stratified random sampling' or SRS protocol for the first time in 1998...

  6. Implementing risk-stratified screening for common cancers: a review of potential ethical, legal and social issues.

    Science.gov (United States)

    Hall, A E; Chowdhury, S; Hallowell, N; Pashayan, N; Dent, T; Pharoah, P; Burton, H

    2014-06-01

    The identification of common genetic variants associated with common cancers including breast, prostate and ovarian cancers would allow population stratification by genotype to effectively target screening and treatment. As scientific, clinical and economic evidence mounts there will be increasing pressure for risk-stratified screening programmes to be implemented. This paper reviews some of the main ethical, legal and social issues (ELSI) raised by the introduction of genotyping into risk-stratified screening programmes, in terms of Beauchamp and Childress's four principles of biomedical ethics--respect for autonomy, non-maleficence, beneficence and justice. Two alternative approaches to data collection, storage, communication and consent are used to exemplify the ELSI issues that are likely to be raised. Ultimately, the provision of risk-stratified screening using genotyping raises fundamental questions about respective roles of individuals, healthcare providers and the state in organizing or mandating such programmes, and the principles, which underpin their provision, particularly the requirement for distributive justice. The scope and breadth of these issues suggest that ELSI relating to risk-stratified screening will become increasingly important for policy-makers, healthcare professionals and a wide diversity of stakeholders. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health.

  7. Pengaruh Kualitas Pelayanan Terhadap Kepuasan Nasabah Di Pt. Bank Central Asia (Bca Tbk Cabang Undaan Surabaya

    Directory of Open Access Journals (Sweden)

    Yulian Belinda Rahmawati

    2014-10-01

    Full Text Available This study aims to determine the effect of Quality of Service Characteristics on Customer Satisfaction in the PT. Bank Central Asia (BCA Branch Surabaya Undaan. The sampling in this study using stratified sampling (Stratified Random Sampling. Analysis techniques that are used are double linear regression. The calculation result shows that the quality of services simultaneous effect on customer satisfaction in PT. Bank Central Asia (BCA Branch Surabaya Undaaan. The quality of services that include variables Responsiveness, Tangibles, Empathy, Assurance, and Reliability in a partial effect on customer satisfaction in PT. Bank Central Asia (BCA Branch Surabaya undaaan.

  8. Evaluation of a Stratified National Breast Screening Program in the United Kingdom: An Early Model-Based Cost-Effectiveness Analysis.

    Science.gov (United States)

    Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D Gareth; Astley, Sue; Payne, Katherine

    2017-09-01

    To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1, risk 2, masking [supplemental screening for women with higher breast density], and masking and risk 1) compared with the current UK NBSP and no screening. The model assumed a lifetime horizon, the health service perspective to identify costs (£, 2015), and measured consequences in quality-adjusted life-years (QALYs). Multiple data sources were used: systematic reviews of effectiveness and utility, published studies reporting costs, and cohort studies embedded in existing NBSPs. Model parameter uncertainty was assessed using probabilistic sensitivity analysis and one-way sensitivity analysis. The base-case analysis, supported by probabilistic sensitivity analysis, suggested that the risk stratified NBSPs (risk 1 and risk-2) were relatively cost-effective when compared with the current UK NBSP, with incremental cost-effectiveness ratios of £16,689 per QALY and £23,924 per QALY, respectively. Stratified NBSP including masking approaches (supplemental screening for women with higher breast density) was not a cost-effective alternative, with incremental cost-effectiveness ratios of £212,947 per QALY (masking) and £75,254 per QALY (risk 1 and masking). When compared with no screening, all stratified NBSPs could be considered cost-effective. Key drivers of cost-effectiveness were discount rate, natural history model parameters, mammographic sensitivity, and biopsy rates for recalled cases. A key assumption was that the risk model used in the stratification process was perfectly calibrated to the population. This early model-based cost-effectiveness analysis provides indicative evidence for decision makers to understand the key drivers of costs and QALYs for exemplar stratified NBSP. Copyright

  9. Comparative Effects of Circuit Training Programme on Speed and ...

    African Journals Online (AJOL)

    Stratified random sampling technique was used to select 40 pre-menarceal and 40 postmenarcheal girls who were later randomly assigned to experimental and control groups. At the end of the training programme, 40 subjects completed the post training measurements, so there were 10 subjects in each of the four study ...

  10. Stratified growth in Pseudomonas aeruginosa biofilms

    DEFF Research Database (Denmark)

    Werner, E.; Roe, F.; Bugnicourt, A.

    2004-01-01

    In this study, stratified patterns of protein synthesis and growth were demonstrated in Pseudomonas aeruginosa biofilms. Spatial patterns of protein synthetic activity inside biofilms were characterized by the use of two green fluorescent protein (GFP) reporter gene constructs. One construct...... synthesis was restricted to a narrow band in the part of the biofilm adjacent to the source of oxygen. The zone of active GFP expression was approximately 60 Am wide in colony biofilms and 30 Am wide in flow cell biofilms. The region of the biofilm in which cells were capable of elongation was mapped...... by treating colony biofilms with carbenicillin, which blocks cell division, and then measuring individual cell lengths by transmission electron microscopy. Cell elongation was localized at the air interface of the biofilm. The heterogeneous anabolic patterns measured inside these biofilms were likely a result...

  11. Thermal instability in a stratified plasma

    International Nuclear Information System (INIS)

    Hermanns, D.F.M.; Priest, E.R.

    1989-01-01

    The thermal instability mechansism has been studied in connection to observed coronal features, like, e.g. prominences or cool cores in loops. Although these features show a lot of structure, most studies concern the thermal instability in an uniform medium. In this paper, we investigate the thermal instability and the interaction between thermal modes and the slow magneto-acoustic subspectrum for a stratified plasma slab. We fomulate the relevant system of equations and give some straightforward properties of the linear spectrum of a non-uniform plasma slab, i.e. the existence of continuous parts in the spectrum. We present a numerical scheme with which we can investigate the linear spectrum for equilibrium states with stratification. The slow and thermal subspectra of a crude coronal model are given as a preliminary result. (author). 6 refs.; 1 fig

  12. Total regional and global number of synapses in the human brain neocortex

    NARCIS (Netherlands)

    Tang, Y.; Nyengaard, J.R.; Groot, D.M.G. de; Jorgen, H.; Gundersen, G.

    2001-01-01

    An estimator of the total number of synapses in neocortex of human autopsy brains based on unbiased stereological principles is described. Each randomly chosen cerebral hemisphere was stratified into the four major neocortical regions. Uniform sampling with a varying sampling fraction in each region

  13. The effect of sediments on turbulent plume dynamics in a stratified fluid

    Science.gov (United States)

    Stenberg, Erik; Ezhova, Ekaterina; Brandt, Luca

    2017-11-01

    We report large eddy simulation results of sediment-loaded turbulent plumes in a stratified fluid. The configuration, where the plume is discharged from a round source, provides an idealized model of subglacial discharge from a submarine tidewater glacier and is a starting point for understanding the effect of sediments on the dynamics of the rising plume. The transport of sediments is modeled by means of an advection-diffusion equation where sediment settling velocity is taken into account. We initially follow the experimental setup of Sutherland (Phys. Rev. Fluids, 2016), considering uniformly stratified ambients and further extend the work to pycnocline-type stratifications typical of Greenland fjords. Apart from examining the rise height, radial spread and intrusion of the rising plume, we gain further insights of the plume dynamics by extracting turbulent characteristics and the distribution of the sediments inside the plume.

  14. Mobile access to virtual randomization for investigator-initiated trials.

    Science.gov (United States)

    Deserno, Thomas M; Keszei, András P

    2017-08-01

    Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization

  15. A statistical mechanics approach to mixing in stratified fluids

    OpenAIRE

    Venaille , Antoine; Gostiaux , Louis; Sommeria , Joël

    2016-01-01

    Accepted for the Journal of Fluid Mechanics; Predicting how much mixing occurs when a given amount of energy is injected into a Boussinesq fluid is a longstanding problem in stratified turbulence. The huge number of degrees of freedom involved in these processes renders extremely difficult a deterministic approach to the problem. Here we present a statistical mechanics approach yielding a prediction for a cumulative, global mixing efficiency as a function of a global Richard-son number and th...

  16. Sutudy on exchange flow under the unstably stratified field

    OpenAIRE

    文沢, 元雄

    2005-01-01

    This paper deals with the exchange flow under the unstably stratified field. The author developed the effective measurement system as well as the numerical analysis program. The system and the program are applied to the helium-air exchange flow in a rectangular channel with inclination. Following main features of the exchange flow were discussed based on the calculated results.(1) Time required for establishing a quasi-steady state exchange flow.(2) The relationship between the inclination an...

  17. A study on the instability criterion for the stratified flow in horizontal pipe at cocurrent flow conditions

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Chang Kyung [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    This paper presents a theoretical approach of the instability criterion from stratified to nonstratified flow in horizontal pipe at cocurrent flow conditions. The new theoretical instability criterion for the stratified and nonstratified flow transition in horizontal pipe has been developed by hyperbolic equations in two-phase flow. Critical flow condition criterion and onset of slugging at cocurrent flow condition correspond to zero and imaginary characteristics which occur when the hyperbolicity of a stratified two-phase flow is broken, respectively. Through comparison between results predicted by the present flow is broken, respectively. Through comparison between results predicted by the present theory and the Kukita et al. [1] experimental data of pipes, it is shown that they are in good agreement with data. 4 refs., 2 figs. (Author)

  18. A study on the instability criterion for the stratified flow in horizontal pipe at cocurrent flow conditions

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Chang Kyung [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1997-12-31

    This paper presents a theoretical approach of the instability criterion from stratified to nonstratified flow in horizontal pipe at cocurrent flow conditions. The new theoretical instability criterion for the stratified and nonstratified flow transition in horizontal pipe has been developed by hyperbolic equations in two-phase flow. Critical flow condition criterion and onset of slugging at cocurrent flow condition correspond to zero and imaginary characteristics which occur when the hyperbolicity of a stratified two-phase flow is broken, respectively. Through comparison between results predicted by the present flow is broken, respectively. Through comparison between results predicted by the present theory and the Kukita et al. [1] experimental data of pipes, it is shown that they are in good agreement with data. 4 refs., 2 figs. (Author)

  19. Analyze of Predictive Model of Innovation Management in Processing and Complementary Industries of Livestock Products

    OpenAIRE

    Ahmad Reza Ommani

    2015-01-01

    The purpose of this study was designing predictive model for innovation management in processing and complementary industries of livestock products. The method of research was correlative descriptive. The population of this research was managers in processing and complementary industries of livestock products of Khouzestan Province (N=486). By stratified random sampling, a random sample (n=125) was selected for participation in the study. A questionnaire was developed to ...

  20. 42 CFR 431.814 - Sampling plan and procedures.

    Science.gov (United States)

    2010-10-01

    ... reliability of the reduced sample. (4) The sample selection procedure. Systematic random sampling is... sampling, and yield estimates with the same or better precision than achieved in systematic random sampling... 42 Public Health 4 2010-10-01 2010-10-01 false Sampling plan and procedures. 431.814 Section 431...

  1. Gender Stratified Monopoly: Why Do I Earn Less and Pay More?

    Science.gov (United States)

    Smith, Stacy L.

    2017-01-01

    A modified version of Monopoly has long been used as a simulation exercise to teach inequality. Versions of Modified Monopoly (MM) have touched on minority status relative to inequality but without an exploration of the complex interaction between minority status and class. This article introduces Gender Stratified Monopoly (GSM), an adaptation…

  2. Longevity of Compositionally Stratified Layers in Ice Giants

    Science.gov (United States)

    Friedson, A. J.

    2017-12-01

    In the hydrogen-rich atmospheres of gas giants, a decrease with radius in the mixing ratio of a heavy species (e.g. He, CH4, H2O) has the potential to produce a density stratification that is convectively stable if the heavy species is sufficiently abundant. Formation of stable layers in the interiors of these planets has important implications for their internal structure, chemical mixing, dynamics, and thermal evolution, since vertical transport of heat and constituents in such layers is greatly reduced in comparison to that in convecting layers. Various processes have been suggested for creating compositionally stratified layers. In the interiors of Jupiter and Saturn, these include phase separation of He from metallic hydrogen and dissolution of dense core material into the surrounding metallic-H envelope. Condensation of methane and water has been proposed as a mechanism for producing stable zones in the atmospheres of Saturn and the ice giants. However, if a stably stratified layer is formed adjacent to an active region of convection, it may be susceptible to progressive erosion as the convection intrudes and entrains fluid into the unstable envelope. We discuss the principal factors that control the rate of entrainment and associated erosion and present a specific example concerning the longevity of stable layers formed by condensation of methane and water in Uranus and Neptune. We also consider whether the temporal variability of such layers may engender episodic behavior in the release of the internal heat of these planets. This research is supported by a grant from the NASA Solar System Workings Program.

  3. Effect of stratified inequality of blood flow on gas exchange in liquid-filled lungs.

    Science.gov (United States)

    West, J. B.; Maloney, J. E.; Castle, B. L.

    1972-01-01

    This investigation set out to answer two questions: (1) are the distal alveoli in the terminal lung units less well perfused than the proximal alveoli, i.e., is there stratification of blood flow; and (2) if so, does this enhance gas exchange in the presence of stratified inequality of ventilation. Excised dog lungs were ventilated with saline and perfused with blood. Following single inspirations of xenon 133 in saline and various periods of breath holding, the expired xenon concentration against volume was measured and it confirmed marked stratified inequality of ventilation under these conditions. By measuring the rate of depletion of xenon from alveoli during a period of blood flow, we showed that the alveoli which emptied at the end of expiration had 16% less blood flow than those exhaling earlier. However, by measuring the xenon concentration in pulmonary venous blood, we found that about 10% less tracer was transferred from the alveoli into the blood when the inspired xenon was stratified within the respiratory zone. Thus while stratification of blood flow was confirmed, it was shown to impair rather than enhance the efficiency of gas transfer.

  4. STRESS DISTRIBUTION IN THE STRATIFIED MASS CONTAINING VERTICAL ALVEOLE

    Directory of Open Access Journals (Sweden)

    Bobileva Tatiana Nikolaevna

    2017-08-01

    Full Text Available Almost all subsurface rocks used as foundations for various types of structures are stratified. Such heterogeneity may cause specific behaviour of the materials under strain. Differential equations describing the behaviour of such materials contain rapidly fluctuating coefficients, in view of this, solution of such equations is more time-consuming when using today’s computers. The method of asymptotic averaging leads to getting homogeneous medium under study to averaged equations with fixed factors. The present article is concerned with stratified soil mass consisting of pair-wise alternative isotropic elastic layers. In the results of elastic modules averaging, the present soil mass with horizontal rock stratification is simulated by homogeneous transversal-isotropic half-space with isotropy plane perpendicular to the standing axis. Half-space is loosened by a vertical alveole of circular cross-section, and virgin ground is under its own weight. For horizontal parting planes of layers, the following two types of surface conditions are set: ideal contact and backlash without cleavage. For homogeneous transversal-isotropic half-space received with a vertical alveole, the analytical solution of S.G. Lekhnitsky, well known in scientific papers, is used. The author gives expressions for stress components and displacements in soil mass for different marginal conditions on the alveole surface. Such research problems arise when constructing and maintaining buildings and when composite materials are used.

  5. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  6. Linear and nonlinear stability of a thermally stratified magnetically driven rotating flow in a cylinder.

    Science.gov (United States)

    Grants, Ilmars; Gerbeth, Gunter

    2010-07-01

    The stability of a thermally stratified liquid metal flow is considered numerically. The flow is driven by a rotating magnetic field in a cylinder heated from above and cooled from below. The stable thermal stratification turns out to destabilize the flow. This is explained by the fact that a stable stratification suppresses the secondary meridional flow, thus indirectly enhancing the primary rotation. The instability in the form of Taylor-Görtler rolls is consequently promoted. These rolls can only be excited by finite disturbances in the isothermal flow. A sufficiently strong thermal stratification transforms this nonlinear bypass instability into a linear one reducing, thus, the critical value of the magnetic driving force. A weaker temperature gradient delays the linear instability but makes the bypass transition more likely. We quantify the non-normal and nonlinear components of this transition by direct numerical simulation of the flow response to noise. It is observed that the flow sensitivity to finite disturbances increases considerably under the action of a stable thermal stratification. The capabilities of the random forcing approach to identify disconnected coherent states in a general case are discussed.

  7. Randomized controlled trial of attention bias modification in a racially diverse, socially anxious, alcohol dependent sample.

    Science.gov (United States)

    Clerkin, Elise M; Magee, Joshua C; Wells, Tony T; Beard, Courtney; Barnett, Nancy P

    2016-12-01

    Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Adult participants (N = 86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  9. E25 stratified torch ignition engine emissions and combustion analysis

    International Nuclear Information System (INIS)

    Rodrigues Filho, Fernando Antonio; Baêta, José Guilherme Coelho; Teixeira, Alysson Fernandes; Valle, Ramón Molina; Fonseca de Souza, José Leôncio

    2016-01-01

    Highlights: • A stratified torch ignition (STI) engine was built and tested. • The STI engines was tested in a wide range of load and speed. • Significant reduction on emissions was achieved by means of the STI system. • Low cyclic variability characterized the lean combustion process of the torch ignition engine. • HC emission is the main drawback of the stratified torch ignition engine. - Abstract: Vehicular emissions significantly increase atmospheric air pollution and greenhouse gases (GHG). This fact associated with fast global vehicle fleet growth calls for prompt scientific community technological solutions in order to promote a significant reduction in vehicle fuel consumption and emissions, especially of fossil fuels to comply with future legislation. To meet this goal, a prototype stratified torch ignition (STI) engine was built from a commercial existing baseline engine. In this system, combustion starts in a pre-combustion chamber, where the pressure increase pushes the combustion jet flames through calibrated nozzles to be precisely targeted into the main chamber. These combustion jet flames are endowed with high thermal and kinetic energy, being able to generate a stable lean combustion process. The high kinetic and thermal energy of the combustion jet flame results from the load stratification. This is carried out through direct fuel injection in the pre-combustion chamber by means of a prototype gasoline direct injector (GDI) developed for a very low fuel flow rate. In this work the engine out-emissions of CO, NOx, HC and CO_2 of the STI engine are presented and a detailed analysis supported by the combustion parameters is conducted. The results obtained in this work show a significant decrease in the specific emissions of CO, NOx and CO_2 of the STI engine in comparison with the baseline engine. On the other hand, HC specific emission increased due to wall wetting from the fuel hitting in the pre-combustion chamber wall.

  10. The Malleability of Spatial Ability under Treatment of a FIRST LEGO League-Based Robotics Simulation

    Science.gov (United States)

    Coxon, Steve V.

    2012-01-01

    A stratified random sample of volunteer participants (N = 75) aged 9 to 14 was drawn from 16 public school districts' gifted programs, including as many females (n = 28) and children from groups traditionally underrepresented in gifted programs (n = 18) as available. Participants were randomly divided into an experimental (n = 38) and a control…

  11. Natural history of HIV/AIDS in a major goldmining centre in South Africa: results of a biomedical and social survey

    CSIR Research Space (South Africa)

    Gilgen, D

    2001-09-01

    Full Text Available sample comprised a stratified random group of migrant mineworkers and of the resident adult population living in the community close to the mines and a small convenience sample of sex workers. In total, 2231 people between 13 and 59 years of age were...

  12. Perceptions of Preservice Teachers regarding the Integration of Information and Communication Technologies in Turkish Education Faculties

    Science.gov (United States)

    Akbulut, Yavuz; Odabasi, H. Ferhan; Kuzu, Abdullah

    2011-01-01

    This study explored the views of pre-service teachers regarding the indicators of information and communication technologies (ICT) at Turkish education faculties. A cross-sectional survey design was implemented with graduating students enrolled in Turkish education faculties. A combination of stratified random sampling and systematic sampling was…

  13. Prevalence and factors affecting work-related injury among workers ...

    African Journals Online (AJOL)

    Administrator

    EPI INFO version 6.04 statistical software was used to calculate sample size. ... Stratified random sampling technique was applied to get the desired ... The quality of data was ensured through the training of data collectors and .... The bivariate analysis indicated that more .... Multivariate Logistic regression analysis: Stepwise.

  14. Family Cohesion and Level of Communication Between Parents and ...

    African Journals Online (AJOL)

    This study investigated the level of communication between parents and their adolescent children and how such communication affects family cohesion. A sample of 200 subjects made up of adolescents and parents were selected through cluster, stratified and random sampling techniques from ten Local Government Areas ...

  15. 1994 Annual Status Report. A Summary of Fish Data in Six Reaches of the Upper Mississippi River System

    National Research Council Canada - National Science Library

    Gutreuter, Steve

    1997-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 2,653 collections of fishes from stratified random sad permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1994...

  16. 1991 Annual Status Report. A Summary of Fish Data in Six Reaches of the Upper Mississippi River System

    National Research Council Canada - National Science Library

    Gutreuter, Steve

    1998-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 2,653 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1991...

  17. 1996 Annual Status Report. A Summary of Fish Data in Six Reaches of the Upper Mississippi River System

    National Research Council Canada - National Science Library

    Burkhardt, Randy

    1997-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 2,378 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1996...

  18. 1997 Annual Status Report A Summary of Fish Data in Six Reaches of The Upper Mississippi River System

    National Research Council Canada - National Science Library

    Burkhardt, Randy

    1998-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 2,797 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1997...

  19. 1998 Annual Status Report: A Summary of Fish Data in Six Reaches of the Upper Mississippi River System

    National Research Council Canada - National Science Library

    Burkhardt, Randy

    2000-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 2,664 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1998...

  20. A Summary of Fish Data in Six Reaches of The Upper Mississippi River System

    National Research Council Canada - National Science Library

    Gutreuter, Steve

    1997-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 1,994 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1993...

  1. 1995 Annual Status Report. A Summary of Fish Data in Six Reaches of the Upper Mississippi River System

    National Research Council Canada - National Science Library

    Gutreuter, Steve

    1997-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 2,723 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during 1995...

  2. Farmers` satisfaction with the performance of the Mooi River ...

    African Journals Online (AJOL)

    2013-08-15

    Aug 15, 2013 ... Key performance issues in SHI schemes range from technical, agronomic .... selected through stratified random sampling during the. 2010/11 irrigation season. ..... fied with the irrigation service endeavour to achieve optimum.

  3. Doubly stratified mixed convection flow of Maxwell nanofluid with heat generation/absorption

    Energy Technology Data Exchange (ETDEWEB)

    Abbasi, F.M., E-mail: abbasisarkar@gmail.com [Department of Mathematics, Comsats Institute of Information Technology, Islamabad 44000 (Pakistan); Shehzad, S.A. [Department of Mathematics, Comsats Institute of Information Technology, Sahiwal 57000 (Pakistan); Hayat, T. [Department of Mathematics, Quaid-i-Azam University, 45320, Islamabad 44000 (Pakistan); NAAM Research Group, Department of Mathematics, Faculty of Science, King Abdulaziz University, Jeddah 21589 (Saudi Arabia); Ahmad, B. [NAAM Research Group, Department of Mathematics, Faculty of Science, King Abdulaziz University, Jeddah 21589 (Saudi Arabia)

    2016-04-15

    Magnetohydrodynamic (MHD) doubly stratified flow of Maxwell nanofluid in presence of mixed convection is analyzed in this article. Effects of thermophoresis, Brownian motion and heat generation/absorption are present. The flow is induced due to linear stretching of sheet. Mathematical formulation is made under boundary layer approach. Expressions of velocity, temperature and nanoparticles concentration are developed. The obtained results are plotted and discussed to examine the variations in temperature and nanoparticles concentration due to different physical parameters. Numerical computations are made to obtain the values of local Nusselt and Sherwood numbers. Impact of sundry parameters on the flow quantities is analyzed graphically. - Highlights: • Double stratified flow of Maxwell nanofluid with mixed convection is modeled. • Thermophoresis and Brownian motion effects are encountered. • Computations are made to obtain the solution expressions. • Numerical values of local Nusselt and Sherwood numbers are computed and examined.

  4. A comparison of dental caries levels in two communities with different oral health prevention strategies stratified in different social classes.

    Science.gov (United States)

    Sagheri, Darius; McLoughlin, Jacinta; Clarkson, John J

    2007-01-01

    To compare dental caries levels of schoolchildren stratified in different social classes whose domestic water supply had been fluoridated since birth (Dublin) with those living in an area where fluoridated salt was available (Freiburg). A representative, random sample of twelve-year-old children was examined and dental caries was recorded using World Health Organization criteria. A total of 699 twelve-year-old children were examined, 377 were children in Dublin and 322 in Freiburg. In Dublin the mean decayed, missing, and filled permanent teeth (DMFT) was 0.80 and in Freiburg it was 0.69. An examination of the distribution of the DMFT score revealed that its distribution is highly positively skewed. For this reason this study provides summary analyses based on medians and inter-quartile range and nonparametric rank sum tests. In both cities caries levels of children in social class 1 (highest) were considerably lower when compared with the other social classes regardless of the fluoride intervention model used. The caries levels showed a reduced disparity between children in social class 2 (medium) and 3 (lowest) in Dublin compared with those in social class 2 and 3 in Freiburg. The evidence from this study confirmed that water fluoridation has reduced the gap in dental caries experience between medium and lower social classes in Dublin compared with the greater difference in caries experience between the equivalent social classes in Freiburg. The results from this study established the important role of salt fluoridation where water fluoridation is not feasible.

  5. 1992 Annual Status Report: A Summary of Fish Data in Six Reaches of the Upper Mississippi River System

    National Research Council Canada - National Science Library

    Gutreuter, Steve

    1997-01-01

    The Long Term Resource Monitoring Program (LTRMP) completed 2,221 collections of fishes from stratified random and permanently fixed sampling locations in six study reaches of the Upper Mississippi River System during I 992...

  6. Natural Resources Management on Corps of Engineers Water Resources Development Projects: Practices, Challenges, and Perspectives on the Future

    National Research Council Canada - National Science Library

    Kasual, Richard

    1998-01-01

    Natural resources management on U.S. Army Corps of Engineers water resources development projects was documented from the responses of management personnel to a detailed questionnaire mailed to a stratified random sample of projects...

  7. Community genomics among stratified microbial assemblages in the ocean's interior

    DEFF Research Database (Denmark)

    DeLong, Edward F; Preston, Christina M; Mincer, Tracy

    2006-01-01

    Microbial life predominates in the ocean, yet little is known about its genomic variability, especially along the depth continuum. We report here genomic analyses of planktonic microbial communities in the North Pacific Subtropical Gyre, from the ocean's surface to near-sea floor depths. Sequence......, and host-viral interactions. Comparative genomic analyses of stratified microbial communities have the potential to provide significant insight into higher-order community organization and dynamics....

  8. Mathematical modeling of turbulent stratified flows. Application of liquid metal fast breeders

    Energy Technology Data Exchange (ETDEWEB)

    Villand, M; Grand, D [CEA-Service des Transferts Thermiques, Grenoble (France)

    1983-07-01

    Mathematical model of turbulent stratified flow was proposed under the following assumptions: Newtonian fluid; incompressible fluid; coupling between temperature and momentum fields according to Boussinesq approximation; two-dimensional invariance for translation or rotation; coordinates cartesian or curvilinear. Solutions obtained by the proposed method are presented.

  9. Experimental and numerical investigation of stratified gas-liquid flow in inclined circular pipes

    International Nuclear Information System (INIS)

    Faccini, J.L.H.; Sampaio, P.A.B. de; Botelho, M.H.D.S.; Cunha, M.V.; Cunha Filho, J.S.; Su, J.

    2012-01-01

    In this paper, a stratified gas-liquid flow is experimentally and numerically investigated. Two measurement techniques, namely an ultrasonic technique and a visualization technique, are applied on an inclined circular test section using a fast single transducer pulse-echo technique and a high-speed camera. A numerical model is employed to simulate the stratified gas-liquid flow, formed by a system of non-linear differential equations consisting of the Reynolds averaged Navier-Stokes equations with the κ-ω turbulence model. The test section used in this work is comprised mainly of a transparent circular pipe with inner diameter 1 inch, and inclination angles varying from -2.5 to -10.0 degrees. Numerical solutions are obtained for the liquid height as a function of inclination angles, and compared with our own experimental data. (author)

  10. Spatial scan statistics to assess sampling strategy of antimicrobial resistance monitoring programme

    DEFF Research Database (Denmark)

    Vieira, Antonio; Houe, Hans; Wegener, Henrik Caspar

    2009-01-01

    Pie collection and analysis of data on antimicrobial resistance in human and animal Populations are important for establishing a baseline of the occurrence of resistance and for determining trends over time. In animals, targeted monitoring with a stratified sampling plan is normally used. However...... sampled by the Danish Integrated Antimicrobial Resistance Monitoring and Research Programme (DANMAP), by identifying spatial Clusters of samples and detecting areas with significantly high or low sampling rates. These analyses were performed for each year and for the total 5-year study period for all...... by an antimicrobial monitoring program....

  11. Characterisation of the suspended particulate matter in a stratified estuarine environment employing complementary techniques

    Science.gov (United States)

    Thomas, Luis P.; Marino, Beatriz M.; Szupiany, Ricardo N.; Gallo, Marcos N.

    2017-09-01

    The ability to predict the sediment and nutrient circulation within estuarine waters is of significant economic and ecological importance. In these complex systems, flocculation is a dynamically active process that is directly affected by the prevalent environmental conditions. Consequently, the floc properties continuously change, which greatly complicates the characterisation of the suspended particle matter (SPM). In the present study, three different techniques are combined in a stratified estuary under quiet weather conditions and with a low river discharge to search for a solution to this problem. The challenge is to obtain the concentration, size and flux of suspended elements through selected cross-sections using the method based on the simultaneous backscatter records of 1200 and 600 kHz ADCPs, isokinetic sampling data and LISST-25X measurements. The two-ADCP method is highly effective for determining the SPM size distributions in a non-intrusive way. The isokinetic sampling and the LISST-25X diffractometer offer point measurements at specific depths, which are especially useful for calibrating the ADCP backscatter intensity as a function of the SPM concentration and size, and providing complementary information on the sites where acoustic records are not available. Limitations and potentials of the techniques applied are discussed.

  12. Analysis of flame propagation phenomenon in simplified stratified charge conditions; Tanjunkasareta sojo kyukiba ni okeru kaen denpa gensho no kansatsu

    Energy Technology Data Exchange (ETDEWEB)

    Moriyoshi, Y; Morikawa, H [Chiba University, Chiba (Japan); Kamimoto, T [Tokyo Institute of Technology, Tokyo (Japan)

    1997-10-01

    Since the local inhomogeneity of mixture concentration inside the cylinder affects the combustion characteristics, a basic research on combustion phenomenon in stratified charge conditions is required. The authors have made experiments with a constant-volume chamber, which can simulate an idealized stratified charge field by using a removable partition, to obtain the combustion characteristics. Also, numerical calculations are made using some combustion models. As a result, the important feature that the combustion speed is faster in stratified condition than in homogeneous condition can be predicted by the two-step reaction model. 4 refs., 8 figs.

  13. The assessment of female students' perceptions, practices and ...

    African Journals Online (AJOL)

    The main purpose of this study was to assess perceptions, practices and challenges of ... Stratified sampling followed by Simple random sampling (lottery method) technique ... The gathered information was analyzed using both quantitative and ... By Country · List All Titles · Free To Read Titles This Journal is Open Access.

  14. An Observational Study of Honey Bee Colony Winter Losses and Their Association with Varroa destructor, Neonicotinoids and Other Risk Factors

    NARCIS (Netherlands)

    Zee, van der R.; Gray, A.; Rijk, de T.C.

    2015-01-01

    This article presents results of an analysis of honey bee losses over the winter of 2011-2012 in the Netherlands, from a sample of 86 colonies, located at 43 apiaries. The apiaries were selected using spatially stratified random sampling. Colony winter loss data were collected and related to various

  15. Utililization of water

    African Journals Online (AJOL)

    User

    Abstract. This study was conducted to investigate the level of water resources utilization for small scale irrigation agriculture and to examine the food security of households of Seka woreda. A sample of two hundred-ten households were taken using stratified random sampling method. Questionnaire and observation were ...

  16. The Relationship between Happiness, Subjective Well-Being, Creativity and Job Performance of Primary School Teachers in Ramhormoz City

    Science.gov (United States)

    Jalali, Zohreh; Heidari, Alireza

    2016-01-01

    The research aimed to investigate the relationship between happiness, subjective well-being, creativity and job performance of primary school teachers in Ramhormoz City. Hence, a sample of 330 individuals was selected through random stratified sampling. The research tools included Oxford Happiness Inventory, Subjective Well-being Scale by Keyes…

  17. The Importance of Phonological Awareness for the Development of Early English Reading Skills among Bilingual Singaporean Kindergartners

    Science.gov (United States)

    Dixon, L. Quentin

    2010-01-01

    To examine the relationship between phonological awareness (PA) and English word-level reading among a multilingual sample, a random sample of 297 Singaporean kindergartners, stratified by ethnicity (169 Chinese, 65 Malay, and 63 Indian), were tested on their PA, receptive vocabulary, and word-level reading skills. Singaporean kindergartners are…

  18. Resource Utilisation and Curriculum Implementation in Community Colleges in Kenya

    Science.gov (United States)

    Kigwilu, Peter Changilwa; Akala, Winston Jumba

    2017-01-01

    The study investigated how Catholic-sponsored community colleges in Nairobi utilise the existing physical facilities and teaching and learning resources for effective implementation of Artisan and Craft curricula. The study adopted a mixed methods research design. Proportional stratified random sampling was used to sample 172 students and 18…

  19. The prevalence of body dysmorphic disorder among South African ...

    African Journals Online (AJOL)

    14 items ... The prevalence of BDD among student populations ranges from. 2.3% in Australia ... since students tend to have a younger mean age than participants ... Proportionate stratified random cluster sampling was used to select the sample. ... In order to ensure optimal ... variables was determined by calculating Pearson.

  20. Revisiting sample size: are big trials the answer?

    Science.gov (United States)

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  1. 1999 Survey of Active Duty Personnel: Administration, Datasets, and Codebook. Appendix G: Frequency and Percentage Distributions for Variables in the Survey Analysis Files

    National Research Council Canada - National Science Library

    Wright, Laverne

    2000-01-01

    ... A) was administered from August 1999 through December 1999, It was fielded to a nonproportional stratified, single stage random sample of 66,040 DoD Service members DoD and the Coast Guard, The (weighted...

  2. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  3. Dominance of a clonal green sulfur bacterial population in a stratified lake

    DEFF Research Database (Denmark)

    Gregersen, Lea H; Habicht, Kirsten S; Peduzzi, Sandro

    2009-01-01

    surveys using FISH cell counting and population multilocus sequence typing [clone library sequence analysis of the small subunit (SSU) rRNA locus and two loci involved in photosynthesis in GSB: fmoA and csmCA]. All bacterial populations clearly stratified according to water column chemistry. The GSB...

  4. LES of stratified-wavy flows using novel near-interface treatment

    Science.gov (United States)

    Karnik, Aditya; Kahouadji, Lyes; Chergui, Jalel; Juric, Damir; Shin, Seungwon; Matar, Omar K.

    2017-11-01

    The pressure drop in horizontal stratified wavy flows is influenced by interfacial shear stress. The near-interface behavior of the lighter phase is akin to that near a moving wall. We employ a front-tracking code, Blue, to simulate and capture the near-interface behaviour of both phases. Blue uses a modified Smagorinsky LES model incorporating a novel near-interface treatment for the sub-grid viscosity, which is influenced by damping due to the wall-like interface, and enhancement of the turbulent kinetic energy (TKE) due to the interfacial waves. Simulations are carried out for both air-water and oil-water stratified configurations to demonstrate the applicability of the present method. The mean velocities and tangential Reynolds stresses are compared with experiments for both configurations. At the higher Re, the waves penetrate well into the buffer region of the boundary layer above the interface thus altering its dynamics. Previous attempts to capture the secondary structures associated with such flows using RANS or standard LES methodologies have been unsuccessful. The ability of the present method to reproduce these structures is due to the correct estimation of the near-interface TKE governing energy transfer from the normal to tangential directions. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).

  5. Actual distribution of Cronobacter spp. in industrial batches of powdered infant formula and consequences for performance of sampling strategies.

    Science.gov (United States)

    Jongenburger, I; Reij, M W; Boer, E P J; Gorris, L G M; Zwietering, M H

    2011-11-15

    , stratified random sampling improved the probability to detect the heterogeneous contamination. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Modification of Measures of Acute Kidney Injury to Risk Stratify Combat Casualties

    Science.gov (United States)

    2017-08-26

    REPORT TYPE 08/26/2017 Poster 4. TJTLE AND SUBTITLE t\\.1odification of l’vfeasures,of Acute Kidney Injury to Risk Stratify Cotnbat Casualties 6...profiles and potential future conflicts , identifying acute kidney injury (AKI) early can help us determine the need for rapidity of evacuation

  7. ORIGINAL ARTICLES

    African Journals Online (AJOL)

    means simple. People ... items, and asked respondents to estimate answers if such data were not ... hospital units in each province serving this population. A stratified random sample of hospital units and clinics was then .... Optimal hospital.

  8. Dynamical implications of sample shape for avalanches in 2-dimensional random-field Ising model with saw-tooth domain wall

    Science.gov (United States)

    Tadić, Bosiljka

    2018-03-01

    We study dynamics of a built-in domain wall (DW) in 2-dimensional disordered ferromagnets with different sample shapes using random-field Ising model on a square lattice rotated by 45 degrees. The saw-tooth DW of the length Lx is created along one side and swept through the sample by slow ramping of the external field until the complete magnetisation reversal and the wall annihilation at the open top boundary at a distance Ly. By fixing the number of spins N =Lx ×Ly = 106 and the random-field distribution at a value above the critical disorder, we vary the ratio of the DW length to the annihilation distance in the range Lx /Ly ∈ [ 1 / 16 , 16 ] . The periodic boundary conditions are applied in the y-direction so that these ratios comprise different samples, i.e., surfaces of cylinders with the changing perimeter Lx and height Ly. We analyse the avalanches of the DW slips between following field updates, and the multifractal structure of the magnetisation fluctuation time series. Our main findings are that the domain-wall lengths materialised in different sample shapes have an impact on the dynamics at all scales. Moreover, the domain-wall motion at the beginning of the hysteresis loop (HLB) probes the disorder effects resulting in the fluctuations that are significantly different from the large avalanches in the central part of the loop (HLC), where the strong fields dominate. Specifically, the fluctuations in HLB exhibit a wide multi-fractal spectrum, which shifts towards higher values of the exponents when the DW length is reduced. The distributions of the avalanches in this segments of the loops obey power-law decay and the exponential cutoffs with the exponents firmly in the mean-field universality class for long DW. In contrast, the avalanches in the HLC obey Tsallis density distribution with the power-law tails which indicate the new categories of the scale invariant behaviour for different ratios Lx /Ly. The large fluctuations in the HLC, on the other

  9. Rapid shelf-wide cooling response of a stratified coastal ocean to hurricanes.

    Science.gov (United States)

    Seroka, Greg; Miles, Travis; Xu, Yi; Kohut, Josh; Schofield, Oscar; Glenn, Scott

    2017-06-01

    Large uncertainty in the predicted intensity of tropical cyclones (TCs) persists compared to the steadily improving skill in the predicted TC tracks. This intensity uncertainty has its most significant implications in the coastal zone, where TC impacts to populated shorelines are greatest. Recent studies have demonstrated that rapid ahead-of-eye-center cooling of a stratified coastal ocean can have a significant impact on hurricane intensity forecasts. Using observation-validated, high-resolution ocean modeling, the stratified coastal ocean cooling processes observed in two U.S. Mid-Atlantic hurricanes were investigated: Hurricane Irene (2011)-with an inshore Mid-Atlantic Bight (MAB) track during the late summer stratified coastal ocean season-and Tropical Storm Barry (2007)-with an offshore track during early summer. For both storms, the critical ahead-of-eye-center depth-averaged force balance across the entire MAB shelf included an onshore wind stress balanced by an offshore pressure gradient. This resulted in onshore surface currents opposing offshore bottom currents that enhanced surface to bottom current shear and turbulent mixing across the thermocline, resulting in the rapid cooling of the surface layer ahead-of-eye-center. Because the same baroclinic and mixing processes occurred for two storms on opposite ends of the track and seasonal stratification envelope, the response appears robust. It will be critical to forecast these processes and their implications for a wide range of future storms using realistic 3-D coupled atmosphere-ocean models to lower the uncertainty in predictions of TC intensities and impacts and enable coastal populations to better respond to increasing rapid intensification threats in an era of rising sea levels.

  10. Rapid shelf‐wide cooling response of a stratified coastal ocean to hurricanes

    Science.gov (United States)

    Miles, Travis; Xu, Yi; Kohut, Josh; Schofield, Oscar; Glenn, Scott

    2017-01-01

    Abstract Large uncertainty in the predicted intensity of tropical cyclones (TCs) persists compared to the steadily improving skill in the predicted TC tracks. This intensity uncertainty has its most significant implications in the coastal zone, where TC impacts to populated shorelines are greatest. Recent studies have demonstrated that rapid ahead‐of‐eye‐center cooling of a stratified coastal ocean can have a significant impact on hurricane intensity forecasts. Using observation‐validated, high‐resolution ocean modeling, the stratified coastal ocean cooling processes observed in two U.S. Mid‐Atlantic hurricanes were investigated: Hurricane Irene (2011)—with an inshore Mid‐Atlantic Bight (MAB) track during the late summer stratified coastal ocean season—and Tropical Storm Barry (2007)—with an offshore track during early summer. For both storms, the critical ahead‐of‐eye‐center depth‐averaged force balance across the entire MAB shelf included an onshore wind stress balanced by an offshore pressure gradient. This resulted in onshore surface currents opposing offshore bottom currents that enhanced surface to bottom current shear and turbulent mixing across the thermocline, resulting in the rapid cooling of the surface layer ahead‐of‐eye‐center. Because the same baroclinic and mixing processes occurred for two storms on opposite ends of the track and seasonal stratification envelope, the response appears robust. It will be critical to forecast these processes and their implications for a wide range of future storms using realistic 3‐D coupled atmosphere‐ocean models to lower the uncertainty in predictions of TC intensities and impacts and enable coastal populations to better respond to increasing rapid intensification threats in an era of rising sea levels. PMID:28944132

  11. Stratified prevention: opportunities and limitations. Report on the 1st interdisciplinary cardiovascular workshop in Augsburg.

    Science.gov (United States)

    Kirchhof, Gregor; Lindner, Josef Franz; Achenbach, Stephan; Berger, Klaus; Blankenberg, Stefan; Fangerau, Heiner; Gimpel, Henner; Gassner, Ulrich M; Kersten, Jens; Magnus, Dorothea; Rebscher, Herbert; Schunkert, Heribert; Rixen, Stephan; Kirchhof, Paulus

    2018-03-01

    Sufficient exercise and sleep, a balanced diet, moderate alcohol consumption and a good approach to handle stress have been known as lifestyles that protect health and longevity since the Middle Age. This traditional prevention quintet, turned into a sextet by smoking cessation, has been the basis of the "preventive personality" that formed in the twentieth century. Recent analyses of big data sets including genomic and physiological measurements have unleashed novel opportunities to estimate individual health risks with unprecedented accuracy, allowing to target preventive interventions to persons at high risk and at the same time to spare those in whom preventive measures may not be needed or even be harmful. To fully grasp these opportunities for modern preventive medicine, the established healthy life styles require supplementation by stratified prevention. The opportunities of these developments for life and health contrast with justified concerns: A "surveillance society", able to predict individual behaviour based on big data, threatens individual freedom and jeopardises equality. Social insurance law and the new German Disease Prevention Act (Präventionsgesetz) rightly stress the need for research to underpin stratified prevention which is accessible to all, ethical, effective, and evidence based. An ethical and acceptable development of stratified prevention needs to start with autonomous individuals who control and understand all information pertaining to their health. This creates a mandate for lifelong health education, enabled in an individualised form by digital technology. Stratified prevention furthermore requires the evidence-based development of a new taxonomy of cardiovascular diseases that reflects disease mechanisms. Such interdisciplinary research needs broad support from society and a better use of biosamples and data sets within an updated research governance framework.

  12. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    Science.gov (United States)

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  13. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    International Nuclear Information System (INIS)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ 1 -minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy

  14. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    Science.gov (United States)

    Hampton, Jerrad; Doostan, Alireza

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ1-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence on the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.

  15. Knowledge of HIV/AIDS and Risk Behaviour among Students of ...

    African Journals Online (AJOL)

    This study examined the knowledge and risk behaviours on HIV/AIDS of students in colleges of Education in Osun State. The study sampled 1600 students (male and female) from two colleges of Education. A descriptive survey was adopted for the study using stratified random sampling techniques. A self- developed ...

  16. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    implementation generally improved the algorithm’s ability to predict the correct soil class. The implementation of soil-landscape relationships and area-proportional sampling generally increased the calculation time, while the random forest implementation reduced the calculation time. In the most successful......Detailed soil information is often needed to support agricultural practices, environmental protection and policy decisions. Several digital approaches can be used to map soil properties based on field observations. When soil observations are sparse or missing, an alternative approach...... is to disaggregate existing conventional soil maps. At present, the DSMART algorithm represents the most sophisticated approach for disaggregating conventional soil maps (Odgers et al., 2014). The algorithm relies on classification trees trained from resampled points, which are assigned classes according...

  17. An analysis direct-contact condensation in horizontal cocurrent stratified flow of steam and cold water

    International Nuclear Information System (INIS)

    Lee, Suk Ho; Kim, Hho Jung

    1992-01-01

    The physical benchmark problem on the direct-contact condensation under the horizontal cocurrent stratified flow was analyzed using the RELAP5/MOD2 and /MOD3 one-dimensional model. Analysis was performed for the Northwestern experiments, which involved condensing steam/water flow in a rectangular channel. The study showed that the RELAP5 interfacial heat transfer model, under the horizontal stratified flow regime, predicted the condensation rate well though the interfacial heat transfer area was underpredicted. However, some discrepancies in water layer thickness and local heat transfer coefficient with experimental results were found especially when there is a wavy interface, and those were satisfied only within the range. (Author)

  18. Comparison of Address-based Sampling and Random-digit Dialing Methods for Recruiting Young Men as Controls in a Case-Control Study of Testicular Cancer Susceptibility

    OpenAIRE

    Clagett, Bartholt; Nathanson, Katherine L.; Ciosek, Stephanie L.; McDermoth, Monique; Vaughn, David J.; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A.

    2013-01-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-...

  19. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    Science.gov (United States)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  20. National Coral Reef Monitoring Program: Socioeconomic surveys of human use, knowledge, attitudes, and perceptions in South Florida from 2014-01-20 to 2014-07-03 (NCEI Accession 0161541)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data in this file comes from a survey of adult residents in South Florida. The survey results were obtained from a random stratified sample of households in...

  1. The Impact of Supervision and Mentorship Practices on Perceived ...

    African Journals Online (AJOL)

    trainees and beginning teachers acquired as the impact that practicum supervision and mentorship have had on them. Stratified and simple random sampling procedures were used to select 446-second year and third year teacher trainees and ...

  2. Ekhosuehi. et al

    African Journals Online (AJOL)

    PUBLICATIONS1

    respondents from a total population of 37,083 persons in the University of Benin, which consists of academic ... stratified random sampling. As part of .... model λ. By optimal state sequence, we mean ..... Maximum Likelihood Estimator for Gen-.

  3. Risk Factors for Emergency Department Short Time Readmission in Stratified Population

    Directory of Open Access Journals (Sweden)

    Ariadna Besga

    2015-01-01

    Full Text Available Background. Emergency department (ED readmissions are considered an indicator of healthcare quality that is particularly relevant in older adults. The primary objective of this study was to identify key factors for predicting patients returning to the ED within 30 days of being discharged. Methods. We analysed patients who attended our ED in June 2014, stratified into four groups based on the Kaiser pyramid. We collected data on more than 100 variables per case including demographic and clinical characteristics and drug treatments. We identified the variables with the highest discriminating power to predict ED readmission and constructed classifiers using machine learning methods to provide predictions. Results. Classifier performance distinguishing between patients who were and were not readmitted (within 30 days, in terms of average accuracy (AC. The variables with the greatest discriminating power were age, comorbidity, reasons for consultation, social factors, and drug treatments. Conclusions. It is possible to predict readmissions in stratified groups with high accuracy and to identify the most important factors influencing the event. Therefore, it will be possible to develop interventions to improve the quality of care provided to ED patients.

  4. Decision tree analysis to stratify risk of de novo non-melanoma skin cancer following liver transplantation.

    Science.gov (United States)

    Tanaka, Tomohiro; Voigt, Michael D

    2018-03-01

    Non-melanoma skin cancer (NMSC) is the most common de novo malignancy in liver transplant (LT) recipients; it behaves more aggressively and it increases mortality. We used decision tree analysis to develop a tool to stratify and quantify risk of NMSC in LT recipients. We performed Cox regression analysis to identify which predictive variables to enter into the decision tree analysis. Data were from the Organ Procurement Transplant Network (OPTN) STAR files of September 2016 (n = 102984). NMSC developed in 4556 of the 105984 recipients, a mean of 5.6 years after transplant. The 5/10/20-year rates of NMSC were 2.9/6.3/13.5%, respectively. Cox regression identified male gender, Caucasian race, age, body mass index (BMI) at LT, and sirolimus use as key predictive or protective factors for NMSC. These factors were entered into a decision tree analysis. The final tree stratified non-Caucasians as low risk (0.8%), and Caucasian males > 47 years, BMI decision tree model accurately stratifies the risk of developing NMSC in the long-term after LT.

  5. A Sample-Based Forest Monitoring Strategy Using Landsat, AVHRR and MODIS Data to Estimate Gross Forest Cover Loss in Malaysia between 1990 and 2005

    Directory of Open Access Journals (Sweden)

    Peter Potapov

    2013-04-01

    Full Text Available Insular Southeast Asia is a hotspot of humid tropical forest cover loss. A sample-based monitoring approach quantifying forest cover loss from Landsat imagery was implemented to estimate gross forest cover loss for two eras, 1990–2000 and 2000–2005. For each time interval, a probability sample of 18.5 km × 18.5 km blocks was selected, and pairs of Landsat images acquired per sample block were interpreted to quantify forest cover area and gross forest cover loss. Stratified random sampling was implemented for 2000–2005 with MODIS-derived forest cover loss used to define the strata. A probability proportional to x (πpx design was implemented for 1990–2000 with AVHRR-derived forest cover loss used as the x variable to increase the likelihood of including forest loss area in the sample. The estimated annual gross forest cover loss for Malaysia was 0.43 Mha/yr (SE = 0.04 during 1990–2000 and 0.64 Mha/yr (SE = 0.055 during 2000–2005. Our use of the πpx sampling design represents a first practical trial of this design for sampling satellite imagery. Although the design performed adequately in this study, a thorough comparative investigation of the πpx design relative to other sampling strategies is needed before general design recommendations can be put forth.

  6. Shade Sails and Passive Recreation in Public Parks of Melbourne and Denver: A Randomized Intervention

    Science.gov (United States)

    English, Dallas R.; Buller, Mary Klein; Simmons, Jody; Chamberlain, James A.; Wakefield, Melanie; Dobbinson, Suzanne

    2017-01-01

    Objectives. To test whether shade sails will increase the use of passive recreation areas (PRAs). Methods. We conducted a stratified randomized pretest–posttest controlled design study in Melbourne, Australia, and Denver, Colorado, in 2010 to 2014. We randomized a sample of 144 public parks with 2 PRAs in full sun in a 1:3 ratio to treatment or control. Shade sails were built at 1 PRA per treatment park. The outcome was any use of the study PRA (n = 576 pretest and n = 576 posttest observations; 100% follow-up). Results. Compared with control PRAs (adjusted probability of use: pretest = 0.14, posttest = 0.17), use of treatment PRAs (pretest = 0.10, posttest = 0.32) was higher at posttest (odds ratio [OR] = 3.91; 95% confidence interval [CI] = 1.71, 8.94). Shade increased use of PRAs in Denver (control: pretest = 0.18, posttest = 0.19; treatment: pretest = 0.16, posttest = 0.47) more than Melbourne (control: pretest = 0.11, posttest = 0.14; shaded: pretest = 0.06, posttest = 0.19; OR = 2.98; 95% CI = 1.09, 8.14). Conclusions. Public investment in shade is warranted for skin cancer prevention and may be especially useful in the United States. Trial Registration. Clinicaltrials.gov identifier NCT02971709. PMID:29048958

  7. Shade Sails and Passive Recreation in Public Parks of Melbourne and Denver: A Randomized Intervention.

    Science.gov (United States)

    Buller, David B; English, Dallas R; Buller, Mary Klein; Simmons, Jody; Chamberlain, James A; Wakefield, Melanie; Dobbinson, Suzanne

    2017-12-01

    To test whether shade sails will increase the use of passive recreation areas (PRAs). We conducted a stratified randomized pretest-posttest controlled design study in Melbourne, Australia, and Denver, Colorado, in 2010 to 2014. We randomized a sample of 144 public parks with 2 PRAs in full sun in a 1:3 ratio to treatment or control. Shade sails were built at 1 PRA per treatment park. The outcome was any use of the study PRA (n = 576 pretest and n = 576 posttest observations; 100% follow-up). Compared with control PRAs (adjusted probability of use: pretest = 0.14, posttest = 0.17), use of treatment PRAs (pretest = 0.10, posttest = 0.32) was higher at posttest (odds ratio [OR] = 3.91; 95% confidence interval [CI] = 1.71, 8.94). Shade increased use of PRAs in Denver (control: pretest = 0.18, posttest = 0.19; treatment: pretest = 0.16, posttest = 0.47) more than Melbourne (control: pretest = 0.11, posttest = 0.14; shaded: pretest = 0.06, posttest = 0.19; OR = 2.98; 95% CI = 1.09, 8.14). Public investment in shade is warranted for skin cancer prevention and may be especially useful in the United States. Clinicaltrials.gov identifier NCT02971709.

  8. Using Random Forest to Improve the Downscaling of Global Livestock Census Data

    Science.gov (United States)

    Nicolas, Gaëlle; Robinson, Timothy P.; Wint, G. R. William; Conchedda, Giulia; Cinardi, Giuseppina; Gilbert, Marius

    2016-01-01

    Large scale, high-resolution global data on farm animal distributions are essential for spatially explicit assessments of the epidemiological, environmental and socio-economic impacts of the livestock sector. This has been the major motivation behind the development of the Gridded Livestock of the World (GLW) database, which has been extensively used since its first publication in 2007. The database relies on a downscaling methodology whereby census counts of animals in sub-national administrative units are redistributed at the level of grid cells as a function of a series of spatial covariates. The recent upgrade of GLW1 to GLW2 involved automating the processing, improvement of input data, and downscaling at a spatial resolution of 1 km per cell (5 km per cell in the earlier version). The underlying statistical methodology, however, remained unchanged. In this paper, we evaluate new methods to downscale census data with a higher accuracy and increased processing efficiency. Two main factors were evaluated, based on sample census datasets of cattle in Africa and chickens in Asia. First, we implemented and evaluated Random Forest models (RF) instead of stratified regressions. Second, we investigated whether models that predicted the number of animals per rural person (per capita) could provide better downscaled estimates than the previous approach that predicted absolute densities (animals per km2). RF models consistently provided better predictions than the stratified regressions for both continents and species. The benefit of per capita over absolute density models varied according to the species and continent. In addition, different technical options were evaluated to reduce the processing time while maintaining their predictive power. Future GLW runs (GLW 3.0) will apply the new RF methodology with optimized modelling options. The potential benefit of per capita models will need to be further investigated with a better distinction between rural and agricultural

  9. Using Random Forest to Improve the Downscaling of Global Livestock Census Data.

    Directory of Open Access Journals (Sweden)

    Gaëlle Nicolas

    Full Text Available Large scale, high-resolution global data on farm animal distributions are essential for spatially explicit assessments of the epidemiological, environmental and socio-economic impacts of the livestock sector. This has been the major motivation behind the development of the Gridded Livestock of the World (GLW database, which has been extensively used since its first publication in 2007. The database relies on a downscaling methodology whereby census counts of animals in sub-national administrative units are redistributed at the level of grid cells as a function of a series of spatial covariates. The recent upgrade of GLW1 to GLW2 involved automating the processing, improvement of input data, and downscaling at a spatial resolution of 1 km per cell (5 km per cell in the earlier version. The underlying statistical methodology, however, remained unchanged. In this paper, we evaluate new methods to downscale census data with a higher accuracy and increased processing efficiency. Two main factors were evaluated, based on sample census datasets of cattle in Africa and chickens in Asia. First, we implemented and evaluated Random Forest models (RF instead of stratified regressions. Second, we investigated whether models that predicted the number of animals per rural person (per capita could provide better downscaled estimates than the previous approach that predicted absolute densities (animals per km2. RF models consistently provided better predictions than the stratified regressions for both continents and species. The benefit of per capita over absolute density models varied according to the species and continent. In addition, different technical options were evaluated to reduce the processing time while maintaining their predictive power. Future GLW runs (GLW 3.0 will apply the new RF methodology with optimized modelling options. The potential benefit of per capita models will need to be further investigated with a better distinction between rural

  10. Direct numerical simulation of homogeneous stratified rotating turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Iida, O.; Tsujimura, S.; Nagano, Y. [Nagoya Institute of Technology, Department of Mech. Eng., Nagoya (Japan)

    2005-12-01

    The effects of the Prandtl number on stratified rotating turbulence have been studied in homogeneous turbulence by using direct numerical simulations and a rapid distortion theory. Fluctuations under strong stable-density stratification can be theoretically divided into the WAVE and the potential vorticity (PV) modes. In low-Prandtl-number fluids, the WAVE mode deteriorates, while the PV mode remains. Imposing rotation on a low-Prandtl-number fluid makes turbulence two-dimensional as well as geostrophic; it is found from the instantaneous turbulent structure that the vortices merge to form a few vertically-elongated vortex columns. During the period toward two-dimensionalization, the vertical vortices become asymmetric in the sense of rotation. (orig.)

  11. Four to seven random casual urine specimens are sufficient to estimate 24-h urinary sodium/potassium ratio in individuals with high blood pressure.

    Science.gov (United States)

    Iwahori, T; Ueshima, H; Torii, S; Saito, Y; Fujiyoshi, A; Ohkubo, T; Miura, K

    2016-05-01

    This study was done to clarify the optimal number and type of casual urine specimens required to estimate urinary sodium/potassium (Na/K) ratio in individuals with high blood pressure. A total of 74 individuals with high blood pressure, 43 treated and 31 untreated, were recruited from the Japanese general population. Urinary sodium, potassium and Na/K ratio were measured in both casual urine samples and 7-day 24-h urine samples and then analyzed by correlation and Bland-Altman analyses. Mean Na/K ratio from random casual urine samples on four or more days strongly correlated with the Na/K ratio of 7-day 24-h urine (r=0.80-0.87), which was similar to the correlation between 1 and 2-day 24-h urine and 7-day 24-h urine (r=0.75-0.89). The agreement quality for Na/K ratio of seven random casual urine for estimating the Na/K ratio of 7-day 24-h urine was good (bias: -0.26, limits of agreements: -1.53-1.01), and it was similar to that of 2-day 24-h urine for estimating 7-day 24-h values (bias: 0.07, limits of agreement: -1.03 to 1.18). Stratified analyses comparing individuals using antihypertensive medication and individuals not using antihypertensive medication showed similar results. Correlations of the means of casual urine sodium or potassium concentrations with 7-day 24-h sodium or potassium excretions were relatively weaker than those for Na/K ratio. The mean Na/K ratio of 4-7 random casual urine specimens on different days provides a good substitute for 1-2-day 24-h urinary Na/K ratio for individuals with high blood pressure.

  12. New numerical approaches for modeling thermochemical convection in a compositionally stratified fluid

    Science.gov (United States)

    Puckett, Elbridge Gerry; Turcotte, Donald L.; He, Ying; Lokavarapu, Harsha; Robey, Jonathan M.; Kellogg, Louise H.

    2018-03-01

    Geochemical observations of mantle-derived rocks favor a nearly homogeneous upper mantle, the source of mid-ocean ridge basalts (MORB), and heterogeneous lower mantle regions. Plumes that generate ocean island basalts are thought to sample the lower mantle regions and exhibit more heterogeneity than MORB. These regions have been associated with lower mantle structures known as large low shear velocity provinces (LLSVPS) below Africa and the South Pacific. The isolation of these regions is attributed to compositional differences and density stratification that, consequently, have been the subject of computational and laboratory modeling designed to determine the parameter regime in which layering is stable and understanding how layering evolves. Mathematical models of persistent compositional interfaces in the Earth's mantle may be inherently unstable, at least in some regions of the parameter space relevant to the mantle. Computing approximations to solutions of such problems presents severe challenges, even to state-of-the-art numerical methods. Some numerical algorithms for modeling the interface between distinct compositions smear the interface at the boundary between compositions, such as methods that add numerical diffusion or 'artificial viscosity' in order to stabilize the algorithm. We present two new algorithms for maintaining high-resolution and sharp computational boundaries in computations of these types of problems: a discontinuous Galerkin method with a bound preserving limiter and a Volume-of-Fluid interface tracking algorithm. We compare these new methods with two approaches widely used for modeling the advection of two distinct thermally driven compositional fields in mantle convection computations: a high-order accurate finite element advection algorithm with entropy viscosity and a particle method that carries a scalar quantity representing the location of each compositional field. All four algorithms are implemented in the open source finite

  13. Randomized Trial of a Broad Preventive Intervention for Mexican American Adolescents

    Science.gov (United States)

    Gonzales, N.A.; Dumka, L.E.; Millsap, R.E.; Gottschall, A.; McClain, D.B.; Wong, J.J.; Germán, M.; Mauricio, A.M.; Wheeler, L.; Carpentier, F.D.; Kim, S.Y.

    2012-01-01

    Objective This randomized trial of a family-focused preventive intervention for Mexican American (MA) adolescents evaluated intervention effects on adolescent substance use, internalizing and externalizing symptoms, and school discipline and grade records in 8th grade, one year after completion of the intervention. The study also examined hypothesized mediators and moderators of intervention effects. Method Stratified by language of program delivery (English vs. Spanish), the trial included a sample of 516 MA adolescents (50.8% female; M =12.3 years, SD=.54) and at least one caregiver that were randomized to receive a low dosage control group workshop or the 9-week group intervention that included parenting, adolescent coping, and conjoint family sessions. Results Positive program effects were found on all five outcomes at one-year posttest, but varied depending on whether adolescents, parents, or teachers reported on the outcome. Intervention effects were mediated by posttest changes in effective parenting, adolescent coping efficacy, adolescent school engagement, and family cohesion. The majority of direct and mediated effects were moderated by language, with a larger number of significant effects for families that participated in Spanish. Intervention effects also were moderated by baseline levels of mediators and outcomes, with the majority showing stronger effects for families with poorer functioning at baseline. Conclusion Findings support the efficacy of the intervention to decrease multiple problem outcomes for MA adolescents, but also demonstrate differential effects for parents and adolescents receiving the intervention in Spanish vs. English, and depending on their baseline levels of functioning. PMID:22103956

  14. National Coral Reef Monitoring Program: Socioeconomic surveys of human use, knowledge, attitudes, and perceptions in Puerto Rico from 2014-12-08 to 2015-02-20 (NCEI Accession 0161544)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data in this file comes from a survey of adult residents in Puerto Rico. The survey was conducted for a random stratified sample of households in the nine...

  15. Transition of Gas-Liquid Stratified Flow in Oil Transport Pipes

    Directory of Open Access Journals (Sweden)

    D. Lakehal

    2011-12-01

    Full Text Available Large-Scale Simulation results of the transition of a gas-liquid stratified flow to slug flow regime in circular 3D oil transport pipes under turbulent flow conditions expressed. Free surface flow in the pipe is treated using the Level Set method. Turbulence is approached via the LES and VLES methodologies extended to interfacial two-phase flows. It is shown that only with the Level Set method the flow transition can be accurately predicted, better than with the two-fluid phase-average model. The transition from stratified to slug flow is found to be subsequent to the merging of the secondary wave modes created by the action of gas shear (short waves with the first wave mode (high amplitude long wave. The model is capable of predicting global flow features like the onset of slugging and slug speed. In the second test case, the model predicts different kinds of slugs, the so-called operating slugs formed upstream that fill entirely the pipe with water slugs of length scales of the order of 2-4 D, and lower size (1-1.5 D disturbance slugs, featuring lower hold-up (0.8-0.9. The model predicts well the frequency of slugs. The simulations revealed important parameter effects on the results, such as two-dimensionality, pipe length, and water holdup.

  16. A Study of the Effects of an Altered Workweek.

    Science.gov (United States)

    Wood Educational Consultants, Edmonton (Alberta).

    The purpose of this study was to examine the effects of organizational change arising from alterations in the structuring of the workweek. Data were collected from a stratified random sample of management and nonmanagement personnel employed within the various branches of the Alberta Department of Education. The sample consisted of 132 standard…

  17. Community of Inquiry Method and Language Skills Acquisition: Empirical Evidence

    Science.gov (United States)

    Preece, Abdul Shakhour Duncan

    2015-01-01

    The study investigates the effectiveness of community of inquiry method in preparing students to develop listening and speaking skills in a sample of junior secondary school students in Borno state, Nigeria. A sample of 100 students in standard classes was drawn in one secondary school in Maiduguri metropolis through stratified random sampling…

  18. Students\\' Academic Achievement in Mathematics as Correlate of ...

    African Journals Online (AJOL)

    The main purpose of this study was to assess the relationship between students' achievement in mathematics and teacher factors. The study was conducted in the in Lesotho, Southern Africa. Stratified random sampling based on school ownership was used to draw a sample of 40 teachers from the population of Grade 10 ...

  19. Experimental observation of the stratified electrothermal instability on aluminum with thickness greater than a skin depth

    Science.gov (United States)

    Hutchinson, T. M.; Awe, T. J.; Bauer, B. S.; Yates, K. C.; Yu, E. P.; Yelton, W. G.; Fuelling, S.

    2018-05-01

    A direct observation of the stratified electrothermal instability on the surface of thick metal is reported. Aluminum rods coated with 70 μ m Parylene-N were driven to 1 MA in 100 ns , with the metal thicker than the skin depth. The dielectric coating suppressed plasma formation, enabling persistent observation of discrete azimuthally correlated stratified thermal perturbations perpendicular to the current whose wave numbers, k , grew exponentially with rate γ (k ) =0.06 n s-1-(0.4 n s-1μ m2ra d-2 ) k2 in ˜1 g /c m3 , ˜7000 K aluminum.

  20. Experimental Observation of the Stratified Electrothermal Instability on Aluminum with Thickness Greater than a Skin Depth

    Energy Technology Data Exchange (ETDEWEB)

    Hutchinson, Trevor M. [Univ. of Nevada, Reno, NV (United States); Hutchinson, Trevor M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Awe, Thomas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauer, Bruno S. [Univ. of Nevada, Reno, NV (United States); Yates, Kevin [Univ. of New Mexico, Albuquerque, NM (United States); Yu, Edmund p. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Yelton, William G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Fuelling, Stephan [Univ. of Nevada, Reno, NV (United States)

    2017-07-01

    The first direct observation of the stratified electrothermal instability on the surface of thick metal is reported. Aluminum rods coated with 70 μm Parylene-N were driven to 1 MA in approximately 100 ns, with the metal thicker than the skin depth. The dielectric coating suppressed plasma formation, enabling persistent observation of discrete azimuthally-correlated stratified structures perpendicular to the current. Strata amplitudes grow rapidly, while their Fourier spectrum shifts toward longer wavelength. Assuming blackbody emission, radiometric calculations indicate strata are temperature perturbations that grow exponentially with rate γ = 0.04 ns -1 in 3000- 10,000 K aluminum.