WorldWideScience

Sample records for random sample study

  1. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  2. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  3. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  4. Improving ambulatory saliva-sampling compliance in pregnant women: a randomized controlled study.

    Directory of Open Access Journals (Sweden)

    Julian Moeller

    Full Text Available OBJECTIVE: Noncompliance with scheduled ambulatory saliva sampling is common and has been associated with biased cortisol estimates in nonpregnant subjects. This study is the first to investigate in pregnant women strategies to improve ambulatory saliva-sampling compliance, and the association between sampling noncompliance and saliva cortisol estimates. METHODS: We instructed 64 pregnant women to collect eight scheduled saliva samples on two consecutive days each. Objective compliance with scheduled sampling times was assessed with a Medication Event Monitoring System and self-reported compliance with a paper-and-pencil diary. In a randomized controlled study, we estimated whether a disclosure intervention (informing women about objective compliance monitoring and a reminder intervention (use of acoustical reminders improved compliance. A mixed model analysis was used to estimate associations between women's objective compliance and their diurnal cortisol profiles, and between deviation from scheduled sampling and the cortisol concentration measured in the related sample. RESULTS: Self-reported compliance with a saliva-sampling protocol was 91%, and objective compliance was 70%. The disclosure intervention was associated with improved objective compliance (informed: 81%, noninformed: 60%, F(1,60  = 17.64, p<0.001, but not the reminder intervention (reminders: 68%, without reminders: 72%, F(1,60 = 0.78, p = 0.379. Furthermore, a woman's increased objective compliance was associated with a higher diurnal cortisol profile, F(2,64  = 8.22, p<0.001. Altered cortisol levels were observed in less objective compliant samples, F(1,705  = 7.38, p = 0.007, with delayed sampling associated with lower cortisol levels. CONCLUSIONS: The results suggest that in pregnant women, objective noncompliance with scheduled ambulatory saliva sampling is common and is associated with biased cortisol estimates. To improve sampling compliance, results suggest

  5. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  6. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  7. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  8. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  9. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  10. Investigating causal associations between use of nicotine, alcohol, caffeine and cannabis: a two-sample bidirectional Mendelian randomization study.

    Science.gov (United States)

    Verweij, Karin J H; Treur, Jorien L; Vink, Jacqueline M

    2018-07-01

    Epidemiological studies consistently show co-occurrence of use of different addictive substances. Whether these associations are causal or due to overlapping underlying influences remains an important question in addiction research. Methodological advances have made it possible to use published genetic associations to infer causal relationships between phenotypes. In this exploratory study, we used Mendelian randomization (MR) to examine the causality of well-established associations between nicotine, alcohol, caffeine and cannabis use. Two-sample MR was employed to estimate bidirectional causal effects between four addictive substances: nicotine (smoking initiation and cigarettes smoked per day), caffeine (cups of coffee per day), alcohol (units per week) and cannabis (initiation). Based on existing genome-wide association results we selected genetic variants associated with the exposure measure as an instrument to estimate causal effects. Where possible we applied sensitivity analyses (MR-Egger and weighted median) more robust to horizontal pleiotropy. Most MR tests did not reveal causal associations. There was some weak evidence for a causal positive effect of genetically instrumented alcohol use on smoking initiation and of cigarettes per day on caffeine use, but these were not supported by the sensitivity analyses. There was also some suggestive evidence for a positive effect of alcohol use on caffeine use (only with MR-Egger) and smoking initiation on cannabis initiation (only with weighted median). None of the suggestive causal associations survived corrections for multiple testing. Two-sample Mendelian randomization analyses found little evidence for causal relationships between nicotine, alcohol, caffeine and cannabis use. © 2018 Society for the Study of Addiction.

  11. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  12. Rationale, design, methodology and sample characteristics for the Vietnam pre-conceptual micronutrient supplementation trial (PRECONCEPT: a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Nguyen Phuong H

    2012-10-01

    Full Text Available Abstract Background Low birth weight and maternal anemia remain intractable problems in many developing countries. The adequacy of the current strategy of providing iron-folic acid (IFA supplements only during pregnancy has been questioned given many women enter pregnancy with poor iron stores, the substantial micronutrient demand by maternal and fetal tissues, and programmatic issues related to timing and coverage of prenatal care. Weekly IFA supplementation for women of reproductive age (WRA improves iron status and reduces the burden of anemia in the short term, but few studies have evaluated subsequent pregnancy and birth outcomes. The Preconcept trial aims to determine whether pre-pregnancy weekly IFA or multiple micronutrient (MM supplementation will improve birth outcomes and maternal and infant iron status compared to the current practice of prenatal IFA supplementation only. This paper provides an overview of study design, methodology and sample characteristics from baseline survey data and key lessons learned. Methods/design We have recruited 5011 WRA in a double-blind stratified randomized controlled trial in rural Vietnam and randomly assigned them to receive weekly supplements containing either: 1 2800 μg folic acid 2 60 mg iron and 2800 μg folic acid or 3 MM. Women who become pregnant receive daily IFA, and are being followed through pregnancy, delivery, and up to three months post-partum. Study outcomes include birth outcomes and maternal and infant iron status. Data are being collected on household characteristics, maternal diet and mental health, anthropometry, infant feeding practices, morbidity and compliance. Discussion The study is timely and responds to the WHO Global Expert Consultation which identified the need to evaluate the long term benefits of weekly IFA and MM supplementation in WRA. Findings will generate new information to help guide policy and programs designed to reduce the burden of anemia in women and

  13. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  14. Comparison of Address-based Sampling and Random-digit Dialing Methods for Recruiting Young Men as Controls in a Case-Control Study of Testicular Cancer Susceptibility

    OpenAIRE

    Clagett, Bartholt; Nathanson, Katherine L.; Ciosek, Stephanie L.; McDermoth, Monique; Vaughn, David J.; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A.

    2013-01-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-...

  15. Nutritional status and falls in community-dwelling older people: a longitudinal study of a population-based random sample.

    Directory of Open Access Journals (Sweden)

    Ming-Hung Chien

    Full Text Available Falls are common in older people and may lead to functional decline, disability, and death. Many risk factors have been identified, but studies evaluating effects of nutritional status are limited. To determine whether nutritional status is a predictor of falls in older people living in the community, we analyzed data collected through the Survey of Health and Living Status of the Elderly in Taiwan (SHLSET.SHLSET include a series of interview surveys conducted by the government on a random sample of people living in community dwellings in the nation. We included participants who received nutritional status assessment using the Mini Nutritional Assessment Taiwan Version 2 (MNA-T2 in the 1999 survey when they were 53 years or older and followed up on the cumulative incidence of falls in the one-year period before the interview in the 2003 survey.At the beginning of follow-up, the 4440 participants had a mean age of 69.5 (standard deviation= 9.1 years, and 467 participants were "not well-nourished," which was defined as having an MNA-T2 score of 23 or less. In the one-year study period, 659 participants reported having at least one fall. After adjusting for other risk factors, we found the associated odds ratio for falls was 1.73 (95% confidence interval, 1.23, 2.42 for "not well-nourished," 1.57 (1.30, 1.90 for female gender, 1.03 (1.02, 1.04 for one-year older, 1.55 (1.22, 1.98 for history of falls, 1.34 (1.05, 1.72 for hospital stay during the past 12 months, 1.66 (1.07, 2.58 for difficulties in activities of daily living, and 1.53 (1.23, 1.91 for difficulties in instrumental activities of daily living.Nutritional status is an independent predictor of falls in older people living in the community. Further studies are warranted to identify nutritional interventions that can help prevent falls in the elderly.

  16. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  17. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  18. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  19. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  20. Systematic random sampling of the comet assay.

    Science.gov (United States)

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  1. Job strain and resting heart rate: a cross-sectional study in a Swedish random working sample

    Directory of Open Access Journals (Sweden)

    Peter Eriksson

    2016-03-01

    Full Text Available Abstract Background Numerous studies have reported an association between stressing work conditions and cardiovascular disease. However, more evidence is needed, and the etiological mechanisms are unknown. Elevated resting heart rate has emerged as a possible risk factor for cardiovascular disease, but little is known about the relation to work-related stress. This study therefore investigated the association between job strain, job control, and job demands and resting heart rate. Methods We conducted a cross-sectional survey of randomly selected men and women in Västra Götalandsregionen, Sweden (West county of Sweden (n = 1552. Information about job strain, job demands, job control, heart rate and covariates was collected during the period 2001–2004 as part of the INTERGENE/ADONIX research project. Six different linear regression models were used with adjustments for gender, age, BMI, smoking, education, and physical activity in the fully adjusted model. Job strain was operationalized as the log-transformed ratio of job demands over job control in the statistical analyses. Results No associations were seen between resting heart rate and job demands. Job strain was associated with elevated resting heart rate in the unadjusted model (linear regression coefficient 1.26, 95 % CI 0.14 to 2.38, but not in any of the extended models. Low job control was associated with elevated resting heart rate after adjustments for gender, age, BMI, and smoking (linear regression coefficient −0.18, 95 % CI −0.30 to −0.02. However, there were no significant associations in the fully adjusted model. Conclusions Low job control and job strain, but not job demands, were associated with elevated resting heart rate. However, the observed associations were modest and may be explained by confounding effects.

  2. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  3. Novel approach to systematic random sampling in population surveys: Lessons from the United Arab Emirates National Diabetes Study (UAEDIAB).

    Science.gov (United States)

    Sulaiman, Nabil; Albadawi, Salah; Abusnana, Salah; Fikri, Mahmoud; Madani, Abdulrazzag; Mairghani, Maisoon; Alawadi, Fatheya; Zimmet, Paul; Shaw, Jonathan

    2015-09-01

    The prevalence of diabetes has risen rapidly in the Middle East, particularly in the Gulf Region. However, some prevalence estimates have not fully accounted for large migrant worker populations and have focused on minority indigenous populations. The objectives of the UAE National Diabetes and Lifestyle Study are to: (i) define the prevalence of, and risk factors for, T2DM; (ii) describe the distribution and determinants of T2DM risk factors; (iii) study health knowledge, attitudes, and (iv) identify gene-environment interactions; and (v) develop baseline data for evaluation of future intervention programs. Given the high burden of diabetes in the region and the absence of accurate data on non-UAE nationals in the UAE, a representative sample of the non-UAE nationals was essential. We used an innovative methodology in which non-UAE nationals were sampled when attending the mandatory biannual health check that is required for visa renewal. Such an approach could also be used in other countries in the region. Complete data were available for 2719 eligible non-UAE nationals (25.9% Arabs, 70.7% Asian non-Arabs, 1.1% African non-Arabs, and 2.3% Westerners). Most were men < 65 years of age. The response rate was 68%, and the non-response was greater among women than men; 26.9% earned less than UAE Dirham (AED) 24 000 (US$6500) and the most common areas of employment were as managers or professionals, in service and sales, and unskilled occupations. Most (37.4%) had completed high school and 4.1% had a postgraduate degree. This novel methodology could provide insights for epidemiological studies in the UAE and other Gulf States, particularly for expatriates. © 2015 Ruijin Hospital, Shanghai Jiaotong University School of Medicine and Wiley Publishing Asia Pty Ltd.

  4. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  5. Open-Label Randomized Trial of Titrated Disease Management for Patients with Hypertension: Study Design and Baseline Sample Characteristics

    Science.gov (United States)

    Jackson, George L.; Weinberger, Morris; Kirshner, Miriam A.; Stechuchak, Karen M.; Melnyk, Stephanie D.; Bosworth, Hayden B.; Coffman, Cynthia J.; Neelon, Brian; Van Houtven, Courtney; Gentry, Pamela W.; Morris, Isis J.; Rose, Cynthia M.; Taylor, Jennifer P.; May, Carrie L.; Han, Byungjoo; Wainwright, Christi; Alkon, Aviel; Powell, Lesa; Edelman, David

    2016-01-01

    Despite the availability of efficacious treatments, only half of patients with hypertension achieve adequate blood pressure (BP) control. This paper describes the protocol and baseline subject characteristics of a 2-arm, 18-month randomized clinical trial of titrated disease management (TDM) for patients with pharmaceutically-treated hypertension for whom systolic blood pressure (SBP) is not controlled (≥140mmHg for non-diabetic or ≥130mmHg for diabetic patients). The trial is being conducted among patients of four clinic locations associated with a Veterans Affairs Medical Center. An intervention arm has a TDM strategy in which patients' hypertension control at baseline, 6, and 12 months determines the resource intensity of disease management. Intensity levels include: a low-intensity strategy utilizing a licensed practical nurse to provide bi-monthly, non-tailored behavioral support calls to patients whose SBP comes under control; medium-intensity strategy utilizing a registered nurse to provide monthly tailored behavioral support telephone calls plus home BP monitoring; and high-intensity strategy utilizing a pharmacist to provide monthly tailored behavioral support telephone calls, home BP monitoring, and pharmacist-directed medication management. Control arm patients receive the low-intensity strategy regardless of BP control. The primary outcome is SBP. There are 385 randomized (192 intervention; 193 control) veterans that are predominately older (mean age 63.5 years) men (92.5%). 61.8% are African American, and the mean baseline SBP for all subjects is 143.6mmHg. This trial will determine if a disease management program that is titrated by matching the intensity of resources to patients' BP control leads to superior outcomes compared to a low-intensity management strategy. PMID:27417982

  6. Biological Sampling Variability Study

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hutchison, Janine R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-11-08

    There are many sources of variability that exist in the sample collection and analysis process. This paper addresses many, but not all, sources of variability. The main focus of this paper was to better understand and estimate variability due to differences between samplers. Variability between days was also studied, as well as random variability within each sampler. Experiments were performed using multiple surface materials (ceramic and stainless steel), multiple contaminant concentrations (10 spores and 100 spores), and with and without the presence of interfering material. All testing was done with sponge sticks using 10-inch by 10-inch coupons. Bacillus atrophaeus was used as the BA surrogate. Spores were deposited using wet deposition. Grime was coated on the coupons which were planned to include the interfering material (Section 3.3). Samples were prepared and analyzed at PNNL using CDC protocol (Section 3.4) and then cultured and counted. Five samplers were trained so that samples were taken using the same protocol. Each sampler randomly sampled eight coupons each day, four coupons with 10 spores deposited and four coupons with 100 spores deposited. Each day consisted of one material being tested. The clean samples (no interfering materials) were run first, followed by the dirty samples (coated with interfering material). There was a significant difference in recovery efficiency between the coupons with 10 spores deposited (mean of 48.9%) and those with 100 spores deposited (mean of 59.8%). There was no general significant difference between the clean and dirty (containing interfering material) coupons or between the two surface materials; however, there was a significant interaction between concentration amount and presence of interfering material. The recovery efficiency was close to the same for coupons with 10 spores deposited, but for the coupons with 100 spores deposited, the recovery efficiency for the dirty samples was significantly larger (65

  7. Comparison of address-based sampling and random-digit dialing methods for recruiting young men as controls in a case-control study of testicular cancer susceptibility.

    Science.gov (United States)

    Clagett, Bartholt; Nathanson, Katherine L; Ciosek, Stephanie L; McDermoth, Monique; Vaughn, David J; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A

    2013-12-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-phone numbers and address-based sampling (ABS), to recruit primarily white men aged 18-55 years into a study of testicular cancer susceptibility conducted in the Philadelphia, Pennsylvania, metropolitan area between 2009 and 2012. With few exceptions, eligible and enrolled controls recruited by means of RDD and ABS were similar with regard to characteristics for which data were collected on the screening survey. While we find ABS to be a comparably effective method of recruiting young males compared with landline RDD, we acknowledge the potential impact that selection bias may have had on our results because of poor overall response rates, which ranged from 11.4% for landline RDD to 1.7% for ABS.

  8. Nutritional profile and obesity: results from a random-sample population-based study in Córdoba, Argentina.

    Science.gov (United States)

    Aballay, Laura R; Osella, Alberto R; De La Quintana, Ana G; Diaz, María Del Pilar

    2016-03-01

    Obesity is a chronic, heterogeneous, multifactorial disease, which has sharply increased in prevalence in both developed and developing countries. This study aimed to estimate the prevalence of obesity and to identify socio-demographic risk factors associated with it, with special emphasis on diet. Nutritional status, demographic characteristics, lifestyle habits, and food consumption patterns derived from a Food Frequency Questionnaire were investigated. Exhaustive exploratory analyses were performed in order to describe dietary patterns, and logistic regression models were used for odds ratio estimation. The study included 4328 subjects, over 18 years old and resident in Cordoba city. The prevalence of overweight and obesity was 34 and 17 %, respectively, with 60 % in men and 45 % in women of BMI ≥ 25. Obesity risk factors were high intake of sodium, refined grains, starchy vegetables, and snacks. A lower risk of overweight and obesity was associated with an adequate, moderate intake of meats, eggs, alcoholic beverages, sugar and sweets, milk, yogurt, and pulses. A high intake of snacks, refined grains, starchy vegetables and sodium and low intake of yogurt, milk, pulses, and whole grains seem to be associated with the emergence and high prevalence of obesity in Cordoba, Argentina.

  9. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  10. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  11. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  12. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  13. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  14. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  15. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  16. A combinatorial and probabilistic study of initial and end heights of descents in samples of geometrically distributed random variables and in permutations

    Directory of Open Access Journals (Sweden)

    Helmut Prodinger

    2007-01-01

    Full Text Available In words, generated by independent geometrically distributed random variables, we study the l th descent, which is, roughly speaking, the l th occurrence of a neighbouring pair ab with a>b. The value a is called the initial height, and b the end height. We study these two random variables (and some similar ones by combinatorial and probabilistic tools. We find in all instances a generating function Ψ(v,u, where the coefficient of v j u i refers to the j th descent (ascent, and i to the initial (end height. From this, various conclusions can be drawn, in particular expected values. In the probabilistic part, a Markov chain model is used, which allows to get explicit expressions for the heights of the second descent. In principle, one could go further, but the complexity of the results forbids it. This is extended to permutations of a large number of elements. Methods from q-analysis are used to simplify the expressions. This is the reason that we confine ourselves to the geometric distribution only. For general discrete distributions, no such tools are available.

  17. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  18. Random-effects linear modeling and sample size tables for two special crossover designs of average bioequivalence studies: the four-period, two-sequence, two-formulation and six-period, three-sequence, three-formulation designs.

    Science.gov (United States)

    Diaz, Francisco J; Berg, Michel J; Krebill, Ron; Welty, Timothy; Gidal, Barry E; Alloway, Rita; Privitera, Michael

    2013-12-01

    Due to concern and debate in the epilepsy medical community and to the current interest of the US Food and Drug Administration (FDA) in revising approaches to the approval of generic drugs, the FDA is currently supporting ongoing bioequivalence studies of antiepileptic drugs, the EQUIGEN studies. During the design of these crossover studies, the researchers could not find commercial or non-commercial statistical software that quickly allowed computation of sample sizes for their designs, particularly software implementing the FDA requirement of using random-effects linear models for the analyses of bioequivalence studies. This article presents tables for sample-size evaluations of average bioequivalence studies based on the two crossover designs used in the EQUIGEN studies: the four-period, two-sequence, two-formulation design, and the six-period, three-sequence, three-formulation design. Sample-size computations assume that random-effects linear models are used in bioequivalence analyses with crossover designs. Random-effects linear models have been traditionally viewed by many pharmacologists and clinical researchers as just mathematical devices to analyze repeated-measures data. In contrast, a modern view of these models attributes an important mathematical role in theoretical formulations in personalized medicine to them, because these models not only have parameters that represent average patients, but also have parameters that represent individual patients. Moreover, the notation and language of random-effects linear models have evolved over the years. Thus, another goal of this article is to provide a presentation of the statistical modeling of data from bioequivalence studies that highlights the modern view of these models, with special emphasis on power analyses and sample-size computations.

  19. An alternative procedure for estimating the population mean in simple random sampling

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2012-03-01

    Full Text Available This paper deals with the problem of estimating the finite population mean using auxiliary information in simple random sampling. Firstly we have suggested a correction to the mean squared error of the estimator proposed by Gupta and Shabbir [On improvement in estimating the population mean in simple random sampling. Jour. Appl. Statist. 35(5 (2008, pp. 559-566]. Later we have proposed a ratio type estimator and its properties are studied in simple random sampling. Numerically we have shown that the proposed class of estimators is more efficient than different known estimators including Gupta and Shabbir (2008 estimator.

  20. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  1. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  2. The potential of Virtual Reality as anxiety management tool: a randomized controlled study in a sample of patients affected by Generalized Anxiety Disorder

    Directory of Open Access Journals (Sweden)

    Gorini Alessandra

    2008-05-01

    Full Text Available Abstract Background Generalized anxiety disorder (GAD is a psychiatric disorder characterized by a constant and unspecific anxiety that interferes with daily-life activities. Its high prevalence in general population and the severe limitations it causes, point out the necessity to find new efficient strategies to treat it. Together with the cognitive-behavioural treatments, relaxation represents a useful approach for the treatment of GAD, but it has the limitation that it is hard to be learned. To overcome this limitation we propose the use of virtual reality (VR to facilitate the relaxation process by visually presenting key relaxing images to the subjects. The visual presentation of a virtual calm scenario can facilitate patients' practice and mastery of relaxation, making the experience more vivid and real than the one that most subjects can create using their own imagination and memory, and triggering a broad empowerment process within the experience induced by a high sense of presence. According to these premises, the aim of the present study is to investigate the advantages of using a VR-based relaxation protocol in reducing anxiety in patients affected by GAD. Methods/Design The trial is based on a randomized controlled study, including three groups of 25 patients each (for a total of 75 patients: (1 the VR group, (2 the non-VR group and (3 the waiting list (WL group. Patients in the VR group will be taught to relax using a VR relaxing environment and audio-visual mobile narratives; patients in the non-VR group will be taught to relax using the same relaxing narratives proposed to the VR group, but without the VR support, and patients in the WL group will not receive any kind of relaxation training. Psychometric and psychophysiological outcomes will serve as quantitative dependent variables, while subjective reports of participants will be used as qualitative dependent variables. Conclusion We argue that the use of VR for relaxation

  3. Lunar sample studies

    International Nuclear Information System (INIS)

    1977-01-01

    Lunar samples discussed and the nature of their analyses are: (1) an Apollo 15 breccia which is thoroughly analyzed as to the nature of the mature regolith from which it derived and the time and nature of the lithification process, (2) two Apollo 11 and one Apollo 12 basalts analyzed in terms of chemistry, Cross-Iddings-Pirsson-Washington norms, mineralogy, and petrography, (3) eight Apollo 17 mare basalts, also analyzed in terms of chemistry, Cross-Iddings-Pirsson-Washington norms, mineralogy, and petrography. The first seven are shown to be chemically similar although of two main textural groups; the eighth is seen to be distinct in both chemistry and mineralogy, (4) a troctolitic clast from a Fra Mauro breccia, analyzed and contrasted with other high-temperature lunar mineral assemblages. Two basaltic clasts from the same breccia are shown to have affinities with rock 14053, and (5) the uranium-thorium-lead systematics of three Apollo 16 samples are determined; serious terrestrial-lead contamination of the first two samples is attributed to bandsaw cutting in the lunar curatorial facility

  4. Random vs. systematic sampling from administrative databases involving human subjects.

    Science.gov (United States)

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  5. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  6. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  7. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  8. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  9. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  10. A randomized controlled trial of inhaled corticosteroids (ICS on markers of epithelial–mesenchymal transition (EMT in large airway samples in COPD: an exploratory proof of concept study

    Directory of Open Access Journals (Sweden)

    Sohal SS

    2014-05-01

    Full Text Available Sukhwinder Singh Sohal,1,* Amir Soltani,1,* David Reid,1,2 Chris Ward,1,3 Karen E Wills,1,4 H Konrad Muller,1 Eugene Haydn Walters1 1National Health and Medical Research Council Centre of Research Excellence for Chronic Respiratory Disease, School of Medicine, University of Tasmania, Hobart, Tasmania, Australia; 2Iron Metabolism Laboratory, Queensland Institute of Medical Research, Brisbane, Queensland, Australia; 3Institute of Cellular Medicine, Newcastle University, Newcastle upon Tyne, Tyne and Wear, UK; 4Department of Biostatistics, Menzies Research Institute Tasmania, University of Tasmania, Hobart, Tasmania, Australia *These authors contributed equally to this workBackground: We recently reported that epithelial–mesenchymal transition (EMT is active in the airways in chronic obstructive pulmonary disease (COPD, suggesting presence of an active profibrotic and promalignant stroma. With no data available on potential treatment effects, we undertook a blinded analysis of inhaled corticosteroids (ICS effects versus placebo on EMT markers in previously obtained endobronchial biopsies in COPD patients, as a “proof of concept” study.Methods: Assessment of the effects of inhaled fluticasone propionate (FP; 500 µg twice daily for 6 months versus placebo in 34 COPD patients (23 on fluticasone propionate and eleven on placebo. The end points were epidermal growth factor receptor (EGFR; marker of epithelial activation and the biomarkers of EMT: reticular basement membrane (Rbm fragmentation (“hallmark” structural marker, matrix metalloproteinase-9 (MMP-9 cell expression, and S100A4 expression in basal epithelial and Rbm cells (mesenchymal transition markers.Results: Epithelial activation, “clefts/fragmentation” in the Rbm, and changes in the other biomarkers all regressed on ICS, at or close to conventional levels of statistical significance. From these data, we have been able to nominate primary and secondary end points and develop

  11. a randomized, placebo- controlled study

    OpenAIRE

    Hall, Franziska van

    2012-01-01

    Introduction: Repetitive transcranial magnetic stimulation (rTMS) is a well-tolerated non-invasive method, which has also been proved to have mild antidepressant effects and is used as “add-on“-therapy in treating pharmaco-resistant major depression. Objective: The efficacy of an escitalopram plus rTMS-combination-treatment was evaluated and compared to escitalopram plus sham rTMS. Methods: We designed a four week-, randomized, rater-blinded, and controlled add-on study with two trea...

  12. Treatability study sample exemption: update

    International Nuclear Information System (INIS)

    1997-01-01

    This document is a RCRA Information Brief intended to update the information in the 1991 Small-Scale Treatability Study Information Brief, and to address questions about the waste and treatability study sample exemptions that have arisen since References 3 and 5 were published

  13. A Comparison of the Number of Men Who Have Sex with Men among Rural-To-Urban Migrants with Non-Migrant Rural and Urban Residents in Wuhan, China: A GIS/GPS-Assisted Random Sample Survey Study

    Science.gov (United States)

    Chen, Xinguang; Yu, Bin; Zhou, Dunjin; Zhou, Wang; Gong, Jie; Li, Shiyue; Stanton, Bonita

    2015-01-01

    Background Mobile populations and men who have sex with men (MSM) play an increasing role in the current HIV epidemic in China and across the globe. While considerable research has addressed both of these at-risk populations, more effective HIV control requires accurate data on the number of MSM at the population level, particularly MSM among migrant populations. Methods Survey data from a random sample of male rural-to-urban migrants (aged 18-45, n=572) in Wuhan, China were analyzed and compared with those of randomly selected non-migrant urban (n=566) and rural counterparts (580). The GIS/GPS technologies were used for sampling and the survey estimation method was used for data analysis. Results HIV-related risk behaviors among rural-to-urban migrants were similar to those among the two comparison groups. The estimated proportion of MSM among migrants [95% CI] was 5.8% [4.7, 6.8], higher than 2.8% [1.2, 4.5] for rural residents and 1.0% [0.0, 2.4] for urban residents, respectively. Among these migrants, the MSM were more likely than non-MSM to be older in age, married, and migrated to more cities. They were also more likely to co-habit with others in rental properties located in new town and neighborhoods with fewer old acquaintances and more entertainment establishments. In addition, they were more likely to engage in commercial sex and less likely to consistently use condoms. Conclusion Findings of this study indicate that compared to rural and urban populations, the migrant population in Wuhan consists of a higher proportion of MSM who also exhibit higher levels of HIV-related risk behaviors. More effective interventions should target this population with a focus on neighborhood factors, social capital and collective efficacy for risk reduction. PMID:26241900

  14. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  15. Sampling Polya-Gamma random variates: alternate and approximate techniques

    OpenAIRE

    Windle, Jesse; Polson, Nicholas G.; Scott, James G.

    2014-01-01

    Efficiently sampling from the P\\'olya-Gamma distribution, ${PG}(b,z)$, is an essential element of P\\'olya-Gamma data augmentation. Polson et. al (2013) show how to efficiently sample from the ${PG}(1,z)$ distribution. We build two new samplers that offer improved performance when sampling from the ${PG}(b,z)$ distribution and $b$ is not unity.

  16. [Saarland Growth Study: sampling design].

    Science.gov (United States)

    Danker-Hopfe, H; Zabransky, S

    2000-01-01

    The use of reference data to evaluate the physical development of children and adolescents is part of the daily routine in the paediatric ambulance. The construction of such reference data is based on the collection of extensive reference data. There are different kinds of reference data: cross sectional references, which are based on data collected from a big representative cross-sectional sample of the population, longitudinal references, which are based on follow-up surveys of usually smaller samples of individuals from birth to maturity, and mixed longitudinal references, which are a combination of longitudinal and cross-sectional reference data. The advantages and disadvantages of the different methods of data collection and the resulting reference data are discussed. The Saarland Growth Study was conducted for several reasons: growth processes are subject to secular changes, there are no specific reference data for children and adolescents from this part of the country and the growth charts in use in the paediatric praxis are possibly not appropriate any more. Therefore, the Saarland Growth Study served two purposes a) to create actual regional reference data and b) to create a database for future studies on secular trends in growth processes of children and adolescents from Saarland. The present contribution focusses on general remarks on the sampling design of (cross-sectional) growth surveys and its inferences for the design of the present study.

  17. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  18. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  19. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  20. Importance sampling of heavy-tailed iterated random functions

    NARCIS (Netherlands)

    B. Chen (Bohan); C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2016-01-01

    textabstractWe consider a stochastic recurrence equation of the form $Z_{n+1} = A_{n+1} Z_n+B_{n+1}$, where $\\mathbb{E}[\\log A_1]<0$, $\\mathbb{E}[\\log^+ B_1]<\\infty$ and $\\{(A_n,B_n)\\}_{n\\in\\mathbb{N}}$ is an i.i.d. sequence of positive random vectors. The stationary distribution of this Markov

  1. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  2. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  3. Developing Sampling Frame for Case Study: Challenges and Conditions

    Science.gov (United States)

    Ishak, Noriah Mohd; Abu Bakar, Abu Yazid

    2014-01-01

    Due to statistical analysis, the issue of random sampling is pertinent to any quantitative study. Unlike quantitative study, the elimination of inferential statistical analysis, allows qualitative researchers to be more creative in dealing with sampling issue. Since results from qualitative study cannot be generalized to the bigger population,…

  4. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  5. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  6. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  7. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  8. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  9. Randomized, interventional, prospective, comparative study to ...

    African Journals Online (AJOL)

    Randomized, interventional, prospective, comparative study to evaluate the antihypertensive efficacy and tolerability of ramipril versus telmisartan in stage 1 hypertensive patients with diabetes mellitus.

  10. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  11. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  12. Acute stress symptoms during the second Lebanon war in a random sample of Israeli citizens.

    Science.gov (United States)

    Cohen, Miri; Yahav, Rivka

    2008-02-01

    The aims of this study were to assess prevalence of acute stress disorder (ASD) and acute stress symptoms (ASS) in Israel during the second Lebanon war. A telephone survey was conducted in July 2006 of a random sample of 235 residents of northern Israel, who were subjected to missile attacks, and of central Israel, who were not subjected to missile attacks. Results indicate that ASS scores were higher in the northern respondents; 6.8% of the northern sample and 3.9% of the central sample met ASD criteria. Appearance of each symptom ranged from 15.4% for dissociative to 88.4% for reexperiencing, with significant differences between northern and central respondents only for reexperiencing and arousal. A low ASD rate and a moderate difference between areas subjected and not subjected to attack were found.

  13. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  14. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  15. Understanding infidelity: correlates in a national random sample.

    Science.gov (United States)

    Atkins, D C; Baucom, D H; Jacobson, N S

    2001-12-01

    Infidelity is a common phenomenon in marriages but is poorly understood. The current study examined variables related to extramarital sex using data from the 1991-1996 General Social Surveys. Predictor variables were entered into a logistic regression with presence of extramarital sex as the dependent variable. Results demonstrated that divorce, education, age when first married, and 2 "opportunity" variables--respondent's income and work status--significantly affected the likelihood of having engaged in infidelity. Also, there were 3 significant interactions related to infidelity: (a) between age and gender, (b) between marital satisfaction and religious behavior, and (c) between past divorce and educational level. Implications of these findings and directions for future research are discussed.

  16. Influence of population versus convenience sampling on sample characteristics in studies of cognitive aging.

    Science.gov (United States)

    Brodaty, Henry; Mothakunnel, Annu; de Vel-Palumbo, Melissa; Ames, David; Ellis, Kathryn A; Reppermund, Simone; Kochan, Nicole A; Savage, Greg; Trollor, Julian N; Crawford, John; Sachdev, Perminder S

    2014-01-01

    We examined whether differences in findings of studies examining mild cognitive impairment (MCI) were associated with recruitment methods by comparing sample characteristics in two contemporaneous Australian studies, using population-based and convenience sampling. The Sydney Memory and Aging Study invited participants randomly from the electoral roll in defined geographic areas in Sydney. The Australian Imaging, Biomarkers and Lifestyle Study of Ageing recruited cognitively normal (CN) individuals via media appeals and MCI participants via referrals from clinicians in Melbourne and Perth. Demographic and cognitive variables were harmonized, and similar diagnostic criteria were applied to both samples retrospectively. CN participants recruited via convenience sampling were younger, better educated, more likely to be married and have a family history of dementia, and performed better cognitively than those recruited via population-based sampling. MCI participants recruited via population-based sampling had better memory performance and were less likely to carry the apolipoprotein E ε4 allele than clinically referred participants but did not differ on other demographic variables. A convenience sample of normal controls is likely to be younger and better functioning and that of an MCI group likely to perform worse than a purportedly random sample. Sampling bias should be considered when interpreting findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  18. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  19. Determination of Initial Conditions for the Safety Analysis by Random Sampling of Operating Parameters

    International Nuclear Information System (INIS)

    Jeong, Hae-Yong; Park, Moon-Ghu

    2015-01-01

    In most existing evaluation methodologies, which follow a conservative approach, the most conservative initial conditions are searched for each transient scenario through tremendous assessment for wide operating windows or limiting conditions for operation (LCO) allowed by the operating guidelines. In this procedure, a user effect could be involved and a remarkable time and human resources are consumed. In the present study, we investigated a more effective statistical method for the selection of the most conservative initial condition by the use of random sampling of operating parameters affecting the initial conditions. A method for the determination of initial conditions based on random sampling of plant design parameters is proposed. This method is expected to be applied for the selection of the most conservative initial plant conditions in the safety analysis using a conservative evaluation methodology. In the method, it is suggested that the initial conditions of reactor coolant flow rate, pressurizer level, pressurizer pressure, and SG level are adjusted by controlling the pump rated flow, setpoints of PLCS, PPCS, and FWCS, respectively. The proposed technique is expected to contribute to eliminate the human factors introduced in the conventional safety analysis procedure and also to reduce the human resources invested in the safety evaluation of nuclear power plants

  20. Sample size in usability studies

    NARCIS (Netherlands)

    Schmettow, Martin

    2012-01-01

    Usability studies are important for developing usable, enjoyable products, identifying design flaws (usability problems) likely to compromise the user experience. Usability testing is recommended for improving interactive design, but discovery of usability problems depends on the number of users

  1. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  2. Randomized comparison of vaginal self-sampling by standard vs. dry swabs for Human papillomavirus testing

    International Nuclear Information System (INIS)

    Eperon, Isabelle; Vassilakos, Pierre; Navarria, Isabelle; Menoud, Pierre-Alain; Gauthier, Aude; Pache, Jean-Claude; Boulvain, Michel; Untiet, Sarah; Petignat, Patrick

    2013-01-01

    To evaluate if human papillomavirus (HPV) self-sampling (Self-HPV) using a dry vaginal swab is a valid alternative for HPV testing. Women attending colposcopy clinic were recruited to collect two consecutive Self-HPV samples: a Self-HPV using a dry swab (S-DRY) and a Self-HPV using a standard wet transport medium (S-WET). These samples were analyzed for HPV using real time PCR (Roche Cobas). Participants were randomized to determine the order of the tests. Questionnaires assessing preferences and acceptability for both tests were conducted. Subsequently, women were invited for colposcopic examination; a physician collected a cervical sample (physician-sampling) with a broom-type device and placed it into a liquid-based cytology medium. Specimens were then processed for the production of cytology slides and a Hybrid Capture HPV DNA test (Qiagen) was performed from the residual liquid. Biopsies were performed if indicated. Unweighted kappa statistics (κ) and McNemar tests were used to measure the agreement among the sampling methods. A total of 120 women were randomized. Overall HPV prevalence was 68.7% (95% Confidence Interval (CI) 59.3–77.2) by S-WET, 54.4% (95% CI 44.8–63.9) by S-DRY and 53.8% (95% CI 43.8–63.7) by HC. Among paired samples (S-WET and S-DRY), the overall agreement was good (85.7%; 95% CI 77.8–91.6) and the κ was substantial (0.70; 95% CI 0.57-0.70). The proportion of positive type-specific HPV agreement was also good (77.3%; 95% CI 68.2-84.9). No differences in sensitivity for cervical intraepithelial neoplasia grade one (CIN1) or worse between the two Self-HPV tests were observed. Women reported the two Self-HPV tests as highly acceptable. Self-HPV using dry swab transfer does not appear to compromise specimen integrity. Further study in a large screening population is needed. ClinicalTrials.gov: http://clinicaltrials.gov/show/NCT01316120

  3. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  4. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  5. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  6. Sample size of the reference sample in a case-augmented study.

    Science.gov (United States)

    Ghosh, Palash; Dewanji, Anup

    2017-05-01

    The case-augmented study, in which a case sample is augmented with a reference (random) sample from the source population with only covariates information known, is becoming popular in different areas of applied science such as pharmacovigilance, ecology, and econometrics. In general, the case sample is available from some source (for example, hospital database, case registry, etc.); however, the reference sample is required to be drawn from the corresponding source population. The required minimum size of the reference sample is an important issue in this regard. In this work, we address the minimum sample size calculation and discuss related issues. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Sample size in qualitative interview studies

    DEFF Research Database (Denmark)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit Kristiane

    2016-01-01

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is “saturation.” Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose...... the concept “information power” to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power...... and during data collection of a qualitative study is discussed....

  8. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    Science.gov (United States)

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  9. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    International Nuclear Information System (INIS)

    Maziero, Jonas

    2015-01-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  10. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  11. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  12. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  13. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  14. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  15. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  16. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  17. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  18. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  19. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  20. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  1. Rationale, study design and sample characteristics of a randomized controlled trial of directly administered antiretroviral therapy for HIV-infected prisoners transitioning to the community - a potential conduit to improved HIV treatment outcomes.

    Science.gov (United States)

    Saber-Tehrani, Ali Shabahang; Springer, Sandra A; Qiu, Jingjun; Herme, Maua; Wickersham, Jeffrey; Altice, Frederick L

    2012-03-01

    HIV-infected prisoners experience poor HIV treatment outcomes post-release. Directly administered antiretroviral therapy (DAART) is a CDC-designated, evidence-based adherence intervention for drug users, yet untested among released prisoners. Sentenced HIV-infected prisoners on antiretroviral therapy (ART) and returning to New Haven or Hartford, Connecticut were recruited and randomized 2:1 to a prospective controlled trial (RCT) of 6 months of DAART versus self-administered therapy (SAT); all subjects received case management services. Subjects meeting DSM-IV criteria for opioid dependence were offered immediate medication-assisted treatment. Trained outreach workers provided DAART once-daily, seven days per week, including behavioral skills training during the last intervention month. Both study groups were assessed for 6 months after the intervention period. Assessments occurred within 90 days pre-release (baseline), day of release, and then monthly for 12 months. Viral load (VL) and CD4 testing was conducted baseline and quarterly; genotypic resistance testing was conducted at baseline, 6 and 12 months. The primary outcome was pre-defined as viral suppression (VLHIV treatment outcomes after release from prison, a period associated with adverse HIV and other medical consequences. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  3. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  4. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    International Nuclear Information System (INIS)

    Ma, Y C; Liu, H Y; Yan, S B; Li, J M; Tang, J; Yang, Y H; Yang, M W

    2013-01-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency. (paper)

  5. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    Science.gov (United States)

    Ma, Y. C.; Liu, H. Y.; Yan, S. B.; Yang, Y. H.; Yang, M. W.; Li, J. M.; Tang, J.

    2013-05-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency.

  6. An R package for spatial coverage sampling and random sampling from compact geographical strata by k-means

    NARCIS (Netherlands)

    Walvoort, D.J.J.; Brus, D.J.; Gruijter, de J.J.

    2010-01-01

    Both for mapping and for estimating spatial means of an environmental variable, the accuracy of the result will usually be increased by dispersing the sample locations so that they cover the study area as uniformly as possible. We developed a new R package for designing spatial coverage samples for

  7. Mars Sample Return Architecture Assessment Study

    Science.gov (United States)

    Centuori, S.; Hermosín, P.; Martín, J.; De Zaiacomo, G.; Colin, S.; Godfrey, A.; Myles, J.; Johnson, H.; Sachdev, T.; Ahmed, R.

    2018-04-01

    Current paper presents the results of ESA funded activity "Mars Sample Return Architecture Assessment Study" carried-out by DEIMOS Space, Lockheed Martin UK Ampthill, and MDA Corporation, where more than 500 mission design options have been studied.

  8. Random-Number Generator Validity in Simulation Studies: An Investigation of Normality.

    Science.gov (United States)

    Bang, Jung W.; Schumacker, Randall E.; Schlieve, Paul L.

    1998-01-01

    The normality of number distributions generated by various random-number generators were studied, focusing on when the random-number generator reached a normal distribution and at what sample size. Findings suggest the steps that should be followed when using a random-number generator in a Monte Carlo simulation. (SLD)

  9. The effect of a mindfulness-based intervention in cognitive functions and psychological well-being applied as an early intervention in schizophrenia and high-risk mental state in a Chilean sample: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Langer, Álvaro I; Schmidt, Carlos; Mayol, Rocío; Díaz, Marcela; Lecaros, Javiera; Krogh, Edwin; Pardow, Aída; Vergara, Carolina; Vergara, Guillermo; Pérez-Herrera, Bernardita; Villar, María José; Maturana, Alejandro; Gaspar, Pablo A

    2017-05-25

    According to the projections of the World Health Organization, 15% of all disabilities will be associated with mental illnesses by 2020. One of the mental disorders with the largest social impacts due to high personal and family costs is psychosis. Among the most effective psychological approaches to treat schizophrenia and other psychotic disorders at the world level is cognitive behavioral therapy. Recently, cognitive behavioral therapy has introduced several tools and strategies that promote psychological processes based on acceptance and mindfulness. A large number of studies support the effectiveness of mindfulness in dealing with various mental health problems, including psychosis. This study is aimed at determining the efficiency of a mindfulness-based program in increasing cognitive function and psychological well-being in patients with a first episode of schizophrenia and a high risk mental state (those at risk of developing an episode of psychosis). This is an experimentally designed, multi-center randomized controlled trial, with a 3-month follow-up period. The study participants will be 48 patients diagnosed with schizophrenia (first episode) and 48 with a high-risk mental state, from Santiago, Chile, aged between 15 and 35 years. Participants will be submitted to a mindfulness-based intervention (MBI), which will involve taking part in eight mindfulness workshops adapted for people with psychosis. Workshops will last approximately 1.5 hours and take place once a week, over 8 weeks. The primary outcome will be the cognitive function through Measurement and Treatment Research to Improve Cognition in Schizophrenia (MATRICS) and the secondary outcome will be psychological well-being measured by self-reporting questionnaires. The outcomes of this trial will add empirical evidence to the benefits and feasibility of MBIs for the psychotherapeutic treatment of patients with schizophrenia and high-risk mental states in reducing cognitive impairment in

  10. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  11. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  12. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  13. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  14. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    Science.gov (United States)

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  15. Study of phosphors determination in biological samples

    International Nuclear Information System (INIS)

    Oliveira, Rosangela Magda de.

    1994-01-01

    In this paper, phosphors determination by neutron activation analysis in milk and bone samples was studied employing both instrumental and radiochemical separation methods. The analysis with radiochemistry separation consisted of the simultaneous irradiation of the samples and standards during 30 minutes, dissolution of the samples, hold back carrier, addition precipitation of phosphorus with ammonium phosphomolibdate (A.M.P.) and phosphorus-32 by counting by using Geiger-Mueller detector. The instrumental analysis consisted of the simultaneous irradiation of the samples and standards during 30 minutes, transfer of the samples into a counting planchet and measurement of the beta radiation emitted by phosphorus-32, after a suitable decay period. After the phosphorus analysis methods were established they were applied to both commercial milk and animal bone samples, and data obtained in the instrumental and radiochemical separation methods for each sample, were compared between themselves. In this work, it became possible to obtain analysis methods for phosphorus that can be applied independently of the sample quantity available, and the phosphorus content in the samples or interference that can be present in them. (author). 51 refs., 7 figs., 4 tabs

  16. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  17. Neurofeedback Against Binge Eating: A Randomized Controlled Trial in a Female Subclinical Threshold Sample.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2016-09-01

    Brain-directed treatment techniques, such as neurofeedback, have recently been proposed as adjuncts in the treatment of eating disorders to improve therapeutic outcomes. In line with this recommendation, a cue exposure EEG-neurofeedback protocol was developed. The present study aimed at the evaluation of the specific efficacy of neurofeedback to reduce subjective binge eating in a female subthreshold sample. A total of 75 subjects were randomized to EEG-neurofeedback, mental imagery with a comparable treatment set-up or a waitlist group. At post-treatment, only EEG-neurofeedback led to a reduced frequency of binge eating (p = .015, g = 0.65). The effects remained stable to a 3-month follow-up. EEG-neurofeedback further showed particular beneficial effects on perceived stress and dietary self-efficacy. Differences in outcomes did not arise from divergent treatment expectations. Because EEG-neurofeedback showed a specific efficacy, it may be a promising brain-directed approach that should be tested as a treatment adjunct in clinical groups with binge eating. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

  18. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Elsa Tavernier

    Full Text Available We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT. In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review. Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was 90%. Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  19. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  20. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  1. Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling

    Science.gov (United States)

    Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing

    2018-05-01

    The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.

  2. Random On-Board Pixel Sampling (ROPS) X-Ray Camera

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhehui [Los Alamos; Iaroshenko, O. [Los Alamos; Li, S. [Los Alamos; Liu, T. [Fermilab; Parab, N. [Argonne (main); Chen, W. W. [Purdue U.; Chu, P. [Los Alamos; Kenyon, G. [Los Alamos; Lipton, R. [Fermilab; Sun, K.-X. [Nevada U., Las Vegas

    2017-09-25

    Recent advances in compressed sensing theory and algorithms offer new possibilities for high-speed X-ray camera design. In many CMOS cameras, each pixel has an independent on-board circuit that includes an amplifier, noise rejection, signal shaper, an analog-to-digital converter (ADC), and optional in-pixel storage. When X-ray images are sparse, i.e., when one of the following cases is true: (a.) The number of pixels with true X-ray hits is much smaller than the total number of pixels; (b.) The X-ray information is redundant; or (c.) Some prior knowledge about the X-ray images exists, sparse sampling may be allowed. Here we first illustrate the feasibility of random on-board pixel sampling (ROPS) using an existing set of X-ray images, followed by a discussion about signal to noise as a function of pixel size. Next, we describe a possible circuit architecture to achieve random pixel access and in-pixel storage. The combination of a multilayer architecture, sparse on-chip sampling, and computational image techniques, is expected to facilitate the development and applications of high-speed X-ray camera technology.

  3. Sample size calculation in metabolic phenotyping studies.

    Science.gov (United States)

    Billoir, Elise; Navratil, Vincent; Blaise, Benjamin J

    2015-09-01

    The number of samples needed to identify significant effects is a key question in biomedical studies, with consequences on experimental designs, costs and potential discoveries. In metabolic phenotyping studies, sample size determination remains a complex step. This is due particularly to the multiple hypothesis-testing framework and the top-down hypothesis-free approach, with no a priori known metabolic target. Until now, there was no standard procedure available to address this purpose. In this review, we discuss sample size estimation procedures for metabolic phenotyping studies. We release an automated implementation of the Data-driven Sample size Determination (DSD) algorithm for MATLAB and GNU Octave. Original research concerning DSD was published elsewhere. DSD allows the determination of an optimized sample size in metabolic phenotyping studies. The procedure uses analytical data only from a small pilot cohort to generate an expanded data set. The statistical recoupling of variables procedure is used to identify metabolic variables, and their intensity distributions are estimated by Kernel smoothing or log-normal density fitting. Statistically significant metabolic variations are evaluated using the Benjamini-Yekutieli correction and processed for data sets of various sizes. Optimal sample size determination is achieved in a context of biomarker discovery (at least one statistically significant variation) or metabolic exploration (a maximum of statistically significant variations). DSD toolbox is encoded in MATLAB R2008A (Mathworks, Natick, MA) for Kernel and log-normal estimates, and in GNU Octave for log-normal estimates (Kernel density estimates are not robust enough in GNU octave). It is available at http://www.prabi.fr/redmine/projects/dsd/repository, with a tutorial at http://www.prabi.fr/redmine/projects/dsd/wiki. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  4. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples.

    Science.gov (United States)

    Shen, Lujun; Yang, Lei; Zhang, Jing; Zhang, Meng

    2018-01-01

    To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students. The Test Anxiety Scale (TAS) was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants' writing manuscripts. Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05). Students' writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days' manuscripts and the last 10 days' ones. Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study.

  5. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples.

    Directory of Open Access Journals (Sweden)

    Lujun Shen

    Full Text Available To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students.The Test Anxiety Scale (TAS was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants' writing manuscripts.Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05. Students' writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days' manuscripts and the last 10 days' ones.Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study.

  6. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples

    Science.gov (United States)

    Zhang, Jing; Zhang, Meng

    2018-01-01

    Purpose To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students. Methods The Test Anxiety Scale (TAS) was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants’ writing manuscripts. Results Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05). Students’ writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days’ manuscripts and the last 10 days’ ones. Conclusions Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study. PMID:29401473

  7. Work Sampling Study of an Engineering Professor during a Regular Contract Period

    Science.gov (United States)

    Brink, Jan; McDonald, Dale B.

    2015-01-01

    Work sampling is a technique that has been employed in industry and fields such as healthcare for some time. It is a powerful technique, and an alternative to conventional stop watch time studies, used by industrial engineers to focus upon random work sampling observations. This study applies work sampling to the duties performed by an individual…

  8. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  9. Mobile Variable Depth Sampling System Design Study

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study

  10. Mobile Variable Depth Sampling System Design Study

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    2000-08-25

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study.

  11. Statistics and sampling in transuranic studies

    International Nuclear Information System (INIS)

    Eberhardt, L.L.; Gilbert, R.O.

    1980-01-01

    The existing data on transuranics in the environment exhibit a remarkably high variability from sample to sample (coefficients of variation of 100% or greater). This chapter stresses the necessity of adequate sample size and suggests various ways to increase sampling efficiency. Objectives in sampling are regarded as being of great importance in making decisions as to sampling methodology. Four different classes of sampling methods are described: (1) descriptive sampling, (2) sampling for spatial pattern, (3) analytical sampling, and (4) sampling for modeling. A number of research needs are identified in the various sampling categories along with several problems that appear to be common to two or more such areas

  12. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  14. Landslide Susceptibility Assessment Using Frequency Ratio Technique with Iterative Random Sampling

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Oh

    2017-01-01

    Full Text Available This paper assesses the performance of the landslide susceptibility analysis using frequency ratio (FR with an iterative random sampling. A pair of before-and-after digital aerial photographs with 50 cm spatial resolution was used to detect landslide occurrences in Yongin area, Korea. Iterative random sampling was run ten times in total and each time it was applied to the training and validation datasets. Thirteen landslide causative factors were derived from the topographic, soil, forest, and geological maps. The FR scores were calculated from the causative factors and training occurrences repeatedly ten times. The ten landslide susceptibility maps were obtained from the integration of causative factors that assigned FR scores. The landslide susceptibility maps were validated by using each validation dataset. The FR method achieved susceptibility accuracies from 89.48% to 93.21%. And the landslide susceptibility accuracy of the FR method is higher than 89%. Moreover, the ten times iterative FR modeling may contribute to a better understanding of a regularized relationship between the causative factors and landslide susceptibility. This makes it possible to incorporate knowledge-driven considerations of the causative factors into the landslide susceptibility analysis and also be extensively used to other areas.

  15. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  16. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  17. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  18. Sampling pig farms at the abattoir in a cross-sectional study − Evaluation of a sampling method

    DEFF Research Database (Denmark)

    Birkegård, Anna Camilla; Hisham Beshara Halasa, Tariq; Toft, Nils

    2017-01-01

    slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2......A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list...... of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However...

  19. Inferences about Variance Components and Reliability-Generalizability Coefficients in the Absence of Random Sampling.

    Science.gov (United States)

    Kane, Michael

    2002-01-01

    Reviews the criticisms of sampling assumptions in generalizability theory (and in reliability theory) and examines the feasibility of using representative sampling, stratification, homogeneity assumptions, and replications to address these criticisms. Suggests some general outlines for the conduct of generalizability theory studies. (SLD)

  20. Sample size allocation in multiregional equivalence studies.

    Science.gov (United States)

    Liao, Jason J Z; Yu, Ziji; Li, Yulan

    2018-06-17

    With the increasing globalization of drug development, the multiregional clinical trial (MRCT) has gained extensive use. The data from MRCTs could be accepted by regulatory authorities across regions and countries as the primary sources of evidence to support global marketing drug approval simultaneously. The MRCT can speed up patient enrollment and drug approval, and it makes the effective therapies available to patients all over the world simultaneously. However, there are many challenges both operationally and scientifically in conducting a drug development globally. One of many important questions to answer for the design of a multiregional study is how to partition sample size into each individual region. In this paper, two systematic approaches are proposed for the sample size allocation in a multiregional equivalence trial. A numerical evaluation and a biosimilar trial are used to illustrate the characteristics of the proposed approaches. Copyright © 2018 John Wiley & Sons, Ltd.

  1. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  2. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  3. Randomized controlled trial of attention bias modification in a racially diverse, socially anxious, alcohol dependent sample.

    Science.gov (United States)

    Clerkin, Elise M; Magee, Joshua C; Wells, Tony T; Beard, Courtney; Barnett, Nancy P

    2016-12-01

    Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Adult participants (N = 86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  5. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  6. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  7. Surface studies of plasma processed Nb samples

    International Nuclear Information System (INIS)

    Tyagi, Puneet V.; Doleans, Marc; Hannah, Brian S.; Afanador, Ralph; Stewart, Stephen; Mammosser, John; Howell, Matthew P; Saunders, Jeffrey W; Degraff, Brian D; Kim, Sang-Ho

    2015-01-01

    Contaminants present at top surface of superconducting radio frequency (SRF) cavities can act as field emitters and restrict the cavity accelerating gradient. A room temperature in-situ plasma processing technology for SRF cavities aiming to clean hydrocarbons from inner surface of cavities has been recently developed at the Spallation Neutron Source (SNS). Surface studies of the plasma-processed Nb samples by Secondary ion mass spectrometry (SIMS) and Scanning Kelvin Probe (SKP) showed that the NeO_2 plasma processing is very effective to remove carbonaceous contaminants from top surface and improves the surface work function by 0.5 to 1.0 eV.

  8. Experimental percolation studies of random networks

    Science.gov (United States)

    Feinerman, A.; Weddell, J.

    2017-06-01

    This report establishes an experimental method of studying electrically percolating networks at a higher resolution than previously implemented. This method measures the current across a conductive sheet as a function of time as elliptical pores are cut into the sheet. This is done utilizing a Universal Laser System X2-600 100 W CO2 laser system with a 76 × 46 cm2 field and 394 dpc (dots/cm) resolution. This laser can cut a random system of elliptical pores into a conductive sheet with a potential voltage applied across it and measures the current versus time. This allows for experimental verification of a percolation threshold as a function of the ellipse's aspect ratio (minor/major diameter). We show that as an ellipse's aspect ratio approaches zero, the percolation threshold approaches one. The benefit of this method is that it can experimentally measure the effect of removing small pores, as well as pores with complex geometries, such as an asterisk from a conductive sheet.

  9. Random Photon Absorption Model Elucidates How Early Gain Control in Fly Photoreceptors Arises from Quantal Sampling

    Science.gov (United States)

    Song, Zhuoyi; Zhou, Yu; Juusola, Mikko

    2016-01-01

    Many diurnal photoreceptors encode vast real-world light changes effectively, but how this performance originates from photon sampling is unclear. A 4-module biophysically-realistic fly photoreceptor model, in which information capture is limited by the number of its sampling units (microvilli) and their photon-hit recovery time (refractoriness), can accurately simulate real recordings and their information content. However, sublinear summation in quantum bump production (quantum-gain-nonlinearity) may also cause adaptation by reducing the bump/photon gain when multiple photons hit the same microvillus simultaneously. Here, we use a Random Photon Absorption Model (RandPAM), which is the 1st module of the 4-module fly photoreceptor model, to quantify the contribution of quantum-gain-nonlinearity in light adaptation. We show how quantum-gain-nonlinearity already results from photon sampling alone. In the extreme case, when two or more simultaneous photon-hits reduce to a single sublinear value, quantum-gain-nonlinearity is preset before the phototransduction reactions adapt the quantum bump waveform. However, the contribution of quantum-gain-nonlinearity in light adaptation depends upon the likelihood of multi-photon-hits, which is strictly determined by the number of microvilli and light intensity. Specifically, its contribution to light-adaptation is marginal (≤ 1%) in fly photoreceptors with many thousands of microvilli, because the probability of simultaneous multi-photon-hits on any one microvillus is low even during daylight conditions. However, in cells with fewer sampling units, the impact of quantum-gain-nonlinearity increases with brightening light. PMID:27445779

  10. Numerical study of microphase separation in gels and random media

    International Nuclear Information System (INIS)

    Uchida, Nariya

    2004-01-01

    Microphase separation in gels and random media is numerically studied using a Ginzburg-Landau model. A random field destroys long-range orientational (lamellar) order and gives rise to a disordered bicontinuous morphology. The dependence of the correlation length on the field strength is distinct from that of random-field magnets

  11. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  12. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects.

    Science.gov (United States)

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called "cluster randomization"). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  13. The contribution of simple random sampling to observed variations in faecal egg counts.

    Science.gov (United States)

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Paleomagnetic Studies of Returned Samples from Mars

    Science.gov (United States)

    Weiss, B. P.; Beaty, D. W.; McSween, H. Y.; Carrier, B. L.; Czaja, A. D.; Goreva, Y. S.; Hausrath, E.; Herd, C. D. K.; Humayun, M.; McCubbin, F. M.; McLennan, S. M.; Pratt, L. M.; Sephton, M. A.; Steele, A.

    2018-04-01

    Magnetic measurements of returned samples could transform our understanding of the martian dynamo and its connection to climatic and planetary thermal evolution and provide powerful constraints on the preservation state of sample biosignatures.

  15. Study on graphite samples for nuclear usage

    International Nuclear Information System (INIS)

    Suarez, J.C.M.; Silva Roseira, M. da

    1994-01-01

    Available as short communication only. The graphite, due to its properties (mechanical strength, thermal conductivity, high-temperature stability, machinability etc.) have many industrial applications, and consequently, an important strategic value. In the nuclear area, it has been used as moderator and reflector of neutrons in the fission process of uranium. The graphite can be produced from many types of carbonaceous materials by a variety of process dominated by the manufactures. This is the reason why there are in the world market a lot of graphite types with different physical and mechanical properties. The present investigation studies some physical characteristics of the graphite samples destined to use in a nuclear reactor. (author). 8 refs, 1 fig, 1 tab

  16. GASOLINE VEHICLE EXHAUST PARTICLE SAMPLING STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Kittelson, D; Watts, W; Johnson, J; Zarling, D Schauer,J Kasper, K; Baltensperger, U; Burtscher, H

    2003-08-24

    The University of Minnesota collaborated with the Paul Scherrer Institute, the University of Wisconsin (UWI) and Ricardo, Inc to physically and chemically characterize the exhaust plume from recruited gasoline spark ignition (SI) vehicles. The project objectives were: (1) Measure representative particle size distributions from a set of on-road SI vehicles and compare these data to similar data collected on a small subset of light-duty gasoline vehicles tested on a chassis dynamometer with a dilution tunnel using the Unified Drive Cycle, at both room temperature (cold start) and 0 C (cold-cold start). (2) Compare data collected from SI vehicles to similar data collected from Diesel engines during the Coordinating Research Council E-43 project. (3) Characterize on-road aerosol during mixed midweek traffic and Sunday midday periods and determine fleet-specific emission rates. (4) Characterize bulk- and size-segregated chemical composition of the particulate matter (PM) emitted in the exhaust from the gasoline vehicles. Particle number concentrations and size distributions are strongly influenced by dilution and sampling conditions. Laboratory methods were evaluated to dilute SI exhaust in a way that would produce size distributions that were similar to those measured during laboratory experiments. Size fractionated samples were collected for chemical analysis using a nano-microorifice uniform deposit impactor (nano-MOUDI). In addition, bulk samples were collected and analyzed. A mixture of low, mid and high mileage vehicles were recruited for testing during the study. Under steady highway cruise conditions a significant particle signature above background was not measured, but during hard accelerations number size distributions for the test fleet were similar to modern heavy-duty Diesel vehicles. Number emissions were much higher at high speed and during cold-cold starts. Fuel specific number emissions range from 1012 to 3 x 1016 particles/kg fuel. A simple

  17. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    Science.gov (United States)

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  18. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Dreyhaupt, Jens

    2017-05-01

    Full Text Available An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called “cluster randomization”. Compared with studies with individual randomization, studies with cluster randomization normally require (significantly larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies.Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  19. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  20. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  1. Hardware architecture for projective model calculation and false match refining using random sample consensus algorithm

    Science.gov (United States)

    Azimi, Ehsan; Behrad, Alireza; Ghaznavi-Ghoushchi, Mohammad Bagher; Shanbehzadeh, Jamshid

    2016-11-01

    The projective model is an important mapping function for the calculation of global transformation between two images. However, its hardware implementation is challenging because of a large number of coefficients with different required precisions for fixed point representation. A VLSI hardware architecture is proposed for the calculation of a global projective model between input and reference images and refining false matches using random sample consensus (RANSAC) algorithm. To make the hardware implementation feasible, it is proved that the calculation of the projective model can be divided into four submodels comprising two translations, an affine model and a simpler projective mapping. This approach makes the hardware implementation feasible and considerably reduces the required number of bits for fixed point representation of model coefficients and intermediate variables. The proposed hardware architecture for the calculation of a global projective model using the RANSAC algorithm was implemented using Verilog hardware description language and the functionality of the design was validated through several experiments. The proposed architecture was synthesized by using an application-specific integrated circuit digital design flow utilizing 180-nm CMOS technology as well as a Virtex-6 field programmable gate array. Experimental results confirm the efficiency of the proposed hardware architecture in comparison with software implementation.

  2. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  3. Discriminative motif discovery via simulated evolution and random under-sampling.

    Science.gov (United States)

    Song, Tao; Gu, Hong

    2014-01-01

    Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs) training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  4. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  5. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  6. Water evaporation: a transition path sampling study.

    Science.gov (United States)

    Varilly, Patrick; Chandler, David

    2013-02-07

    We use transition path sampling to study evaporation in the SPC/E model of liquid water. On the basis of thousands of evaporation trajectories, we characterize the members of the transition state ensemble (TSE), which exhibit a liquid-vapor interface with predominantly negative mean curvature at the site of evaporation. We also find that after evaporation is complete, the distributions of translational and angular momenta of the evaporated water are Maxwellian with a temperature equal to that of the liquid. To characterize the evaporation trajectories in their entirety, we find that it suffices to project them onto just two coordinates: the distance of the evaporating molecule to the instantaneous liquid-vapor interface and the velocity of the water along the average interface normal. In this projected space, we find that the TSE is well-captured by a simple model of ballistic escape from a deep potential well, with no additional barrier to evaporation beyond the cohesive strength of the liquid. Equivalently, they are consistent with a near-unity probability for a water molecule impinging upon a liquid droplet to condense. These results agree with previous simulations and with some, but not all, recent experiments.

  7. Randomization-Based Inference about Latent Variables from Complex Samples: The Case of Two-Stage Sampling

    Science.gov (United States)

    Li, Tiandong

    2012-01-01

    In large-scale assessments, such as the National Assessment of Educational Progress (NAEP), plausible values based on Multiple Imputations (MI) have been used to estimate population characteristics for latent constructs under complex sample designs. Mislevy (1991) derived a closed-form analytic solution for a fixed-effect model in creating…

  8. Experimental study of glass sampling devices

    International Nuclear Information System (INIS)

    Jouan, A.; Moncouyoux, J.P.; Meyere, A.

    1992-01-01

    Two high-level liquid waste containment glass sampling systems have been designed and built. The first device fits entirely inside a standard glass storage canister, and may thus be used in facilities not initially designed for this function. It has been tested successfully in the nonradioactive prototype unit at Marcoule. The work primarily covered the design and construction of an articulated arm supporting the sampling vessel, and the mechanisms necessary for filling the vessel and recovering the sample. System actuation and operation are fully automatic, and the resulting sample is representative of the glass melt. Implementation of the device is delicate however, and its reliability is estimated at about 75%. A second device was designed specifically for new vitrification facilities. It is installed directly on the glass melting furnace, and meets process operating and quality control requirements. Tests conducted at the Marcoule prototype vitrification facility demonstrated the feasibility of the system. Special attention was given to the sampling vessel transfer mechanisms, with two filling and controlled sample cooling options

  9. Random sampling of the Central European bat fauna reveals the existence of numerous hitherto unknown adenoviruses.

    Science.gov (United States)

    Vidovszky, Márton; Kohl, Claudia; Boldogh, Sándor; Görföl, Tamás; Wibbelt, Gudrun; Kurth, Andreas; Harrach, Balázs

    2015-12-01

    From over 1250 extant species of the order Chiroptera, 25 and 28 are known to occur in Germany and Hungary, respectively. Close to 350 samples originating from 28 bat species (17 from Germany, 27 from Hungary) were screened for the presence of adenoviruses (AdVs) using a nested PCR that targets the DNA polymerase gene of AdVs. An additional PCR was designed and applied to amplify a fragment from the gene encoding the IVa2 protein of mastadenoviruses. All German samples originated from organs of bats found moribund or dead. The Hungarian samples were excrements collected from colonies of known bat species, throat or rectal swab samples, taken from live individuals that had been captured for faunistic surveys and migration studies, as well as internal organs of dead specimens. Overall, 51 samples (14.73%) were found positive. We detected 28 seemingly novel and six previously described bat AdVs by sequencing the PCR products. The positivity rate was the highest among the guano samples of bat colonies. In phylogeny reconstructions, the AdVs detected in bats clustered roughly, but not perfectly, according to the hosts' families (Vespertilionidae, Rhinolophidae, Hipposideridae, Phyllostomidae and Pteropodidae). In a few cases, identical sequences were derived from animals of closely related species. On the other hand, some bat species proved to harbour more than one type of AdV. The high prevalence of infection and the large number of chiropteran species worldwide make us hypothesise that hundreds of different yet unknown AdV types might circulate in bats.

  10. The relationship between blood viscosity and blood pressure in a random sample of the population aged 55 to 74 years.

    Science.gov (United States)

    Fowkes, F G; Lowe, G D; Rumley, A; Lennie, S E; Smith, F B; Donnan, P T

    1993-05-01

    Blood viscosity is elevated in hypertensive subjects, but the association of viscosity with arterial blood pressure in the general population, and the influence of social, lifestyle and disease characteristics on this association, are not established. In the Edinburgh Artery Study, 1592 men and women aged 55-74 years selected randomly from the general population attended a university clinic. A fasting blood sample was taken for the measurement of blood viscosity and its major determinants (haematocrit, plasma viscosity and fibrinogen). Systolic pressure was related univariately to blood viscosity (P viscosity (P index. Diastolic pressure was related univariately to blood viscosity (P viscosity (P viscosity and systolic pressure was confined to males. Blood viscosity was associated equally with systolic and diastolic pressures in males, and remained independently related on multivariate analysis adjusting for age, sex, body mass index, social class, smoking, alcohol intake, exercise, angina, HDL and non-HDL cholesterol, diabetes mellitus, plasma viscosity, fibrinogen, and haematocrit.

  11. Application of bias factor method using random sampling technique for prediction accuracy improvement of critical eigenvalue of BWR

    International Nuclear Information System (INIS)

    Ito, Motohiro; Endo, Tomohiro; Yamamoto, Akio; Kuroda, Yusuke; Yoshii, Takashi

    2017-01-01

    The bias factor method based on the random sampling technique is applied to the benchmark problem of Peach Bottom Unit 2. Validity and availability of the present method, i.e. correction of calculation results and reduction of uncertainty, are confirmed in addition to features and performance of the present method. In the present study, core characteristics in cycle 3 are corrected with the proposed method using predicted and 'measured' critical eigenvalues in cycles 1 and 2. As the source of uncertainty, variance-covariance of cross sections is considered. The calculation results indicate that bias between predicted and measured results, and uncertainty owing to cross section can be reduced. Extension to other uncertainties such as thermal hydraulics properties will be a future task. (author)

  12. Seroincidence of non-typhoid Salmonella infections: convenience vs. random community-based sampling.

    Science.gov (United States)

    Emborg, H-D; Simonsen, J; Jørgensen, C S; Harritshøj, L H; Krogfelt, K A; Linneberg, A; Mølbak, K

    2016-01-01

    The incidence of reported infections of non-typhoid Salmonella is affected by biases inherent to passive laboratory surveillance, whereas analysis of blood sera may provide a less biased alternative to estimate the force of Salmonella transmission in humans. We developed a mathematical model that enabled a back-calculation of the annual seroincidence of Salmonella based on measurements of specific antibodies. The aim of the present study was to determine the seroincidence in two convenience samples from 2012 (Danish blood donors, n = 500, and pregnant women, n = 637) and a community-based sample of healthy individuals from 2006 to 2007 (n = 1780). The lowest antibody levels were measured in the samples from the community cohort and the highest in pregnant women. The annual Salmonella seroincidences were 319 infections/1000 pregnant women [90% credibility interval (CrI) 210-441], 182/1000 in blood donors (90% CrI 85-298) and 77/1000 in the community cohort (90% CrI 45-114). Although the differences between study populations decreased when accounting for different age distributions the estimates depend on the study population. It is important to be aware of this issue and define a certain population under surveillance in order to obtain consistent results in an application of serological measures for public health purposes.

  13. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    Science.gov (United States)

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  14. Comparing the performance of cluster random sampling and integrated threshold mapping for targeting trachoma control, using computer simulation.

    Directory of Open Access Journals (Sweden)

    Jennifer L Smith

    Full Text Available Implementation of trachoma control strategies requires reliable district-level estimates of trachomatous inflammation-follicular (TF, generally collected using the recommended gold-standard cluster randomized surveys (CRS. Integrated Threshold Mapping (ITM has been proposed as an integrated and cost-effective means of rapidly surveying trachoma in order to classify districts according to treatment thresholds. ITM differs from CRS in a number of important ways, including the use of a school-based sampling platform for children aged 1-9 and a different age distribution of participants. This study uses computerised sampling simulations to compare the performance of these survey designs and evaluate the impact of varying key parameters.Realistic pseudo gold standard data for 100 districts were generated that maintained the relative risk of disease between important sub-groups and incorporated empirical estimates of disease clustering at the household, village and district level. To simulate the different sampling approaches, 20 clusters were selected from each district, with individuals sampled according to the protocol for ITM and CRS. Results showed that ITM generally under-estimated the true prevalence of TF over a range of epidemiological settings and introduced more district misclassification according to treatment thresholds than did CRS. However, the extent of underestimation and resulting misclassification was found to be dependent on three main factors: (i the district prevalence of TF; (ii the relative risk of TF between enrolled and non-enrolled children within clusters; and (iii the enrollment rate in schools.Although in some contexts the two methodologies may be equivalent, ITM can introduce a bias-dependent shift as prevalence of TF increases, resulting in a greater risk of misclassification around treatment thresholds. In addition to strengthening the evidence base around choice of trachoma survey methodologies, this study illustrates

  15. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  16. Active Learning Not Associated with Student Learning in a Random Sample of College Biology Courses

    Science.gov (United States)

    Andrews, T. M.; Leonard, M. J.; Colgrove, C. A.; Kalinowski, S. T.

    2011-01-01

    Previous research has suggested that adding active learning to traditional college science lectures substantially improves student learning. However, this research predominantly studied courses taught by science education researchers, who are likely to have exceptional teaching expertise. The present study investigated introductory biology courses randomly selected from a list of prominent colleges and universities to include instructors representing a broader population. We examined the relationship between active learning and student learning in the subject area of natural selection. We found no association between student learning gains and the use of active-learning instruction. Although active learning has the potential to substantially improve student learning, this research suggests that active learning, as used by typical college biology instructors, is not associated with greater learning gains. We contend that most instructors lack the rich and nuanced understanding of teaching and learning that science education researchers have developed. Therefore, active learning as designed and implemented by typical college biology instructors may superficially resemble active learning used by education researchers, but lacks the constructivist elements necessary for improving learning. PMID:22135373

  17. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    OpenAIRE

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Fr?d?ric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed f...

  18. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  19. Albumin to creatinine ratio in a random urine sample: Correlation with severity of preeclampsia

    Directory of Open Access Journals (Sweden)

    Fady S. Moiety

    2014-06-01

    Conclusions: Random urine ACR may be a reliable method for prediction and assessment of severity of preeclampsia. Using the estimated cut-off may add to the predictive value of such a simple quick test.

  20. Celiac Patients: A Randomized, Controlled Clinical Study

    Directory of Open Access Journals (Sweden)

    Giuseppe Mazzarella

    2012-01-01

    Full Text Available A lifelong gluten-free diet (GFD is mandatory for celiac disease (CD but has poor compliance, justifying novel strategies. We found that wheat flour transamidation inhibited IFN-γ secretion by intestinal T cells from CD patients. Herein, the primary endpoint was to evaluate the ability of transamidated gluten to maintain GFD CD patients in clinical remission. Secondary endpoints were efficacy in prevention of the inflammatory response and safety at the kidney level, where reaction products are metabolized. In a randomized single blinded, controlled 90-day trial, 47 GFD CD patients received 3.7 g/day of gluten from nontransamidated (12 or transamidated (35 flour. On day 15, 75% and 37% of patients in the control and experimental groups, respectively, showed clinical relapse (=0.04 whereas intestinal permeability was mainly altered in the control group (50% versus 20%, =0.06. On day 90, 0 controls and 14 patients in the experimental group completed the challenge with no variation of antitransglutaminase IgA (=0.63, Marsh-Oberhuber grading (=0.08, or intestinal IFN-γ mRNA (>0.05. Creatinine clearance did not vary after 90 days of treatment (=0.46. In conclusion, transamidated gluten reduced the number of clinical relapses in challenged patients with no changes of baseline values for serological/mucosal CD markers and an unaltered kidney function.

  1. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  2. High Field In Vivo 13C Magnetic Resonance Spectroscopy of Brain by Random Radiofrequency Heteronuclear Decoupling and Data Sampling

    Science.gov (United States)

    Li, Ningzhi; Li, Shizhe; Shen, Jun

    2017-06-01

    In vivo 13C magnetic resonance spectroscopy (MRS) is a unique and effective tool for studying dynamic human brain metabolism and the cycling of neurotransmitters. One of the major technical challenges for in vivo 13C-MRS is the high radio frequency (RF) power necessary for heteronuclear decoupling. In the common practice of in vivo 13C-MRS, alkanyl carbons are detected in the spectra range of 10-65ppm. The amplitude of decoupling pulses has to be significantly greater than the large one-bond 1H-13C scalar coupling (1JCH=125-145 Hz). Two main proton decoupling methods have been developed: broadband stochastic decoupling and coherent composite or adiabatic pulse decoupling (e.g., WALTZ); the latter is widely used because of its efficiency and superb performance under inhomogeneous B1 field. Because the RF power required for proton decoupling increases quadratically with field strength, in vivo 13C-MRS using coherent decoupling is often limited to low magnetic fields (protons via weak long-range 1H-13C scalar couplings, which can be decoupled using low RF power broadband stochastic decoupling. Recently, the carboxylic/amide 13C-MRS technique using low power random RF heteronuclear decoupling was safely applied to human brain studies at 7T. Here, we review the two major decoupling methods and the carboxylic/amide 13C-MRS with low power decoupling strategy. Further decreases in RF power deposition by frequency-domain windowing and time-domain random under-sampling are also discussed. Low RF power decoupling opens the possibility of performing in vivo 13C experiments of human brain at very high magnetic fields (such as 11.7T), where signal-to-noise ratio as well as spatial and temporal spectral resolution are more favorable than lower fields.

  3. Comparing attitudes about legal sanctions and teratogenic effects for cocaine, alcohol, tobacco and caffeine: A randomized, independent samples design

    Directory of Open Access Journals (Sweden)

    Alanis Kelly L

    2006-02-01

    Full Text Available Abstract Background Establishing more sensible measures to treat cocaine-addicted mothers and their children is essential for improving U.S. drug policy. Favorable post-natal environments have moderated potential deleterious prenatal effects. However, since cocaine is an illicit substance having long been demonized, we hypothesized that attitudes toward prenatal cocaine exposure would be more negative than for licit substances, alcohol, nicotine and caffeine. Further, media portrayals about long-term outcomes were hypothesized to influence viewers' attitudes, measured immediately post-viewing. Reducing popular crack baby stigmas could influence future policy decisions by legislators. In Study 1, 336 participants were randomly assigned to 1 of 4 conditions describing hypothetical legal sanction scenarios for pregnant women using cocaine, alcohol, nicotine or caffeine. Participants rated legal sanctions against pregnant women who used one of these substances and risk potential for developing children. In Study 2, 139 participants were randomly assigned to positive, neutral and negative media conditions. Immediately post-viewing, participants rated prenatal cocaine-exposed or non-exposed teens for their academic performance and risk for problems at age18. Results Participants in Study 1 imposed significantly greater legal sanctions for cocaine, perceiving prenatal cocaine exposure as more harmful than alcohol, nicotine or caffeine. A one-way ANOVA for independent samples showed significant differences, beyond .0001. Post-hoc Sheffe test illustrated that cocaine was rated differently from other substances. In Study 2, a one-way ANOVA for independent samples was performed on difference scores for the positive, neutral or negative media conditions about prenatal cocaine exposure. Participants in the neutral and negative media conditions estimated significantly lower grade point averages and more problems for the teen with prenatal cocaine exposure

  4. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  5. Studies in astronomical time series analysis: Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  6. Assessing sample representativeness in randomized controlled trials: application to the National Institute of Drug Abuse Clinical Trials Network.

    Science.gov (United States)

    Susukida, Ryoko; Crum, Rosa M; Stuart, Elizabeth A; Ebnesajjad, Cyrus; Mojtabai, Ramin

    2016-07-01

    To compare the characteristics of individuals participating in randomized controlled trials (RCTs) of treatments of substance use disorder (SUD) with individuals receiving treatment in usual care settings, and to provide a summary quantitative measure of differences between characteristics of these two groups of individuals using propensity score methods. Design Analyses using data from RCT samples from the National Institute of Drug Abuse Clinical Trials Network (CTN) and target populations of patients drawn from the Treatment Episodes Data Set-Admissions (TEDS-A). Settings Multiple clinical trial sites and nation-wide usual SUD treatment settings in the United States. A total of 3592 individuals from 10 CTN samples and 1 602 226 individuals selected from TEDS-A between 2001 and 2009. Measurements The propensity scores for enrolling in the RCTs were computed based on the following nine observable characteristics: sex, race/ethnicity, age, education, employment status, marital status, admission to treatment through criminal justice, intravenous drug use and the number of prior treatments. Findings The proportion of those with ≥ 12 years of education and the proportion of those who had full-time jobs were significantly higher among RCT samples than among target populations (in seven and nine trials, respectively, at P difference in the mean propensity scores between the RCTs and the target population was 1.54 standard deviations and was statistically significant at P different from individuals receiving treatment in usual care settings. Notably, RCT participants tend to have more years of education and a greater likelihood of full-time work compared with people receiving care in usual care settings. © 2016 Society for the Study of Addiction.

  7. Mental Health Impact of Hosting Disaster Refugees: Analyses from a Random Sample Survey Among Haitians Living in Miami.

    Science.gov (United States)

    Messiah, Antoine; Lacoste, Jérôme; Gokalsing, Erick; Shultz, James M; Rodríguez de la Vega, Pura; Castro, Grettel; Acuna, Juan M

    2016-08-01

    Studies on the mental health of families hosting disaster refugees are lacking. This study compares participants in households that hosted 2010 Haitian earthquake disaster refugees with their nonhost counterparts. A random sample survey was conducted from October 2011 through December 2012 in Miami-Dade County, Florida. Haitian participants were assessed regarding their 2010 earthquake exposure and impact on family and friends and whether they hosted earthquake refugees. Using standardized scores and thresholds, they were evaluated for symptoms of three common mental disorders (CMDs): posttraumatic stress disorder, generalized anxiety disorder, and major depressive disorder (MDD). Participants who hosted refugees (n = 51) had significantly higher percentages of scores beyond thresholds for MDD than those who did not host refugees (n = 365) and for at least one CMD, after adjusting for participants' earthquake exposures and effects on family and friends. Hosting refugees from a natural disaster appears to elevate the risk for MDD and possibly other CMDs, independent of risks posed by exposure to the disaster itself. Families hosting refugees deserve special attention.

  8. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  9. Characteristics of men with substance use disorder consequent to illicit drug use: comparison of a random sample and volunteers.

    Science.gov (United States)

    Reynolds, Maureen D; Tarter, Ralph E; Kirisci, Levent

    2004-09-06

    Men qualifying for substance use disorder (SUD) consequent to consumption of an illicit drug were compared according to recruitment method. It was hypothesized that volunteers would be more self-disclosing and exhibit more severe disturbances compared to randomly recruited subjects. Personal, demographic, family, social, substance use, psychiatric, and SUD characteristics of volunteers (N = 146) were compared to randomly recruited (N = 102) subjects. Volunteers had lower socioceconomic status, were more likely to be African American, and had lower IQ than randomly recruited subjects. Volunteers also evidenced greater social and family maladjustment and more frequently had received treatment for substance abuse. In addition, lower social desirability response bias was observed in the volunteers. SUD was not more severe in the volunteers; however, they reported a higher lifetime rate of opiate, diet, depressant, and analgesic drug use. Volunteers and randomly recruited subjects qualifying for SUD consequent to illicit drug use are similar in SUD severity but differ in terms of severity of psychosocial disturbance and history of drug involvement. The factors discriminating volunteers and randomly recruited subjects are well known to impact on outcome, hence they need to be considered in research design, especially when selecting a sampling strategy in treatment research.

  10. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  11. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  12. Random or systematic sampling to detect a localised microbial contamination within a batch of food

    NARCIS (Netherlands)

    Jongenburger, I.; Reij, M.W.; Boer, E.P.J.; Gorris, L.G.M.; Zwietering, M.H.

    2011-01-01

    Pathogenic microorganisms are known to be distributed heterogeneously in food products that are solid, semi-solid or powdered, like for instance peanut butter, cereals, or powdered milk. This complicates effective detection of the pathogens by sampling. Two-class sampling plans, which are deployed

  13. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  14. Evidentials and advertising: a sample study

    Directory of Open Access Journals (Sweden)

    Laura Cruz García

    2017-07-01

    Full Text Available This paper explores the use of evidential devices in press adverts in English in a compilation of original advertisements. Due to the appellative nature of advertising discourse, I think that these texts are likely to convey source of knowledge through evidentials as an advertising strategy in order to pragmatically manifest a higher level of credibility and reliability of the information presented concerning the products and the brands. The selected corpus of adverts will allow us to focus special attention on this particular genre and on how evidentials are used, in the fashion of other works carried out in other textual genres (cf. Fox, 2001; Kaplan, 2007; Marín-Arrese, 2004, 2007; Ortega-Barrera and Torres-Ramírez, 2010. Evidentials are studied as part of a set of persuasion strategies used by different linguistic communities in the discourse of advertising (Block de Behar, 1992; Cook, 1992; Cortés de los Ríos, 2001; Pavitt, 2000; Rein, 1982. Conclusions will report on how evidentials are used in print adverts, and whether a type of evidential device prevails over the rest.

  15. A cross-sectional, randomized cluster sample survey of household vulnerability to extreme heat among slum dwellers in ahmedabad, india.

    Science.gov (United States)

    Tran, Kathy V; Azhar, Gulrez S; Nair, Rajesh; Knowlton, Kim; Jaiswal, Anjali; Sheffield, Perry; Mavalankar, Dileep; Hess, Jeremy

    2013-06-18

    Extreme heat is a significant public health concern in India; extreme heat hazards are projected to increase in frequency and severity with climate change. Few of the factors driving population heat vulnerability are documented, though poverty is a presumed risk factor. To facilitate public health preparedness, an assessment of factors affecting vulnerability among slum dwellers was conducted in summer 2011 in Ahmedabad, Gujarat, India. Indicators of heat exposure, susceptibility to heat illness, and adaptive capacity, all of which feed into heat vulnerability, was assessed through a cross-sectional household survey using randomized multistage cluster sampling. Associations between heat-related morbidity and vulnerability factors were identified using multivariate logistic regression with generalized estimating equations to account for clustering effects. Age, preexisting medical conditions, work location, and access to health information and resources were associated with self-reported heat illness. Several of these variables were unique to this study. As sociodemographics, occupational heat exposure, and access to resources were shown to increase vulnerability, future interventions (e.g., health education) might target specific populations among Ahmedabad urban slum dwellers to reduce vulnerability to extreme heat. Surveillance and evaluations of future interventions may also be worthwhile.

  16. Global Stratigraphy of Venus: Analysis of a Random Sample of Thirty-Six Test Areas

    Science.gov (United States)

    Basilevsky, Alexander T.; Head, James W., III

    1995-01-01

    The age relations between 36 impact craters with dark paraboloids and other geologic units and structures at these localities have been studied through photogeologic analysis of Magellan SAR images of the surface of Venus. Geologic settings in all 36 sites, about 1000 x 1000 km each, could be characterized using only 10 different terrain units and six types of structures. These units and structures form a major stratigraphic and geologic sequence (from oldest to youngest): (1) tessera terrain; (2) densely fractured terrains associated with coronae and in the form of remnants among plains; (3) fractured and ridged plains and ridge belts; (4) plains with wrinkle ridges; (5) ridges associated with coronae annulae and ridges of arachnoid annulae which are contemporary with wrinkle ridges of the ridged plains; (6) smooth and lobate plains; (7) fractures of coronae annulae, and fractures not related to coronae annulae, which disrupt ridged and smooth plains; (8) rift-associated fractures; and (9) craters with associated dark paraboloids, which represent the youngest 1O% of the Venus impact crater population (Campbell et al.), and are on top of all volcanic and tectonic units except the youngest episodes of rift-associated fracturing and volcanism; surficial streaks and patches are approximately contemporary with dark-paraboloid craters. Mapping of such units and structures in 36 randomly distributed large regions (each approximately 10(exp 6) sq km) shows evidence for a distinctive regional and global stratigraphic and geologic sequence. On the basis of this sequence we have developed a model that illustrates several major themes in the history of Venus. Most of the history of Venus (that of its first 80% or so) is not preserved in the surface geomorphological record. The major deformation associated with tessera formation in the period sometime between 0.5-1.0 b.y. ago (Ivanov and Basilevsky) is the earliest event detected. In the terminal stages of tessera fon

  17. A Randomized Control Study of Responsive Teaching with Young Turkish Children and Their Mothers

    Science.gov (United States)

    Karaaslan, Ozcan; Diken, Ibrahim H.; Mahoney, Gerald

    2013-01-01

    A randomized control study was conducted to evaluate the effectiveness of responsive teaching (RT) with a sample of 19 Turkish preschool-age children with disabilities and their mothers over a 6-months period. RT is an early intervention curriculum that attempts to promote children's development by encouraging parents to engage in highly…

  18. Sampling study in milk storage tanks by INAA

    International Nuclear Information System (INIS)

    Santos, L.G.C.; Nadai Fernandes de, E.A.; Bacchi, M.A.; Tagliaferro, F.S.

    2008-01-01

    This study investigated the representativeness of samples for assessing chemical elements in milk bulk tanks. Milk samples were collected from a closed tank in a dairy plant and from an open top tank in a dairy farm. Samples were analyzed for chemical elements by instrumental neutron activation analysis (INAA). For both experiments, Br, Ca, Cs, K, Na, Rb and Zn did not present significant differences between samples thereby indicating the appropriateness of the sampling procedure adopted to evaluate the analytes of interest. (author)

  19. Active learning for clinical text classification: is it better than random sampling?

    Science.gov (United States)

    Figueroa, Rosa L; Ngo, Long H; Goryachev, Sergey; Wiechmann, Eduardo P

    2012-01-01

    Objective This study explores active learning algorithms as a way to reduce the requirements for large training sets in medical text classification tasks. Design Three existing active learning algorithms (distance-based (DIST), diversity-based (DIV), and a combination of both (CMB)) were used to classify text from five datasets. The performance of these algorithms was compared to that of passive learning on the five datasets. We then conducted a novel investigation of the interaction between dataset characteristics and the performance results. Measurements Classification accuracy and area under receiver operating characteristics (ROC) curves for each algorithm at different sample sizes were generated. The performance of active learning algorithms was compared with that of passive learning using a weighted mean of paired differences. To determine why the performance varies on different datasets, we measured the diversity and uncertainty of each dataset using relative entropy and correlated the results with the performance differences. Results The DIST and CMB algorithms performed better than passive learning. With a statistical significance level set at 0.05, DIST outperformed passive learning in all five datasets, while CMB was found to be better than passive learning in four datasets. We found strong correlations between the dataset diversity and the DIV performance, as well as the dataset uncertainty and the performance of the DIST algorithm. Conclusion For medical text classification, appropriate active learning algorithms can yield performance comparable to that of passive learning with considerably smaller training sets. In particular, our results suggest that DIV performs better on data with higher diversity and DIST on data with lower uncertainty. PMID:22707743

  20. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  1. Random Walks on Directed Networks: Inference and Respondent-Driven Sampling

    Directory of Open Access Journals (Sweden)

    Malmros Jens

    2016-06-01

    Full Text Available Respondent-driven sampling (RDS is often used to estimate population properties (e.g., sexual risk behavior in hard-to-reach populations. In RDS, already sampled individuals recruit population members to the sample from their social contacts in an efficient snowball-like sampling procedure. By assuming a Markov model for the recruitment of individuals, asymptotically unbiased estimates of population characteristics can be obtained. Current RDS estimation methodology assumes that the social network is undirected, that is, all edges are reciprocal. However, empirical social networks in general also include a substantial number of nonreciprocal edges. In this article, we develop an estimation method for RDS in populations connected by social networks that include reciprocal and nonreciprocal edges. We derive estimators of the selection probabilities of individuals as a function of the number of outgoing edges of sampled individuals. The proposed estimators are evaluated on artificial and empirical networks and are shown to generally perform better than existing estimators. This is the case in particular when the fraction of directed edges in the network is large.

  2. Recovery from work-related stress: a randomized controlled trial of a stress management intervention in a clinical sample.

    Science.gov (United States)

    Glasscock, David J; Carstensen, Ole; Dalgaard, Vita Ligaya

    2018-05-28

    Randomized controlled trials (RCTs) of interventions aimed at reducing work-related stress indicate that cognitive behavioural therapy (CBT) is more effective than other interventions. However, definitions of study populations are often unclear and there is a lack of interventions targeting both the individual and the workplace. The aim of this study was to determine whether a stress management intervention combining individual CBT and a workplace focus is superior to no treatment in the reduction of perceived stress and stress symptoms and time to lasting return to work (RTW) in a clinical sample. Patients with work-related stress reactions or adjustment disorders were randomly assigned to an intervention group (n = 57, 84.2% female) or a control group (n = 80, 83.8% female). Subjects were followed via questionnaires and register data. The intervention contained individual CBT and the offer of a workplace meeting. We examined intervention effects by analysing group differences in score changes on the Perceived Stress Scale (PSS-10) and the General Health Questionnaire (GHQ-30). We also tested if intervention led to faster lasting RTW. Mean baseline values of PSS were 24.79 in the intervention group and 23.26 in the control group while the corresponding values for GHQ were 21.3 and 20.27, respectively. There was a significant effect of time. 10 months after baseline, both groups reported less perceived stress and improved mental health. 4 months after baseline, we found significant treatment effects for both perceived stress and mental health. The difference in mean change in PSS after 4 months was - 3.09 (- 5.47, - 0.72), while for GHQ it was - 3.91 (- 7.15, - 0.68). There were no group differences in RTW. The intervention led to faster reductions in perceived stress and stress symptoms amongst patients with work-related stress reactions and adjustment disorders. 6 months after the intervention ended there were no longer differences between

  3. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2010-01-01

    We discuss the results of SEM and TEM measurements with the BPRML test samples fabricated from a BPRML (WSi2/Si with fundamental layer thickness of 3 nm) with a Dual Beam FIB (focused ion beam)/SEM technique. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.

  4. Application of random effects to the study of resource selection by animals.

    Science.gov (United States)

    Gillies, Cameron S; Hebblewhite, Mark; Nielsen, Scott E; Krawchuk, Meg A; Aldridge, Cameron L; Frair, Jacqueline L; Saher, D Joanne; Stevens, Cameron E; Jerde, Christopher L

    2006-07-01

    1. Resource selection estimated by logistic regression is used increasingly in studies to identify critical resources for animal populations and to predict species occurrence. 2. Most frequently, individual animals are monitored and pooled to estimate population-level effects without regard to group or individual-level variation. Pooling assumes that both observations and their errors are independent, and resource selection is constant given individual variation in resource availability. 3. Although researchers have identified ways to minimize autocorrelation, variation between individuals caused by differences in selection or available resources, including functional responses in resource selection, have not been well addressed. 4. Here we review random-effects models and their application to resource selection modelling to overcome these common limitations. We present a simple case study of an analysis of resource selection by grizzly bears in the foothills of the Canadian Rocky Mountains with and without random effects. 5. Both categorical and continuous variables in the grizzly bear model differed in interpretation, both in statistical significance and coefficient sign, depending on how a random effect was included. We used a simulation approach to clarify the application of random effects under three common situations for telemetry studies: (a) discrepancies in sample sizes among individuals; (b) differences among individuals in selection where availability is constant; and (c) differences in availability with and without a functional response in resource selection. 6. We found that random intercepts accounted for unbalanced sample designs, and models with random intercepts and coefficients improved model fit given the variation in selection among individuals and functional responses in selection. Our empirical example and simulations demonstrate how including random effects in resource selection models can aid interpretation and address difficult assumptions

  5. Associations Among Religiousness and Community Volunteerism in National Random Samples of American Adults.

    Science.gov (United States)

    Haggard, Megan C; Kang, Linda L; Rowatt, Wade C; Shen, Megan Johnson

    2015-01-01

    The connection between religiousness and volunteering for the community can be explained through two distinct features of religion. First, religious organizations are social groups that encourage members to help others through planned opportunities. Second, helping others is regarded as an important value for members in religious organizations to uphold. We examined the relationship between religiousness and self-reported community volunteering in two independent national random surveys of American adults (i.e., the 2005 and 2007 waves of the Baylor Religion Survey). In both waves, frequency of religious service attendance was associated with an increase in likelihood that individuals would volunteer, whether through their religious organization or not, whereas frequency of reading sacred texts outside of religious services was associated with an increase in likelihood of volunteering only for or through their religious organization. The role of religion in community volunteering is discussed in light of these findings.

  6. Re-estimating sample size in cluster randomized trials with active recruitment within clusters

    NARCIS (Netherlands)

    van Schie, Sander; Moerbeek, Mirjam

    2014-01-01

    Often only a limited number of clusters can be obtained in cluster randomised trials, although many potential participants can be recruited within each cluster. Thus, active recruitment is feasible within the clusters. To obtain an efficient sample size in a cluster randomised trial, the cluster

  7. A systematic random sampling scheme optimized to detect the proportion of rare synapses in the neuropil.

    Science.gov (United States)

    da Costa, Nuno Maçarico; Hepp, Klaus; Martin, Kevan A C

    2009-05-30

    Synapses can only be morphologically identified by electron microscopy and this is often a very labor-intensive and time-consuming task. When quantitative estimates are required for pathways that contribute a small proportion of synapses to the neuropil, the problems of accurate sampling are particularly severe and the total time required may become prohibitive. Here we present a sampling method devised to count the percentage of rarely occurring synapses in the neuropil using a large sample (approximately 1000 sampling sites), with the strong constraint of doing it in reasonable time. The strategy, which uses the unbiased physical disector technique, resembles that used in particle physics to detect rare events. We validated our method in the primary visual cortex of the cat, where we used biotinylated dextran amine to label thalamic afferents and measured the density of their synapses using the physical disector method. Our results show that we could obtain accurate counts of the labeled synapses, even when they represented only 0.2% of all the synapses in the neuropil.

  8. Effects of psychological therapies in randomized trials and practice-based studies.

    Science.gov (United States)

    Barkham, Michael; Stiles, William B; Connell, Janice; Twigg, Elspeth; Leach, Chris; Lucock, Mike; Mellor-Clark, John; Bower, Peter; King, Michael; Shapiro, David A; Hardy, Gillian E; Greenberg, Leslie; Angus, Lynne

    2008-11-01

    Randomized trials of the effects of psychological therapies seek internal validity via homogeneous samples and standardized treatment protocols. In contrast, practice-based studies aim for clinical realism and external validity via heterogeneous samples of clients treated under routine practice conditions. We compared indices of treatment effects in these two types of studies. Using published transformation formulas, the Beck Depression Inventory (BDI) scores from five randomized trials of depression (N = 477 clients) were transformed into Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM) scores and compared with CORE-OM data collected in four practice-based studies (N = 4,196 clients). Conversely, the practice-based studies' CORE-OM scores were transformed into BDI scores and compared with randomized trial data. Randomized trials showed a modest advantage over practice-based studies in amount of pre-post improvement. This difference was compressed or exaggerated depending on the direction of the transformation but averaged about 12%. There was a similarly sized advantage to randomized trials in rates of reliable and clinically significant improvement (RCSI). The largest difference was yielded by comparisons of effect sizes which suggested an advantage more than twice as large, reflecting narrower pre-treatment distributions in the randomized trials. Outcomes of completed treatments for depression in randomized trials appeared to be modestly greater than those in routine care settings. The size of the difference may be distorted depending on the method for calculating degree of change. Transforming BDI scores into CORE-OM scores and vice versa may be a preferable alternative to effect sizes for comparisons of studies using these measures.

  9. A random cluster survey and a convenience sample give comparable estimates of immunity to vaccine preventable diseases in children of school age in Victoria, Australia.

    Science.gov (United States)

    Kelly, Heath; Riddell, Michaela A; Gidding, Heather F; Nolan, Terry; Gilbert, Gwendolyn L

    2002-08-19

    We compared estimates of the age-specific population immunity to measles, mumps, rubella, hepatitis B and varicella zoster viruses in Victorian school children obtained by a national sero-survey, using a convenience sample of residual sera from diagnostic laboratories throughout Australia, with those from a three-stage random cluster survey. When grouped according to school age (primary or secondary school) there was no significant difference in the estimates of immunity to measles, mumps, hepatitis B or varicella. Compared with the convenience sample, the random cluster survey estimated higher immunity to rubella in samples from both primary (98.7% versus 93.6%, P = 0.002) and secondary school students (98.4% versus 93.2%, P = 0.03). Despite some limitations, this study suggests that the collection of a convenience sample of sera from diagnostic laboratories is an appropriate sampling strategy to provide population immunity data that will inform Australia's current and future immunisation policies. Copyright 2002 Elsevier Science Ltd.

  10. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    Science.gov (United States)

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  11. A Monte Carlo study of adsorption of random copolymers on random surfaces

    CERN Document Server

    Moghaddam, M S

    2003-01-01

    We study the adsorption problem of a random copolymer on a random surface in which a self-avoiding walk in three dimensions interacts with a plane defining a half-space to which the walk is confined. Each vertex of the walk is randomly labelled A with probability p sub p or B with probability 1 - p sub p , and only vertices labelled A are attracted to the surface plane. Each lattice site on the plane is also labelled either A with probability p sub s or B with probability 1 - p sub s , and only lattice sites labelled A interact with the walk. We study two variations of this model: in the first case the A-vertices of the walk interact only with the A-sites on the surface. In the second case the constraint of selective binding is removed; that is, any contact between the walk and the surface that involves an A-labelling, either from the surface or from the walk, is counted as a visit to the surface. The system is quenched in both cases, i.e. the labellings of the walk and of the surface are fixed as thermodynam...

  12. Sample Size in Qualitative Interview Studies: Guided by Information Power.

    Science.gov (United States)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit

    2015-11-27

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is "saturation." Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose the concept "information power" to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power depends on (a) the aim of the study, (b) sample specificity, (c) use of established theory, (d) quality of dialogue, and (e) analysis strategy. We present a model where these elements of information and their relevant dimensions are related to information power. Application of this model in the planning and during data collection of a qualitative study is discussed. © The Author(s) 2015.

  13. Assessing differences in groups randomized by recruitment chain in a respondent-driven sample of Seattle-area injection drug users.

    Science.gov (United States)

    Burt, Richard D; Thiede, Hanne

    2014-11-01

    Respondent-driven sampling (RDS) is a form of peer-based study recruitment and analysis that incorporates features designed to limit and adjust for biases in traditional snowball sampling. It is being widely used in studies of hidden populations. We report an empirical evaluation of RDS's consistency and variability, comparing groups recruited contemporaneously, by identical methods and using identical survey instruments. We randomized recruitment chains from the RDS-based 2012 National HIV Behavioral Surveillance survey of injection drug users in the Seattle area into two groups and compared them in terms of sociodemographic characteristics, drug-associated risk behaviors, sexual risk behaviors, human immunodeficiency virus (HIV) status and HIV testing frequency. The two groups differed in five of the 18 variables examined (P ≤ .001): race (e.g., 60% white vs. 47%), gender (52% male vs. 67%), area of residence (32% downtown Seattle vs. 44%), an HIV test in the previous 12 months (51% vs. 38%). The difference in serologic HIV status was particularly pronounced (4% positive vs. 18%). In four further randomizations, differences in one to five variables attained this level of significance, although the specific variables involved differed. We found some material differences between the randomized groups. Although the variability of the present study was less than has been reported in serial RDS surveys, these findings indicate caution in the interpretation of RDS results. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Random Evolutionary Dynamics Driven by Fitness and House-of-Cards Mutations: Sampling Formulae

    Science.gov (United States)

    Huillet, Thierry E.

    2017-07-01

    We first revisit the multi-allelic mutation-fitness balance problem, especially when mutations obey a house of cards condition, where the discrete-time deterministic evolutionary dynamics of the allelic frequencies derives from a Shahshahani potential. We then consider multi-allelic Wright-Fisher stochastic models whose deviation to neutrality is from the Shahshahani mutation/selection potential. We next focus on the weak selection, weak mutation cases and, making use of a Gamma calculus, we compute the normalizing partition functions of the invariant probability densities appearing in their Wright-Fisher diffusive approximations. Using these results, generalized Ewens sampling formulae (ESF) from the equilibrium distributions are derived. We start treating the ESF in the mixed mutation/selection potential case and then we restrict ourselves to the ESF in the simpler house-of-cards mutations only situation. We also address some issues concerning sampling problems from infinitely-many alleles weak limits.

  15. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  16. Study on random number generator in Monte Carlo code

    International Nuclear Information System (INIS)

    Oya, Kentaro; Kitada, Takanori; Tanaka, Shinichi

    2011-01-01

    The Monte Carlo code uses a sequence of pseudo-random numbers with a random number generator (RNG) to simulate particle histories. A pseudo-random number has its own period depending on its generation method and the period is desired to be long enough not to exceed the period during one Monte Carlo calculation to ensure the correctness especially for a standard deviation of results. The linear congruential generator (LCG) is widely used as Monte Carlo RNG and the period of LCG is not so long by considering the increasing rate of simulation histories in a Monte Carlo calculation according to the remarkable enhancement of computer performance. Recently, many kinds of RNG have been developed and some of their features are better than those of LCG. In this study, we investigate the appropriate RNG in a Monte Carlo code as an alternative to LCG especially for the case of enormous histories. It is found that xorshift has desirable features compared with LCG, and xorshift has a larger period, a comparable speed to generate random numbers, a better randomness, and good applicability to parallel calculation. (author)

  17. Randomized controlled study of CBT in bronchial asthma

    Directory of Open Access Journals (Sweden)

    Grover Naveen

    2007-01-01

    Full Text Available The aim of the present study was to find out efficacy of cognitive behavior therapy, as an adjunct to standard pharmacotherapy, in bronchial asthma. In a random-ized two-group design with pre-and post assessments, forty asthma patients were randomly allotted to two groups: self management group and cognitive behavior therapy group. Both groups were exposed to 6-8 weeks of intervention, asthma self management program and cognitive behavior therapy. Assessment measures used were-Semi structured interview schedule, Asthma Symptom Checklist, Asthma di-ary, Asthma Bother Profile, Hospital Anxiety & Depression Scale, AQLQ and Peak Expiratory Flow Rate. Within group comparison showed significant improvement in both groups at the post assessment. Between group comparisons showed that CBT group reported significantly greater change than that of SM group. Cognitive behavior therapy helps in improving the managment of asthma.

  18. Thromboembolism and mechanical heart valves : A randomized study revisited

    NARCIS (Netherlands)

    Kuntze, CEE; Blackstone, EH; Ebels, T

    Background. This study was designed to revise and substantiate previous inferences, based on short-term follow-up, about differences in the incidence of anticoagulant-related events after heart valve replacement among patients who had been randomly assigned to receive either a Bjork-Shiley,

  19. CT-Guided Transgluteal Biopsy for Systematic Random Sampling of the Prostate in Patients Without Rectal Access.

    Science.gov (United States)

    Goenka, Ajit H; Remer, Erick M; Veniero, Joseph C; Thupili, Chakradhar R; Klein, Eric A

    2015-09-01

    The objective of our study was to review our experience with CT-guided transgluteal prostate biopsy in patients without rectal access. Twenty-one CT-guided transgluteal prostate biopsy procedures were performed in 16 men (mean age, 68 years; age range, 60-78 years) who were under conscious sedation. The mean prostate-specific antigen (PSA) value was 11.4 ng/mL (range, 2.3-39.4 ng/mL). Six had seven prior unsuccessful transperineal or transurethral biopsies. Biopsy results, complications, sedation time, and radiation dose were recorded. The mean PSA values and number of core specimens were compared between patients with malignant results and patients with nonmalignant results using the Student t test. The average procedural sedation time was 50.6 minutes (range, 15-90 minutes) (n = 20), and the mean effective radiation dose was 8.2 mSv (median, 6.6 mSv; range 3.6-19.3 mSv) (n = 13). Twenty of the 21 (95%) procedures were technically successful. The only complication was a single episode of gross hematuria and penile pain in one patient, which resolved spontaneously. Of 20 successful biopsies, 8 (40%) yielded adenocarcinoma (Gleason score: mean, 8; range, 7-9). Twelve biopsies yielded nonmalignant results (60%): high-grade prostatic intraepithelial neoplasia (n = 3) or benign prostatic tissue with or without inflammation (n = 9). Three patients had carcinoma diagnosed on subsequent biopsies (second biopsy, n = 2 patients; third biopsy, n = 1 patient). A malignant biopsy result was not significantly associated with the number of core specimens (p = 0.3) or the mean PSA value (p = 0.1). CT-guided transgluteal prostate biopsy is a safe and reliable technique for the systematic random sampling of the prostate in patients without a rectal access. In patients with initial negative biopsy results, repeat biopsy should be considered if there is a persistent rise in the PSA value.

  20. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern.

    Science.gov (United States)

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-06-28

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method.

  1. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  2. Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies

    Science.gov (United States)

    Theis, Fabian J.

    2017-01-01

    Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464

  3. Sample Size and Saturation in PhD Studies Using Qualitative Interviews

    Directory of Open Access Journals (Sweden)

    Mark Mason

    2010-08-01

    Full Text Available A number of issues can affect sample size in qualitative research; however, the guiding principle should be the concept of saturation. This has been explored in detail by a number of authors but is still hotly debated, and some say little understood. A sample of PhD studies using qualitative approaches, and qualitative interviews as the method of data collection was taken from theses.com and contents analysed for their sample sizes. Five hundred and sixty studies were identified that fitted the inclusion criteria. Results showed that the mean sample size was 31; however, the distribution was non-random, with a statistically significant proportion of studies, presenting sample sizes that were multiples of ten. These results are discussed in relation to saturation. They suggest a pre-meditated approach that is not wholly congruent with the principles of qualitative research. URN: urn:nbn:de:0114-fqs100387

  4. New complete sample of identified radio sources. Part 2. Statistical study

    International Nuclear Information System (INIS)

    Soltan, A.

    1978-01-01

    Complete sample of radio sources with known redshifts selected in Paper I is studied. Source counts in the sample and the luminosity - volume test show that both quasars and galaxies are subject to the evolution. Luminosity functions for different ranges of redshifts are obtained. Due to many uncertainties only simplified models of the evolution are tested. Exponential decline of the liminosity with time of all the bright sources is in a good agreement both with the luminosity- volume test and N(S) realtion in the entire range of observed flux densities. It is shown that sources in the sample are randomly distributed in scales greater than about 17 Mpc. (author)

  5. A randomized trial of a DWI intervention program for first offenders: intervention outcomes and interactions with antisocial personality disorder among a primarily American-Indian sample.

    Science.gov (United States)

    Woodall, W Gill; Delaney, Harold D; Kunitz, Stephen J; Westerberg, Verner S; Zhao, Hongwei

    2007-06-01

    Randomized trial evidence on the effectiveness of incarceration and treatment of first-time driving while intoxicated (DWI) offenders who are primarily American Indian has yet to be reported in the literature on DWI prevention. Further, research has confirmed the association of antisocial personality disorder (ASPD) with problems with alcohol including DWI. A randomized clinical trial was conducted, in conjunction with 28 days of incarceration, of a treatment program incorporating motivational interviewing principles for first-time DWI offenders. The sample of 305 offenders including 52 diagnosed as ASPD by the Diagnostic Interview Schedule were assessed before assignment to conditions and at 6, 12, and 24 months after discharge. Self-reported frequency of drinking and driving as well as various measures of drinking over the preceding 90 days were available at all assessments for 244 participants. Further, DWI rearrest data for 274 participants were available for analysis. Participants randomized to receive the first offender incarceration and treatment program reported greater reductions in alcohol consumption from baseline levels when compared with participants who were only incarcerated. Antisocial personality disorder participants reported heavier and more frequent drinking but showed significantly greater declines in drinking from intake to posttreatment assessments. Further, the treatment resulted in larger effects relative to the control on ASPD than non-ASPD participants. Nonconfrontational treatment may significantly enhance outcomes for DWI offenders with ASPD when delivered in an incarcerated setting, and in the present study, such effects were found in a primarily American-Indian sample.

  6. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    Science.gov (United States)

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  7. Generalizability of causal inference in observational studies under retrospective convenience sampling.

    Science.gov (United States)

    Hu, Zonghui; Qin, Jing

    2018-05-20

    Many observational studies adopt what we call retrospective convenience sampling (RCS). With the sample size in each arm prespecified, RCS randomly selects subjects from the treatment-inclined subpopulation into the treatment arm and those from the control-inclined into the control arm. Samples in each arm are representative of the respective subpopulation, but the proportion of the 2 subpopulations is usually not preserved in the sample data. We show in this work that, under RCS, existing causal effect estimators actually estimate the treatment effect over the sample population instead of the underlying study population. We investigate how to correct existing methods for consistent estimation of the treatment effect over the underlying population. Although RCS is adopted in medical studies for ethical and cost-effective purposes, it also has a big advantage for statistical inference: When the tendency to receive treatment is low in a study population, treatment effect estimators under RCS, with proper correction, are more efficient than their parallels under random sampling. These properties are investigated both theoretically and through numerical demonstration. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  8. Estimation of Daily Proteinuria in Patients with Amyloidosis by Using the Protein-To-Creatinine ratio in Random Urine Samples.

    Science.gov (United States)

    Talamo, Giampaolo; Mir Muhammad, A; Pandey, Manoj K; Zhu, Junjia; Creer, Michael H; Malysz, Jozef

    2015-02-11

    Measurement of daily proteinuria in patients with amyloidosis is recommended at the time of diagnosis for assessing renal involvement, and for monitoring disease activity. Renal involvement is usually defined by proteinuria >500 mg/day. We evaluated the accuracy of the random urine protein-to-creatinine ratio (Pr/Cr) in predicting 24 hour proteinuria in patient with amyloidosis. We compared results of random urine Pr/Cr ratio and concomitant 24-hour urine collections in 44 patients with amyloidosis. We found a strong correlation (Spearman's ρ=0.874) between the Pr/Cr ratio and the 24 hour urine protein excretion. For predicting renal involvement, the optimal cut-off point of the Pr/Cr ratio was 715 mg/g. The sensitivity and specificity for this point were 91.8% and 95.5%, respectively, and the area under the curve value was 97.4%. We conclude that the random urine Pr/Cr ratio could be useful in the screening of renal involvement in patients with amyloidosis. If validated in a prospective study, the random urine Pr/Cr ratio could replace the 24 hour urine collection for the assessment of daily proteinuria and presence of nephrotic syndrome in patients with amyloidosis.

  9. Estimation of daily proteinuria in patients with amyloidosis by using the protein-to-creatinine ratio in random urine sample

    Directory of Open Access Journals (Sweden)

    Giampaolo Talamo

    2015-02-01

    Full Text Available Measurement of daily proteinuria in patients with amyloidosis is recommended at the time of diagnosis for assessing renal involvement, and for monitoring disease activity. Renal involvement is usually defined by proteinuria >500 mg/day. We evaluated the accuracy of the random urine protein-to-creatinine ratio (Pr/Cr in predicting 24 hour proteinuria in patient with amyloidosis. We com- pared results of random urine Pr/Cr ratio and concomitant 24-hour urine collections in 44 patients with amyloidosis. We found a strong correlation (Spearman’s ρ=0.874 between the Pr/Cr ratio and the 24 hour urine protein excretion. For predicting renal involvement, the optimal cut-off point of the Pr/Cr ratio was 715 mg/g. The sensitivity and specificity for this point were 91.8% and 95.5%, respectively, and the area under the curve value was 97.4%. We conclude that the random urine Pr/Cr ratio could be useful in the screening of renal involvement in patients with amyloidosis. If validated in a prospective study, the random urine Pr/Cr ratio could replace the 24 hour urine collection for the assessment of daily proteinuria and presence of nephrotic syndrome in patients with amyloidosis.

  10. Sample size calculations for case-control studies

    Science.gov (United States)

    This R package can be used to calculate the required samples size for unconditional multivariate analyses of unmatched case-control studies. The sample sizes are for a scalar exposure effect, such as binary, ordinal or continuous exposures. The sample sizes can also be computed for scalar interaction effects. The analyses account for the effects of potential confounder variables that are also included in the multivariate logistic model.

  11. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation.

    Science.gov (United States)

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-04-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.

  12. Fasting time and lipid parameters: association with hepatic steatosis — data from a random population sample

    Science.gov (United States)

    2014-01-01

    Background Current guidelines recommend measuring plasma lipids in fasting patients. Recent studies, however, suggest that variation in plasma lipid concentrations secondary to fasting time may be minimal. Objective of the present study was to investigate the impact of fasting time on plasma lipid concentrations (total cholesterol, HDL and LDL cholesterol, triglycerides). A second objective was to determine the effect of non-alcoholic fatty liver disease exerted on the above-mentioned lipid levels. Method Subjects participating in a population-based cross-sectional study (2,445 subjects; 51.7% females) were questioned at time of phlebotomy regarding duration of pre-phlebotomy fasting. Total cholesterol, LDL and HDL cholesterol, and triglycerides were determined and correlated with length of fasting. An upper abdominal ultrasonographic examination was performed and body-mass index (BMI) and waist-to-hip ratio (WHR) were calculated. Subjects were divided into three groups based on their reported fasting periods of 1–4 h, 4–8 h and > 8 h. After application of the exclusion criteria, a total of 1,195 subjects (52.4% females) were included in the study collective. The Kruskal-Wallis test was used for continuous variables and the chi-square test for categorical variables. The effects of age, BMI, WHR, alcohol consumption, fasting time and hepatic steatosis on the respective lipid variables were analyzed using multivariate logistic regression. Results At multivariate analysis, fasting time was associated with elevated triglycerides (p = 0.0047 for 1–4 h and p = 0.0147 for 4–8 h among females; p fasting period. LDL cholesterol and triglycerides exhibit highly significant variability; the greatest impact is seen with the triglycerides. Fasting time represents an independent factor for reduced LDL cholesterol and elevated triglyceride concentrations. There is a close association between elevated lipids and hepatic steatosis. PMID:24447492

  13. COPD, Body Mass, Fat Free Body Mass and prognosis in Patients from a Random Population Sample

    DEFF Research Database (Denmark)

    Vestbo, Jørgen; Prescott, E; Almdal, Thomas Peter

    2006-01-01

    distribution of low FFMI and its association with prognosis in a population-based cohort of patients with COPD. METHODS: We used data on 1,898 patients with COPD identified in a population-based epidemiologic study in Copenhagen. FFM was measured using bioelectrical impedance analysis. Patients were followed...... mortality and 2.4 (1.4-4.0) for COPD-related mortality. FFMI was also a predictor of overall mortality when analyses were restricted to subjects with normal BMI. CONCLUSIONS: FFMI provides information in addition to BMI and assessment of FFM should be considered in the routine assessment of COPD....

  14. Condom and other contraceptive use among a random sample of female adolescents: a snapshot in time.

    Science.gov (United States)

    Grimley, D M; Lee, P A

    1997-01-01

    This study examined the sexual practices of 235 females aged 15 to 19 years and their readiness to use specific contraceptive methods for birth control and sexually transmitted disease (STD) prevention. The investigation was based on the stages-of-change construct from the Transtheoretical Model (Prochaska & DiClemente, 1983, 1984). Results demonstrated that despite the availability of newer contraceptive methods (e.g., Depo-Provera), most sexually active adolescents were least resistant to using condoms and were further along in the stages of change for condom use as compared with other contraceptive methods. Moreover, the females perceived the male condom as an acceptable method for prevention of both pregnancy and STDs. These findings suggest that interventions designed to target consistent and correct condom use may result in better compliance, reducing the number of unintended pregnancies and STD cases among this populations.

  15. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Science.gov (United States)

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2) detected by s-DRY, 56.2% (95%CI: 47.6-64.4) by Dr-WET, and 54.6% (95%CI: 46.1-62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5-79.8) for s-FTA, 84.6% (95%CI: 66.5-93.9) for s-DRY, and 76.9% (95%CI: 58.0-89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. International Standard Randomized Controlled Trial Number (ISRCTN): 43310942.

  16. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Directory of Open Access Journals (Sweden)

    Rosa Catarino

    Full Text Available Human papillomavirus (HPV self-sampling (self-HPV is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab.A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA. After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET. HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic.HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2 detected by s-DRY, 56.2% (95%CI: 47.6-64.4 by Dr-WET, and 54.6% (95%CI: 46.1-62.9 by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34, and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56. Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+ were: 64.0% (95%CI: 44.5-79.8 for s-FTA, 84.6% (95%CI: 66.5-93.9 for s-DRY, and 76.9% (95%CI: 58.0-89.0 for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%. Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD/per card vs. ~1 USD/per swab.Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA.International Standard Randomized Controlled Trial Number (ISRCTN: 43310942.

  17. Aerobic physical activity and resistance training: an application of the theory of planned behavior among adults with type 2 diabetes in a random, national sample of Canadians

    Directory of Open Access Journals (Sweden)

    Karunamuni Nandini

    2008-12-01

    Full Text Available Abstract Background Aerobic physical activity (PA and resistance training are paramount in the treatment and management of type 2 diabetes (T2D, but few studies have examined the determinants of both types of exercise in the same sample. Objective The primary purpose was to investigate the utility of the Theory of Planned Behavior (TPB in explaining aerobic PA and resistance training in a population sample of T2D adults. Methods A total of 244 individuals were recruited through a random national sample which was created by generating a random list of household phone numbers. The list was proportionate to the actual number of household telephone numbers for each Canadian province (with the exception of Quebec. These individuals completed self-report TPB constructs of attitude, subjective norm, perceived behavioral control and intention, and a 3-month follow-up that assessed aerobic PA and resistance training. Results TPB explained 10% and 8% of the variance respectively for aerobic PA and resistance training; and accounted for 39% and 45% of the variance respectively for aerobic PA and resistance training intentions. Conclusion These results may guide the development of appropriate PA interventions for aerobic PA and resistance training based on the TPB.

  18. Monolith Chromatography as Sample Preparation Step in Virome Studies of Water Samples.

    Science.gov (United States)

    Gutiérrez-Aguirre, Ion; Kutnjak, Denis; Rački, Nejc; Rupar, Matevž; Ravnikar, Maja

    2018-01-01

    Viruses exist in aquatic media and many of them use this media as transmission route. Next-generation sequencing (NGS) technologies have opened new doors in virus research, allowing also to reveal a hidden diversity of viral species in aquatic environments. Not surprisingly, many of the newly discovered viruses are found in environmental fresh and marine waters. One of the problems in virome research can be the low amount of viral nucleic acids present in the sample in contrast to the background ones (host, eukaryotic, prokaryotic, environmental). Therefore, virus enrichment prior to NGS is necessary in many cases. In water samples, an added problem resides in the low concentration of viruses typically present in aquatic media. Different concentration strategies have been used to overcome such limitations. CIM monoliths are a new generation of chromatographic supports that due to their particular structural characteristics are very efficient in concentration and purification of viruses. In this chapter, we describe the use of CIM monolithic chromatography for sample preparation step in NGS studies targeting viruses in fresh or marine water. The step-by-step protocol will include a case study where CIM concentration was used to study the virome of a wastewater sample using NGS.

  19. Acoustically levitated droplets: a contactless sampling method for fluorescence studies.

    Science.gov (United States)

    Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich

    2008-01-01

    Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence.

  20. Media Use and Source Trust among Muslims in Seven Countries: Results of a Large Random Sample Survey

    Directory of Open Access Journals (Sweden)

    Steven R. Corman

    2013-12-01

    Full Text Available Despite the perceived importance of media in the spread of and resistance against Islamist extremism, little is known about how Muslims use different kinds of media to get information about religious issues, and what sources they trust when doing so. This paper reports the results of a large, random sample survey among Muslims in seven countries Southeast Asia, West Africa and Western Europe, which helps fill this gap. Results show a diverse set of profiles of media use and source trust that differ by country, with overall low trust in mediated sources of information. Based on these findings, we conclude that mass media is still the most common source of religious information for Muslims, but that trust in mediated information is low overall. This suggests that media are probably best used to persuade opinion leaders, who will then carry anti-extremist messages through more personal means.

  1. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    implementation generally improved the algorithm’s ability to predict the correct soil class. The implementation of soil-landscape relationships and area-proportional sampling generally increased the calculation time, while the random forest implementation reduced the calculation time. In the most successful......Detailed soil information is often needed to support agricultural practices, environmental protection and policy decisions. Several digital approaches can be used to map soil properties based on field observations. When soil observations are sparse or missing, an alternative approach...... is to disaggregate existing conventional soil maps. At present, the DSMART algorithm represents the most sophisticated approach for disaggregating conventional soil maps (Odgers et al., 2014). The algorithm relies on classification trees trained from resampled points, which are assigned classes according...

  2. Astronaut Neil Armstrong studies rock samples during geological field trip

    Science.gov (United States)

    1969-01-01

    Astronaut Neil Armstrong, commander of the Apollo 11 lunar landing mission, studies rock samples during a geological field trip to the Quitman Mountains area near the Fort Quitman ruins in far west Texas.

  3. Sampling strategies for tropical forest nutrient cycling studies: a case study in São Paulo, Brazil

    Directory of Open Access Journals (Sweden)

    G. Sparovek

    1997-12-01

    Full Text Available The precise sampling of soil, biological or micro climatic attributes in tropical forests, which are characterized by a high diversity of species and complex spatial variability, is a difficult task. We found few basic studies to guide sampling procedures. The objective of this study was to define a sampling strategy and data analysis for some parameters frequently used in nutrient cycling studies, i. e., litter amount, total nutrient amounts in litter and its composition (Ca, Mg, Κ, Ν and P, and soil attributes at three depths (organic matter, Ρ content, cation exchange capacity and base saturation. A natural remnant forest in the West of São Paulo State (Brazil was selected as study area and samples were collected in July, 1989. The total amount of litter and its total nutrient amounts had a high spatial independent variance. Conversely, the variance of litter composition was lower and the spatial dependency was peculiar to each nutrient. The sampling strategy for the estimation of litter amounts and the amount of nutrient in litter should be different than the sampling strategy for nutrient composition. For the estimation of litter amounts and the amount of nutrients in litter (related to quantity a large number of randomly distributed determinations are needed. Otherwise, for the estimation of litter nutrient composition (related to quality a smaller amount of spatially located samples should be analyzed. The determination of sampling for soil attributes differed according to the depth. Overall, surface samples (0-5 cm showed high short distance spatial dependent variance, whereas, subsurface samples exhibited spatial dependency in longer distances. Short transects with sampling interval of 5-10 m are recommended for surface sampling. Subsurface samples must also be spatially located, but with transects or grids with longer distances between sampling points over the entire area. Composite soil samples would not provide a complete

  4. MiDAS ENCORE: Randomized Controlled Study Design and Protocol.

    Science.gov (United States)

    Benyamin, Ramsin M; Staats, Peter S

    2015-01-01

    Epidural steroid injections (ESIs) are commonly used for treatment of symptomatic lumbar spinal stenosis (LSS). ESIs are generally administered after failure of conservative therapy. For LSS patients suffering from neurogenic claudication, the mild® procedure provides an alternative to ESIs via minimally invasive lumbar decompression. Both ESIs and mild offer interventional pain treatment options for LSS patients experiencing neurogenic claudication refractory to more conservative therapies. Prospective, multi-center, randomized controlled, clinical study. Twenty-six interventional pain management centers throughout the United States. To compare patient outcomes following treatment with either mild or ESIs in LSS patients with neurogenic claudication and having verified ligamentum flavum hypertrophy. Study participants include Medicare beneficiaries who meet study inclusion/exclusion criteria. Eligible patients will be randomized in a 1:1 ratio to one of 2 treatment arms, mild (treatment group) or ESI (control group). Each study group will include approximately 150 patients who have experienced neurogenic claudication symptoms for ≥ 3 months duration who have failed to respond to physical therapy, home exercise programs, and oral analgesics. Those randomized to mild are prohibited from receiving lumbar ESIs during the study period, while those randomized to ESI may receive ESIs up to 4 times per year. Patient assessments will occur at baseline, 6 months, and one year. An additional assessment will be conducted for the mild patient group at 2 years. The primary efficacy outcome measure is the proportion of Oswestry Disability Index (ODI) responders from baseline to one year follow-up in the treatment group (mild) versus the control group (ESI). ODI responders are defined as those patients achieving the validated Minimal Important Change (MIC) of ≥ 10 point improvement in ODI from baseline to follow-up as a clinically significant efficacy threshold. Secondary

  5. The UK Biobank sample handling and storage validation studies.

    Science.gov (United States)

    Peakman, Tim C; Elliott, Paul

    2008-04-01

    and aims UK Biobank is a large prospective study in the United Kingdom to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. It involves the collection of blood and urine from 500 000 individuals aged between 40 and 69 years. How the samples are collected, processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. A series of validation studies was recommended to test the robustness of the draft sample handling and storage protocol. Samples of blood and urine were collected from 40 healthy volunteers and either processed immediately according to the protocol or maintained at specified temperatures (4 degrees C for all tubes with the exception of vacutainers containing acid citrate dextrose that were maintained at 18 degrees C) for 12, 24 or 36 h prior to processing. A further sample was maintained for 24 h at 4 degrees C, processed and the aliquots frozen at -80 degrees C for 20 days and then thawed under controlled conditions. The stability of the samples was compared for the different times in a wide variety of assays. The samples maintained at 4 degrees C were stable for at least 24 h after collection for a wide range of assays. Small but significant changes were observed in metabonomic studies in samples maintained at 4 degrees C for 36 h. There was no degradation of the samples for a range of biochemical assays after short-term freezing and thawing under controlled conditions. Whole blood maintained at 18 degrees C for 24 h in vacutainers containing acid citrate dextrose is suitable for viral immortalization techniques. The validation studies reported in this supplement provide justification for the sample handling and storage procedures adopted in the UK Biobank project.

  6. A sero-survey of rinderpest in nomadic pastoral systems in central and southern Somalia from 2002 to 2003, using a spatially integrated random sampling approach.

    Science.gov (United States)

    Tempia, S; Salman, M D; Keefe, T; Morley, P; Freier, J E; DeMartini, J C; Wamwayi, H M; Njeumi, F; Soumaré, B; Abdi, A M

    2010-12-01

    A cross-sectional sero-survey, using a two-stage cluster sampling design, was conducted between 2002 and 2003 in ten administrative regions of central and southern Somalia, to estimate the seroprevalence and geographic distribution of rinderpest (RP) in the study area, as well as to identify potential risk factors for the observed seroprevalence distribution. The study was also used to test the feasibility of the spatially integrated investigation technique in nomadic and semi-nomadic pastoral systems. In the absence of a systematic list of livestock holdings, the primary sampling units were selected by generating random map coordinates. A total of 9,216 serum samples were collected from cattle aged 12 to 36 months at 562 sampling sites. Two apparent clusters of RP seroprevalence were detected. Four potential risk factors associated with the observed seroprevalence were identified: the mobility of cattle herds, the cattle population density, the proximity of cattle herds to cattle trade routes and cattle herd size. Risk maps were then generated to assist in designing more targeted surveillance strategies. The observed seroprevalence in these areas declined over time. In subsequent years, similar seroprevalence studies in neighbouring areas of Kenya and Ethiopia also showed a very low seroprevalence of RP or the absence of antibodies against RP. The progressive decline in RP antibody prevalence is consistent with virus extinction. Verification of freedom from RP infection in the Somali ecosystem is currently in progress.

  7. Missing citations due to exact reference matching: Analysis of a random sample from WoS. Are publications from peripheral countries disadvantaged?

    Energy Technology Data Exchange (ETDEWEB)

    Donner, P.

    2016-07-01

    Citation counts of scientific research contributions are one fundamental data in scientometrics. Accuracy and completeness of citation links are therefore crucial data quality issues (Moed, 2005, Ch. 13). However, despite the known flaws of reference matching algorithms, usually no attempts are made to incorporate uncertainty about citation counts into indicators. This study is a step towards that goal. Particular attention is paid to the question whether publications from countries not using basic Latin script are differently affected by missed citations. The proprietary reference matching procedure of Web of Science (WoS) is based on (near) exact agreement of cited reference data (normalized during processing) to the target papers bibliographical data. Consequently, the procedure has near-optimal precision but incomplete recall - it is known to miss some slightly inaccurate reference links (Olensky, 2015). However, there has been no attempt so far to estimate the rate of missed citations by a principled method for a random sample. For this study a simple random sample of WoS source papers was drawn and it was attempted to find all reference strings of WoS indexed documents that refer to them, in particular inexact matches. The objective is to give a statistical estimate of the proportion of missed citations and to describe the relationship of the number of found citations to the number of missed citations, i.e. the conditional error distribution. The empirical error distribution is statistically analyzed and modelled. (Author)

  8. A Mixed Methods Sampling Methodology for a Multisite Case Study

    Science.gov (United States)

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  9. Sampling Studies at an Air Force Live-Fire Bombing Range Impact Area

    National Research Council Canada - National Science Library

    Jenkins, Thomas F; Hewitt, Alan D; Ramsey, Charles A; Bjella, Kevin L; Bigl, Susan R; Lambert, Dennis J

    2006-01-01

    .... The main objective was to assess the effectiveness of using a systematic-random, multi-increment sampling strategy for the collection of representative surface soil samples in areas where bombing...

  10. Investigative studies on water contamination in Bangladesh. Primary treatment of water samples at the sampling site

    International Nuclear Information System (INIS)

    Sera, K.; Islam, Md. Shafiqul; Takatsuji, T.; Nakamura, T.; Goto, S.; Takahashi, C.; Saitoh, Y.

    2010-01-01

    Arsenic concentration in 13 well waters, 9 pond waters, 10 agricultural waters and a coconut juice taken in Comilla district, Bangladesh, where the problem of arsenic pollution is the most severe, was investigated. High-level arsenic is detected even in the well water which has been kept drinking by the people. Relatively high arsenic concentration was detected for some pond and farm waters even though the sampling was performed just after the rainy season and the waters were expected to be highly diluted. Clear relationship was observed in elemental compositions between the pond water and the coconut juice collected at the edge of the water. These results are expected to become the basic information for evaluating the risk of individual food such as cultured fishes, shrimps and farm products, and for controlling total intakes of arsenic. In order to solve the problem of transportation of water samples internationally, a simple method of target preparation performed at the sampling site was established and its validity was confirmed. All targets were prepared at the sampling sites in this study on the basis of this method. (author)

  11. Neuromuscular dose-response studies: determining sample size.

    Science.gov (United States)

    Kopman, A F; Lien, C A; Naguib, M

    2011-02-01

    Investigators planning dose-response studies of neuromuscular blockers have rarely used a priori power analysis to determine the minimal sample size their protocols require. Institutional Review Boards and peer-reviewed journals now generally ask for this information. This study outlines a proposed method for meeting these requirements. The slopes of the dose-response relationships of eight neuromuscular blocking agents were determined using regression analysis. These values were substituted for γ in the Hill equation. When this is done, the coefficient of variation (COV) around the mean value of the ED₅₀ for each drug is easily calculated. Using these values, we performed an a priori one-sample two-tailed t-test of the means to determine the required sample size when the allowable error in the ED₅₀ was varied from ±10-20%. The COV averaged 22% (range 15-27%). We used a COV value of 25% in determining the sample size. If the allowable error in finding the mean ED₅₀ is ±15%, a sample size of 24 is needed to achieve a power of 80%. Increasing 'accuracy' beyond this point requires increasing greater sample sizes (e.g. an 'n' of 37 for a ±12% error). On the basis of the results of this retrospective analysis, a total sample size of not less than 24 subjects should be adequate for determining a neuromuscular blocking drug's clinical potency with a reasonable degree of assurance.

  12. Adult health study reference papers. Selection of the sample. Characteristics of the sample

    Energy Technology Data Exchange (ETDEWEB)

    Beebe, G W; Fujisawa, Hideo; Yamasaki, Mitsuru

    1960-12-14

    The characteristics and selection of the clinical sample have been described in some detail to provide information on the comparability of the exposure groups with respect to factors excluded from the matching criteria and to provide basic descriptive information potentially relevant to individual studies that may be done within the framework of the Adult Health Study. The characteristics under review here are age, sex, many different aspects of residence, marital status, occupation and industry, details of location and shielding ATB, acute radiation signs and symptoms, and prior ABCC medical or pathology examinations. 5 references, 57 tables.

  13. Sample size adjustments for varying cluster sizes in cluster randomized trials with binary outcomes analyzed with second-order PQL mixed logistic regression.

    Science.gov (United States)

    Candel, Math J J M; Van Breukelen, Gerard J P

    2010-06-30

    Adjustments of sample size formulas are given for varying cluster sizes in cluster randomized trials with a binary outcome when testing the treatment effect with mixed effects logistic regression using second-order penalized quasi-likelihood estimation (PQL). Starting from first-order marginal quasi-likelihood (MQL) estimation of the treatment effect, the asymptotic relative efficiency of unequal versus equal cluster sizes is derived. A Monte Carlo simulation study shows this asymptotic relative efficiency to be rather accurate for realistic sample sizes, when employing second-order PQL. An approximate, simpler formula is presented to estimate the efficiency loss due to varying cluster sizes when planning a trial. In many cases sampling 14 per cent more clusters is sufficient to repair the efficiency loss due to varying cluster sizes. Since current closed-form formulas for sample size calculation are based on first-order MQL, planning a trial also requires a conversion factor to obtain the variance of the second-order PQL estimator. In a second Monte Carlo study, this conversion factor turned out to be 1.25 at most. (c) 2010 John Wiley & Sons, Ltd.

  14. Randomly and Non-Randomly Missing Renal Function Data in the Strong Heart Study: A Comparison of Imputation Methods.

    Directory of Open Access Journals (Sweden)

    Nawar Shara

    Full Text Available Kidney and cardiovascular disease are widespread among populations with high prevalence of diabetes, such as American Indians participating in the Strong Heart Study (SHS. Studying these conditions simultaneously in longitudinal studies is challenging, because the morbidity and mortality associated with these diseases result in missing data, and these data are likely not missing at random. When such data are merely excluded, study findings may be compromised. In this article, a subset of 2264 participants with complete renal function data from Strong Heart Exams 1 (1989-1991, 2 (1993-1995, and 3 (1998-1999 was used to examine the performance of five methods used to impute missing data: listwise deletion, mean of serial measures, adjacent value, multiple imputation, and pattern-mixture. Three missing at random models and one non-missing at random model were used to compare the performance of the imputation techniques on randomly and non-randomly missing data. The pattern-mixture method was found to perform best for imputing renal function data that were not missing at random. Determining whether data are missing at random or not can help in choosing the imputation method that will provide the most accurate results.

  15. Serum Dried Samples to Detect Dengue Antibodies: A Field Study

    Directory of Open Access Journals (Sweden)

    Angelica Maldonado-Rodríguez

    2017-01-01

    Full Text Available Background. Dried blood and serum samples are useful resources for detecting antiviral antibodies. The conditions for elution of the sample need to be optimized for each disease. Dengue is a widespread disease in Mexico which requires continuous surveillance. In this study, we standardized and validated a protocol for the specific detection of dengue antibodies from dried serum spots (DSSs. Methods. Paired serum and DSS samples from 66 suspected cases of dengue were collected in a clinic in Veracruz, Mexico. Samples were sent to our laboratory, where the conditions for optimal elution of DSSs were established. The presence of anti-dengue antibodies was determined in the paired samples. Results. DSS elution conditions were standardized as follows: 1 h at 4°C in 200 µl of DNase-, RNase-, and protease-free PBS (1x. The optimal volume of DSS eluate to be used in the IgG assay was 40 µl. Sensitivity of 94%, specificity of 93.3%, and kappa concordance of 0.87 were obtained when comparing the antidengue reactivity between DSSs and serum samples. Conclusion. DSS samples are useful for detecting anti-dengue IgG antibodies in the field.

  16. Iterative random vs. Kennard-Stone sampling for IR spectrum-based classification task using PLS2-DA

    Science.gov (United States)

    Lee, Loong Chuen; Liong, Choong-Yeun; Jemain, Abdul Aziz

    2018-04-01

    External testing (ET) is preferred over auto-prediction (AP) or k-fold-cross-validation in estimating more realistic predictive ability of a statistical model. With IR spectra, Kennard-stone (KS) sampling algorithm is often used to split the data into training and test sets, i.e. respectively for model construction and for model testing. On the other hand, iterative random sampling (IRS) has not been the favored choice though it is theoretically more likely to produce reliable estimation. The aim of this preliminary work is to compare performances of KS and IRS in sampling a representative training set from an attenuated total reflectance - Fourier transform infrared spectral dataset (of four varieties of blue gel pen inks) for PLS2-DA modeling. The `best' performance achievable from the dataset is estimated with AP on the full dataset (APF, error). Both IRS (n = 200) and KS were used to split the dataset in the ratio of 7:3. The classic decision rule (i.e. maximum value-based) is employed for new sample prediction via partial least squares - discriminant analysis (PLS2-DA). Error rate of each model was estimated repeatedly via: (a) AP on full data (APF, error); (b) AP on training set (APS, error); and (c) ET on the respective test set (ETS, error). A good PLS2-DA model is expected to produce APS, error and EVS, error that is similar to the APF, error. Bearing that in mind, the similarities between (a) APS, error vs. APF, error; (b) ETS, error vs. APF, error and; (c) APS, error vs. ETS, error were evaluated using correlation tests (i.e. Pearson and Spearman's rank test), using series of PLS2-DA models computed from KS-set and IRS-set, respectively. Overall, models constructed from IRS-set exhibits more similarities between the internal and external error rates than the respective KS-set, i.e. less risk of overfitting. In conclusion, IRS is more reliable than KS in sampling representative training set.

  17. Two to five repeated measurements per patient reduced the required sample size considerably in a randomized clinical trial for patients with inflammatory rheumatic diseases

    Directory of Open Access Journals (Sweden)

    Smedslund Geir

    2013-02-01

    Full Text Available Abstract Background Patient reported outcomes are accepted as important outcome measures in rheumatology. The fluctuating symptoms in patients with rheumatic diseases have serious implications for sample size in clinical trials. We estimated the effects of measuring the outcome 1-5 times on the sample size required in a two-armed trial. Findings In a randomized controlled trial that evaluated the effects of a mindfulness-based group intervention for patients with inflammatory arthritis (n=71, the outcome variables Numerical Rating Scales (NRS (pain, fatigue, disease activity, self-care ability, and emotional wellbeing and General Health Questionnaire (GHQ-20 were measured five times before and after the intervention. For each variable we calculated the necessary sample sizes for obtaining 80% power (α=.05 for one up to five measurements. Two, three, and four measures reduced the required sample sizes by 15%, 21%, and 24%, respectively. With three (and five measures, the required sample size per group was reduced from 56 to 39 (32 for the GHQ-20, from 71 to 60 (55 for pain, 96 to 71 (73 for fatigue, 57 to 51 (48 for disease activity, 59 to 44 (45 for self-care, and 47 to 37 (33 for emotional wellbeing. Conclusions Measuring the outcomes five times rather than once reduced the necessary sample size by an average of 27%. When planning a study, researchers should carefully compare the advantages and disadvantages of increasing sample size versus employing three to five repeated measurements in order to obtain the required statistical power.

  18. Sampling strategies to measure the prevalence of common recurrent infections in longitudinal studies

    Directory of Open Access Journals (Sweden)

    Luby Stephen P

    2010-08-01

    Full Text Available Abstract Background Measuring recurrent infections such as diarrhoea or respiratory infections in epidemiological studies is a methodological challenge. Problems in measuring the incidence of recurrent infections include the episode definition, recall error, and the logistics of close follow up. Longitudinal prevalence (LP, the proportion-of-time-ill estimated by repeated prevalence measurements, is an alternative measure to incidence of recurrent infections. In contrast to incidence which usually requires continuous sampling, LP can be measured at intervals. This study explored how many more participants are needed for infrequent sampling to achieve the same study power as frequent sampling. Methods We developed a set of four empirical simulation models representing low and high risk settings with short or long episode durations. The model was used to evaluate different sampling strategies with different assumptions on recall period and recall error. Results The model identified three major factors that influence sampling strategies: (1 the clustering of episodes in individuals; (2 the duration of episodes; (3 the positive correlation between an individual's disease incidence and episode duration. Intermittent sampling (e.g. 12 times per year often requires only a slightly larger sample size compared to continuous sampling, especially in cluster-randomized trials. The collection of period prevalence data can lead to highly biased effect estimates if the exposure variable is associated with episode duration. To maximize study power, recall periods of 3 to 7 days may be preferable over shorter periods, even if this leads to inaccuracy in the prevalence estimates. Conclusion Choosing the optimal approach to measure recurrent infections in epidemiological studies depends on the setting, the study objectives, study design and budget constraints. Sampling at intervals can contribute to making epidemiological studies and trials more efficient, valid

  19. Prospective randomized clinical studies involving reirradiation. Lessons learned

    International Nuclear Information System (INIS)

    Nieder, Carsten; Langendijk, Johannes A.; Guckenberger, Matthias; Grosu, Anca L.

    2016-01-01

    Reirradiation is a potentially useful option for many patients with recurrent cancer. The purpose of this study was to review all recently published randomized trials in order to identify methodological strengths and weaknesses, comment on the results, clinical implications and open questions, and give advice for the planning of future trials. Systematic review of trials published between 2000 and 2015 (databases searched were PubMed, Scopus and Web of Science). We reviewed 9 trials, most of which addressed reirradiation of head and neck tumours. The median number of patients was 69. Trial design, primary endpoint and statistical hypotheses varied widely. The results contribute mainly to decision making for reirradiation of nasopharynx cancer and bone metastases. The trials with relatively long median follow-up confirm that serious toxicity remains a concern after high cumulative total doses. Multi-institutional collaboration is encouraged to complete sufficiently large trials. Despite a paucity of large randomized studies, reirradiation has been adopted in different clinical scenarios by many institutions. Typically, the patients have been assessed by multidisciplinary tumour boards and advanced technologies are used to create highly conformal dose distributions. (orig.) [de

  20. Selective decontamination in pediatric liver transplants. A randomized prospective study.

    Science.gov (United States)

    Smith, S D; Jackson, R J; Hannakan, C J; Wadowsky, R M; Tzakis, A G; Rowe, M I

    1993-06-01

    Although it has been suggested that selective decontamination of the digestive tract (SDD) decreases postoperative aerobic Gram-negative and fungal infections in orthotopic liver transplantation (OLT), no controlled trials exist in pediatric patients. This prospective, randomized controlled study of 36 pediatric OLT patients examines the effect of short-term SDD on postoperative infection and digestive tract flora. Patients were randomized into two groups. The control group received perioperative parenteral antibiotics only. The SDD group received in addition polymyxin E, tobramycin, and amphotericin B enterally and by oropharyngeal swab postoperatively until oral intake was tolerated (6 +/- 4 days). Indications for operation, preoperative status, age, and intensive care unit and hospital length of stay were no different in SDD (n = 18) and control (n = 18) groups. A total of 14 Gram-negative infections (intraabdominal abscess 7, septicemia 5, pneumonia 1, urinary tract 1) developed in the 36 patients studied. Mortality was not significantly different in the two groups. However, there were significantly fewer patients with Gram-negative infections in the SDD group: 3/18 patients (11%) vs. 11/18 patients (50%) in the control group, P < 0.001. There was also significant reduction in aerobic Gram-negative flora in the stool and pharynx in patients receiving SDD. Gram-positive and anaerobic organisms were unaffected. We conclude that short-term postoperative SDD significantly reduces Gram-negative infections in pediatric OLT patients.

  1. Levels of dioxin (PCDD/F) and PCBs in a random sample of Australian aquaculture-produced Southern Bluefin Tuna (Thunnus maccoyii)

    Energy Technology Data Exchange (ETDEWEB)

    Padula, D.; Madigan, T.; Kiermeier, A.; Daughtry, B.; Pointon, A. [South Australian Research and Development Inst. (Australia)

    2004-09-15

    To date there has been no published information available on the levels of dioxin (PCDD/F) and PCBs in Australian aquaculture-produced Southern Bluefin Tuna (Thunnus maccoyii). Southern Bluefin Tuna are commercially farmed off the coast of Port Lincoln in the state of South Australia, Australia. This paper reports the levels of dioxin (PCDD/F) and PCBs in muscle tissue samples from 11 randomly sampled aquaculture-produced Southern Bluefin Tuna collected in 2003. Little published data exists on the levels of dioxin (PCDD/F) and PCBs in Australian aquacultureproduced seafood. Wild tuna are first caught in the Great Australian Bight in South Australian waters, and are then brought back to Port Lincoln where they are ranched in sea-cages before being harvested and exported to Japan. The aim of the study was to identify pathways whereby contaminants such as dioxin (PCDD/F) and PCBs may enter the aquaculture production system. This involved undertaking a through chain analysis of the levels of dioxin (PCDD/F) and PCBs in wild caught tuna, seafloor sediment samples from the marine environment, levels in feeds and final harvested exported product. Detailed study was also undertaken on the variation of dioxin (PCDD/F) and PCBs across individual tuna carcases. This paper addresses the levels found in final harvested product. Details on levels found in other studies will be published elsewhere shortly.

  2. Electron Spin Resonance (ESR) studies of returned comet nucleus samples

    International Nuclear Information System (INIS)

    Tsay, Fundow; Kim, S.S.; Liang, R.H.

    1989-01-01

    The most important objective of the Comet Nucleus Sample Returm Mission is to return samples which could reflect formation conditions and evolutionary processes in the early solar nebula. It is expected that the returned samples will consist of fine-grained silicate materials mixed with ices composed of simple molecules such as H 2 O, NH 3 , CH 4 as well as organics and/or more complex compounds. Because of the exposure to ionizing radiation from cosmic-ray, gamma-ray, and solar wind protons at low temperature, free radicals are expected to be formed and trapped in the solid ice matrices. The kind of trapped radical species together with their concentration and thermal stability can be used as a dosimeter as well as a geothermometer to determine thermal and radiation histories as well as outgassing and other possible alternation effects since the nucleus material was formed. Since free radicals that are known to contain unpaired electrons are all paramagnetic in nature, they can be readily detected and characterized in their native form by the Electron Spin Resonance (ESR) method. In fact, ESR has been shown to be a non-destructive, highly sensitive tool for the detection and characterization of paramagnetic, ferromagnetic, and radiation damage centers in terrestrial and extraterrestrial geological samples. The potential use of ESR as an effective method in the study of returned comet nucleus samples, in particular, in the analysis of fine-grained solid state icy samples is discussed

  3. Balneotherapy in fibromyalgia: a single blind randomized controlled clinical study.

    Science.gov (United States)

    Ozkurt, Seçil; Dönmez, Arif; Zeki Karagülle, M; Uzunoğlu, Emel; Turan, Mustafa; Erdoğan, Nergis

    2012-07-01

    We aimed to evaluate the effectiveness of balneotherapy in fibromyalgia management. Fifty women with fibromyalgia under pharmacological treatment were randomly assigned to either the balneotherapy (25) or the control (25) group. Four patients from the balneotherapy group and one patient from the control group left the study after randomization. The patients in the balneotherapy group (21) had 2 thermomineral water baths daily for 2 weeks in Tuzla Spa Center. The patients in the control group (24) continued to have their medical treatment and routine daily life. An investigator who was blinded to the study arms assessed the patients. All patients were assessed four times; at the beginning of the study, at the end of the 2nd week, the 1st month, and the 3rd month after balneotherapy. Outcome measures of the study were pain intensity, Fibromyalgia Impact Questionnaire (FIQ), Beck Depression Inventory (BDI), patient's global assessment, investigator's global assessment, SF-36 scores, and tender point count. Balneotherapy was found to be superior at the end of the cure period in terms of pain intensity, FIQ, Beck Depression Inventory, patient's global assessment, investigator's global assessment scores, and tender point count as compared to the control group. The superiority of balneotherapy lasted up to the end of the 3rd month, except for the Beck Depression Inventory score and the investigator's global assessment score. Significant improvements were observed in PF, GH, and MH subscales of SF-36 during the study period in the balneotherapy group; however, no such improvement was observed in the control group. Balneotherapy was superior only in VT subscale at the end of therapy and at the end of the third month after the therapy as compared to the controls. It was concluded that balneotherapy provides beneficial effects in patients with fibromyalgia.

  4. WRAP Module 1 sampling strategy and waste characterization alternatives study

    Energy Technology Data Exchange (ETDEWEB)

    Bergeson, C.L.

    1994-09-30

    The Waste Receiving and Processing Module 1 Facility is designed to examine, process, certify, and ship drums and boxes of solid wastes that have a surface dose equivalent of less than 200 mrem/h. These wastes will include low-level and transuranic wastes that are retrievably stored in the 200 Area burial grounds and facilities in addition to newly generated wastes. Certification of retrievably stored wastes processing in WRAP 1 is required to meet the waste acceptance criteria for onsite treatment and disposal of low-level waste and mixed low-level waste and the Waste Isolation Pilot Plant Waste Acceptance Criteria for the disposal of TRU waste. In addition, these wastes will need to be certified for packaging in TRUPACT-II shipping containers. Characterization of the retrievably stored waste is needed to support the certification process. Characterization data will be obtained from historical records, process knowledge, nondestructive examination nondestructive assay, visual inspection of the waste, head-gas sampling, and analysis of samples taken from the waste containers. Sample characterization refers to the method or methods that are used to test waste samples for specific analytes. The focus of this study is the sample characterization needed to accurately identify the hazardous and radioactive constituents present in the retrieved wastes that will be processed in WRAP 1. In addition, some sampling and characterization will be required to support NDA calculations and to provide an over-check for the characterization of newly generated wastes. This study results in the baseline definition of WRAP 1 sampling and analysis requirements and identifies alternative methods to meet these requirements in an efficient and economical manner.

  5. WRAP Module 1 sampling strategy and waste characterization alternatives study

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    The Waste Receiving and Processing Module 1 Facility is designed to examine, process, certify, and ship drums and boxes of solid wastes that have a surface dose equivalent of less than 200 mrem/h. These wastes will include low-level and transuranic wastes that are retrievably stored in the 200 Area burial grounds and facilities in addition to newly generated wastes. Certification of retrievably stored wastes processing in WRAP 1 is required to meet the waste acceptance criteria for onsite treatment and disposal of low-level waste and mixed low-level waste and the Waste Isolation Pilot Plant Waste Acceptance Criteria for the disposal of TRU waste. In addition, these wastes will need to be certified for packaging in TRUPACT-II shipping containers. Characterization of the retrievably stored waste is needed to support the certification process. Characterization data will be obtained from historical records, process knowledge, nondestructive examination nondestructive assay, visual inspection of the waste, head-gas sampling, and analysis of samples taken from the waste containers. Sample characterization refers to the method or methods that are used to test waste samples for specific analytes. The focus of this study is the sample characterization needed to accurately identify the hazardous and radioactive constituents present in the retrieved wastes that will be processed in WRAP 1. In addition, some sampling and characterization will be required to support NDA calculations and to provide an over-check for the characterization of newly generated wastes. This study results in the baseline definition of WRAP 1 sampling and analysis requirements and identifies alternative methods to meet these requirements in an efficient and economical manner

  6. A study on the representative sampling survey for the inspection of the clearance level for the radioisotope waste

    International Nuclear Information System (INIS)

    Hong Joo Ahn; Se Chul Sohn; Kwang Yong Jee; Ju Youl Kim; In Koo Lee

    2007-01-01

    Utilization facilities for radioisotope (RI) are increasing annually in South Korea, and the total number was 2,723, as of December 31, 2005. The inspection of a clearance level is a very important problem in order to ensure a social reliance for releasing radioactive materials to the environment. Korean regulations for such a clearance are described in Notice No. 2001-30 of the Ministry of Science and Technology (MOST) and Notice No. 2002-67 of the Ministry of Commerce, Industry and Energy (MOCIE). Most unsealed sources in RI waste drums at a storage facility are low-level beta-emitters with short half-lives, so it is impossible to measure their inventories by a nondestructive analysis. Furthermore, RI wastes generated from hospital, educational and research institutes and industry have a heterogeneous, multiple, irregular, and a small quantity of a waste stream. This study addresses a representative (master) sampling survey and analysis plan for RI wastes because a complete enumeration of waste drums is impossible and not desirable in terms of a cost and efficiency. The existing approaches to a representative sampling include a judgmental, simple random, stratified random, systematic grid, systematic random, composite, and adaptive sampling. A representative sampling plan may combine two or more of the above sampling approaches depending on the type and distribution of a waste stream. Stratified random sampling (constrained randomization) is proven to be adequate for a sampling design of a RI waste regarding a half-life, surface dose, undertaking time to a storage facility, and type of waste. The developed sampling protocol includes estimating the number of drums within a waste stream, estimating the number of samples, and a confirmation of the required number of samples. The statistical process control for a quality assurance plan includes control charts and an upper control limit (UCL) of 95% to determine whether a clearance level is met or not. (authors)

  7. Multiple-image authentication with a cascaded multilevel architecture based on amplitude field random sampling and phase information multiplexing.

    Science.gov (United States)

    Fan, Desheng; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Pan, Xuemei; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi

    2015-04-10

    A multiple-image authentication method with a cascaded multilevel architecture in the Fresnel domain is proposed, in which a synthetic encoded complex amplitude is first fabricated, and its real amplitude component is generated by iterative amplitude encoding, random sampling, and space multiplexing for the low-level certification images, while the phase component of the synthetic encoded complex amplitude is constructed by iterative phase information encoding and multiplexing for the high-level certification images. Then the synthetic encoded complex amplitude is iteratively encoded into two phase-type ciphertexts located in two different transform planes. During high-level authentication, when the two phase-type ciphertexts and the high-level decryption key are presented to the system and then the Fresnel transform is carried out, a meaningful image with good quality and a high correlation coefficient with the original certification image can be recovered in the output plane. Similar to the procedure of high-level authentication, in the case of low-level authentication with the aid of a low-level decryption key, no significant or meaningful information is retrieved, but it can result in a remarkable peak output in the nonlinear correlation coefficient of the output image and the corresponding original certification image. Therefore, the method realizes different levels of accessibility to the original certification image for different authority levels with the same cascaded multilevel architecture.

  8. Dynamical implications of sample shape for avalanches in 2-dimensional random-field Ising model with saw-tooth domain wall

    Science.gov (United States)

    Tadić, Bosiljka

    2018-03-01

    We study dynamics of a built-in domain wall (DW) in 2-dimensional disordered ferromagnets with different sample shapes using random-field Ising model on a square lattice rotated by 45 degrees. The saw-tooth DW of the length Lx is created along one side and swept through the sample by slow ramping of the external field until the complete magnetisation reversal and the wall annihilation at the open top boundary at a distance Ly. By fixing the number of spins N =Lx ×Ly = 106 and the random-field distribution at a value above the critical disorder, we vary the ratio of the DW length to the annihilation distance in the range Lx /Ly ∈ [ 1 / 16 , 16 ] . The periodic boundary conditions are applied in the y-direction so that these ratios comprise different samples, i.e., surfaces of cylinders with the changing perimeter Lx and height Ly. We analyse the avalanches of the DW slips between following field updates, and the multifractal structure of the magnetisation fluctuation time series. Our main findings are that the domain-wall lengths materialised in different sample shapes have an impact on the dynamics at all scales. Moreover, the domain-wall motion at the beginning of the hysteresis loop (HLB) probes the disorder effects resulting in the fluctuations that are significantly different from the large avalanches in the central part of the loop (HLC), where the strong fields dominate. Specifically, the fluctuations in HLB exhibit a wide multi-fractal spectrum, which shifts towards higher values of the exponents when the DW length is reduced. The distributions of the avalanches in this segments of the loops obey power-law decay and the exponential cutoffs with the exponents firmly in the mean-field universality class for long DW. In contrast, the avalanches in the HLC obey Tsallis density distribution with the power-law tails which indicate the new categories of the scale invariant behaviour for different ratios Lx /Ly. The large fluctuations in the HLC, on the other

  9. Random glucose is useful for individual prediction of type 2 diabetes: results of the Study of Health in Pomerania (SHIP).

    Science.gov (United States)

    Kowall, Bernd; Rathmann, Wolfgang; Giani, Guido; Schipf, Sabine; Baumeister, Sebastian; Wallaschofski, Henri; Nauck, Matthias; Völzke, Henry

    2013-04-01

    Random glucose is widely used in routine clinical practice. We investigated whether this non-standardized glycemic measure is useful for individual diabetes prediction. The Study of Health in Pomerania (SHIP), a population-based cohort study in north-east Germany, included 3107 diabetes-free persons aged 31-81 years at baseline in 1997-2001. 2475 persons participated at 5-year follow-up and gave self-reports of incident diabetes. For the total sample and for subjects aged ≥50 years, statistical properties of prediction models with and without random glucose were compared. A basic model (including age, sex, diabetes of parents, hypertension and waist circumference) and a comprehensive model (additionally including various lifestyle variables and blood parameters, but not HbA1c) performed statistically significantly better after adding random glucose (e.g., the area under the receiver-operating curve (AROC) increased from 0.824 to 0.856 after adding random glucose to the comprehensive model in the total sample). Likewise, adding random glucose to prediction models which included HbA1c led to significant improvements of predictive ability (e.g., for subjects ≥50 years, AROC increased from 0.824 to 0.849 after adding random glucose to the comprehensive model+HbA1c). Random glucose is useful for individual diabetes prediction, and improves prediction models including HbA1c. Copyright © 2012 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.

  10. Cognitive training in Alzheimer's disease: a controlled randomized study.

    Science.gov (United States)

    Giovagnoli, A R; Manfredi, V; Parente, A; Schifano, L; Oliveri, S; Avanzini, G

    2017-08-01

    This controlled randomized single-blind study evaluated the effects of cognitive training (CT), compared to active music therapy (AMT) and neuroeducation (NE), on initiative in patients with mild to moderate Alzheimer's disease (AD). Secondarily, we explored the effects of CT on episodic memory, mood, and social relationships. Thirty-nine AD patients were randomly assigned to CT, AMT, or NE. Each treatment lasted 3 months. Before, at the end, and 3 months after treatment, neuropsychological tests and self-rated scales assessed initiative, episodic memory, depression, anxiety, and social relationships. At the end of the CT, initiative significantly improved, whereas, at the end of AMT and NE, it was unchanged. Episodic memory showed no changes at the end of CT or AMT and a worsening after NE. The rates of the patients with clinically significant improvement of initiative were greater after CT (about 62%) than after AMT (about 8%) or NE (none). At the 3-month follow-up, initiative and episodic memory declined in all patients. Mood and social relationships improved in the three groups, with greater changes after AMT or NE. In patients with mild to moderate AD, CT can improve initiative and stabilize memory, while the non-cognitive treatments can ameliorate the psychosocial aspects. The combining of CT and non-cognitive treatments may have useful clinical implications.

  11. A prospective, randomized multicenter study comparing APD and CAPD treatment

    DEFF Research Database (Denmark)

    Bro, S; Bjorner, J B; Tofte-Jensen, P

    2000-01-01

    , dialysis-related complications, dialysis-related expenses. RESULTS: The quality-of-life studies showed that significantly more time for work, family, and social activities was available to patients on APD compared to those on CAPD (p ...) treatment with respect to quality of life and clinical outcomes in relation to therapy costs. DESIGN: A prospective, randomized multicenter study. SETTING: Three Danish CAPD units. PATIENTS: Thirty-four adequately dialyzed patients with high or high-average peritoneal transport characteristics were included...... were assessed at baseline and after 6 months by the self-administered short-form SF-36 generic health survey questionnaire supplemented with disease- and treatment-specific questions. Therapy costs were compared by evaluating dialysis-related expenses. MAIN OUTCOME MEASURES: Quality-of-life parameters...

  12. Empirical evidence of study design biases in randomized trials

    DEFF Research Database (Denmark)

    Page, Matthew J.; Higgins, Julian P. T.; Clayton, Gemma

    2016-01-01

    search September 2012), and searched Ovid MEDLINE and Ovid EMBASE for studies indexed from Jan 2012-May 2015. Data were extracted by one author and verified by another. We combined estimates of average bias (e.g. ratio of odds ratios (ROR) or difference in standardised mean differences (dSMD)) in meta......-analyses using the random-effects model. Analyses were stratified by type of outcome ("mortality" versus "other objective" versus "subjective"). Direction of effect was standardised so that ROR SMD ... studies). For these characteristics, the average bias appeared to be larger in trials of subjective outcomes compared with other objective outcomes. Also, intervention effects for subjective outcomes appear to be exaggerated in trials with lack of/unclear blinding of participants (versus blinding) (dSMD...

  13. Systematic studies of small scintillators for new sampling calorimeter

    Indian Academy of Sciences (India)

    A new sampling calorimeter using very thin scintillators and the multi-pixel photon counter (MPPC) has been proposed to produce better position resolution for the international linear collider (ILC) experiment. As part of this R & D study, small plastic scintillators of different sizes, thickness and wrapping reflectors are ...

  14. Astronauts Armstrong and Aldrin study rock samples during field trip

    Science.gov (United States)

    1969-01-01

    Astronaut Neil Armstrong, commander of the Apollo 11 lunar landing mission, and Astronaut Edwin Aldrin, Lunar module pilot for Apollo 11, study rock samples during a geological field trip to the Quitman Mountains area near the Fort Quitman ruins in far west Texas.

  15. Sampling challenges in a study examining refugee resettlement.

    Science.gov (United States)

    Sulaiman-Hill, Cheryl Mr; Thompson, Sandra C

    2011-03-15

    As almost half of all refugees currently under United Nations protection are from Afghanistan or Iraq and significant numbers have already been resettled outside the region of origin, it is likely that future research will examine their resettlement needs. A number of methodological challenges confront researchers working with culturally and linguistically diverse groups; however, few detailed articles are available to inform other studies. The aim of this paper is to outline challenges with sampling and recruitment of socially invisible refugee groups, describing the method adopted for a mixed methods exploratory study assessing mental health, subjective wellbeing and resettlement perspectives of Afghan and Kurdish refugees living in New Zealand and Australia. Sampling strategies used in previous studies with similar refugee groups were considered before determining the approach to recruitment A snowball approach was adopted for the study, with multiple entry points into the communities being used to choose as wide a range of people as possible to provide further contacts and reduce selection bias. Census data was used to assess the representativeness of the sample. A sample of 193 former refugee participants was recruited in Christchurch (n = 98) and Perth (n = 95), 47% were of Afghan and 53% Kurdish ethnicity. A good gender balance (males 52%, females 48%) was achieved overall, mainly as a result of the sampling method used. Differences in the demographic composition of groups in each location were observed, especially in relation to the length of time spent in a refugee situation and time since arrival, reflecting variations in national humanitarian quota intakes. Although some measures were problematic, Census data comparison to assess reasonable representativeness of the study sample was generally reassuring. Snowball sampling, with multiple initiation points to reduce selection bias, was necessary to locate and identify participants, provide reassurance and

  16. Sampling challenges in a study examining refugee resettlement

    Directory of Open Access Journals (Sweden)

    Thompson Sandra C

    2011-03-01

    Full Text Available Abstract Background As almost half of all refugees currently under United Nations protection are from Afghanistan or Iraq and significant numbers have already been resettled outside the region of origin, it is likely that future research will examine their resettlement needs. A number of methodological challenges confront researchers working with culturally and linguistically diverse groups; however, few detailed articles are available to inform other studies. The aim of this paper is to outline challenges with sampling and recruitment of socially invisible refugee groups, describing the method adopted for a mixed methods exploratory study assessing mental health, subjective wellbeing and resettlement perspectives of Afghan and Kurdish refugees living in New Zealand and Australia. Sampling strategies used in previous studies with similar refugee groups were considered before determining the approach to recruitment Methods A snowball approach was adopted for the study, with multiple entry points into the communities being used to choose as wide a range of people as possible to provide further contacts and reduce selection bias. Census data was used to assess the representativeness of the sample. Results A sample of 193 former refugee participants was recruited in Christchurch (n = 98 and Perth (n = 95, 47% were of Afghan and 53% Kurdish ethnicity. A good gender balance (males 52%, females 48% was achieved overall, mainly as a result of the sampling method used. Differences in the demographic composition of groups in each location were observed, especially in relation to the length of time spent in a refugee situation and time since arrival, reflecting variations in national humanitarian quota intakes. Although some measures were problematic, Census data comparison to assess reasonable representativeness of the study sample was generally reassuring. Conclusions Snowball sampling, with multiple initiation points to reduce selection bias, was

  17. The Move from Accuracy Studies to Randomized Trials in PET

    DEFF Research Database (Denmark)

    Siepe, Bettina; Hoilund-Carlsen, Poul Flemming; Gerke, Oke

    2014-01-01

    an important role in informing guideline developers and policy makers. Our aim was to investigate how far the nuclear medicine community has come on its way from accuracy studies to RCTs and which issues we have to take into account in planning future studies. METHODS: We conducted a systematic review...... evaluation. Choice of patient-important outcomes and sufficient sample sizes are crucial issues in planning RCTs to demonstrate the clinical benefit of using PET....

  18. Prevalence and predictors of Video Game Addiction: A study based on a national sample of Gamers.

    OpenAIRE

    Wittek, Charlotte Thoresen; Finserås, Turi Reiten; Pallesen, Ståle; Mentzoni, Rune; Hanss, Daniel; Griffiths, Mark D.; Molde, Helge

    2015-01-01

    Video gaming has become a popular leisure activity in many parts of the world, and an increasing number of empirical studies examine the small minority that appears to develop problems as a result of excessive gaming. This study investigated prevalence rates and predictors of video game addiction in a sample of gamers, randomly selected from the National Population Registry of Norway (N = 3389). Results showed there were 1.4 % addicted gamers, 7.3 % problem gamers, 3.9 % engaged gamers, and 8...

  19. Study of β-NMR for Liquid Biological Samples

    CERN Document Server

    Beattie, Caitlin

    2017-01-01

    β-NMR is an exotic form of NMR spectroscopy that allows for the characterization of matter based on the anisotropic β-decay of radioactive probe nuclei. This has been shown to be an effective spectroscopic technique for many different compounds, but its use for liquid biological samples is relatively unexplored. The work at the VITO line of ISOLDE seeks to employ this technique to study such samples. Currently, preparations are being made for an experiment to characterize DNA G-quadruplexes and their interactions with stabilizing cations. More specifically, the work in which I engaged as a summer student focused on the experiment’s liquid handling system and the stability of the relevant biological samples under vacuum.

  20. Why choose Random Forest to predict rare species distribution with few samples in large undersampled areas? Three Asian crane species models provide supporting evidence

    Directory of Open Access Journals (Sweden)

    Chunrong Mi

    2017-01-01

    Full Text Available Species distribution models (SDMs have become an essential tool in ecology, biogeography, evolution and, more recently, in conservation biology. How to generalize species distributions in large undersampled areas, especially with few samples, is a fundamental issue of SDMs. In order to explore this issue, we used the best available presence records for the Hooded Crane (Grus monacha, n = 33, White-naped Crane (Grus vipio, n = 40, and Black-necked Crane (Grus nigricollis, n = 75 in China as three case studies, employing four powerful and commonly used machine learning algorithms to map the breeding distributions of the three species: TreeNet (Stochastic Gradient Boosting, Boosted Regression Tree Model, Random Forest, CART (Classification and Regression Tree and Maxent (Maximum Entropy Models. In addition, we developed an ensemble forecast by averaging predicted probability of the above four models results. Commonly used model performance metrics (Area under ROC (AUC and true skill statistic (TSS were employed to evaluate model accuracy. The latest satellite tracking data and compiled literature data were used as two independent testing datasets to confront model predictions. We found Random Forest demonstrated the best performance for the most assessment method, provided a better model fit to the testing data, and achieved better species range maps for each crane species in undersampled areas. Random Forest has been generally available for more than 20 years and has been known to perform extremely well in ecological predictions. However, while increasingly on the rise, its potential is still widely underused in conservation, (spatial ecological applications and for inference. Our results show that it informs ecological and biogeographical theories as well as being suitable for conservation applications, specifically when the study area is undersampled. This method helps to save model-selection time and effort, and allows robust and rapid

  1. Why choose Random Forest to predict rare species distribution with few samples in large undersampled areas? Three Asian crane species models provide supporting evidence.

    Science.gov (United States)

    Mi, Chunrong; Huettmann, Falk; Guo, Yumin; Han, Xuesong; Wen, Lijia

    2017-01-01

    Species distribution models (SDMs) have become an essential tool in ecology, biogeography, evolution and, more recently, in conservation biology. How to generalize species distributions in large undersampled areas, especially with few samples, is a fundamental issue of SDMs. In order to explore this issue, we used the best available presence records for the Hooded Crane ( Grus monacha , n  = 33), White-naped Crane ( Grus vipio , n  = 40), and Black-necked Crane ( Grus nigricollis , n  = 75) in China as three case studies, employing four powerful and commonly used machine learning algorithms to map the breeding distributions of the three species: TreeNet (Stochastic Gradient Boosting, Boosted Regression Tree Model), Random Forest, CART (Classification and Regression Tree) and Maxent (Maximum Entropy Models). In addition, we developed an ensemble forecast by averaging predicted probability of the above four models results. Commonly used model performance metrics (Area under ROC (AUC) and true skill statistic (TSS)) were employed to evaluate model accuracy. The latest satellite tracking data and compiled literature data were used as two independent testing datasets to confront model predictions. We found Random Forest demonstrated the best performance for the most assessment method, provided a better model fit to the testing data, and achieved better species range maps for each crane species in undersampled areas. Random Forest has been generally available for more than 20 years and has been known to perform extremely well in ecological predictions. However, while increasingly on the rise, its potential is still widely underused in conservation, (spatial) ecological applications and for inference. Our results show that it informs ecological and biogeographical theories as well as being suitable for conservation applications, specifically when the study area is undersampled. This method helps to save model-selection time and effort, and allows robust and rapid

  2. Building Kindergartners' Number Sense: A Randomized Controlled Study.

    Science.gov (United States)

    Jordan, Nancy C; Glutting, Joseph; Dyson, Nancy; Hassinger-Das, Brenna; Irwin, Casey

    2012-08-01

    Math achievement in elementary school is mediated by performance and growth in number sense during kindergarten. The aim of the present study was to test the effectiveness of a targeted small group number sense intervention for high-risk kindergartners from low-income communities. Children were randomly assigned to one of three groups ( n = 44 in each group): a number sense intervention group, a language intervention group, or a business as usual control group. Accounting for initial skill level in mathematical knowledge, children who received the number sense intervention performed better than controls at immediate post test, with meaningful effects on measures of number competencies and general math achievement. Many of the effects held eight weeks after the intervention was completed, suggesting that children internalized what they had learned. There were no differences between the language and control groups on any math-related measures.

  3. Building Kindergartners’ Number Sense: A Randomized Controlled Study

    Science.gov (United States)

    Jordan, Nancy C.; Glutting, Joseph; Dyson, Nancy; Hassinger-Das, Brenna; Irwin, Casey

    2015-01-01

    Math achievement in elementary school is mediated by performance and growth in number sense during kindergarten. The aim of the present study was to test the effectiveness of a targeted small group number sense intervention for high-risk kindergartners from low-income communities. Children were randomly assigned to one of three groups (n = 44 in each group): a number sense intervention group, a language intervention group, or a business as usual control group. Accounting for initial skill level in mathematical knowledge, children who received the number sense intervention performed better than controls at immediate post test, with meaningful effects on measures of number competencies and general math achievement. Many of the effects held eight weeks after the intervention was completed, suggesting that children internalized what they had learned. There were no differences between the language and control groups on any math-related measures. PMID:25866417

  4. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    Science.gov (United States)

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  5. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    Science.gov (United States)

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  6. Sample size determinations for group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms.

    Science.gov (United States)

    Heo, Moonseong; Litwin, Alain H; Blackstock, Oni; Kim, Namhee; Arnsten, Julia H

    2017-02-01

    We derived sample size formulae for detecting main effects in group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms. Such designs are necessary when experimental interventions need to be administered to groups of subjects whereas control conditions need to be administered to individual subjects. This type of trial, often referred to as a partially nested or partially clustered design, has been implemented for management of chronic diseases such as diabetes and is beginning to emerge more commonly in wider clinical settings. Depending on the research setting, the level of hierarchy of data structure for the experimental arm can be three or two, whereas that for the control arm is two or one. Such different levels of data hierarchy assume correlation structures of outcomes that are different between arms, regardless of whether research settings require two or three level data structure for the experimental arm. Therefore, the different correlations should be taken into account for statistical modeling and for sample size determinations. To this end, we considered mixed-effects linear models with different correlation structures between experimental and control arms to theoretically derive and empirically validate the sample size formulae with simulation studies.

  7. Partner randomized controlled trial: study protocol and coaching intervention

    Directory of Open Access Journals (Sweden)

    Garbutt Jane M

    2012-04-01

    Full Text Available Abstract Background Many children with asthma live with frequent symptoms and activity limitations, and visits for urgent care are common. Many pediatricians do not regularly meet with families to monitor asthma control, identify concerns or problems with management, or provide self-management education. Effective interventions to improve asthma care such as small group training and care redesign have been difficult to disseminate into office practice. Methods and design This paper describes the protocol for a randomized controlled trial (RCT to evaluate a 12-month telephone-coaching program designed to support primary care management of children with persistent asthma and subsequently to improve asthma control and disease-related quality of life and reduce urgent care events for asthma care. Randomization occurred at the practice level with eligible families within a practice having access to the coaching program or to usual care. The coaching intervention was based on the transtheoretical model of behavior change. Targeted behaviors included 1 effective use of controller medications, 2 effective use of rescue medications and 3 monitoring to ensure optimal control. Trained lay coaches provided parents with education and support for asthma care, tailoring the information provided and frequency of contact to the parent's readiness to change their child's day-to-day asthma management. Coaching calls varied in frequency from weekly to monthly. For each participating family, follow-up measurements were obtained at 12- and 24-months after enrollment in the study during a telephone interview. The primary outcomes were the mean change in 1 the child's asthma control score, 2 the parent's quality of life score, and 3 the number of urgent care events assessed at 12 and 24 months. Secondary outcomes reflected adherence to guideline recommendations by the primary care pediatricians and included the proportion of children prescribed controller medications

  8. Feasibility of exercising adults with asthma: a randomized pilot study

    Directory of Open Access Journals (Sweden)

    Boyd Amy

    2012-08-01

    Full Text Available Abstract Background Aerobic exercise appears to have clinical benefits for many asthmatics, yet a complete understanding of the mechanisms underlying these benefits has not been elucidated at this time. Purpose The objective of this study was to determine feasibility for a larger, future study that will define the effect of aerobic exercise on cellular, molecular, and functional measures in adults with mild-moderate asthma. Design Recruited subjects were randomized into usual care (sedentary or usual care with moderate intensity aerobic exercise treatment groups. Setting / Participants Nineteen adults with mild-moderate asthma but without a recent history of exercise were recruited at the UAB Lung Health Center, Birmingham, AL. Intervention The exercise group underwent a 12 week walking program exercising at 60 – 75% of maximum heart rate (HRmax. Subjects self-monitored HRmax levels using heart rate monitors; exercise diaries and recreation center sign-in logs were also used. Main outcome measures Functional measures, including lung function and asthma control scores, were evaluated for all subjects at pre- and post-study time-points; fitness measures were also assessed for subjects in the exercise group. Peripheral blood and nasal lavage fluid were collected from all subjects at pre- and post-study visits in order to evaluate cellular and molecular measures, including cell differentials and eosinophilic cationic protein (ECP. Results Sixteen subjects completed the prescribed protocol. Results show that subjects randomized to the exercise group adhered well (80% to the exercise prescription and exhibited a trend toward improved fitness levels upon study completion. Both groups exhibited improvements in ACQ scores. No changes were observed in lung function (FEV1, FEV1/FVC, cell differentials, or ECP between groups. Conclusions Results indicate that a moderate intensity aerobic exercise training program may improve asthma control and fitness

  9. Experimental Study of Impregnation Birch and Aspen Samples

    Directory of Open Access Journals (Sweden)

    Igor Vladislavovich Grigorev

    2014-10-01

    Full Text Available An experimental study of wood impregnation was implemented by applying centrifugal methods. The impregnants were a 10% aqueous solution of potassium chloride and a 2% aqueous solution of borax. Birch (Betula pendula and aspen (Populus tremula wood samples in different moisture content were tested. The impregnation time in the centrifugal device were 30 seconds repeated 21 times, and the samples were measured after every 30 seconds. The experimental results were fitted to a nonlinear filtration law, which indicated that the centrifugal wood impregnation was dependent on wood species, wood moisture, rotational speed, and radius. Determination of rotational speed and centrifuge radius for impregnating aspen and birch at varying lengths and humidity under conditions of the nonlinear impregnant filtration law can be done using the example charts that were developed and presented in this study.

  10. Microanalysis study of archaeological mural samples containing Maya blue pigment

    International Nuclear Information System (INIS)

    Sanchez del Rio, M.; Martinetto, P.; Somogyi, A.; Reyes-Valerio, C.; Dooryhee, E.; Peltier, N.; Alianelli, L.; Moignard, B.; Pichon, L.; Calligaro, T.; Dran, J.-C.

    2004-01-01

    Elemental analysis by X-ray fluorescence and particle induced X-ray emission is applied to the study of several Mesoamerican mural samples containing blue pigments. The most characteristic blue pigment is Maya blue, a very stable organo-clay complex original from Maya culture and widely used in murals, pottery and sculptures in a vast region of Mesoamerica during the pre-hispanic time (from VIII century) and during the colonization until 1580. The mural samples come from six different archaeological sites (four pre-hispanic and two from XVI century colonial convents). The correlation between the presence of some elements and the pigment colour is discussed. From the comparative study of the elemental concentration, some conclusions are drawn on the nature of the pigments and the technology used

  11. Microanalysis study of archaeological mural samples containing Maya blue pigment

    Science.gov (United States)

    Sánchez del Río, M.; Martinetto, P.; Somogyi, A.; Reyes-Valerio, C.; Dooryhée, E.; Peltier, N.; Alianelli, L.; Moignard, B.; Pichon, L.; Calligaro, T.; Dran, J.-C.

    2004-10-01

    Elemental analysis by X-ray fluorescence and particle induced X-ray emission is applied to the study of several Mesoamerican mural samples containing blue pigments. The most characteristic blue pigment is Maya blue, a very stable organo-clay complex original from Maya culture and widely used in murals, pottery and sculptures in a vast region of Mesoamerica during the pre-hispanic time (from VIII century) and during the colonization until 1580. The mural samples come from six different archaeological sites (four pre-hispanic and two from XVI century colonial convents). The correlation between the presence of some elements and the pigment colour is discussed. From the comparative study of the elemental concentration, some conclusions are drawn on the nature of the pigments and the technology used.

  12. Microanalysis study of archaeological mural samples containing Maya blue pigment

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez del Rio, M. [ESRF, BP220, F-38043 Grenoble (France)]. E-mail: srio@esrf.fr; Martinetto, P. [Laboratoire de Cristallographie, CNRS, BP166 F-30842 Grenoble (France); Somogyi, A. [ESRF, BP220, F-38043 Grenoble (France); Reyes-Valerio, C. [INAH, Mexico DF (Mexico); Dooryhee, E. [Laboratoire de Cristallographie, CNRS, BP166 F-30842 Grenoble (France); Peltier, N. [Laboratoire de Cristallographie, CNRS, BP166 F-30842 Grenoble (France); Alianelli, L. [INFM-OGG c/o ESRF, BP220, F-38043 Grenoble Cedex (France); Moignard, B. [C2RMF, 6 Rue des Pyramides, F-75041 Paris Cedex 01 (France); Pichon, L. [C2RMF, 6 Rue des Pyramides, F-75041 Paris Cedex 01 (France); Calligaro, T. [C2RMF, 6 Rue des Pyramides, F-75041 Paris Cedex 01 (France); Dran, J.-C. [C2RMF, 6 Rue des Pyramides, F-75041 Paris Cedex 01 (France)

    2004-10-08

    Elemental analysis by X-ray fluorescence and particle induced X-ray emission is applied to the study of several Mesoamerican mural samples containing blue pigments. The most characteristic blue pigment is Maya blue, a very stable organo-clay complex original from Maya culture and widely used in murals, pottery and sculptures in a vast region of Mesoamerica during the pre-hispanic time (from VIII century) and during the colonization until 1580. The mural samples come from six different archaeological sites (four pre-hispanic and two from XVI century colonial convents). The correlation between the presence of some elements and the pigment colour is discussed. From the comparative study of the elemental concentration, some conclusions are drawn on the nature of the pigments and the technology used.

  13. Yoga in Correctional Settings: A Randomized Controlled Study

    Directory of Open Access Journals (Sweden)

    Nóra Kerekes

    2017-10-01

    Full Text Available BackgroundThe effect of yoga in the reduction of depressive symptoms, anxiety, stress, anger as well as in the increased ability of behavioral control has been shown. These effects of yoga are highly relevant for prison inmates who often have poor mental health and low impulse control. While it has been shown that yoga and meditation can be effective in improving subjective well-being, mental health, and executive functioning within prison populations, only a limited number of studies have proved this, using randomized controlled settings.MethodsA total of 152 participants from nine Swedish correctional facilities were randomly assigned to a 10-week yoga group (one class a week; N = 77 or a control group (N = 75. Before and after the intervention period, participants answered questionnaires measuring stress, aggression, affective states, sleep quality, and psychological well-being and completed a computerized test measuring attention and impulsivity.ResultsAfter the intervention period, significant improvements were found on 13 of the 16 variables within the yoga group (e.g., less perceived stress, better sleep quality, an increased psychological and emotional well-being, less aggressive, and antisocial behavior and on two within the control group. Compared to the control group, yoga class participants reported significantly improved emotional well-being and less antisocial behavior after 10 weeks of yoga. They also showed improved performance on the computerized test that measures attention and impulse control.ConclusionIt can be concluded that the yoga practiced in Swedish correctional facilities has positive effects on inmates’ well-being and on considerable risk factors associated with recidivism, such as impulsivity and antisocial behavior. Accordingly, the results show that yoga practice can play an important part in the rehabilitation of prison inmates.

  14. People's Intuitions about Randomness and Probability: An Empirical Study

    Science.gov (United States)

    Lecoutre, Marie-Paule; Rovira, Katia; Lecoutre, Bruno; Poitevineau, Jacques

    2006-01-01

    What people mean by randomness should be taken into account when teaching statistical inference. This experiment explored subjective beliefs about randomness and probability through two successive tasks. Subjects were asked to categorize 16 familiar items: 8 real items from everyday life experiences, and 8 stochastic items involving a repeatable…

  15. Manual and Electroacupuncture for Labour Pain: Study Design of a Longitudinal Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Linda Vixner

    2012-01-01

    Full Text Available Introduction. Results from previous studies on acupuncture for labour pain are contradictory and lack important information on methodology. However, studies indicate that acupuncture has a positive effect on women’s experiences of labour pain. The aim of the present study was to evaluate the efficacy of two different acupuncture stimulations, manual or electrical stimulation, compared with standard care in the relief of labour pain as the primary outcome. This paper will present in-depth information on the design of the study, following the CONSORT and STRICTA recommendations. Methods. The study was designed as a randomized controlled trial based on western medical theories. Nulliparous women with normal pregnancies admitted to the delivery ward after a spontaneous onset of labour were randomly allocated into one of three groups: manual acupuncture, electroacupuncture, or standard care. Sample size calculation gave 101 women in each group, including a total of 303 women. A Visual Analogue Scale was used for assessing pain every 30 minutes for five hours and thereafter every hour until birth. Questionnaires were distributed before treatment, directly after the birth, and at one day and two months postpartum. Blood samples were collected before and after the first treatment. This trial is registered at ClinicalTrials.gov: NCT01197950.

  16. Systematic screening with information and home sampling for genital Chlamydia trachomatis infections in young men and women in Norway: a randomized controlled trial.

    Science.gov (United States)

    Kløvstad, Hilde; Natås, Olav; Tverdal, Aage; Aavitsland, Preben

    2013-01-23

    As most genital Chlamydia trachomatis infections are asymptomatic, many patients do not seek health care for testing. Infections remain undiagnosed and untreated. We studied whether screening with information and home sampling resulted in more young people getting tested, diagnosed and treated for chlamydia in the three months following the intervention compared to the current strategy of testing in the health care system. We conducted a population based randomized controlled trial among all persons aged 18-25 years in one Norwegian county (41 519 persons). 10 000 persons (intervention) received an invitation by mail with chlamydia information and a mail-back urine sampling kit. 31 519 persons received no intervention and continued with usual care (control). All samples from both groups were analysed in the same laboratory. Information on treatment was obtained from the Norwegian Prescription Database (NorPD). We estimated risk ratios and risk differences of being tested, diagnosed and treated in the intervention group compared to the control group. In the intervention group 16.5% got tested and in the control group 3.4%, risk ratio 4.9 (95% CI 4.5-5.2). The intervention led to 2.6 (95% CI 2.0-3.4) times as many individuals being diagnosed and 2.5 (95% CI 1.9-3.4) times as many individuals receiving treatment for chlamydia compared to no intervention in the three months following the intervention. In Norway, systematic screening with information and home sampling results in more young people being tested, diagnosed and treated for chlamydia in the three months following the intervention than the current strategy of testing in the health care system. However, the study has not established that the intervention will reduce the chlamydia prevalence or the risk of complications from chlamydia.

  17. Effects of errorless skill learning in people with mild-to-moderate or severe dementia: a randomized controlled pilot study.

    NARCIS (Netherlands)

    Kessels, R.P.C.; Hensken, L.M.

    2009-01-01

    This pilot study examines whether learning without errors is advantageous compared to trial-and-error learning in people with dementia using a procedural task and a randomized case-control design. A sample of 60 people was recruited, consisting of 20 patients with severe dementia, 20 patients with

  18. Effects of errorless skill learning in people with mild-to-moderate or severe dementia: A randomized controlled pilot study

    NARCIS (Netherlands)

    Kessels, R.P.C.; Olde Hensken, L.M.G.

    2009-01-01

    This pilot study examines whether learning without errors is advantageous compared to trial-and-error learning in people with dementia using a procedural task and a randomized case-control design. A sample of 60 people was recruited, consisting of 20 patients with severe dementia, 20 patients with

  19. Effects of smartphone diaries and personal dosimeters on behavior in a randomized study of methods to document sunlight exposure

    DEFF Research Database (Denmark)

    Køster, Brian; Søndergaard, Jens; Nielsen, Jesper Bo

    2016-01-01

    study. We examined the effects of wearing dosimeters and filling out diaries, measurement period and recall effect on the sun-related behavior in Denmark in 2012.Our sample included 240 participants eligible by smartphone status and who took a vacation during weeks 26-32 in 2012, randomized by gender...

  20. Empirically simulated study to compare and validate sampling methods used in aerial surveys of wildlife populations

    NARCIS (Netherlands)

    Khaemba, W.M.; Stein, A.; Rasch, D.; Leeuw, de J.; Georgiadis, N.

    2001-01-01

    This paper compares the distribution, sampling and estimation of abundance for two animal species in an African ecosystem by means of an intensive simulation of the sampling process under a geographical information system (GIS) environment. It focuses on systematic and random sampling designs,

  1. Study of polyconsumption in a 2005 Bogota driver's sample

    Directory of Open Access Journals (Sweden)

    Óscar Armando Sánchez Cardozo

    2007-01-01

    Full Text Available Background. We do not know of studies in our city that indicate the degree of poly illicit drugs consumption in motor vehicles drivers. Use of illicit drugs can explain in some cases the lack of agreement among measurement of breath alcohol test, blood alcohol concentration, and clinical examination in motor vehicles drivers when they are suspects of alcoholic intoxication. In this report we want to center on showing the use of illicit drugs and their clinical manifestations in a sample of motor vehicles drivers. Objective. To determine the consumption of illicit drugs in a sample of motor vehicle drivers, when they are suspects of alcoholic intoxication. Materials and methods. We took a representative sample within 68 motor vehicle drivers. We performed on all of them clinical examination, measurement of breath alcohol tests and blood alcohol concentration. Also we collected urine samples. Five (5 substances were investigated according to their high prevalence in our population: opiate, marijuana, cocaine, amphetamines and benzodiazepines. Results.There were seven cases of illicit drug consumption: the most frequent combination found was (5-7 cases alcohol plus marijuana. In three cases we found consumption of three illicit drugs: 2 cases have alcohol, marijuana and cocaine. Only one case of alcohol, benzodiazepine and cocaine. Conclusions. In the sample analyzed the use of illicit drugs was 10.14 %. The most common form of use is a depressor with a stimulant. If it is recent, it diminishes the neurological manifestations, but it does not affect the presentation nor the severity of the rotacional nystagmus. The alterations of the superior mental functions were seen to be associated with alcohol consumption plus benzodiazepines. Combination of two depressors increases the motor alterations and seems that it alters the superior mental functions. In the negative clinical examinations it was concluded that there was no alcohol intoxication and no

  2. Random and systematic sampling error when hooking fish to monitor skin fluke (Benedenia seriolae) and gill fluke (Zeuxapta seriolae) burden in Australian farmed yellowtail kingfish (Seriola lalandi).

    Science.gov (United States)

    Fensham, J R; Bubner, E; D'Antignana, T; Landos, M; Caraguel, C G B

    2018-05-01

    The Australian farmed yellowtail kingfish (Seriola lalandi, YTK) industry monitor skin fluke (Benedenia seriolae) and gill fluke (Zeuxapta seriolae) burden by pooling the fluke count of 10 hooked YTK. The random and systematic error of this sampling strategy was evaluated to assess potential impact on treatment decisions. Fluke abundance (fluke count per fish) in a study cage (estimated 30,502 fish) was assessed five times using the current sampling protocol and its repeatability was estimated the repeatability coefficient (CR) and the coefficient of variation (CV). Individual body weight, fork length, fluke abundance, prevalence, intensity (fluke count per infested fish) and density (fluke count per Kg of fish) were compared between 100 hooked and 100 seined YTK (assumed representative of the entire population) to estimate potential selection bias. Depending on the fluke species and age category, CR (expected difference in parasite count between 2 sampling iterations) ranged from 0.78 to 114 flukes per fish. Capturing YTK by hooking increased the selection of fish of a weight and length in the lowest 5th percentile of the cage (RR = 5.75, 95% CI: 2.06-16.03, P-value = 0.0001). These lower end YTK had on average an extra 31 juveniles and 6 adults Z. seriolae per Kg of fish and an extra 3 juvenile and 0.4 adult B. seriolae per Kg of fish, compared to the rest of the cage population (P-value sampling towards the smallest and most heavily infested fish in the population, resulting in poor repeatability (more variability amongst sampled fish) and an overestimation of parasite burden in the population. In this particular commercial situation these finding supported that health management program, where the finding of an underestimation of parasite burden could provide a production impact on the study population. In instances where fish populations and parasite burdens are more homogenous, sampling error may be less severe. Sampling error when capturing fish

  3. Prospective randomized clinical studies involving reirradiation. Lessons learned

    Energy Technology Data Exchange (ETDEWEB)

    Nieder, Carsten [Nordland Hospital, Department of Oncology and Palliative Medicine, Bodoe (Norway); University of Tromsoe, Department of Clinical Medicine, Faculty of Health Sciences, Tromsoe (Norway); Langendijk, Johannes A. [University Medical Centre Groningen, Department of Radiation Oncology, Groningen (Netherlands); Guckenberger, Matthias [University Hospital Zuerich, Department of Radiation Oncology, Zuerich (Switzerland); Grosu, Anca L. [University Hospital Freiburg, Department of Radiation Oncology, Freiburg (Germany)

    2016-10-15

    Reirradiation is a potentially useful option for many patients with recurrent cancer. The purpose of this study was to review all recently published randomized trials in order to identify methodological strengths and weaknesses, comment on the results, clinical implications and open questions, and give advice for the planning of future trials. Systematic review of trials published between 2000 and 2015 (databases searched were PubMed, Scopus and Web of Science). We reviewed 9 trials, most of which addressed reirradiation of head and neck tumours. The median number of patients was 69. Trial design, primary endpoint and statistical hypotheses varied widely. The results contribute mainly to decision making for reirradiation of nasopharynx cancer and bone metastases. The trials with relatively long median follow-up confirm that serious toxicity remains a concern after high cumulative total doses. Multi-institutional collaboration is encouraged to complete sufficiently large trials. Despite a paucity of large randomized studies, reirradiation has been adopted in different clinical scenarios by many institutions. Typically, the patients have been assessed by multidisciplinary tumour boards and advanced technologies are used to create highly conformal dose distributions. (orig.) [German] Eine Rebestrahlung kann fuer viele Patienten mit rezidivierenden Malignomen eine nuetzliche Option bieten. Der Zweck dieser Studie bestand darin, alle in der juengeren Vergangenheit publizierten randomisierten Studien zu beurteilen, da deren methodische Staerken und Schwaechen, Ergebnisse und resultierende Implikationen bzw. offene Fragen die Planung kuenftiger Studien wesentlich beeinflussen koennen. Systematische Uebersicht aller zwischen 2000 und 2015 veroeffentlichten Studien (Literatursuche ueber PubMed, Scopus und Web of Science). Ausgewertet wurden 9 Studien, in die vor allem Patienten mit Kopf-Hals-Tumoren eingeschlossen waren. Im Median hatten 69 Patienten teilgenommen. Das

  4. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  5. [Identification and sampling of people with migration background for epidemiological studies in Germany].

    Science.gov (United States)

    Reiss, K; Makarova, N; Spallek, J; Zeeb, H; Razum, O

    2013-06-01

    In 2009, 19.6% of the population of Germany either had migrated themselves or were the offspring of people with migration experience. Migrants differ from the autochthonous German population in terms of health status, health awareness and health behaviour. To further investigate the health situation of migrants in Germany, epidemiological studies are needed. Such studies can employ existing databases which provide detailed information on migration status. Otherwise, onomastic or toponomastic procedures can be applied to identify people with migration background. If migrants have to be recruited into an epidemiological study, this can be done register-based (e. g., data from registration offices or telephone lists), based on residential location (random-route or random-walk procedure), via snowball sampling (e. g., through key persons) or via settings (e. g., school entry examination). An oversampling of people with migration background is not sufficient to avoid systematic bias in the sample due to non-participation. Additional measures have to be taken to increase access and raise participation rates. Personal contacting, multilingual instruments, multilingual interviewers and extensive public relations increase access and willingness to participate. Empirical evidence on 'successful' recruitment strategies for studies with migrants is still lacking in epidemiology and health sciences in Germany. The choice of the recruitment strategy as well as the measures to raise accessibility and willingness to participate depend on the available resources, the research question and the specific migrant target group. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Glove failure in elective thyroid surgery: A prospective randomized study

    Directory of Open Access Journals (Sweden)

    Dariusz Timler

    2015-06-01

    Full Text Available Objectives: To analyze perforation rate in sterile gloves used by surgeons in the operating theatre of the Department of Endocrinological and General Surgery of Medical University of Lodz. Material and Methods: Randomized and controlled trial. This study analyses the incidents of tears in sterile surgical gloves used by surgeons during operations on 3 types of thyroid diseases according to the 10th revision of International Statistical Classification of Diseases and Related Health Problems (ICD-10 codes. Nine hundred seventy-two pairs (sets of gloves were collected from 321 surgical procedures. All gloves were tested immediately following surgery using the water leak test (EN455-1 to detect leakage. Results: Glove perforation was detected in 89 of 972 glove sets (9.2%. Statistically relevant more often glove tears occurred in operator than the 1st assistant (p < 0.001. The sites of perforation were localized mostly on the middle finger of the non-dominant hand (22.5%, and the non-dominant ring finger (17.9%. Conclusions: This study has proved that the role performed by the surgeon during the procedure (operator, 1st assistant has significant influence on the risk of glove perforations. Nearly 90% of glove perforations are unnoticed during surgery.

  7. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Science.gov (United States)

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  8. DNA methylation analysis from saliva samples for epidemiological studies.

    Science.gov (United States)

    Nishitani, Shota; Parets, Sasha E; Haas, Brian W; Smith, Alicia K

    2018-06-18

    Saliva is a non-invasive, easily accessible tissue, which is regularly collected in large epidemiological studies to examine genetic questions. Recently, it is becoming more common to use saliva to assess DNA methylation. However, DNA extracted from saliva is a mixture of both bacterial and human DNA derived from epithelial and immune cells in the mouth. Thus, there are unique challenges to using salivary DNA in methylation studies that can influence data quality. This study assesses: (1) quantification of human DNA after extraction; (2) delineation of human and bacterial DNA; (3) bisulfite conversion (BSC); (4) quantification of BSC DNA; (5) PCR amplification of BSC DNA from saliva and; (6) quantitation of DNA methylation with a targeted assay. The framework proposed will allow saliva samples to be more widely used in targeted epigenetic studies.

  9. Convergence analysis for Latin-hypercube lattice-sample selection strategies for 3D correlated random hydraulic-conductivity fields

    OpenAIRE

    Simuta-Champo, R.; Herrera-Zamarrón, G. S.

    2010-01-01

    The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...

  10. A novel approach to non-biased systematic random sampling: a stereologic estimate of Purkinje cells in the human cerebellum.

    Science.gov (United States)

    Agashiwala, Rajiv M; Louis, Elan D; Hof, Patrick R; Perl, Daniel P

    2008-10-21

    Non-biased systematic sampling using the principles of stereology provides accurate quantitative estimates of objects within neuroanatomic structures. However, the basic principles of stereology are not optimally suited for counting objects that selectively exist within a limited but complex and convoluted portion of the sample, such as occurs when counting cerebellar Purkinje cells. In an effort to quantify Purkinje cells in association with certain neurodegenerative disorders, we developed a new method for stereologic sampling of the cerebellar cortex, involving calculating the volume of the cerebellar tissues, identifying and isolating the Purkinje cell layer and using this information to extrapolate non-biased systematic sampling data to estimate the total number of Purkinje cells in the tissues. Using this approach, we counted Purkinje cells in the right cerebella of four human male control specimens, aged 41, 67, 70 and 84 years, and estimated the total Purkinje cell number for the four entire cerebella to be 27.03, 19.74, 20.44 and 22.03 million cells, respectively. The precision of the method is seen when comparing the density of the cells within the tissue: 266,274, 173,166, 167,603 and 183,575 cells/cm3, respectively. Prior literature documents Purkinje cell counts ranging from 14.8 to 30.5 million cells. These data demonstrate the accuracy of our approach. Our novel approach, which offers an improvement over previous methodologies, is of value for quantitative work of this nature. This approach could be applied to morphometric studies of other similarly complex tissues as well.

  11. Genome Wide Association Study to predict severe asthma exacerbations in children using random forests classifiers

    Directory of Open Access Journals (Sweden)

    Litonjua Augusto A

    2011-06-01

    Full Text Available Abstract Background Personalized health-care promises tailored health-care solutions to individual patients based on their genetic background and/or environmental exposure history. To date, disease prediction has been based on a few environmental factors and/or single nucleotide polymorphisms (SNPs, while complex diseases are usually affected by many genetic and environmental factors with each factor contributing a small portion to the outcome. We hypothesized that the use of random forests classifiers to select SNPs would result in an improved predictive model of asthma exacerbations. We tested this hypothesis in a population of childhood asthmatics. Methods In this study, using emergency room visits or hospitalizations as the definition of a severe asthma exacerbation, we first identified a list of top Genome Wide Association Study (GWAS SNPs ranked by Random Forests (RF importance score for the CAMP (Childhood Asthma Management Program population of 127 exacerbation cases and 290 non-exacerbation controls. We predict severe asthma exacerbations using the top 10 to 320 SNPs together with age, sex, pre-bronchodilator FEV1 percentage predicted, and treatment group. Results Testing in an independent set of the CAMP population shows that severe asthma exacerbations can be predicted with an Area Under the Curve (AUC = 0.66 with 160-320 SNPs in comparison to an AUC score of 0.57 with 10 SNPs. Using the clinical traits alone yielded AUC score of 0.54, suggesting the phenotype is affected by genetic as well as environmental factors. Conclusions Our study shows that a random forests algorithm can effectively extract and use the information contained in a small number of samples. Random forests, and other machine learning tools, can be used with GWAS studies to integrate large numbers of predictors simultaneously.

  12. Transport studies in p-type double quantum well samples

    International Nuclear Information System (INIS)

    Hyndman, R.J.

    2000-01-01

    The motivation for the study of double quantum well samples is that the extra spatial degree of freedom can modify the ground state energies of the system, leading to new and interesting many body effects. Electron bi-layers have been widely studied but the work presented here is the first systematic study of transport properties of a p-type, double quantum well system. The samples, grown on the 311 plane, consisted of two 100A GaAs wells separated by a 30A AlAs barrier. The thin barrier in our structures, gives rise to very strong inter-layer Coulombic interactions but in contrast to electron double quantum well samples, tunnelling between the two wells is very weak. This is due to the large effective mass of holes compared with electrons. It is possible to accurately control the total density of a sample and the relative occupancy of each well using front and back gates. A systematic study of the magnetoresistance properties of the p-type bi-layers, was carried out at low temperatures and in high magnetic fields, for samples covering a range of densities. Considerable care was required to obtain reliable results as the samples were extremely susceptible to electrical shock and were prone to drift in density slowly over time. With balanced wells, the very low tunnelling in the p-type bi-layer leads to a complete absence of all odd integers in both resistance and thermopower except for the v=1 state, ( v 1/2 in each layer) where v is the total Landau level filling factor. Unlike other FQHE features the v=1 state strengthens with increased density as inter-layer interactions increase in strength over intra-layer interactions. The state is also destroyed at a critical temperature, which is much lower than the measured activation temperature. This is taken as evidence for a finite temperature phase transition predicted for the bi-layer v=1. From the experimental observations, we construct a phase diagram for the state, which agree closely with theoretical predictions

  13. Salivary gland tumours in a Mexican sample. A retrospective study.

    Science.gov (United States)

    Ledesma-Montes, C; Garces-Ortiz, M

    2002-01-01

    Salivary gland tumours are an important part of the Oral and Maxillofacial Pathology, unfortunately, only few studies on these tumours have been done in Latin-American population. The aim of this study was to compare demographic data on salivary gland tumours in a Mexican sample with those previously published from Latin American and non-Latin American countries. All cases of salivary gland tumours or lesions diagnosed in our service were reviewed. Of the reviewed cases,67 were confirmed as salivary gland tumours. Out of these 64.2% were benign neoplasms, 35.8% were malignant and a slight female predominance (56.7%) was found. The most common location was palate followed by lips and floor of the mouth. Mean age for benign tumours was 40.6 years with female predominance (60.5%). Mean age for malignant tumours was 41 years and female predominance was found again. Palate followed by retromolar area were the usual locations. Pleomorphic adenoma (58.2%), mucoepidermoid carcinoma (17.9%) and adenoid cystic carcinoma (11.9%) were the more frequent neoplasms. All retromolar cases were malignant and all submandibular gland tumours were benign. We found a high proportion of salivary gland neoplasms in children. Our results showed that differences of the studied tumours among our sample and previously reported series exist. These differences can be related to race and geographical location.

  14. Balneotherapy for chronic low back pain: a randomized, controlled study.

    Science.gov (United States)

    Kesiktas, Nur; Karakas, Sinem; Gun, Kerem; Gun, Nuran; Murat, Sadiye; Uludag, Murat

    2012-10-01

    A large number of treatments were used for patients with chronic low back pain. Frequent episodes have been reported very high. Although balneotherapy was found effective in this disease, there are not well-designed studies. We aimed to determine the effectiveness of balneotherapy versus physical therapy in patients with chronic low back pain. Exercise was added to both treatment programs. Sixty patients with chronic low back pain were randomly divided into two groups. Physical modalities plus exercise were applied to group 1, and group 2 was received balneotherapy plus exercise for ten sessions. The following parameters were measured: visual analogue scale at rest and movement for pain, paracetamol dose, manual muscle test for lumber muscles, modified Schoeber' test, Oswestry disability index, and Short-Form 36 at the beginning and end of the therapies and at the 3 months follow-up. The statistical analyses were performed using the SPSS 10.0 program. Both groups achieved significant improvements within themselves. But balneotherapy groups were improved at back extensor muscle test (P Balneotherapy combined with exercise therapy had advantages than therapy with physical modalities plus exercise in improving quality of life and flexibility of patients with chronic low back pain.

  15. Application of the random coil index to studying protein flexibility

    Energy Technology Data Exchange (ETDEWEB)

    Berjanskii, Mark V.; Wishart, David S. [University of Alberta, Department of Computing Science (Canada)], E-mail: david.wishart@ualberta.ca

    2008-01-15

    Protein flexibility lies at the heart of many protein-ligand binding events and enzymatic activities. However, the experimental measurement of protein motions is often difficult, tedious and error-prone. As a result, there is a considerable interest in developing simpler and faster ways of quantifying protein flexibility. Recently, we described a method, called Random Coil Index (RCI), which appears to be able to quantitatively estimate model-free order parameters and flexibility in protein structural ensembles using only backbone chemical shifts. Because of its potential utility, we have undertaken a more detailed investigation of the RCI method in an attempt to ascertain its underlying principles, its general utility, its sensitivity to chemical shift errors, its sensitivity to data completeness, its applicability to other proteins, and its general strengths and weaknesses. Overall, we find that the RCI method is very robust and that it represents a useful addition to traditional methods of studying protein flexibility. We have implemented many of the findings and refinements reported here into a web server that allows facile, automated predictions of model-free order parameters, MD RMSF and NMR RMSD values directly from backbone {sup 1}H, {sup 13}C and {sup 15}N chemical shift assignments. The server is available at http: //wishart.biology.ualberta.ca/rcihttp://wishart.biology.ualberta.ca/rci.

  16. Microbiota-based Signature of Gingivitis Treatments: A Randomized Study.

    Science.gov (United States)

    Huang, Shi; Li, Zhen; He, Tao; Bo, Cunpei; Chang, Jinlan; Li, Lin; He, Yanyan; Liu, Jiquan; Charbonneau, Duane; Li, Rui; Xu, Jian

    2016-04-20

    Plaque-induced gingivitis can be alleviated by various treatment regimens. To probe the impacts of various anti-gingivitis treatments on plaque microflora, here a double blinded, randomized controlled trial of 91 adults with moderate gingivitis was designed with two anti-gingivitis regimens: the brush-alone treatment and the brush-plus-rinse treatment. In the later group, more reduction in both Plaque Index (TMQHI) and Gingival Index (mean MGI) at Day 3, Day 11 and Day 27 was evident, and more dramatic changes were found between baseline and other time points for both supragingival plaque microbiota structure and salivary metabonomic profiles. A comparison of plaque microbiota changes was also performed between these two treatments and a third dataset where 50 subjects received regimen of dental scaling. Only Actinobaculum, TM7 and Leptotrichia were consistently reduced by all the three treatments, whereas the different microbial signatures of the three treatments during gingivitis relieve indicate distinct mechanisms of action. Our study suggests that microbiota based signatures can serve as a valuable approach for understanding and potentially comparing the modes of action for clinical treatments and oral-care products in the future.

  17. TVT versus TOT, 2-year prospective randomized study.

    Science.gov (United States)

    Wadie, Bassem S; El-Hefnawy, Ahmed S; Elhefnawy, Ahmed S

    2013-06-01

    To evaluate in a comprehensive way TVT in comparison with TOT, the results of a single-center RCT are presented. Many studies addressed efficacy and safety of TVT and TOT. Women included were adults having predominant SUI with positive stress test. They were randomized to get either TVT (Gynecare(®)) or TOT (Aris(®)). All women were seen 1 week, 3, 6, 12, 18, and 24 months. Seventy-one women completed 2-year follow-up. Median age was 47 (range 33-60 years). Mean ± SD BMI in TVT group was 34 ± 5 while in TOT group was 32 ± 5 kg/m(2). POP of any degree was seen in 50 % (35 women). At 1 year, pad test-negative women were 31 and 29 for TVT and TOT, respectively. At 2 years, figures became 28 in TVT group and 27 in TOT. At 1 year, UDI 6 and IIQ 7 decreased by 78.5 and 81 % for TVT and by 69 % and 75 % for TOT group. At 2 year, comparable percentages were 73 and 79 % for TVT and 69 and 82 % for TOT. Fifteen unique patients had adverse events, 10 of them had TOT. Both tapes have similar efficacy, regarding cure of incontinence. TVT is more effective, albeit insignificantly, than TOT at 2 years. However, serious adverse events were more frequent with TVT, yet TOT has more unique adverse events.

  18. Efficacy of Arthroscopic Teaching Methods: A Prospective Randomized Controlled Study.

    Science.gov (United States)

    Robinson, Luke; Spanyer, Jonathon; Yenna, Zachary; Burchell, Patrick; Garber, Andrew; Riehl, John

    Arthroscopic education research recently has been focused on the use of skills labs to facilitate resident education and objective measure development to gauge technical skill. This study evaluates the effectiveness of three different teaching methods. Medical students were randomized into three groups. The first group received only classroom-based lecture. The second group received the same lecture and 28 minutes of lab-based hands-off arthroscopy instruction using a cadaver and arthroscopy setup. The final group received the same lecture and 7 minutes of hands-on arthroscopy instruction in the lab on a cadaver knee. The arthroscopic knee exam that followed simulated a diagnostic knee exam and subjects were measured on task completion and by the number of look downs. The number of look downs and the number of tasks completed did not achieve statistical significance between groups. Posttest survey results revealed that the hands-on group placed significantly more value on their educational experience as compared with the other two groups. (Journal of Surgical Orthopaedic Advances.

  19. Employing a Multi-level Approach to Recruit a Representative Sample of Women with Recent Gestational Diabetes Mellitus into a Randomized Lifestyle Intervention Trial.

    Science.gov (United States)

    Nicklas, Jacinda M; Skurnik, Geraldine; Zera, Chloe A; Reforma, Liberty G; Levkoff, Sue E; Seely, Ellen W

    2016-02-01

    The postpartum period is a window of opportunity for diabetes prevention in women with recent gestational diabetes (GDM), but recruitment for clinical trials during this period of life is a major challenge. We adapted a social-ecologic model to develop a multi-level recruitment strategy at the macro (high or institutional level), meso (mid or provider level), and micro (individual) levels. Our goal was to recruit 100 women with recent GDM into the Balance after Baby randomized controlled trial over a 17-month period. Participants were asked to attend three in-person study visits at 6 weeks, 6, and 12 months postpartum. They were randomized into a control arm or a web-based intervention arm at the end of the baseline visit at six weeks postpartum. At the end of the recruitment period, we compared population characteristics of our enrolled subjects to the entire population of women with GDM delivering at Brigham and Women's Hospital (BWH). We successfully recruited 107 of 156 (69 %) women assessed for eligibility, with the majority (92) recruited during pregnancy at a mean 30 (SD ± 5) weeks of gestation, and 15 recruited postpartum, at a mean 2 (SD ± 3) weeks postpartum. 78 subjects attended the initial baseline visit, and 75 subjects were randomized into the trial at a mean 7 (SD ± 2) weeks postpartum. The recruited subjects were similar in age and race/ethnicity to the total population of 538 GDM deliveries at BWH over the 17-month recruitment period. Our multilevel approach allowed us to successfully meet our recruitment goal and recruit a representative sample of women with recent GDM. We believe that our most successful strategies included using a dedicated in-person recruiter, integrating recruitment into clinical flow, allowing for flexibility in recruitment, minimizing barriers to participation, and using an opt-out strategy with providers. Although the majority of women were recruited while pregnant, women recruited in the early postpartum period were

  20. Comparison of Sampling Designs for Estimating Deforestation from Landsat TM and MODIS Imagery: A Case Study in Mato Grosso, Brazil

    Directory of Open Access Journals (Sweden)

    Shanyou Zhu

    2014-01-01

    Full Text Available Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  1. Comparison of sampling designs for estimating deforestation from landsat TM and MODIS imagery: a case study in Mato Grosso, Brazil.

    Science.gov (United States)

    Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin

    2014-01-01

    Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  2. Undergraduate student drinking and related harms at an Australian university: web-based survey of a large random sample

    Directory of Open Access Journals (Sweden)

    Hallett Jonathan

    2012-01-01

    Full Text Available Abstract Background There is considerable interest in university student hazardous drinking among the media and policy makers. However there have been no population-based studies in Australia to date. We sought to estimate the prevalence and correlates of hazardous drinking and secondhand effects among undergraduates at a Western Australian university. Method We invited 13,000 randomly selected undergraduate students from a commuter university in Australia to participate in an online survey of university drinking. Responses were received from 7,237 students (56%, who served as participants in this study. Results Ninety percent had consumed alcohol in the last 12 months and 34% met criteria for hazardous drinking (AUDIT score ≥ 8 and greater than 6 standard drinks in one sitting in the previous month. Men and Australian/New Zealand residents had significantly increased odds (OR: 2.1; 95% CI: 1.9-2.3; OR: 5.2; 95% CI: 4.4-6.2 of being categorised as dependent (AUDIT score 20 or over than women and non-residents. In the previous 4 weeks, 13% of students had been insulted or humiliated and 6% had been pushed, hit or otherwise assaulted by others who were drinking. One percent of respondents had experienced sexual assault in this time period. Conclusions Half of men and over a third of women were drinking at hazardous levels and a relatively large proportion of students were negatively affected by their own and other students' drinking. There is a need for intervention to reduce hazardous drinking early in university participation. Trial registration ACTRN12608000104358

  3. Childhood adiposity and risk of type 1 diabetes: A Mendelian randomization study.

    Directory of Open Access Journals (Sweden)

    J C Censin

    2017-08-01

    Full Text Available The incidence of type 1 diabetes (T1D is increasing globally. One hypothesis is that increasing childhood obesity rates may explain part of this increase, but, as T1D is rare, intervention studies are challenging to perform. The aim of this study was to assess this hypothesis with a Mendelian randomization approach that uses genetic variants as instrumental variables to test for causal associations.We created a genetic instrument of 23 single nucleotide polymorphisms (SNPs associated with childhood adiposity in children aged 2-10 years. Summary-level association results for these 23 SNPs with childhood-onset (<17 years T1D were extracted from a meta-analysis of genome-wide association study with 5,913 T1D cases and 8,828 reference samples. Using inverse-variance weighted Mendelian randomization analysis, we found support for an effect of childhood adiposity on T1D risk (odds ratio 1.32, 95% CI 1.06-1.64 per standard deviation score in body mass index [SDS-BMI]. A sensitivity analysis provided evidence of horizontal pleiotropy bias (p = 0.04 diluting the estimates towards the null. We therefore applied Egger regression and multivariable Mendelian randomization methods to control for this type of bias and found evidence in support of a role of childhood adiposity in T1D (odds ratio in Egger regression, 2.76, 95% CI 1.40-5.44. Limitations of our study include that underlying genes and their mechanisms for most of the genetic variants included in the score are not known. Mendelian randomization requires large sample sizes, and power was limited to provide precise estimates. This research has been conducted using data from the Early Growth Genetics (EGG Consortium, the Genetic Investigation of Anthropometric Traits (GIANT Consortium, the Tobacco and Genetics (TAG Consortium, and the Social Science Genetic Association Consortium (SSGAC, as well as meta-analysis results from a T1D genome-wide association study.This study provides genetic support for a

  4. The High/Scope Perry Preschool Study: A Case Study in Random Assignment.

    Science.gov (United States)

    Schweinhart, Lawrence J.

    2000-01-01

    Studied the long-term benefits of preschool programs for young children living in poverty in the High/Scope Perry Preschool Study, which examined the lives of 123 African Americans randomly divided into a preschool treatment group and a no-preschool comparison group. Cost-benefit analyses of data on these students to age 27 show beneficial effects…

  5. Diagnostic-test evaluation of immunoassays for anti-Toxoplasma gondii IgG antibodies in a random sample of Mexican population.

    Science.gov (United States)

    Caballero-Ortega, Heriberto; Castillo-Cruz, Rocío; Murieta, Sandra; Ortíz-Alegría, Luz Belinda; Calderón-Segura, Esther; Conde-Glez, Carlos J; Cañedo-Solares, Irma; Correa, Dolores

    2014-05-14

    There are few articles on evaluation of Toxoplasma gondii serological tests. Besides, commercially available tests are not always useful and are expensive for studies in open population. The aim of this study was to evaluate in-house ELISA and western blot for IgG antibodies in a representative sample of people living in Mexico. Three hundred and five serum samples were randomly selected from two national seroepidemiological survey banks; they were taken from men and women of all ages and from all areas of the country. ELISA cut-off was established using the mean plus three standard deviations of negative samples. Western blots were analysed by two experienced technicians and positivity was established according to the presence of at least three diagnostic bands. A commercial ELISA kit was used as a third test. Two reference standards were built up: one using concordant results of two assays leaving the evaluated test out and the other in which the evaluated test was included (IN) with at least two concordant results to define diagnosis. the lowest values of diagnostic parameters were obtained with the OUT reference standards: in-house ELISA had 96.9% sensitivity, 62.1% specificity, 49.6% PPV, 98.1% NPV and 71.8% accuracy, while western blot presented 81.8%, 89.7%, 84.0%, 88.2% and 86.6% values and the best kappa coefficient (0.72-0.82). The in-house ELISA is useful for screening people of Mexico, due to its high sensitivity, while western blot may be used to confirm diagnosis. These techniques might prove useful in other Latin American countries.

  6. Timing of food intake impacts daily rhythms of human salivary microbiota: a randomized, crossover study.

    Science.gov (United States)

    Collado, María Carmen; Engen, Phillip A; Bandín, Cristina; Cabrera-Rubio, Raúl; Voigt, Robin M; Green, Stefan J; Naqib, Ankur; Keshavarzian, Ali; Scheer, Frank A J L; Garaulet, Marta

    2018-04-01

    The composition of the diet (what we eat) has been widely related to the microbiota profile. However, whether the timing of food consumption (when we eat) influences microbiota in humans is unknown. A randomized, crossover study was performed in 10 healthy normal-weight young women to test the effect of the timing of food intake on the human microbiota in the saliva and fecal samples. More specifically, to determine whether eating late alters daily rhythms of human salivary microbiota, we interrogated salivary microbiota in samples obtained at 4 specific time points over 24 h, to achieve a better understanding of the relationship between food timing and metabolic alterations in humans. Results revealed significant diurnal rhythms in salivary diversity and bacterial relative abundance ( i.e., TM7 and Fusobacteria) across both early and late eating conditions. More importantly, meal timing affected diurnal rhythms in diversity of salivary microbiota toward an inverted rhythm between the eating conditions, and eating late increased the number of putative proinflammatory taxa, showing a diurnal rhythm in the saliva. In a randomized, crossover study, we showed for the first time the impact of the timing of food intake on human salivary microbiota. Eating the main meal late inverts the daily rhythm of salivary microbiota diversity which may have a deleterious effect on the metabolism of the host.-Collado, M. C., Engen, P. A., Bandín, C., Cabrera-Rubio, R., Voigt, R. M., Green, S. J., Naqib, A., Keshavarzian, A., Scheer, F. A. J. L., Garaulet, M. Timing of food intake impacts daily rhythms of human salivary microbiota: a randomized, crossover study.

  7. Experimental study on the particles deposition in the sampling duct

    Energy Technology Data Exchange (ETDEWEB)

    Vendel, J.; Charuau, J. [Institut de Protection et de Surete Nucleaire, Yvette (France)

    1995-02-01

    A high standard of protection against the harmful effects of radioactive aerosol dissemination requires a measurement, as representative as possible, of their concentration. This measurement depends on the techniques used for aerosol sampling and transfer to the detector, as well as on the location of the latter with respect to the potential sources. The aeraulic design of the apparatus is also an important factor. Once collected the aerosol particles often have to travel through a variably shaped duct to the measurement apparatus. This transport is responsible for losses due to the particles deposition on the walls, leading to a distortion on the concentration measurements and a change in the particle size distribution. To estimate and minimize measurement errors it is important to determine the optimal transport conditions when designing a duct; its diameter and material, the radius of curvature of the bends and the flow conditions must be defined in particular. This paper presents an experimental study in order to determine, for each deposition mechanism, the retained fraction, or the deposition velocity for different flow regimes. This study has pointed out that it exists a favourable flow regime for the particle transport through the sampling ducts (2 500 < Re < 5 000). It has been established, for any particle diameters, equations to predict the aerosol penetration in smooth-walled cylindrical metal ducts.

  8. A Randomized Study of a Mobile Behavioral Parent Training Application.

    Science.gov (United States)

    Feil, Edward G; Sprengelmeyer, Peter G; Leve, Craig

    2018-06-01

    Background/Introduction: Never before have parents had such immediate access to parenting support. The extension of the Internet to smartphones, offers the opportunity to provide families with the highest-quality information at the time and place that it can be the most useful. However, there remain considerable barriers to getting the right information to the right people at the right time. This study includes the initial feasibility testing of a smartphone application "ParentNet" that attempts to deliver on the potential of empirically supported therapy by connecting family members with specific behavioral goals and outcomes in real time. Participation was solicited from community parenting support groups and through online social media. Data were collected from 73 parents and 88 children on child behavior (adult only) and satisfaction. Data analyses showed positive satisfaction and utilization results: (1) users rated the ParentNet app very positively (i.e., 85% of caregivers and 88% of youth would recommend the app to others), and (2) parenting behavior was improved with a small/moderate effect-size. Findings from this initial testing are reviewed along with future development possibilities to be considered. Limitations of small pilot sample and brief administration period could have reduced effects. Further study would include a more robust sample.

  9. Sugammadex versus neostigmine in pediatric patients: a prospective randomized study

    Directory of Open Access Journals (Sweden)

    Turhan Kara

    2014-12-01

    Full Text Available Background and objectives: Acetylcholinesterase inhibitors may cause postoperative residual curarization when they are used for reversal of neuromuscular blockade. Sugammadex reverses neuromuscular blockade by chemical encapsulation and is not associated with the side effects that may occur with the use of anticholinesterase agents. Because of increased outpatient surgical procedures postoperative residual curarization and rapid postoperative recovery have a greater importance in the pediatric patient population. The aim of this study was to compare the efficacy of sugammadex and neostigmine on reversing neuromuscular blockade in pediatric patients undergoing outpatient surgical procedures. Methods: 80 patients, aged 2-12 years, scheduled for outpatient surgery were enrolled in this randomized prospective study. Neuromuscular blockade was achieved with 0.6 mgkg−1 rocuronium and monitorized with train-of-four. Group RN (n = 40 received 0.03 mgkg−1 neostigmine, Group RS (n = 40 received 2 mgkg−1 sugammadex for reversal of rocuronium. Extubation time (time from the reversal of neuromuscular blockade to extubation, train-of-four ratio during this time, time to reach train-of-four > 0.9, and probable complications were recorded. Results: There was no significant difference between the patients' characteristics. Extubation time and time to reach train-of-four > 0.9 were significantly higher in Group RN (p = 0.001, p = 0.002. Train-of-four at the time of neostigmine/sugammadex injection in Group RN were significantly higher than in the RS group (p = 0.020. Extubation train-of-four ratio was significantly lower in Group RN (p = 0.002. Conclusion: Sugammadex provides safer extubation with a shorter recovery time than neostigmine in pediatric patients undergoing outpatient surgical procedures.

  10. Using Environmental Variables for Studying of the Quality of Sampling in Soil Mapping

    Directory of Open Access Journals (Sweden)

    A. Jafari

    2016-02-01

    Full Text Available Introduction: Methods of soil survey are generally empirical and based on the mental development of the surveyor, correlating soil with underlying geology, landforms, vegetation and air-photo interpretation. Since there are no statistical criteria for traditional soil sampling; this may lead to bias in the areas being sampled. In digital soil mapping, soil samples may be used to elaborate quantitative relationships or models between soil attributes and soil covariates. Because the relationships are based on the soil observations, the quality of the resulting soil map depends also on the soil observation quality. An appropriate sampling design for digital soil mapping depends on how much data is available and where the data is located. Some statistical methods have been developed for optimizing data sampling for soil surveys. Some of these methods deal with the use of ancillary information. The purpose of this study was to evaluate the quality of sampling of existing data. Materials and Methods: The study area is located in the central basin of the Iranian plateau (Figure 1. The geologic infrastructure of the area is mainly Cretaceous limestone, Mesozoic shale and sandstone. Air photo interpretation (API was used to differentiate geomorphic patterns based on their formation processes, general structure and morphometry. The patterns were differentiated through a nested geomorphic hierarchy (Fig. 2. A four-level geomorphic hierarchy is used to breakdown the complexity of different landscapes of the study area. In the lower level of the hierarchy, the geomorphic surfaces, which were formed by a unique process during a specific geologic time, were defined. A stratified sampling scheme was designed based on geomorphic mapping. In the stratified simple random sampling, the area was divided into sub-areas referred to as strata based on geomorphic surfaces, and within each stratum, sampling locations were randomly selected (Figure 2. This resulted in 191

  11. Effective permittivity of random composite media: A comparative study

    International Nuclear Information System (INIS)

    Prasad, Ashutosh; Prasad, K.

    2007-01-01

    In the present study, experimental data for effective permittivity of amorphous, polycrystalline thick films, and ceramic form of samples, taken from the literature, have been chosen for their comparison with those yielded by different mixture equations. In order to test the acceptability of dielectric mixture equations for high volume fractions of the inclusion material in the mixture, eleven such equations have been chosen. It is found that equations given by Cuming, Maxwell-Wagner, Webmann, Skipetrov and modified Cule-Torquato show their coherence and minimal deviation from the experimental results of permittivity for all the chosen test materials almost over the entire measurement range of volume fractions. It is further found that Maxwell-Wagner, Webmann, and Skipetrov equations yielded equivalent results and consequently they have been combined together and reckoned as a single equation named MWWS. The study revealed that the Cuming equation had the highest degree of acceptability (errors <±1-5%) in all the cases

  12. Studies of U in the blood of two population samples

    International Nuclear Information System (INIS)

    Segovia, N.; Olguin, M.E.; Romero, M.

    1986-01-01

    The present work, attempts to establish the statistical distribution of blood uranium in a population of the same community, similar in age and in living patterns. U traces were evaluated by a fission track technique both in whole blood and plasma samples. Dried samples were compressed into pellets and irradiated in a nuclear reactor using the external detector method. For U quantification, standard U samples were used. A comparative sampling of U content in blood samples from a group of radiation exposed workers and another of leukemia patients was also carried out. Results from the sampling groups are reported and discussed. (author)

  13. A large-scale study of the random variability of a coding sequence: a study on the CFTR gene.

    Science.gov (United States)

    Modiano, Guido; Bombieri, Cristina; Ciminelli, Bianca Maria; Belpinati, Francesca; Giorgi, Silvia; Georges, Marie des; Scotet, Virginie; Pompei, Fiorenza; Ciccacci, Cinzia; Guittard, Caroline; Audrézet, Marie Pierre; Begnini, Angela; Toepfer, Michael; Macek, Milan; Ferec, Claude; Claustres, Mireille; Pignatti, Pier Franco

    2005-02-01

    Coding single nucleotide substitutions (cSNSs) have been studied on hundreds of genes using small samples (n(g) approximately 100-150 genes). In the present investigation, a large random European population sample (average n(g) approximately 1500) was studied for a single gene, the CFTR (Cystic Fibrosis Transmembrane conductance Regulator). The nonsynonymous (NS) substitutions exhibited, in accordance with previous reports, a mean probability of being polymorphic (q > 0.005), much lower than that of the synonymous (S) substitutions, but they showed a similar rate of subpolymorphic (q < 0.005) variability. This indicates that, in autosomal genes that may have harmful recessive alleles (nonduplicated genes with important functions), genetic drift overwhelms selection in the subpolymorphic range of variability, making disadvantageous alleles behave as neutral. These results imply that the majority of the subpolymorphic nonsynonymous alleles of these genes are selectively negative or even pathogenic.

  14. Statistical Power and Optimum Sample Allocation Ratio for Treatment and Control Having Unequal Costs Per Unit of Randomization

    Science.gov (United States)

    Liu, Xiaofeng

    2003-01-01

    This article considers optimal sample allocation between the treatment and control condition in multilevel designs when the costs per sampling unit vary due to treatment assignment. Optimal unequal allocation may reduce the cost from that of a balanced design without sacrificing any power. The optimum sample allocation ratio depends only on the…

  15. Acupuncture intervention in ischemic stroke: a randomized controlled prospective study.

    Science.gov (United States)

    Shen, Peng-Fei; Kong, Li; Ni, Li-Wei; Guo, Hai-Long; Yang, Sha; Zhang, Li-Li; Zhang, Zhi-Long; Guo, Jia-Kui; Xiong, Jie; Zhen, Zhong; Shi, Xue-Min

    2012-01-01

    Stroke is one of the most common causes of death and few pharmacological therapies show benefits in ischemic stroke. In this study, 290 patients aged 40-75 years old with first onset of acute ischemic stroke (more than 24 hours but within 14 days) were treated with standard treatments, and then were randomly allocated into an intervention group (treated with resuscitating acupuncture) and a control group (treated using sham-acupoints). Primary outcome measures included Barthel Index (BI), relapse and death up to six months. For the 290 patients in both groups, one case in the intervention group died, and two cases in the control group died from the disease (p = 0.558). Six patients of the 144 cases in the intervention group had relapse, whereas 34 of 143 patients had relapse in the control group (p two groups, respectively (p two groups for the National Institute of Health Stroke Scale (NIHSS), not at two weeks (7.03 ± 3.201 vs. 8.13 ± 3.634; p = 0.067), but at four weeks (4.15 ± 2.032 vs. 6.35 ± 3.131, p Stroke Scale (CSS) at four weeks showed more improvement in the intervention group than that in the control group (9.40 ± 4.51 vs. 13.09 ± 5.80, p Stroke Specific Quality of Life Scale (SS-QOL) at six months was higher in the intervention group (166.63 ± 45.70) than the control group (143.60 ± 50.24; p < 0.01). The results of this clinical trial showed a clinically relevant decrease of relapse in patients treated with resuscitating acupuncture intervention by the end of six months, compared with needling at the sham-acupoints. The resuscitating acupuncture intervention could also improve self-care ability and quality of life, evaluated with BI, NIHSS, CSS, Oxford Handicap Scale (OHS), and SS-QOL.

  16. Sample holder for studying temperature dependent particle guiding

    International Nuclear Information System (INIS)

    Bereczky, R.J.; Toekesi, K.; Kowarik, G.; Aumayr, F.

    2011-01-01

    Complete text of publication follows. The so called guiding effect is a complex process involving the interplay of a large number of charged particles with a solid. Although many research groups joined this field and carried out various experiments with insulator capillaries many details of the interactions are still unknown. We investigated the temperature dependence of the guiding since it opens new possibilities both for a fundamental understanding of the guiding phenomenon and for applications. For the temperature dependent guiding experiments a completely new heatable sample holder was designed. We developed and built such a heatable sample holder to make accurate and reproducible studies of the temperature dependence of the ion guiding effect possible. The target holder (for an exploded view see Fig. 1) consists of two main parts, the front and the back plates. The two plates of the sample holder, which function as an oven, are made of copper. These parts surround the capillary in order to guarantee a uniform temperature along the whole tube. The temperature of the copper parts is monitored by a K-Type thermocouple. Stainless steel coaxial heaters surrounding the oven are used for heating. The heating power up to a few watts is regulated by a PID controller. Cooling of the capillary is achieved by a copper feed-through connected to a liquid nitrogen bath outside the UHV chamber. This solution allows us to change the temperature of the sample from -30 deg C up to 90 deg C. Our experiments with this newly developed temperature regulated capillary holder show that the glass temperature (i.e. conductivity) can be used to control the guiding properties of the glass capillary and adjust the conditions from guiding at room temperature to simple geometrical transmission at elevated temperatures. This holds the promise to investigate the effect of conductivity on particle transport (build-up and removal of charge patches) through capillaries in more details

  17. A topological analysis of large-scale structure, studied using the CMASS sample of SDSS-III

    International Nuclear Information System (INIS)

    Parihar, Prachi; Gott, J. Richard III; Vogeley, Michael S.; Choi, Yun-Young; Kim, Juhan; Kim, Sungsoo S.; Speare, Robert; Brownstein, Joel R.; Brinkmann, J.

    2014-01-01

    We study the three-dimensional genus topology of large-scale structure using the northern region of the CMASS Data Release 10 (DR10) sample of the SDSS-III Baryon Oscillation Spectroscopic Survey. We select galaxies with redshift 0.452 < z < 0.625 and with a stellar mass M stellar > 10 11.56 M ☉ . We study the topology at two smoothing lengths: R G = 21 h –1 Mpc and R G = 34 h –1 Mpc. The genus topology studied at the R G = 21 h –1 Mpc scale results in the highest genus amplitude observed to date. The CMASS sample yields a genus curve that is characteristic of one produced by Gaussian random phase initial conditions. The data thus support the standard model of inflation where random quantum fluctuations in the early universe produced Gaussian random phase initial conditions. Modest deviations in the observed genus from random phase are as expected from shot noise effects and the nonlinear evolution of structure. We suggest the use of a fitting formula motivated by perturbation theory to characterize the shift and asymmetries in the observed genus curve with a single parameter. We construct 54 mock SDSS CMASS surveys along the past light cone from the Horizon Run 3 (HR3) N-body simulations, where gravitationally bound dark matter subhalos are identified as the sites of galaxy formation. We study the genus topology of the HR3 mock surveys with the same geometry and sampling density as the observational sample and find the observed genus topology to be consistent with ΛCDM as simulated by the HR3 mock samples. We conclude that the topology of the large-scale structure in the SDSS CMASS sample is consistent with cosmological models having primordial Gaussian density fluctuations growing in accordance with general relativity to form galaxies in massive dark matter halos.

  18. Conventional versus computer-navigated TKA: a prospective randomized study.

    Science.gov (United States)

    Todesca, Alessandro; Garro, Luca; Penna, Massimo; Bejui-Hugues, Jacques

    2017-06-01

    The purpose of this study was to assess the midterm results of total knee arthroplasty (TKA) implanted with a specific computer navigation system in a group of patients (NAV) and to assess the same prosthesis implanted with the conventional technique in another group (CON); we hypothesized that computer navigation surgery would improve implant alignment, functional scores and survival of the implant compared to the conventional technique. From 2008 to 2009, 225 patients were enrolled in the study and randomly assigned in CON and NAV groups; 240 consecutive mobile-bearing ultra-congruent score (Amplitude, Valence, France) TKAs were performed by a single surgeon, 117 using the conventional method and 123 using the computer-navigated approach. Clinical outcome assessment was based on the Knee Society Score (KSS), the Hospital for Special Surgery Knee Score and the Western Ontario Mac Master University Index score. Component survival was calculated by Kaplan-Meier analysis. Median follow-up was 6.4 years (range 6-7 years). Two patients were lost to follow-up. No differences were seen between the two groups in age, sex, BMI and side of implantation. Three patients of CON group referred feelings of instability during walking, but clinical tests were all negative. NAV group showed statistical significant better KSS Score and wider ROM and fewer outliers from neutral mechanical axis, lateral distal femoral angle, medial proximal tibial angle and tibial slope in post-operative radiographic assessment. There was one case of early post-operative superficial infection (caused by Staph. Aureus) successfully treated with antibiotics. No mechanical loosening, mobile-bearing dislocation or patellofemoral complication was seen. At 7 years of follow-up, component survival in relation to the risk of aseptic loosening or other complications was 100 %. There were no implant revisions. This study demonstrates superior accuracy in implant positioning and statistical significant

  19. Study on sampling conditions for the monitoring of waste air

    International Nuclear Information System (INIS)

    Moeller, T.J.; Buetefisch, K.A.

    1998-01-01

    The technical codes for radiological monitoring of the waste air released from a radwaste repository demand that sampling for determination of aerosol-borne radioactivity is to be made with a screener equipped with a suitable number of measuring probes extending over the entire cross-sectional surface of the vent. Another requirement is to ensure that the waste air stream passing through the measuring channel is representative, containing the typical, operation-induced distribution of aerosols across the surface to be scanned. The study reported was intended to determine in a scaled-down model (1:10) of a repository ventilating duct the typical spatial distribution of aerosols (3D particulate density) in order to establish information on the type of typical distributions of aerosols, to be used for optimisation of the measuring site and monitoring instruments. (orig./CB) [de

  20. Importance of participation rate in sampling of data in population based studies, with special reference to bone mass in Sweden.

    OpenAIRE

    Düppe, H; Gärdsell, P; Hanson, B S; Johnell, O; Nilsson, B E

    1996-01-01

    OBJECTIVE: To study the effects of participation rate in sampling on "normative" bone mass data. DESIGN: This was a comparison between two randomly selected samples from the same population. The participation rates in the two samples were 61.9% and 83.6%. Measurements were made of bone mass at different skeletal sites and of muscle strength, as well as an assessment of physical activity. SETTING: Malmö, Sweden. SUBJECTS: There were 230 subjects (117 men, 113 women), aged 21 to 42 years. RESUL...

  1. Randomized, Controlled Study of Adderall XR in ADHD

    Directory of Open Access Journals (Sweden)

    J Gordon Millichap

    2002-08-01

    Full Text Available The efficacy and safety of Adderall XR in the treatment of attention deficit/hyperactivity disorder and diurnal variation in responses were assessed by a multicenter, randomized, double-blind, parallel group, placebo-controlled trial at 47 sites, and reported from the Massachusetts General Hospital, Boston, MA.

  2. Cluster randomized trial in the general practice research database: 2. Secondary prevention after first stroke (eCRT study: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Dregan Alex

    2012-10-01

    Full Text Available Abstract Background The purpose of this research is to develop and evaluate methods for conducting pragmatic cluster randomized trials in a primary care electronic database. The proposal describes one application, in a less frequent chronic condition of public health importance, secondary prevention of stroke. A related protocol in antibiotic prescribing was reported previously. Methods/Design The study aims to implement a cluster randomized trial (CRT using the electronic patient records of the General Practice Research Database (GPRD as a sampling frame and data source. The specific objective of the trial is to evaluate the effectiveness of a computer-delivered intervention at enhancing the delivery of stroke secondary prevention in primary care. GPRD family practices will be allocated to the intervention or usual care. The intervention promotes the use of electronic prompts to support adherence with the recommendations of the UK Intercollegiate Stroke Working Party and NICE guidelines for the secondary prevention of stroke in primary care. Primary outcome measure will be the difference in systolic blood pressure between intervention and control trial arms at 12-month follow-up. Secondary outcomes will be differences in serum cholesterol, prescribing of antihypertensive drugs, statins, and antiplatelet therapy. The intervention will continue for 12 months. Information on the utilization of the decision-support tools will also be analyzed. Discussion The CRT will investigate the effectiveness of using a computer-delivered intervention to reduce the risk of stroke recurrence following a first stroke event. The study will provide methodological guidance on the implementation of CRTs in electronic databases in primary care. Trial registration Current Controlled Trials ISRCTN35701810

  3. FIT for FUNCTION: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Richardson, Julie; Tang, Ada; Guyatt, Gordon; Thabane, Lehana; Xie, Feng; Sahlas, Demetrios; Hart, Robert; Fleck, Rebecca; Hladysh, Genevieve; Macrae, Louise

    2018-01-15

    The current state of evidence suggests that community-based exercise programs are beneficial in improving impairment, function, and health status, and are greatly needed for persons with stroke. However, limitations of these studies include risk of bias, feasibility, and cost issues. This single-blinded, randomized controlled trial (RCT) of 216 participants with stroke will compare the effectiveness of a 12-week YMCA community-based wellness program (FIT for FUNCTION) specifically designed for community-dwelling persons with stroke to persons who receive a standard YMCA membership. The primary outcome will be community reintegration using the Reintegration to Normal Living Index at 12 and 24 weeks. Secondary outcomes include measurement of physical activity level using the Rapid Assessment of Physical Activity and accelerometry; balance using the Berg Balance Scale; lower extremity function using the Short Physical Performance Battery; exercise capacity using the 6-min walk test; grip strength and isometric knee extension strength using hand held dynamometry; and health-related quality of life using the European Quality of Life 5-Dimension Questionnaire. We are also assessing cardiovascular health and lipids; glucose and inflammatory markers will be collected following 12-h fast for total cholesterol, insulin, glucose, and glycated hemoglobin. Self-efficacy for physical activity will be assessed with a single question and self-efficacy for managing chronic disease will be assessed using the Stanford 6-item Scale. The Patient Activation Measure will be used to assess the patient's level of knowledge, skill, and confidence for self-management. Healthcare utilization and costs will be evaluated. Group, time, and group × time interaction effects will be estimated using generalized linear models for continuous variables, including relevant baseline variables as covariates in the analysis that differ appreciably between groups at baseline. Cost data will be treated

  4. Study population, questionnaire, data management and sample description

    Directory of Open Access Journals (Sweden)

    Chiara de Waure

    2015-06-01

    Full Text Available INTRODUCTION: This article describes methodological issues of the "Sportello Salute Giovani" project ("Youth Health Information Desk", a multicenter study aimed at assessing the health status and attitudes and behaviours of university students in Italy. MATERIALS AND METHODS: The questionnaire used to carry out the study was adapted from the Italian health behaviours in school-aged children (HBSC project and consisted of 93 items addressing: demographics; nutritional habits and status; physical activity; lifestyles; reproductive and preconception health; health and satisfaction of life; attitudes and behaviours toward academic study and new technologies. The questionnaire was administered to a pool of 12 000 students from 18 to 30 years of age who voluntary decided to participate during classes held at different Italian faculties or at the three "Sportello Salute Giovani" centers which were established in the three sites of the Università Cattolica del Sacro Cuore (Catholic University of the Sacred Heart of Rome. RESULTS: The final study sample was composed by 8516 university students. The mean age of responders was 22.2 (Standard Deviation 2.0 and 5702 (67.0% were females. According to the distribution in age classes, 3601 (43.3% belonged to the 18-21 one, 3796 (44.5% to the 22-24 class and 1019 (12.2% to the 25-30 class. With respect to socio-economic status, data were available for 8410 responders and showed that 50.3% of students belonged to the middle class. DISCUSSION: The project took into consideration a large number of individuals from different regions of the country and therefore may be considered representative of the general population of Italian university students. Furthermore, it is the first to address, at the same time, several issues, in particular attitudes and behaviours toward health, in Italian university students. CONCLUSION: The analysis of data from such a large sample of university students sets the basis for

  5. Studying the sampling representativeness in the NPP ventilation ducts

    International Nuclear Information System (INIS)

    Sosnovskij, R.I.; Fedchenko, T.K.; Minin, S.A.

    2000-01-01

    Measurements of the gas and aerosol voluminous activity in the NPP ventilation ducts are an important source of information on the radiation contaminants ingress into the environmental medium. These measurements include sampling, samples transport and proper measurements. The work is devoted to calculation of metrological characteristics of the sampling systems for the NPP gas-aerosol releases by different parameters of these systems and ventilation ducts. The results obtained are intended for application by designing such systems and their metrological certification [ru

  6. Feasibility of self-sampled dried blood spot and saliva samples sent by mail in a population-based study

    International Nuclear Information System (INIS)

    Sakhi, Amrit Kaur; Bastani, Nasser Ezzatkhah; Ellingjord-Dale, Merete; Gundersen, Thomas Erik; Blomhoff, Rune; Ursin, Giske

    2015-01-01

    In large epidemiological studies it is often challenging to obtain biological samples. Self-sampling by study participants using dried blood spots (DBS) technique has been suggested to overcome this challenge. DBS is a type of biosampling where blood samples are obtained by a finger-prick lancet, blotted and dried on filter paper. However, the feasibility and efficacy of collecting DBS samples from study participants in large-scale epidemiological studies is not known. The aim of the present study was to test the feasibility and response rate of collecting self-sampled DBS and saliva samples in a population–based study of women above 50 years of age. We determined response proportions, number of phone calls to the study center with questions about sampling, and quality of the DBS. We recruited women through a study conducted within the Norwegian Breast Cancer Screening Program. Invitations, instructions and materials were sent to 4,597 women. The data collection took place over a 3 month period in the spring of 2009. Response proportions for the collection of DBS and saliva samples were 71.0% (3,263) and 70.9% (3,258), respectively. We received 312 phone calls (7% of the 4,597 women) with questions regarding sampling. Of the 3,263 individuals that returned DBS cards, 3,038 (93.1%) had been packaged and shipped according to instructions. A total of 3,032 DBS samples were sufficient for at least one biomarker analysis (i.e. 92.9% of DBS samples received by the laboratory). 2,418 (74.1%) of the DBS cards received by the laboratory were filled with blood according to the instructions (i.e. 10 completely filled spots with up to 7 punches per spot for up to 70 separate analyses). To assess the quality of the samples, we selected and measured two biomarkers (carotenoids and vitamin D). The biomarker levels were consistent with previous reports. Collecting self-sampled DBS and saliva samples through the postal services provides a low cost, effective and feasible

  7. Feasibility of self-sampled dried blood spot and saliva samples sent by mail in a population-based study.

    Science.gov (United States)

    Sakhi, Amrit Kaur; Bastani, Nasser Ezzatkhah; Ellingjord-Dale, Merete; Gundersen, Thomas Erik; Blomhoff, Rune; Ursin, Giske

    2015-04-11

    In large epidemiological studies it is often challenging to obtain biological samples. Self-sampling by study participants using dried blood spots (DBS) technique has been suggested to overcome this challenge. DBS is a type of biosampling where blood samples are obtained by a finger-prick lancet, blotted and dried on filter paper. However, the feasibility and efficacy of collecting DBS samples from study participants in large-scale epidemiological studies is not known. The aim of the present study was to test the feasibility and response rate of collecting self-sampled DBS and saliva samples in a population-based study of women above 50 years of age. We determined response proportions, number of phone calls to the study center with questions about sampling, and quality of the DBS. We recruited women through a study conducted within the Norwegian Breast Cancer Screening Program. Invitations, instructions and materials were sent to 4,597 women. The data collection took place over a 3 month period in the spring of 2009. Response proportions for the collection of DBS and saliva samples were 71.0% (3,263) and 70.9% (3,258), respectively. We received 312 phone calls (7% of the 4,597 women) with questions regarding sampling. Of the 3,263 individuals that returned DBS cards, 3,038 (93.1%) had been packaged and shipped according to instructions. A total of 3,032 DBS samples were sufficient for at least one biomarker analysis (i.e. 92.9% of DBS samples received by the laboratory). 2,418 (74.1%) of the DBS cards received by the laboratory were filled with blood according to the instructions (i.e. 10 completely filled spots with up to 7 punches per spot for up to 70 separate analyses). To assess the quality of the samples, we selected and measured two biomarkers (carotenoids and vitamin D). The biomarker levels were consistent with previous reports. Collecting self-sampled DBS and saliva samples through the postal services provides a low cost, effective and feasible

  8. Evaluating the optimal timing of surgical antimicrobial prophylaxis: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Mujagic, Edin; Zwimpfer, Tibor; Marti, Walter R; Zwahlen, Marcel; Hoffmann, Henry; Kindler, Christoph; Fux, Christoph; Misteli, Heidi; Iselin, Lukas; Lugli, Andrea Kopp; Nebiker, Christian A; von Holzen, Urs; Vinzens, Fabrizio; von Strauss, Marco; Reck, Stefan; Kraljević, Marko; Widmer, Andreas F; Oertli, Daniel; Rosenthal, Rachel; Weber, Walter P

    2014-05-24

    Surgical site infections are the most common hospital-acquired infections among surgical patients. The administration of surgical antimicrobial prophylaxis reduces the risk of surgical site infections . The optimal timing of this procedure is still a matter of debate. While most studies suggest that it should be given as close to the incision time as possible, others conclude that this may be too late for optimal prevention of surgical site infections. A large observational study suggests that surgical antimicrobial prophylaxis should be administered 74 to 30 minutes before surgery. The aim of this article is to report the design and protocol of a randomized controlled trial investigating the optimal timing of surgical antimicrobial prophylaxis. In this bi-center randomized controlled trial conducted at two tertiary referral centers in Switzerland, we plan to include 5,000 patients undergoing general, oncologic, vascular and orthopedic trauma procedures. Patients are randomized in a 1:1 ratio into two groups: one receiving surgical antimicrobial prophylaxis in the anesthesia room (75 to 30 minutes before incision) and the other receiving surgical antimicrobial prophylaxis in the operating room (less than 30 minutes before incision). We expect a significantly lower rate of surgical site infections with surgical antimicrobial prophylaxis administered more than 30 minutes before the scheduled incision. The primary outcome is the occurrence of surgical site infections during a 30-day follow-up period (one year with an implant in place). When assuming a 5% surgical site infection risk with administration of surgical antimicrobial prophylaxis in the operating room, the planned sample size has an 80% power to detect a relative risk reduction for surgical site infections of 33% when administering surgical antimicrobial prophylaxis in the anesthesia room (with a two-sided type I error of 5%). We expect the study to be completed within three years. The results of this

  9. A randomized controlled trial of smartphone-based mindfulness training for smoking cessation: a study protocol.

    Science.gov (United States)

    Garrison, Kathleen A; Pal, Prasanta; Rojiani, Rahil; Dallery, Jesse; O'Malley, Stephanie S; Brewer, Judson A

    2015-04-14

    Tobacco use is responsible for the death of about 1 in 10 individuals worldwide. Mindfulness training has shown preliminary efficacy as a behavioral treatment for smoking cessation. Recent advances in mobile health suggest advantages to smartphone-based smoking cessation treatment including smartphone-based mindfulness training. This study evaluates the efficacy of a smartphone app-based mindfulness training program for improving smoking cessation rates at 6-months follow-up. A two-group parallel-randomized clinical trial with allocation concealment will be conducted. Group assignment will be concealed from study researchers through to follow-up. The study will be conducted by smartphone and online. Daily smokers who are interested in quitting smoking and own a smartphone (n = 140) will be recruited through study advertisements posted online. After completion of a baseline survey, participants will be allocated randomly to the control or intervention group. Participants in both groups will receive a 22-day smartphone-based treatment program for smoking. Participants in the intervention group will receive mobile mindfulness training plus experience sampling. Participants in the control group will receive experience sampling-only. The primary outcome measure will be one-week point prevalence abstinence from smoking (at 6-months follow-up) assessed using carbon monoxide breath monitoring, which will be validated through smartphone-based video chat. This is the first intervention study to evaluate smartphone-based delivery of mindfulness training for smoking cessation. Such an intervention may provide treatment in-hand, in real-world contexts, to help individuals quit smoking. Clinicaltrials.gov NCT02134509 . Registered 7 May 2014.

  10. Approximating the variance of estimated means for systematic random sampling, illustrated with data of the French Soil Monitoring Network

    NARCIS (Netherlands)

    Brus, D.J.; Saby, N.P.A.

    2016-01-01

    In France like in many other countries, the soil is monitored at the locations of a regular, square grid thus forming a systematic sample (SY). This sampling design leads to good spatial coverage, enhancing the precision of design-based estimates of spatial means and totals. Design-based

  11. The relationship between external and internal validity of randomized controlled trials: A sample of hypertension trials from China.

    Science.gov (United States)

    Zhang, Xin; Wu, Yuxia; Ren, Pengwei; Liu, Xueting; Kang, Deying

    2015-10-30

    To explore the relationship between the external validity and the internal validity of hypertension RCTs conducted in China. Comprehensive literature searches were performed in Medline, Embase, Cochrane Central Register of Controlled Trials (CCTR), CBMdisc (Chinese biomedical literature database), CNKI (China National Knowledge Infrastructure/China Academic Journals Full-text Database) and VIP (Chinese scientific journals database) as well as advanced search strategies were used to locate hypertension RCTs. The risk of bias in RCTs was assessed by a modified scale, Jadad scale respectively, and then studies with 3 or more grading scores were included for the purpose of evaluating of external validity. A data extract form including 4 domains and 25 items was used to explore relationship of the external validity and the internal validity. Statistic analyses were performed by using SPSS software, version 21.0 (SPSS, Chicago, IL). 226 hypertension RCTs were included for final analysis. RCTs conducted in university affiliated hospitals (P internal validity. Multi-center studies (median = 4.0, IQR = 2.0) were scored higher internal validity score than single-center studies (median = 3.0, IQR = 1.0) (P internal validity (P = 0.004). Multivariate regression indicated sample size, industry-funding, quality of life (QOL) taken as measure and the university affiliated hospital as trial setting had statistical significance (P external validity of RCTs do associate with the internal validity, that do not stand in an easy relationship to each other. Regarding the poor reporting, other possible links between two variables need to trace in the future methodological researches.

  12. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  13. The influence of psychoeducation on regulating biological rhythm in a sample of patients with bipolar II disorder: a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Faria AD

    2014-06-01

    Full Text Available Augusto Duarte Faria,1 Luciano Dias de Mattos Souza,2 Taiane de Azevedo Cardoso,2 Karen Amaral Tavares Pinheiro,2 Ricardo Tavares Pinheiro,2 Ricardo Azevedo da Silva,2 Karen Jansen21Department of Clinical and Health Psychology, Universidade Federal do Rio Grande – FURG, Rio Grande, RS, Brazil; 2Health and Behavior Postgraduate Program, Universidade Católica de Pelotas – UCPEL, Pelotas, RS, BrazilIntroduction: Changes in biological rhythm are among the various characteristics of bipolar disorder, and have long been associated with the functional impairment of the disease. There are only a few viable options of psychosocial interventions that deal with this specific topic; one of them is psychoeducation, a model that, although it has been used by practitioners for some time, only recently have studies shown its efficacy in clinical practice.Aim: To assess if patients undergoing psychosocial intervention in addition to a pharmacological treatment have better regulation of their biological rhythm than those only using medication.Method: This study is a randomized clinical trial that compares a standard medication intervention to an intervention combined with drugs and psychoeducation. The evaluation of the biological rhythm was made using the Biological Rhythm Interview of Assessment in Neuropsychiatry, an 18-item scale divided in four areas (sleep, activity, social rhythm, and eating pattern. The combined intervention consisted of medication and a short-term psychoeducation model summarized in a protocol of six individual sessions of 1 hour each.Results: The sample consisted of 61 patients with bipolar II disorder, but during the study, there were 14 losses to follow-up. Therefore, the final sample consisted of 45 individuals (26 for standard intervention and 19 for combined. The results showed that, in this sample and time period evaluated, the combined treatment of medication and psychoeducation had no statistically significant impact on the

  14. Piroxicam immediate release formulations: A fasting randomized open-label crossover bioequivalence study in healthy volunteers.

    Science.gov (United States)

    Helmy, Sally A; El-Bedaiwy, Heba M

    2014-11-01

    Piroxicam is a NSAID with analgesic and antipyretic properties, used for the treatment of rheumatoid diseases. The aim of this study was to evaluate the bioequivalence of two brands of piroxicam capsules (20 mg) in 24 Egyptian volunteers. The in vivo study was established according to a single-center, randomized, single-dose, laboratory-blinded, 2-period, 2-sequence, crossover study with a washout period of 3 weeks. Under fasting conditions, 24 healthy male volunteers were randomly selected to receive a single oral dose of one capsule (20 mg) of either test or reference product. Plasma samples were obtained over a 144-hour interval and analyzed for piroxicam by HPLC with UV detection. The pharmacokinetic parameters Cmax , tmax , AUC0-t , AUC0-∞ , Vd /F, Cl/F, and t1/2 were determined from plasma concentration-time profiles. The 90% confidence intervals for the ratio of log transformed values of Cmax , AUC0-t , and AUC0-∞ of the two treatments were within the acceptable range (0.8-1.25) for bioequivalence. From PK perspectives, the two piroxicam formulations were considered bioequivalent, based on the rate and extent of absorption. No adverse events occurred or were reported after a single 20-mg piroxicam and both formulations were well-tolerated. © 2014, The American College of Clinical Pharmacology.

  15. A comparative study of sampling techniques for monitoring carcass contamination

    NARCIS (Netherlands)

    Snijders, J.M.A.; Janssen, M.H.W.; Gerats, G.E.; Corstiaensen, G.P.

    1984-01-01

    Four bacteriological sampling techniques i.e. the excision, double swab, agar contract and modified agar contact techniques were compared by sampling pig carcasses before and after chilling. As well as assessing the advantages and disadvantages of the techniques particular attention was paid to

  16. Imitation and luck: an experimental study on social sampling

    NARCIS (Netherlands)

    Offerman, T.; Schotter, A.

    2007-01-01

    In this paper, we present the results of two experiments on social sampling. In both experiments, people are asked to make a risky decision in a situation where an idiosyncratic luck term a?ects their performance. Before they make their decision, people have the opportunity to sample others who have

  17. Imitation and luck: an experimental study on social sampling

    NARCIS (Netherlands)

    Offerman, T.; Schotter, A.

    2008-01-01

    In this paper, we present the results of two experiments on social sampling, where people make a risky decision after they have sampled the behavior of others who have done exactly the same problem before them. In an individual decision making problem as well as in the take-over game, the simple

  18. Managing mobility outcomes in vulnerable seniors ( MMOVeS): a randomized controlled pilot study.

    Science.gov (United States)

    Figueiredo, Sabrina; Morais, Jose A; Mayo, Nancy

    2017-12-01

    To estimate feasibility and potential for efficacy of an individualized, exercise-focused, self-management program (i.e. Managing Mobility Outcomes In Vulnerable Seniors ( MMOVeS)), in comparison to exercise information in improving mobility after six months among seniors recently discharged from hospital. Randomized pilot study. Two McGill University-teaching hospitals. Community dwelling seniors, aged 70 years and older, recently discharged from either participating hospitals. The physiotherapy-facilitated intervention consisted of (1) evaluation of mobility capacity, (2) setting short- and long-term goals, (3) delineation of an exercise treatment plan, (4) an educational booklet to enhance mobility self-management skills, and (5) six monthly telephone calls. Control group received a booklet with information on exercises targeting mobility limitations in seniors. Mobility, pain, and health status were assessed at baseline and at six months using multiple indicators drawn from Disabilities of the Arm, Shoulder, and Hand (DASH) Score, Lower Extremity Functional Scale (LEFS) and Short-Form (SF)-36. In all, 26 people were randomized to the intervention (mean age: 81 ± 8; 39% women), and 23 were randomized to the control (mean age: 79 ± 7; 33% women). The odds ratio for the mobility outcomes combined was 3.08 and the 95% confidence interval excluded 1 (1.65-5.77). The odds ratio for pain and health perception favored the MMOVeS group, but the 95% confidence interval included the null value. This feasibility study highlights the potential for efficacy of an individualized, exercise-focused, self-management program in comparison to exercise information in improving mobility outcome for seniors. Furthermore, a home-program combining self-management skills and exercise taught with minimal supervision prove to be feasible. Finally, data from this study can be used to estimate sample size for a confirmatory trial.

  19. Thermal discomfort with cold extremities in relation to age, gender, and body mass index in a random sample of a Swiss urban population

    Directory of Open Access Journals (Sweden)

    Orgül Selim

    2010-06-01

    Full Text Available Abstract Background The aim of this epidemiological study was to investigate the relationship of thermal discomfort with cold extremities (TDCE to age, gender, and body mass index (BMI in a Swiss urban population. Methods In a random population sample of Basel city, 2,800 subjects aged 20-40 years were asked to complete a questionnaire evaluating the extent of cold extremities. Values of cold extremities were based on questionnaire-derived scores. The correlation of age, gender, and BMI to TDCE was analyzed using multiple regression analysis. Results A total of 1,001 women (72.3% response rate and 809 men (60% response rate returned a completed questionnaire. Statistical analyses revealed the following findings: Younger subjects suffered more intensely from cold extremities than the elderly, and women suffered more than men (particularly younger women. Slimmer subjects suffered significantly more often from cold extremities than subjects with higher BMIs. Conclusions Thermal discomfort with cold extremities (a relevant symptom of primary vascular dysregulation occurs at highest intensity in younger, slimmer women and at lowest intensity in elderly, stouter men.

  20. Assessing the suitability of summary data for two-sample Mendelian randomization analyses using MR-Egger regression: the role of the I2 statistic.

    Science.gov (United States)

    Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R

    2016-12-01

    : MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We

  1. Effects of sample size and sampling frequency on studies of brown bear home ranges and habitat use

    Science.gov (United States)

    Arthur, Steve M.; Schwartz, Charles C.

    1999-01-01

    We equipped 9 brown bears (Ursus arctos) on the Kenai Peninsula, Alaska, with collars containing both conventional very-high-frequency (VHF) transmitters and global positioning system (GPS) receivers programmed to determine an animal's position at 5.75-hr intervals. We calculated minimum convex polygon (MCP) and fixed and adaptive kernel home ranges for randomly-selected subsets of the GPS data to examine the effects of sample size on accuracy and precision of home range estimates. We also compared results obtained by weekly aerial radiotracking versus more frequent GPS locations to test for biases in conventional radiotracking data. Home ranges based on the MCP were 20-606 km2 (x = 201) for aerial radiotracking data (n = 12-16 locations/bear) and 116-1,505 km2 (x = 522) for the complete GPS data sets (n = 245-466 locations/bear). Fixed kernel home ranges were 34-955 km2 (x = 224) for radiotracking data and 16-130 km2 (x = 60) for the GPS data. Differences between means for radiotracking and GPS data were due primarily to the larger samples provided by the GPS data. Means did not differ between radiotracking data and equivalent-sized subsets of GPS data (P > 0.10). For the MCP, home range area increased and variability decreased asymptotically with number of locations. For the kernel models, both area and variability decreased with increasing sample size. Simulations suggested that the MCP and kernel models required >60 and >80 locations, respectively, for estimates to be both accurate (change in area bears. Our results suggest that the usefulness of conventional radiotracking data may be limited by potential biases and variability due to small samples. Investigators that use home range estimates in statistical tests should consider the effects of variability of those estimates. Use of GPS-equipped collars can facilitate obtaining larger samples of unbiased data and improve accuracy and precision of home range estimates.

  2. Differentiating emotions across contexts: comparing adults with and without social anxiety disorder using random, social interaction, and daily experience sampling.

    Science.gov (United States)

    Kashdan, Todd B; Farmer, Antonina S

    2014-06-01

    The ability to recognize and label emotional experiences has been associated with well-being and adaptive functioning. This skill is particularly important in social situations, as emotions provide information about the state of relationships and help guide interpersonal decisions, such as whether to disclose personal information. Given the interpersonal difficulties linked to social anxiety disorder (SAD), deficient negative emotion differentiation may contribute to impairment in this population. We hypothesized that people with SAD would exhibit less negative emotion differentiation in daily life, and these differences would translate to impairment in social functioning. We recruited 43 people diagnosed with generalized SAD and 43 healthy adults to describe the emotions they experienced over 14 days. Participants received palmtop computers for responding to random prompts and describing naturalistic social interactions; to complete end-of-day diary entries, they used a secure online website. We calculated intraclass correlation coefficients to capture the degree of differentiation of negative and positive emotions for each context (random moments, face-to-face social interactions, and end-of-day reflections). Compared to healthy controls, the SAD group exhibited less negative (but not positive) emotion differentiation during random prompts, social interactions, and (at trend level) end-of-day assessments. These differences could not be explained by emotion intensity or variability over the 14 days, or to comorbid depression or anxiety disorders. Our findings suggest that people with generalized SAD have deficits in clarifying specific negative emotions felt at a given point of time. These deficits may contribute to difficulties with effective emotion regulation and healthy social relationship functioning.

  3. Differentiating Emotions Across Contexts: Comparing Adults with and without Social Anxiety Disorder Using Random, Social Interaction, and Daily Experience Sampling

    Science.gov (United States)

    Kashdan, Todd B.; Farmer, Antonina S.

    2014-01-01

    The ability to recognize and label emotional experiences has been associated with well-being and adaptive functioning. This skill is particularly important in social situations, as emotions provide information about the state of relationships and help guide interpersonal decisions, such as whether to disclose personal information. Given the interpersonal difficulties linked to social anxiety disorder (SAD), deficient negative emotion differentiation may contribute to impairment in this population. We hypothesized that people with SAD would exhibit less negative emotion differentiation in daily life, and these differences would translate to impairment in social functioning. We recruited 43 people diagnosed with generalized SAD and 43 healthy adults to describe the emotions they experienced over 14 days. Participants received palmtop computers for responding to random prompts and describing naturalistic social interactions; to complete end-of-day diary entries, they used a secure online website. We calculated intraclass correlation coefficients to capture the degree of differentiation of negative and positive emotions for each context (random moments, face-to-face social interactions, and end-of-day reflections). Compared to healthy controls, the SAD group exhibited less negative (but not positive) emotion differentiation during random prompts, social interactions, and (at trend level) end-of-day assessments. These differences could not be explained by emotion intensity or variability over the 14 days, or to comorbid depression or anxiety disorders. Our findings suggest that people with generalized SAD have deficits in clarifying specific negative emotions felt at a given point of time. These deficits may contribute to difficulties with effective emotion regulation and healthy social relationship functioning. PMID:24512246

  4. Systematic studies of small scintillators for new sampling calorimeter

    International Nuclear Information System (INIS)

    Jacosalem, E.P.; Sanchez, A.L.C.; Bacala, A.M.; Iba, S.; Nakajima, N.; Ono, H.; Miyata, H.

    2007-01-01

    A new sampling calorimeter using very thin scintillators and the multi-pixel photon counter (MPPC) has been proposed to produce better position resolution for the international linear collider (ILC) experiment. As part of this R and D study, small plastic scintillators of different sizes, thickness and wrapping reflectors are systematically studied. The scintillation light due to beta rays from a collimated 90 Sr source are collected from the scintillator by wavelength-shifting (WLS) fiber and converted into electrical signals at the PMT. The wrapped scintillator that gives the best light yield is determined by comparing the measured pulse height of each 10 x 40 x 2 mm strip scintillator covered with 3M reflective mirror film, teflon, white paint, black tape, gold, aluminum and white paint+teflon. The pulse height dependence on position, length and thickness of the 3M reflective mirror film and teflon wrapped scintillators are measured. Results show that the 3M radiant mirror film-wrapped scintillator has the greatest light yield with an average of 9.2 photoelectrons. It is observed that light yield slightly increases with scintillator length, but increases to about 100% when WLS fiber diameter is increased from 1.0 mm to 1.6 mm. The position dependence measurement along the strip scintillator showed the uniformity of light transmission from the sensor to the PMT. A dip across the strip is observed which is 40% of the maximum pulse height. The block type scintillator pulse height, on the other hand, is found to be almost proportional to scintillator thickness. (author)

  5. A Study on the Amount of Random Graph Groupies

    OpenAIRE

    Lu, Daodi

    2013-01-01

    In 1980, Ajtai, Komlos and Szemer{\\'e}di defined "groupie": Let $G=(V,E)$ be a simple graph, $|V|=n$, $|E|=e$. For a vertex $v\\in V$, let $r(v)$ denote the sum of the degrees of the vertices adjacent to $v$. We say $v\\in V$ is a {\\it groupie}, if $\\frac{r(v)}{\\deg(v)}\\geq\\frac{e}{n}.$ In this paper, we prove that in random graph $B(n,p)$, $0

  6. Simulation and study of small numbers of random events

    Science.gov (United States)

    Shelton, R. D.

    1986-01-01

    Random events were simulated by computer and subjected to various statistical methods to extract important parameters. Various forms of curve fitting were explored, such as least squares, least distance from a line, maximum likelihood. Problems considered were dead time, exponential decay, and spectrum extraction from cosmic ray data using binned data and data from individual events. Computer programs, mostly of an iterative nature, were developed to do these simulations and extractions and are partially listed as appendices. The mathematical basis for the compuer programs is given.

  7. Cooling tower wood sampling and analyses: A case study

    International Nuclear Information System (INIS)

    Haymore, J.L.

    1985-01-01

    Extensive wood sampling and analyses programs were initiated on crossflow and counterflow cooling towers that have been in service since 1951 and 1955, respectively. Wood samples were taken from all areas of the towers and were subjected to biological, chemical and physical tests. The tests and results for the analyses are discussed. The results indicate the degree of wood deterioration, and areas of the towers which experience the most advanced degree of degradation

  8. Financial ties of principal investigators and randomized controlled trial outcomes: cross sectional study.

    Science.gov (United States)

    Ahn, Rosa; Woodbridge, Alexandra; Abraham, Ann; Saba, Susan; Korenstein, Deborah; Madden, Erin; Boscardin, W John; Keyhani, Salomeh

    2017-01-17

     To examine the association between the presence of individual principal investigators' financial ties to the manufacturer of the study drug and the trial's outcomes after accounting for source of research funding.  Cross sectional study of randomized controlled trials (RCTs).  Studies published in "core clinical" journals, as identified by Medline, between 1 January 2013 and 31 December 2013.  Random sample of RCTs focused on drug efficacy.  Association between financial ties of principal investigators and study outcome.  A total of 190 papers describing 195 studies met inclusion criteria. Financial ties between principal investigators and the pharmaceutical industry were present in 132 (67.7%) studies. Of 397 principal investigators, 231 (58%) had financial ties and 166 (42%) did not. Of all principal investigators, 156 (39%) reported advisor/consultancy payments, 81 (20%) reported speakers' fees, 81 (20%) reported unspecified financial ties, 52 (13%) reported honorariums, 52 (13%) reported employee relationships, 52 (13%) reported travel fees, 41 (10%) reported stock ownership, and 20 (5%) reported having a patent related to the study drug. The prevalence of financial ties of principal investigators was 76% (103/136) among positive studies and 49% (29/59) among negative studies. In unadjusted analyses, the presence of a financial tie was associated with a positive study outcome (odds ratio 3.23, 95% confidence interval 1.7 to 6.1). In the primary multivariate analysis, a financial tie was significantly associated with positive RCT outcome after adjustment for the study funding source (odds ratio 3.57 (1.7 to 7.7). The secondary analysis controlled for additional RCT characteristics such as study phase, sample size, country of first authors, specialty, trial registration, study design, type of analysis, comparator, and outcome measure. These characteristics did not appreciably affect the relation between financial ties and study outcomes (odds ratio 3.37, 1

  9. Inverse probability weighting for covariate adjustment in randomized studies.

    Science.gov (United States)

    Shen, Changyu; Li, Xiaochun; Li, Lingling

    2014-02-20

    Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting a 'favorable' model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a 'favorable' model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Methicillin-sensitive and methicillin-resistant Staphylococcus aureus nasal carriage in a random sample of non-hospitalized adult population in northern Germany.

    Directory of Open Access Journals (Sweden)

    Jaishri Mehraj

    Full Text Available OBJECTIVE: The findings from truly randomized community-based studies on Staphylococcus aureus nasal colonization are scarce. Therefore we have examined point prevalence and risk factors of S. aureus nasal carriage in a non-hospitalized population of Braunschweig, northern Germany. METHODS: A total of 2026 potential participants were randomly selected through the resident's registration office and invited by mail. They were requested to collect a nasal swab at home and return it by mail. S. aureus was identified by culture and PCR. Logistic regression was used to determine risk factors of S. aureus carriage. RESULTS: Among the invitees, 405 individuals agreed to participate and 389 provided complete data which was included in the analysis. The median age of the participants was 49 years (IQR: 39-61 and 61% were females. S. aureus was isolated in 85 (21.9%; 95% CI: 18.0-26.2% of the samples, five of which were MRSA (1.29%; 95% CI: 0.55-2.98%. In multiple logistic regression, male sex (OR = 3.50; 95% CI: 2.01-6.11 and presence of allergies (OR = 2.43; 95% CI: 1.39-4.24 were found to be associated with S. aureus nasal carriage. Fifty five different spa types were found, that clustered into nine distinct groups. MRSA belonged to the hospital-associated spa types t032 and t025 (corresponds to MLST CC 22, whereas MSSA spa types varied and mostly belonged to spa-CC 012 (corresponds to MLST CC 30, and spa-CC 084 (corresponds to MLST CC 15. CONCLUSION: This first point prevalence study of S. aureus in a non-hospitalized population of Germany revealed prevalence, consistent with other European countries and supports previous findings on male sex and allergies as risk factors of S. aureus carriage. The detection of hospital-associated MRSA spa types in the community indicates possible spread of these strains from hospitals into the community.

  11. Study of probe-sample distance for biomedical spectra measurement

    Directory of Open Access Journals (Sweden)

    Li Lei

    2011-11-01

    Full Text Available Abstract Background Fiber-based optical spectroscopy has been widely used for biomedical applications. However, the effect of probe-sample distance on the collection efficiency has not been well investigated. Method In this paper, we presented a theoretical model to maximize the illumination and collection efficiency in designing fiber optic probes for biomedical spectra measurement. This model was in general applicable to probes with single or multiple fibers at an arbitrary incident angle. In order to demonstrate the theory, a fluorescence spectrometer was used to measure the fluorescence of human finger skin at various probe-sample distances. The fluorescence spectrum and the total fluorescence intensity were recorded. Results The theoretical results show that for single fiber probes, contact measurement always provides the best results. While for multi-fiber probes, there is an optimal probe distance. When a 400- μm excitation fiber is used to deliver the light to the skin and another six 400- μm fibers surrounding the excitation fiber are used to collect the fluorescence signal, the experimental results show that human finger skin has very strong fluorescence between 475 nm and 700 nm under 450 nm excitation. The fluorescence intensity is heavily dependent on the probe-sample distance and there is an optimal probe distance. Conclusions We investigated a number of probe-sample configurations and found that contact measurement could be the primary choice for single-fiber probes, but was very inefficient for multi-fiber probes. There was an optimal probe-sample distance for multi-fiber probes. By carefully choosing the probe-sample distance, the collection efficiency could be enhanced by 5-10 times. Our experiments demonstrated that the experimental results of the probe-sample distance dependence of collection efficiency in multi-fiber probes were in general agreement with our theory.

  12. Modeling of Residential Water Demand Using Random Effect Model,Case Study: Arak City

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Sajadifar

    2011-10-01

    Full Text Available The present study tries to apply the “Partial Adjustment Model” and “Random Effect Model” techniques to the Stone-Greay’s linear expenditure system, in order to estimate the "Residential Seasonal Demand" for water in Arak city. Per capita water consumption of family residences is regressed on marginal price, per capita income, price of other goods, average temperature and average rainfall. Panel data approaches based on a sample of 152 observations from Arak city referred to 1993-2003. From the estimation of the Elasticity-price of the residential water demand, we want to know how a policy of responsive pricing can lead to more efficient household water consumption inArakcity. Results also indicated that summer price elasticity was twice the winter and price and income elasticity was less than 1 in all cases.

  13. Study of Bacterial Samples Using Laser Induced Breakdown Spectroscopy

    International Nuclear Information System (INIS)

    Farooq W A; Atif M; Tawfik W; Alsalhi M S; Alahmed Z A; Sarfraz M; Singh J P

    2014-01-01

    Laser-induced breakdown spectroscopy (LIBS) technique has been applied to investigate two different types of bacteria, Escherichia coli (B1) and Micrococcus luteus (B2) deposited on glass slides using Spectrolaser 7000. LIBS spectra were analyzed using spectrolaser software. LIBS spectrum of glass substrate was compared with bacteria spectra. Ca, Mg, Na, K, P, S, Cl, Fe, Al, Mn, Cu, C, H and CN-band appeared in bacterial samples in air. Two carbon lines at 193.02 nm, 247.88 nm and one hydrogen line at 656.28 nm with intensity ratios of 1.9, 1.83 and 1.53 appeared in bacterial samples B1 and B2 respectively. Carbon and hydrogen are the important components of the bio-samples like bacteria and other cancer cells. Investigation on LIBS spectra of the samples in He and Ar atmospheres is also presented. Ni lines appeared only in B2 sample in Ar atmosphere. From the present experimental results we are able to show that LIBS technique has a potential in the identification and discrimination of different types of bacteria. (plasma technology)

  14. Studies on the radiocarbon sample from the shroud of turin

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, Raymond N. [Los Alamos National Laboratory, University of California, 1961 Cumbres Patio, Los Alamos, NM 87544 (United States)]. E-mail: rnrogers@att.net

    2005-01-20

    In 1988, radiocarbon laboratories at Arizona, Cambridge, and Zurich determined the age of a sample from the Shroud of Turin. They reported that the date of the cloth's production lay between A.D. 1260 and 1390 with 95% confidence. This came as a surprise in view of the technology used to produce the cloth, its chemical composition, and the lack of vanillin in its lignin. The results prompted questions about the validity of the sample. Preliminary estimates of the kinetics constants for the loss of vanillin from lignin indicate a much older age for the cloth than the radiocarbon analyses. The radiocarbon sampling area is uniquely coated with a yellow-brown plant gum containing dye lakes. Pyrolysis-mass-spectrometry results from the sample area coupled with microscopic and microchemical observations prove that the radiocarbon sample was not part of the original cloth of the Shroud of Turin. The radiocarbon date was thus not valid for determining the true age of the shroud.

  15. Studies on the radiocarbon sample from the shroud of turin

    International Nuclear Information System (INIS)

    Rogers, Raymond N.

    2005-01-01

    In 1988, radiocarbon laboratories at Arizona, Cambridge, and Zurich determined the age of a sample from the Shroud of Turin. They reported that the date of the cloth's production lay between A.D. 1260 and 1390 with 95% confidence. This came as a surprise in view of the technology used to produce the cloth, its chemical composition, and the lack of vanillin in its lignin. The results prompted questions about the validity of the sample. Preliminary estimates of the kinetics constants for the loss of vanillin from lignin indicate a much older age for the cloth than the radiocarbon analyses. The radiocarbon sampling area is uniquely coated with a yellow-brown plant gum containing dye lakes. Pyrolysis-mass-spectrometry results from the sample area coupled with microscopic and microchemical observations prove that the radiocarbon sample was not part of the original cloth of the Shroud of Turin. The radiocarbon date was thus not valid for determining the true age of the shroud

  16. Network meta-analysis incorporating randomized controlled trials and non-randomized comparative cohort studies for assessing the safety and effectiveness of medical treatments: challenges and opportunities

    OpenAIRE

    Cameron, Chris; Fireman, Bruce; Hutton, Brian; Clifford, Tammy; Coyle, Doug; Wells, George; Dormuth, Colin R.; Platt, Robert; Toh, Sengwee

    2015-01-01

    Network meta-analysis is increasingly used to allow comparison of multiple treatment alternatives simultaneously, some of which may not have been compared directly in primary research studies. The majority of network meta-analyses published to date have incorporated data from randomized controlled trials (RCTs) only; however, inclusion of non-randomized studies may sometimes be considered. Non-randomized studies can complement RCTs or address some of their limitations, such as short follow-up...

  17. Computed tomography of the brain, hepatotoxic drugs and high alcohol consumption in male alcoholic patients and a random sample from the general male population

    Energy Technology Data Exchange (ETDEWEB)

    Muetzell, S. (Univ. Hospital of Uppsala (Sweden). Dept. of Family Medicine)

    1992-01-01

    Computed tomography (CT) of the brain was performed in a random sample of a total of 195 men and 211 male alcoholic patients admitted for the first time during a period of two years from the same geographically limited area of Greater Stockholm as the sample. Laboratory tests were performed, including liver and pancreatic tests. Toxicological screening was performed and the consumption of hepatotoxic drugs was also investigated. The groups were then subdivided with respect to alcohol consumption and use of hepatotoxic drugs: group IA, men from the random sample with low or moderate alcohol consumption and no use of hepatotoxic drugs; IB, men from the random sample with low or moderate alcohol consumption with use of hepatotoxic drugs; IIA, alcoholic inpatients with use of alcohol and no drugs; and IIB, alcoholic inpatients with use of alcohol and drugs. Group IIB was found to have a higher incidence of cortical and subcortical changes than group IA. Group IB had a higher incidence of subcortical changes than group IA, and they differed only in drug use. Groups IIN and IIA only differed in drug use, and IIB had a higher incidence of brian damage except for anterior horn index and wide cerebellar sulci indicating vermian atrophy. Significantly higher serum levels of bilirubin, GGT, ASAT, ALAT, CK LD, and amylase were found in IIB. The results indicate that drug use influences the incidence of cortical and subcortical aberrations, except anterior horn index. It is concluded that the groups with alcohol abuse who used hepatotoxic drugs showed a picture of cortical changes (wide transport sulci and clear-cut of high-grade cortical changes) and also of subcortical aberrations, expressed as an increased widening on the third ventricle.

  18. Computed tomography of the brain, hepatotoxic drugs and high alcohol consumption in male alcoholic patients and a random sample from the general male population

    International Nuclear Information System (INIS)

    Muetzell, S.

    1992-01-01

    Computed tomography (CT) of the brain was performed in a random sample of a total of 195 men and 211 male alcoholic patients admitted for the first time during a period of two years from the same geographically limited area of Greater Stockholm as the sample. Laboratory tests were performed, including liver and pancreatic tests. Toxicological screening was performed and the consumption of hepatotoxic drugs was also investigated. The groups were then subdivided with respect to alcohol consumption and use of hepatotoxic drugs: group IA, men from the random sample with low or moderate alcohol consumption and no use of hepatotoxic drugs; IB, men from the random sample with low or moderate alcohol consumption with use of hepatotoxic drugs; IIA, alcoholic inpatients with use of alcohol and no drugs; and IIB, alcoholic inpatients with use of alcohol and drugs. Group IIB was found to have a higher incidence of cortical and subcortical changes than group IA. Group IB had a higher incidence of subcortical changes than group IA, and they differed only in drug use. Groups IIN and IIA only differed in drug use, and IIB had a higher incidence of brian damage except for anterior horn index and wide cerebellar sulci indicating vermian atrophy. Significantly higher serum levels of bilirubin, GGT, ASAT, ALAT, CK LD, and amylase were found in IIB. The results indicate that drug use influences the incidence of cortical and subcortical aberrations, except anterior horn index. It is concluded that the groups with alcohol abuse who used hepatotoxic drugs showed a picture of cortical changes (wide transport sulci and clear-cut of high-grade cortical changes) and also of subcortical aberrations, expressed as an increased widening on the third ventricle

  19. Sugammadex versus neostigmine in pediatric patients: a prospective randomized study

    Directory of Open Access Journals (Sweden)

    Turhan Kara

    2014-11-01

    Full Text Available Background and objectives: Acetylcholinesterase inhibitors may cause postoperative residual curarization when they are used for reversal of neuromuscular blockade. Sugammadex reverses neuromuscular blockade by chemical encapsulation and is not associated with the side effects that may occur with the use of anticholinesterase agents. Because of increased outpatient surgical procedures postoperative residual curarization and rapid postoperative recovery have a greater importance in the pediatric patient population. The aim of this study was to compare the efficacy of sugammadex and neostigmine on reversing neuromuscular blockade in pediatric patients undergoing outpatient surgical procedures. Methods: 80 patients, aged 2–12 years, scheduled for outpatient surgery were enrolled in this randomized prospective study. Neuromuscular blockade was achieved with 0.6 mg kg−1 rocuronium and monitorized with train-of-four. Group RN (n = 40 received 0.03 mg kg−1 neostigmine, Group RS (n = 40 received 2 mg kg−1 sugammadex for reversal of rocuronium. Extubation time (time from the reversal of neuromuscular blockade to extubation, train-of-four ratio during this time, time to reach train-of-four > 0.9, and probable complications were recorded. Results: There was no significant difference between the patients’ characteristics. Extubation time and time to reach train-of-four > 0.9 were significantly higher in Group RN (p = 0.001, p = 0.002. Train-of-four at the time of neostigmine/sugammadex injection in Group RN were significantly higher than in the RS group (p = 0.020. Extubation train-of-four ratio was significantly lower in Group RN (p = 0.002. Conclusion: Sugammadex provides safer extubation with a shorter recovery time than neostigmine in pediatric patients undergoing outpatient surgical procedures. Resumo: Justificativa e objetivos: Os inibidores da acetilcolinesterase podem causar curarização residual no p

  20. Predictors of poor retention on antiretroviral therapy as a major HIV drug resistance early warning indicator in Cameroon: results from a nationwide systematic random sampling.

    Science.gov (United States)

    Billong, Serge Clotaire; Fokam, Joseph; Penda, Calixte Ida; Amadou, Salmon; Kob, David Same; Billong, Edson-Joan; Colizzi, Vittorio; Ndjolo, Alexis; Bisseck, Anne-Cecile Zoung-Kani; Elat, Jean-Bosco Nfetam

    2016-11-15

    Retention on lifelong antiretroviral therapy (ART) is essential in sustaining treatment success while preventing HIV drug resistance (HIVDR), especially in resource-limited settings (RLS). In an era of rising numbers of patients on ART, mastering patients in care is becoming more strategic for programmatic interventions. Due to lapses and uncertainty with the current WHO sampling approach in Cameroon, we thus aimed to ascertain the national performance of, and determinants in, retention on ART at 12 months. Using a systematic random sampling, a survey was conducted in the ten regions (56 sites) of Cameroon, within the "reporting period" of October 2013-November 2014, enrolling 5005 eligible adults and children. Performance in retention on ART at 12 months was interpreted following the definition of HIVDR early warning indicator: excellent (>85%), fair (85-75%), poor (sampling strategy could be further strengthened for informed ART monitoring and HIVDR prevention perspectives.

  1. Study of gastric cancer samples using terahertz techniques

    Science.gov (United States)

    Wahaia, Faustino; Kasalynas, Irmantas; Seliuta, Dalius; Molis, Gediminas; Urbanowicz, Andrzej; Carvalho Silva, Catia D.; Carneiro, Fatima; Valusis, Gintaras; Granja, Pedro L.

    2014-08-01

    In the present work, samples of healthy and adenocarcinoma-affected human gastric tissue were analyzed using transmission time-domain THz spectroscopy (THz-TDS) and spectroscopic THz imaging at 201 and 590 GHz. The work shows that it is possible to distinguish between normal and cancerous regions in dried and paraffin-embedded samples. Plots of absorption coefficient α and refractive index n of normal and cancer affected tissues, as well as 2-D transmission THz images are presented and the conditions for discrimination between normal and affected tissues are discussed.

  2. An econometric method for estimating population parameters from non-random samples: An application to clinical case finding.

    Science.gov (United States)

    Burger, Rulof P; McLaren, Zoë M

    2017-09-01

    The problem of sample selection complicates the process of drawing inference about populations. Selective sampling arises in many real world situations when agents such as doctors and customs officials search for targets with high values of a characteristic. We propose a new method for estimating population characteristics from these types of selected samples. We develop a model that captures key features of the agent's sampling decision. We use a generalized method of moments with instrumental variables and maximum likelihood to estimate the population prevalence of the characteristic of interest and the agents' accuracy in identifying targets. We apply this method to tuberculosis (TB), which is the leading infectious disease cause of death worldwide. We use a national database of TB test data from South Africa to examine testing for multidrug resistant TB (MDR-TB). Approximately one quarter of MDR-TB cases was undiagnosed between 2004 and 2010. The official estimate of 2.5% is therefore too low, and MDR-TB prevalence is as high as 3.5%. Signal-to-noise ratios are estimated to be between 0.5 and 1. Our approach is widely applicable because of the availability of routinely collected data and abundance of potential instruments. Using routinely collected data to monitor population prevalence can guide evidence-based policy making. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Randomized controlled trial of endoscopic ultrasound-guided fine-needle sampling with or without suction for better cytological diagnosis

    DEFF Research Database (Denmark)

    Puri, Rajesh; Vilmann, Peter; Saftoiu, Adrian

    2009-01-01

    ). The samples were characterized for cellularity and bloodiness, with a final cytology diagnosis established blindly. The final diagnosis was reached either by EUS-FNA if malignancy was definite, or by surgery and/or clinical follow-up of a minimum of 6 months in the cases of non-specific benign lesions...

  4. Safety and feasibility of transcranial direct current stimulation in pediatric hemiparesis: randomized controlled preliminary study.

    Science.gov (United States)

    Gillick, Bernadette T; Feyma, Tim; Menk, Jeremiah; Usset, Michelle; Vaith, Amy; Wood, Teddi Jean; Worthington, Rebecca; Krach, Linda E

    2015-03-01

    Transcranial direct current stimulation (tDCS) is a form of noninvasive brain stimulation that has shown improved adult stroke outcomes. Applying tDCS in children with congenital hemiparesis has not yet been explored. The primary objective of this study was to explore the safety and feasibility of single-session tDCS through an adverse events profile and symptom assessment within a double-blind, randomized placebo-controlled preliminary study in children with congenital hemiparesis. A secondary objective was to assess the stability of hand and cognitive function. A double-blind, randomized placebo-controlled pretest/posttest/follow-up study was conducted. The study was conducted in a university pediatric research laboratory. Thirteen children, ages 7 to 18 years, with congenital hemiparesis participated. Adverse events/safety assessment and hand function were measured. Participants were randomly assigned to either an intervention group or a control group, with safety and functional assessments at pretest, at posttest on the same day, and at a 1-week follow-up session. An intervention of 10 minutes of 0.7 mA tDCS was applied to bilateral primary motor cortices. The tDCS intervention was considered safe if there was no individual decline of 25% or group decline of 2 standard deviations for motor evoked potentials (MEPs) and behavioral data and no report of adverse events. No major adverse events were found, including no seizures. Two participants did not complete the study due to lack of MEP and discomfort. For the 11 participants who completed the study, group differences in MEPs and behavioral data did not exceed 2 standard deviations in those who received the tDCS (n=5) and those in the control group (n=6). The study was completed without the need for stopping per medical monitor and biostatisticial analysis. A limitation of the study was the small sample size, with data available for 11 participants. Based on the results of this study, tDCS appears to be safe

  5. Prenatal emotion management improves obstetric outcomes: a randomized control study.

    Science.gov (United States)

    Huang, Jian; Li, He-Jiang; Wang, Jue; Mao, Hong-Jing; Jiang, Wen-Ying; Zhou, Hong; Chen, Shu-Lin

    2015-01-01

    Negative emotions can cause a number of prenatal problems and disturb obstetric outcomes. We determined the effectiveness of prenatal emotional management on obstetric outcomes in nulliparas. All participants completed the PHQ-9 at the baseline assessment. Then, the participants were randomly assigned to the emotional management (EM) and usual care (UC) groups. The baseline evaluation began at 31 weeks gestation and the participants were followed up to 42 days postpartum. Each subject in the EM group received an extra EM program while the participants in the UC groups received routine prenatal care and education only. The PHQ-9 and Edinburgh Postnatal Depression scale (EPDS) were used for assessment. The EM group had a lower PHQ-9 score at 36 weeks gestation, and 7 and 42 days after delivery (P Prenatal EM intervention could control anxiety and depressive feelings in nulliparas, and improve obstetric outcomes. It may serve as an innovative approach to reduce the cesarean section rate in China.

  6. Best (but oft-forgotten) practices: the design, analysis, and interpretation of Mendelian randomization studies1

    Science.gov (United States)

    Bowden, Jack; Relton, Caroline; Davey Smith, George

    2016-01-01

    Mendelian randomization (MR) is an increasingly important tool for appraising causality in observational epidemiology. The technique exploits the principle that genotypes are not generally susceptible to reverse causation bias and confounding, reflecting their fixed nature and Mendel’s first and second laws of inheritance. The approach is, however, subject to important limitations and assumptions that, if unaddressed or compounded by poor study design, can lead to erroneous conclusions. Nevertheless, the advent of 2-sample approaches (in which exposure and outcome are measured in separate samples) and the increasing availability of open-access data from large consortia of genome-wide association studies and population biobanks mean that the approach is likely to become routine practice in evidence synthesis and causal inference research. In this article we provide an overview of the design, analysis, and interpretation of MR studies, with a special emphasis on assumptions and limitations. We also consider different analytic strategies for strengthening causal inference. Although impossible to prove causality with any single approach, MR is a highly cost-effective strategy for prioritizing intervention targets for disease prevention and for strengthening the evidence base for public health policy. PMID:26961927

  7. Ecological momentary assessment for chronic pain in fibromyalgia using a smartphone: a randomized crossover study.

    Science.gov (United States)

    Garcia-Palacios, A; Herrero, R; Belmonte, M A; Castilla, D; Guixeres, J; Molinari, G; Baños, R M

    2014-07-01

    Daily diaries are a useful way of measuring fluctuations in pain-related symptoms. However, traditional diaries do not assure the gathering of data in real time, not solving the problem of retrospective assessment. Ecological momentary assessment (EMA) by means of electronic diaries helps to improve repeated assessment. However, it is important to test its feasibility in specific populations in order to reach a wider number of people who could benefit from these procedures. The present study compares the compliance and acceptability of an electronic diary running on a smartphone using a crossover design for a sample with a specific pain condition, fibromyalgia and low familiarity with technology. Forty-seven participants were randomly assigned to one of two conditions: (1) paper diary - smartphone diary and (2) smartphone diary - paper diary, using each assessment method for 1 week. The findings of this study showed that the smartphone diary made it possible to gather more accurate and complete ratings. Besides, this method was well accepted by a sample of patients with fibromyalgia referred by a public hospital, with an important proportion of participants with low level of education and low familiarity with technology. The findings of this study support the use of smartphones for EMA even in specific populations with a specific pain condition, fibromyalgia and with low familiarity with technology. These methods could help clinicians and researchers to gather more accurate ratings of relevant pain-related variables even in populations with low familiarity with technology.

  8. GY SAMPLING THEORY IN ENVIRONMENTAL STUDIES 2: SUBSAMPLING ERROR MEASUREMENTS

    Science.gov (United States)

    Sampling can be a significant source of error in the measurement process. The characterization and cleanup of hazardous waste sites require data that meet site-specific levels of acceptable quality if scientifically supportable decisions are to be made. In support of this effort,...

  9. Sample size computation for association studies using case–parents ...

    Indian Academy of Sciences (India)

    ple size needed to reach a given power (Knapp 1999; Schaid. 1999; Chen and Deng 2001; Brown 2004). In their seminal paper, Risch and Merikangas (1996) showed that for a mul- tiplicative mode of inheritance (MOI) for the susceptibility gene, sample size depends on two parameters: the frequency of the risk allele at the ...

  10. Paracetamol sharpens reflection and spatial memory: a double-blind randomized controlled study in healthy volunteers.

    Science.gov (United States)

    Pickering, Gisèle; Macian, Nicolas; Dubray, Claude; Pereira, Bruno

    2016-01-01

    Acetaminophen (APAP, paracetamol) mechanism for analgesic and antipyretic outcomes has been largely addressed, but APAP action on cognitive function has not been studied in humans. Animal studies have suggested an improved cognitive performance but the link with analgesic and antipyretic modes of action is incomplete. This study aims at exploring cognitive tests in healthy volunteers in the context of antinociception and temperature regulation. A double-blind randomized controlled study (NCT01390467) was carried out from May 30, 2011 to July 12, 2011. Forty healthy volunteers were included and analyzed. Nociceptive thresholds, core temperature (body temperature), and a battery of cognitive tests were recorded before and after oral APAP (2 g) or placebo: Information sampling task for predecisional processing, Stockings of Cambridge for spatial memory, reaction time, delayed matching of sample, and pattern recognition memory tests. Analysis of variance for repeated measures adapted to crossover design was performed and a two-tailed type I error was fixed at 5%. APAP improved information sampling task (diminution of the number of errors, latency to open boxes, and increased number of opened boxes; all P memory initial thinking time were decreased ( P =0.04). All other tests were not modified by APAP. APAP had an antinociceptive effect ( P body temperature did not change. This study shows for the first time that APAP sharpens decision making and planning strategy in healthy volunteers and that cognitive performance and antinociception are independent of APAP effect on thermogenesis. We suggest that cognitive performance mirrors the analgesic rather than thermic cascade of events, with possibly a central role for serotonergic and cannabinoid systems that need to be explored further in the context of pain and cognition.

  11. A shared frailty model for case-cohort samples: parent and offspring relations in an adoption study

    DEFF Research Database (Denmark)

    Petersen, Liselotte; Sørensen, Thorkild I A; Andersen, Per Kragh

    2010-01-01

    of their biological and adoptive parents were collected with the purpose of studying the association of survival between the adoptee and his/her biological or adoptive parents. Motivated by this study, we explored how to make inference in a shared frailty model for case-cohort data. Our approach was to use inverse......The Danish adoption register contains data on the 12 301 Danish nonfamilial adoptions during 1924-1947. From that register a case-cohort sample was selected consisting of all case adoptees, that is those adoptees dying before age 70 years, and a random sample of 1683 adoptees. The survival data...... probability weighting to account for the sampling in a conditional, shared frailty Poisson model and to use the robust variance estimator proposed by Moger et al. (Statist. Med. 2008; 27:1062-1074).To explore the performance of the estimation procedure, a simulation study was conducted. We studied situations...

  12. Testing links between childhood positive peer relations and externalizing outcomes through a randomized controlled intervention study

    NARCIS (Netherlands)

    Witvliet, M.; van Lier, P.A.C.; Cuijpers, P.; Koot, H.M.

    2009-01-01

    In this study, the authors used a randomized controlled trial to explore the link between having positive peer relations and externalizing outcomes in 758 children followed from kindergarten to the end of 2nd grade. Children were randomly assigned to the Good Behavior Game (GBG), a universal

  13. Sampling atmospheric pesticides with SPME: Laboratory developments and field study

    International Nuclear Information System (INIS)

    Wang Junxia; Tuduri, Ludovic; Mercury, Maud; Millet, Maurice; Briand, Olivier; Montury, Michel

    2009-01-01

    To estimate the atmospheric exposure of the greenhouse workers to pesticides, solid phase microextraction (SPME) was used under non-equilibrium conditions. Using Fick's law of diffusion, the concentrations of pesticides in the greenhouse can be calculated using pre-determined sampling rates (SRs). Thus the sampling rates (SRs) of two modes of SPME in the lab and in the field were determined and compared. The SRs for six pesticides in the lab were 20.4-48.3 mL min -1 for the exposed fiber and 0.166-0.929 mL min -1 for the retracted fiber. In field sampling, two pesticides, dichlorvos and cyprodinil were detected with exposed SPME. SR with exposed SPME for dichlorvos in the field (32.4 mL min -1 ) was consistent with that in the lab (34.5 mL min -1 ). SR for dichlorvos in the field (32.4 mL min -1 ) was consistent with that in the lab (34.5 mL min -1 ). The trends of temporal concentration and the inhalation exposure were also obtained. - SPME was proved to be a powerful and simple tool for determining pesticides' atmospheric concentration

  14. Are Video Games a Gateway to Gambling? A Longitudinal Study Based on a Representative Norwegian Sample.

    Science.gov (United States)

    Molde, Helge; Holmøy, Bjørn; Merkesdal, Aleksander Garvik; Torsheim, Torbjørn; Mentzoni, Rune Aune; Hanns, Daniel; Sagoe, Dominic; Pallesen, Ståle

    2018-06-05

    The scope and variety of video games and monetary gambling opportunities are expanding rapidly. In many ways, these forms of entertainment are converging on digital and online video games and gambling sites. However, little is known about the relationship between video gaming and gambling. The present study explored the possibility of a directional relationship between measures of problem gaming and problem gambling, while also controlling for the influence of sex and age. In contrast to most previous investigations which are based on cross-sectional designs and non-representative samples, the present study utilized a longitudinal design conducted over 2 years (2013, 2015) and comprising 4601 participants (males 47.2%, age range 16-74) drawn from a random sample from the general population. Video gaming and gambling were assessed using the Gaming Addiction Scale for Adolescents and the Canadian Problem Gambling Index, respectively. Using an autoregressive cross-lagged structural equation model, we found a positive relationship between scores on problematic gaming and later scores on problematic gambling, whereas we found no evidence of the reverse relationship. Hence, video gaming problems appear to be a gateway behavior to problematic gambling behavior. In future research, one should continue to monitor the possible reciprocal behavioral influences between gambling and video gaming.

  15. Ventilatory Function in Relation to Mining Experience and Smoking in a Random Sample of Miners and Non-miners in a Witwatersrand Town1

    Science.gov (United States)

    Sluis-Cremer, G. K.; Walters, L. G.; Sichel, H. S.

    1967-01-01

    The ventilatory capacity of a random sample of men over the age of 35 years in the town of Carletonville was estimated by the forced expiratory volume and the peak expiratory flow rate. Five hundred and sixty-two persons were working or had worked in gold-mines and 265 had never worked in gold-mines. No difference in ventilatory function was found between the miners and non-miners other than that due to the excess of chronic bronchitis in miners. PMID:6017134

  16. On-Demand Treatment of Premature Ejaculation with Citalopram: A Randomized Double-Blind Study

    Directory of Open Access Journals (Sweden)

    Ghafuri Zahra

    2009-10-01

    Full Text Available "nAs the most common male sexual disorder premature ejaculation (PE, also referred to as early ejaculation (EE or rapid ejaculation (RE, affects 30%-40% of sexually active men. Despite the limited number of available studies comparing the efficacy of selective serotonin re-uptake inhibitors (SSRI they have been thought to have beneficial effects for the treatment of patients with PE. In the present study, we assessed the efficacy of on-demand use of citalopram, in the treatment of premature ejaculation. A randomized double blind study of fixed dose on-demand use of citalopram was performed in Roozbeh Psychiatry Hospital, Tehran University of Medical Sciences. The sample was consisted of 80 married patients diagnosed with PE according to Diagnostic and Statistical Manual of Mental Disorders. The patients were randomly assigned to two groups: group 1 consisting of 42 patients received 20mg citalopram, and group 2 consisting of 38 patients received placebo four hours before intercourse for a 4-week treatment course. The effects of drug on the ejaculatory function in each group were assessed by the intravaginal ejaculation latency time (IELT, and the Chinese Index of Premature Ejaculation (CIPE before and at the end of treatment course. The mean IELT increased from 66.78±36.94 to 80.85±43.05 seconds in group 1 and from 63.44±33.16 to 65.71±34.26 seconds in group 2 (P = 0.000. Mean CIPE score increased 1.14±1.04 and 0.52±0.50 in group 1 and 2 respectively (P = 0.002. The patients treated with on demand citalopram showed significantly greater improvement in IELT and CIPE score compared to the patients receiving placebo. It seems that citalopram may be an effective treatment of premature ejaculation with on-demand usage. However further studies are warranted.

  17. Random walk on random walks

    NARCIS (Netherlands)

    Hilário, M.; Hollander, den W.Th.F.; Sidoravicius, V.; Soares dos Santos, R.; Teixeira, A.

    2014-01-01

    In this paper we study a random walk in a one-dimensional dynamic random environment consisting of a collection of independent particles performing simple symmetric random walks in a Poisson equilibrium with density ¿¿(0,8). At each step the random walk performs a nearest-neighbour jump, moving to

  18. Reducing Eating Disorder Onset in a Very High Risk Sample with Significant Comorbid Depression: A Randomized Controlled Trial

    Science.gov (United States)

    Taylor, C. Barr; Kass, Andrea E.; Trockel, Mickey; Cunning, Darby; Weisman, Hannah; Bailey, Jakki; Sinton, Meghan; Aspen, Vandana; Schecthman, Kenneth; Jacobi, Corinna; Wilfley, Denise E.

    2015-01-01

    Objective Eating disorders (EDs) are serious problems among college-age women and may be preventable. An indicated on-line eating disorder (ED) intervention, designed to reduce ED and comorbid pathology, was evaluated. Method 206 women (M age = 20 ± 1.8 years; 51% White/Caucasian, 11% African American, 10% Hispanic, 21% Asian/Asian American, 7% other) at very high risk for ED onset (i.e., with high weight/shape concerns plus a history of being teased, current or lifetime depression, and/or non-clinical levels of compensatory behaviors) were randomized to a 10-week, Internet-based, cognitive-behavioral intervention or wait-list control. Assessments included the Eating Disorder Examination (EDE to assess ED onset), EDE-Questionnaire, Structured Clinical Interview for DSM Disorders, and Beck Depression Inventory-II. Results ED attitudes and behaviors improved more in the intervention than control group (p = 0.02, d = 0.31); although ED onset rate was 27% lower, this difference was not significant (p = 0.28, NNT = 15). In the subgroup with highest shape concerns, ED onset rate was significantly lower in the intervention than control group (20% versus 42%, p = 0.025, NNT = 5). For the 27 individuals with depression at baseline, depressive symptomatology improved more in the intervention than control group (p = 0.016, d = 0.96); although ED onset rate was lower in the intervention than control group, this difference was not significant (25% versus 57%, NNT = 4). Conclusions An inexpensive, easily disseminated intervention might reduce ED onset among those at highest risk. Low adoption rates need to be addressed in future research. PMID:26795936

  19. Reducing eating disorder onset in a very high risk sample with significant comorbid depression: A randomized controlled trial.

    Science.gov (United States)

    Taylor, C Barr; Kass, Andrea E; Trockel, Mickey; Cunning, Darby; Weisman, Hannah; Bailey, Jakki; Sinton, Meghan; Aspen, Vandana; Schecthman, Kenneth; Jacobi, Corinna; Wilfley, Denise E

    2016-05-01

    Eating disorders (EDs) are serious problems among college-age women and may be preventable. An indicated online eating disorder (ED) intervention, designed to reduce ED and comorbid pathology, was evaluated. 206 women (M age = 20 ± 1.8 years; 51% White/Caucasian, 11% African American, 10% Hispanic, 21% Asian/Asian American, 7% other) at very high risk for ED onset (i.e., with high weight/shape concerns plus a history of being teased, current or lifetime depression, and/or nonclinical levels of compensatory behaviors) were randomized to a 10-week, Internet-based, cognitive-behavioral intervention or waitlist control. Assessments included the Eating Disorder Examination (EDE, to assess ED onset), EDE-Questionnaire, Structured Clinical Interview for DSM Disorders, and Beck Depression Inventory-II. ED attitudes and behaviors improved more in the intervention than control group (p = .02, d = 0.31); although ED onset rate was 27% lower, this difference was not significant (p = .28, NNT = 15). In the subgroup with highest shape concerns, ED onset rate was significantly lower in the intervention than control group (20% vs. 42%, p = .025, NNT = 5). For the 27 individuals with depression at baseline, depressive symptomatology improved more in the intervention than control group (p = .016, d = 0.96); although ED onset rate was lower in the intervention than control group, this difference was not significant (25% vs. 57%, NNT = 4). An inexpensive, easily disseminated intervention might reduce ED onset among those at highest risk. Low adoption rates need to be addressed in future research. (c) 2016 APA, all rights reserved).

  20. A study of western pharmaceuticals contained within samples of Chinese herbal/patent medicines collected from New York City's Chinatown.

    Science.gov (United States)

    Miller, Gretchen M; Stripp, Richard

    2007-09-01

    In America, recent growth in the popularity of Chinese herbal/patent medicines (CHM/CPM) has generated concerns as to the safety of these and other herbal remedies. Lack of strict federal regulations has lead to the possibility of improper labeling and even adulteration of these products with western drugs or other chemical contaminants. Our laboratory has conducted an analytical study to determine the presence of undeclared pharmaceuticals and therapeutic substances within CHM/CPM sold in New York City's Chinatown. Ninety representative samples randomly purchased in the form of pills, tablets, creams and teas were screened by appropriate analytical techniques including TLC, GC/MS and HPLC. Five samples contained nine different western pharmaceuticals. Two of these samples contained undeclared or mislabeled substances. One sample contained two pharmaceuticals contraindicated in people for whom the product was intended. Drugs identified include promethazine, chlormethiazole, chlorpheniramine, diclofenac, chlordiazepoxide, hydrochlorothiazide, triamterene, diphenhydramine and sildenafil citrate (Viagra).

  1. Sampling and assessment accuracy in mate choice: a random-walk model of information processing in mating decision.

    Science.gov (United States)

    Castellano, Sergio; Cermelli, Paolo

    2011-04-07

    Mate choice depends on mating preferences and on the manner in which mate-quality information is acquired and used to make decisions. We present a model that describes how these two components of mating decision interact with each other during a comparative evaluation of prospective mates. The model, with its well-explored precedents in psychology and neurophysiology, assumes that decisions are made by the integration over time of noisy information until a stopping-rule criterion is reached. Due to this informational approach, the model builds a coherent theoretical framework for developing an integrated view of functions and mechanisms of mating decisions. From a functional point of view, the model allows us to investigate speed-accuracy tradeoffs in mating decision at both population and individual levels. It shows that, under strong time constraints, decision makers are expected to make fast and frugal decisions and to optimally trade off population-sampling accuracy (i.e. the number of sampled males) against individual-assessment accuracy (i.e. the time spent for evaluating each mate). From the proximate-mechanism point of view, the model makes testable predictions on the interactions of mating preferences and choosiness in different contexts and it might be of compelling empirical utility for a context-independent description of mating preference strength. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Rapid, easy, and cheap randomization: prospective evaluation in a study cohort

    Directory of Open Access Journals (Sweden)

    Parker Melissa J

    2012-06-01

    Full Text Available Abstract Background When planning a randomized controlled trial (RCT, investigators must select randomization and allocation procedures based upon a variety of factors. While third party randomization is cited as being among the most desirable randomization processes, many third party randomization procedures are neither feasible nor cost-effective for small RCTs, including pilot RCTs. In this study we present our experience with a third party randomization and allocation procedure that utilizes current technology to achieve randomization in a rapid, reliable, and cost-effective manner. Methods This method was developed by the investigators for use in a small 48-participant parallel group RCT with four study arms. As a nested study, the reliability of this randomization procedure was prospectively evaluated in this cohort. The primary outcome of this nested study was the proportion of subjects for whom allocation information was obtained by the Research Assistant within 15 min of the initial participant randomization request. A secondary outcome was the average time for communicating participant group assignment back to the Research Assistant. Descriptive information regarding any failed attempts at participant randomization as well as costs attributable to use of this method were also recorded. Statistical analyses included the calculation of simple proportions and descriptive statistics. Results Forty-eight participants were successfully randomized and group allocation instruction was received for 46 (96% within 15 min of the Research Assistant placing the initial randomization request. Time elapsed in minutes until receipt of participant allocation instruction was Mean (SD 3.1 +/− 3.6; Median (IQR 2 (2,3; Range (1–20 for the entire cohort of 48. For the two participants for whom group allocation information was not received by the Research Assistant within the 15-min pass threshold, this information was obtained following a second

  3. Multiscale study on stochastic reconstructions of shale samples

    Science.gov (United States)

    Lili, J.; Lin, M.; Jiang, W. B.

    2016-12-01

    Shales are known to have multiscale pore systems, composed of macroscale fractures, micropores, and nanoscale pores within gas or oil-producing organic material. Also, shales are fissile and laminated, and the heterogeneity in horizontal is quite different from that in vertical. Stochastic reconstructions are extremely useful in situations where three-dimensional information is costly and time consuming. Thus the purpose of our paper is to reconstruct stochastically equiprobable 3D models containing information from several scales. In this paper, macroscale and microscale images of shale structure in the Lower Silurian Longmaxi are obtained by X-ray microtomography and nanoscale images are obtained by scanning electron microscopy. Each image is representative for all given scales and phases. Especially, the macroscale is four times coarser than the microscale, which in turn is four times lower in resolution than the nanoscale image. Secondly, the cross correlation-based simulation method (CCSIM) and the three-step sampling method are combined together to generate stochastic reconstructions for each scale. It is important to point out that the boundary points of pore and matrix are selected based on multiple-point connectivity function in the sampling process, and thus the characteristics of the reconstructed image can be controlled indirectly. Thirdly, all images with the same resolution are developed through downscaling and upscaling by interpolation, and then we merge multiscale categorical spatial data into a single 3D image with predefined resolution (the microscale image). 30 realizations using the given images and the proposed method are generated. The result reveals that the proposed method is capable of preserving the multiscale pore structure, both vertically and horizontally, which is necessary for accurate permeability prediction. The variogram curves and pore-size distribution for both original 3D sample and the generated 3D realizations are compared

  4. Children's Quality of Life Based on the KIDSCREEN-27: Child Self-Report, Parent Ratings and Child-Parent Agreement in a Swedish Random Population Sample.

    Directory of Open Access Journals (Sweden)

    Anne H Berman

    Full Text Available The KIDSCREEN-27 is a measure of child and adolescent quality of life (QoL, with excellent psychometric properties, available in child-report and parent-rating versions in 38 languages. This study provides child-reported and parent-rated norms for the KIDSCREEN-27 among Swedish 11-16 year-olds, as well as child-parent agreement. Sociodemographic correlates of self-reported wellbeing and parent-rated wellbeing were also measured.A random population sample consisting of 600 children aged 11-16, 100 per age group and one of their parents (N = 1200, were approached for response to self-reported and parent-rated versions of the KIDSCREEN-27. Parents were also asked about their education, employment status and their own QoL based on the 26-item WHOQOL-Bref. Based on the final sampling pool of 1158 persons, a 34.8% response rate of 403 individuals was obtained, including 175 child-parent pairs, 27 child singleton responders and 26 parent singletons. Gender and age differences for parent ratings and child-reported data were analyzed using t-tests and the Mann-Whitney U-test. Post-hoc Dunn tests were conducted for pairwise comparisons when the p-value for specific subscales was 0.05 or lower. Child-parent agreement was tested item-by-item, using the Prevalence- and Bias-Adjusted Kappa (PABAK coefficient for ordinal data (PABAK-OS; dimensional and total score agreement was evaluated based on dichotomous cut-offs for lower well-being, using the PABAK and total, continuous scores were evaluated using Bland-Altman plots.Compared to European norms, Swedish children in this sample scored lower on Physical wellbeing (48.8 SE/49.94 EU but higher on the other KIDSCREEN-27 dimensions: Psychological wellbeing (53.4/49.77, Parent relations and autonomy (55.1/49.99, Social Support and peers (54.1/49.94 and School (55.8/50.01. Older children self-reported lower wellbeing than younger children. No significant self-reported gender differences occurred and parent ratings

  5. Acupuncture, Counseling, and Usual care for Depression (ACUDep: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    MacPherson Hugh

    2012-11-01

    Full Text Available Abstract Background The evidence on the effect of acupuncture or counseling for depression is not conclusive yet is sufficient to warrant further research. Our aim is to conduct a full-scale RCT to determine the clinical and cost effectiveness of acupuncture and counseling compared to usual care alone. We will explore the experiences and perspectives of patients and practitioners. Methods/Design Randomized controlled trial with three parallel arms: acupuncture plus usual care, counseling plus usual care, and usual care alone, in conjunction with a nested qualitative study using in-depth interviews with purposive samples of trial participants. Participants: Patients aged over 18 years diagnosed with depression or mood disorder by their GP and with a score of 20 or above on the Beck Depression Inventory (BDI-II. Randomization: Computer randomization by York Trials Unit to acupuncture, counseling, and usual care alone in proportions of 2:2:1, respectively, with secure allocation concealment. Interventions: Patients allocated to acupuncture and counseling groups receive the offer of up to 12 weekly sessions. Both interventions allow flexibility to address patient variation, yet are constrained within defined protocols. Acupuncture is based on traditional Chinese medicine and counseling is non-directive within the humanistic tradition. Outcome: The PHQ-9 is the primary outcome measure, collected at baseline, 3, 6, 9, and 12 months. Also measured is BDI-II, SF-36 Bodily pain subscale, and EQ-5D. Texted mood scores are collected weekly over the first 15 weeks. Health-related resource use is collected over 12 months. Analysis: The sample size target was for 640 participants, calculated for an effect size of 0.32 on the PHQ-9 when comparing acupuncture with counseling given 90% power, 5% significance, and 20% loss to follow-up. Analysis of covariance will be used on an intention-to-treat basis. Thematic analysis will be used for qualitative data. We will

  6. Removable samples for ITER—a feasibility and conceptual study

    International Nuclear Information System (INIS)

    Mertens, Ph; Neubauer, O; Philipps, V; Huber, A; Kirschner, A; Knaup, M; Borodin, D; Samm, U; Ciattaglia, S; Choi, C H; Gicquel, S; Hirai, T; Mitteau, R; Pitts, R A; Sadakov, S; Shimada, M; Veshchev, E

    2014-01-01

    The control of the radioactive inventory in the vacuum vessel of ITER is a main safety issue. Erosion of activated plasma-facing components (PFC) and co-deposition of tritiated dust on PFC and in areas below the divertor constitute the main sources of in-vessel radioactive inventory mobilizable in the case of an accident and also during venting of the vessel. To trace the dust and tritium inventory in the machine, the use of collectors in the form of removable samples was evaluated, beside other techniques, since it provides a reliable way to follow the history of the deposits and check critical areas. Four types of removable probes and two optional active diagnostics were selected out of about 30 different options. For all four probes, a conceptual design was worked out and the feasibility was checked with preliminary estimations of thermal and electromagnetic loads, as well as remote handling paths. The highest temperature estimated for the front face of all probes lies in the range 300–500 °C, which is tolerable. Installed in representative places, such removable samples may provide information about the dust and tritium distribution inside the vacuum vessel. (paper)

  7. Study of the Gamma Radiation Effect on Tannins Samples

    International Nuclear Information System (INIS)

    Coto Hernandez, I.; Barroso Solares, S.; Martinez Luzardo, F.; Guzman Martinez, F.; Diaz Rizo, O.; Arado Lopez, J.O.; Santana Romero, J.L.; Baeza Fonte, A.; Rapado Paneque, M.; Garcia, F.

    2011-01-01

    Vegetable tannins are polyphenolic substances of different chemical mixtures, in correspondence with the characteristics of groups of polyphenols. Taking into consideration its composition, we can find different types of flavonoids, mainly in the so-called condensed tannins. In general, many applications have been explored, including the medical ones, due to their proven biological activity as antiviral, antibacterial and others characteristics derived from their reactions with metal ions and amino acids of the protein components. Therefore it is promising to examine the effects of gamma radiation on the structure of tannin, looking for the possible modification of its biological activity. To this end, samples of tannins are irradiated at different doses (maximum dose 35 kGy) with the use of a Cobalt-60 irradiator. Scanning Electron Microscopy (SEM) permitted to characterize the samples in morphology and composition. The changes were analyzed by using infrared spectroscopy Fourier transform (FT-IR) and High Resolution Liquid Chromatography (HPLC). At the end we discuss the implication of the results for a dosage range above 5 kGy. (Author)

  8. Scanning Ion Conductance Microscopy for Studying Biological Samples

    Directory of Open Access Journals (Sweden)

    Irmgard D. Dietzel

    2012-11-01

    Full Text Available Scanning ion conductance microscopy (SICM is a scanning probe technique that utilizes the increase in access resistance that occurs if an electrolyte filled glass micro-pipette is approached towards a poorly conducting surface. Since an increase in resistance can be monitored before the physical contact between scanning probe tip and sample, this technique is particularly useful to investigate the topography of delicate samples such as living cells. SICM has shown its potential in various applications such as high resolution and long-time imaging of living cells or the determination of local changes in cellular volume. Furthermore, SICM has been combined with various techniques such as fluorescence microscopy or patch clamping to reveal localized information about proteins or protein functions. This review details the various advantages and pitfalls of SICM and provides an overview of the recent developments and applications of SICM in biological imaging. Furthermore, we show that in principle, a combination of SICM and ion selective micro-electrodes enables one to monitor the local ion activity surrounding a living cell.

  9. Sampling Key Populations for HIV Surveillance: Results From Eight Cross-Sectional Studies Using Respondent-Driven Sampling and Venue-Based Snowball Sampling.

    Science.gov (United States)

    Rao, Amrita; Stahlman, Shauna; Hargreaves, James; Weir, Sharon; Edwards, Jessie; Rice, Brian; Kochelani, Duncan; Mavimbela, Mpumelelo; Baral, Stefan

    2017-10-20

    In using regularly collected or existing surveillance data to characterize engagement in human immunodeficiency virus (HIV) services among marginalized populations, differences in sampling methods may produce different pictures of the target population and may therefore result in different priorities for response. The objective of this study was to use existing data to evaluate the sample distribution of eight studies of female sex workers (FSW) and men who have sex with men (MSM), who were recruited using different sampling approaches in two locations within Sub-Saharan Africa: Manzini, Swaziland and Yaoundé, Cameroon. MSM and FSW participants were recruited using either respondent-driven sampling (RDS) or venue-based snowball sampling. Recruitment took place between 2011 and 2016. Participants at each study site were administered a face-to-face survey to assess sociodemographics, along with the prevalence of self-reported HIV status, frequency of HIV testing, stigma, and other HIV-related characteristics. Crude and RDS-adjusted prevalence estimates were calculated. Crude prevalence estimates from the venue-based snowball samples were compared with the overlap of the RDS-adjusted prevalence estimates, between both FSW and MSM in Cameroon and Swaziland. RDS samples tended to be younger (MSM aged 18-21 years in Swaziland: 47.6% [139/310] in RDS vs 24.3% [42/173] in Snowball, in Cameroon: 47.9% [99/306] in RDS vs 20.1% [52/259] in Snowball; FSW aged 18-21 years in Swaziland 42.5% [82/325] in RDS vs 8.0% [20/249] in Snowball; in Cameroon 15.6% [75/576] in RDS vs 8.1% [25/306] in Snowball). They were less educated (MSM: primary school completed or less in Swaziland 42.6% [109/310] in RDS vs 4.0% [7/173] in Snowball, in Cameroon 46.2% [138/306] in RDS vs 14.3% [37/259] in Snowball; FSW: primary school completed or less in Swaziland 86.6% [281/325] in RDS vs 23.9% [59/247] in Snowball, in Cameroon 87.4% [520/576] in RDS vs 77.5% [238/307] in Snowball) than the snowball

  10. A randomized study of multimedia informational aids for research on medical practices: implications for informed consent

    Science.gov (United States)

    Kraft, Stephanie A; Constantine, Melissa; Magnus, David; Porter, Kathryn M.; Lee, Sandra Soo-Jin; Green, Michael; Kass, Nancy E; Wilfond, Benjamin S.; Cho, Mildred K

    2016-01-01

    Background/aims Participant understanding is a key element of informed consent for enrollment in research. However, participants often do not understand the nature, risks, benefits, or design of the studies in which they take part. Research on medical practices, which studies standard interventions rather than new treatments, has the potential to be especially confusing to participants because it is embedded within usual clinical care. Our objective in this randomized study was to compare the ability of a range of multimedia informational aids to improve participant understanding in the context of research on medical practices. Methods We administered a Web-based survey to members of a proprietary online panel sample selected to match national U.S. demographics. Respondents were randomized to one of five arms: four content-equivalent informational aids (animated videos, slideshows with voiceover, comics, and text), and one no-intervention control. We measured knowledge of research on medical practices using a summary knowledge score from 10 questions based on the content of the informational aids. We used ANOVA and paired t-tests to compare knowledge scores between arms. Results There were 1500 completed surveys (300 in each arm). Mean knowledge scores were highest for the slideshows with voiceover (65.7%), followed by the animated videos (62.7%), comics (60.7%), text (57.2%), and control (50.3%). Differences between arms were statistically significant except between the slideshows with voiceover and animated videos and between the animated videos and comics. Informational aids that included an audio component (animated videos and slideshows with voiceover) had higher knowledge scores than those without an audio component (64.2% versus 59.0%, peffectively than text alone. However, the relatively low knowledge scores suggest that targeted informational aids may be needed to teach some particularly challenging concepts. Nonetheless, our results demonstrate the

  11. Effectiveness of Housing First with Intensive Case Management in an Ethnically Diverse Sample of Homeless Adults with Mental Illness: A Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Vicky Stergiopoulos

    Full Text Available Housing First (HF is being widely disseminated in efforts to end homelessness among homeless adults with psychiatric disabilities. This study evaluates the effectiveness of HF with Intensive Case Management (ICM among ethnically diverse homeless adults in an urban setting. 378 participants were randomized to HF with ICM or treatment-as-usual (TAU in Toronto (Canada, and followed for 24 months. Measures of effectiveness included housing stability, physical (EQ5D-VAS and mental (CSI, GAIN-SS health, social functioning (MCAS, quality of life (QoLI20, and health service use. Two-thirds of the sample (63% was from racialized groups and half (50% were born outside Canada. Over the 24 months of follow-up, HF participants spent a significantly greater percentage of time in stable residences compared to TAU participants (75.1% 95% CI 70.5 to 79.7 vs. 39.3% 95% CI 34.3 to 44.2, respectively. Similarly, community functioning (MCAS improved significantly from baseline in HF compared to TAU participants (change in mean difference = +1.67 95% CI 0.04 to 3.30. There was a significant reduction in the number of days spent experiencing alcohol problems among the HF compared to TAU participants at 24 months (ratio of rate ratios = 0.47 95% CI 0.22 to 0.99 relative to baseline, a reduction of 53%. Although the number of emergency department visits and days in hospital over 24 months did not differ significantly between HF and TAU participants, fewer HF participants compared to TAU participants had 1 or more hospitalizations during this period (70.4% vs. 81.1%, respectively; P=0.044. Compared to non-racialized HF participants, racialized HF participants saw an increase in the amount of money spent on alcohol (change in mean difference = $112.90 95% CI 5.84 to 219.96 and a reduction in physical community integration (ratio of rate ratios = 0.67 95% CI 0.47 to 0.96 from baseline to 24 months. Secondary analyses found a significant reduction in the number of days

  12. A Randomized trial of an Asthma Internet Self-management Intervention (RAISIN): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Morrison, Deborah; Wyke, Sally; Thomson, Neil C; McConnachie, Alex; Agur, Karolina; Saunderson, Kathryn; Chaudhuri, Rekha; Mair, Frances S

    2014-05-24

    The financial costs associated with asthma care continue to increase while care remains suboptimal. Promoting optimal self-management, including the use of asthma action plans, along with regular health professional review has been shown to be an effective strategy and is recommended in asthma guidelines internationally. Despite evidence of benefit, guided self-management remains underused, however the potential for online resources to promote self-management behaviors is gaining increasing recognition. The aim of this paper is to describe the protocol for a pilot evaluation of a website 'Living well with asthma' which has been developed with the aim of promoting self-management behaviors shown to improve outcomes. The study is a parallel randomized controlled trial, where adults with asthma are randomly assigned to either access to the website for 12 weeks, or usual asthma care for 12 weeks (followed by access to the website if desired). Individuals are included if they are over 16-years-old, have a diagnosis of asthma with an Asthma Control Questionnaire (ACQ) score of greater than, or equal to 1, and have access to the internet. Primary outcomes for this evaluation include recruitment and retention rates, changes at 12 weeks from baseline for both ACQ and Asthma Quality of Life Questionnaire (AQLQ) scores, and quantitative data describing website usage (number of times logged on, length of time logged on, number of times individual pages looked at, and for how long). Secondary outcomes include clinical outcomes (medication use, health services use, lung function) and patient reported outcomes (including adherence, patient activation measures, and health status). Piloting of complex interventions is considered best practice and will maximise the potential of any future large-scale randomized controlled trial to successfully recruit and be able to report on necessary outcomes. Here we will provide results across a range of outcomes which will provide estimates of

  13. Feasibility Study of Commercial Markets for New Sample Acquisition Devices

    Science.gov (United States)

    Brady, Collin; Coyne, Jim; Bilen, Sven G.; Kisenwether, Liz; Miller, Garry; Mueller, Robert P.; Zacny, Kris

    2010-01-01

    The NASA Exploration Systems Mission Directorate (ESMD) and Penn State technology commercialization project was designed to assist in the maturation of a NASA SBIR Phase III technology. The project was funded by NASA's ESMD Education group with oversight from the Surface Systems Office at NASA Kennedy Space Center in the Engineering Directorate. Two Penn State engineering student interns managed the project with support from Honeybee Robotics and NASA Kennedy Space Center. The objective was to find an opportunity to integrate SBIR-developed Regolith Extractor and Sampling Technology as the payload for the future Lunar Lander or Rover missions. The team was able to identify two potential Google Lunar X Prize organizations with considerable interest in utilizing regolith acquisition and transfer technology.

  14. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  15. Study on uranium leaching behavior from coal fly ash samples

    International Nuclear Information System (INIS)

    Police, S.; Maity, S.; Chaudhary, D.K.; Sahu, S.K.; Pandit, G.G.

    2017-01-01

    Leachability of trace and toxic metals from coal fly ash (FA) poses significant environmental problems especially ground and surface water contamination. In the present study, leachability of U using batch leaching tests (i.e., at various leachate pH values) and using TCLP was studied. Results of pH variation study indicate that, U has higher leachability in acidic medium as compared to slightly alkaline medium. The leachable U concentrations observed in pH variation study are well below the WHO safety limits. In TCLP leachates, the leachable U concentrations are found to be higher than that observed in pH variation study. (author)

  16. The Work-It Study for people with arthritis: Study protocol and baseline sample characteristics.

    Science.gov (United States)

    Keysor, Julie J; AlHeresh, Rawan; Vaughan, Molly; LaValley, Michael P; Allaire, Saralynn

    2016-06-14

    People with arthritis are at risk of work disability. Job accommodation and educational programs delivered before imminent work loss can minimize work disability, yet are not currently being widely implemented. The Work-It Study is a randomized controlled trial testing the efficacy of a problem solving program delivered by physical and occupational therapy practitioners to prevent work loss over a two-year period among people with arthritis and rheumatological conditions. The purpose of this paper is to describe the protocol of the randomized controlled trial, and describe the baseline characteristics of the subjects and their work outcomes. 287 participants were recruited from the Boston area in Massachusetts, USA. Eligible participants were aged between 21-65, self-reported a physicians' diagnosis of arthritis, rheumatic condition, or chronic back pain, reported a concern about working now or in the near future due to your health, worked at least 15 hours a week, had plans to continue working, and worked or lived in Massachusetts. Subjects were recruited through community sources and rheumatology offices. Participants in the experimental group received a structured interview and an education and resource packet, while participants in the control received the resource packet only. The baseline characteristics and work related outcomes of the participants were analyzed. To our knowledge, the Work-It Study is the largest and most diverse randomized controlled trial to date aiming to identify and problem solve work-related barriers, promote advocacy, and foster work disability knowledge among people with chronic disabling musculoskeletal conditions. Despite advances in medical management of arthritis and other rheumatological and musculoskeletal conditions, many people still have concerns about their ability to remain employed and are seeking strategies to help them sustain employment.

  17. Benfotiamine in treatment of alcoholic polyneuropathy: an 8-week randomized controlled study (BAP I Study).

    Science.gov (United States)

    Woelk, H; Lehrl, S; Bitsch, R; Köpcke, W

    1998-01-01

    A three-armed, randomized, multicentre, placebo-controlled double-blind study was used to examine the efficacy of benfotiamine vs a combination containing benfotiamine and vitamins B6 and B12 in out-patients with severe symptoms of alcoholic polyneuropathy (Benfotiamine in treatment of Alcoholic Polyneuropathy, BAP I). The study period was 8 weeks and 84 patients fulfilled all the prerequisite criteria and completed the study as planned. Benfotiamine led to significant improvement of alcoholic polyneuropathy. Vibration perception (measured at the tip of the great toe) significantly improved in the course of the study, as did motor function. and the overall score reflecting the entire range of symptoms of alcoholic polyneuropathy. A tendency toward improvement was evident for pain and co-ordination; no therapy-specific adverse effects were seen.

  18. Random Telegraph Signal Amplitudes in Sub 100 nm (Decanano) MOSFETs: A 3D 'Atomistic' Simulation Study

    Science.gov (United States)

    Asenov, Asen; Balasubramaniam, R.; Brown, A. R.; Davies, J. H.; Saini, Subhash

    2000-01-01

    In this paper we use 3D simulations to study the amplitudes of random telegraph signals (RTS) associated with the trapping of a single carrier in interface states in the channel of sub 100 nm (decanano) MOSFETs. Both simulations using continuous doping charge and random discrete dopants in the active region of the MOSFETs are presented. We have studied the dependence of the RTS amplitudes on the position of the trapped charge in the channel and on the device design parameters. We have observed a significant increase in the maximum RTS amplitude when discrete random dopants are employed in the simulations.

  19. Assessing the precision of a time-sampling-based study among GPs: balancing sample size and measurement frequency.

    Science.gov (United States)

    van Hassel, Daniël; van der Velden, Lud; de Bakker, Dinny; van der Hoek, Lucas; Batenburg, Ronald

    2017-12-04

    Our research is based on a technique for time sampling, an innovative method for measuring the working hours of Dutch general practitioners (GPs), which was deployed in an earlier study. In this study, 1051 GPs were questioned about their activities in real time by sending them one SMS text message every 3 h during 1 week. The required sample size for this study is important for health workforce planners to know if they want to apply this method to target groups who are hard to reach or if fewer resources are available. In this time-sampling method, however, standard power analyses is not sufficient for calculating the required sample size as this accounts only for sample fluctuation and not for the fluctuation of measurements taken from every participant. We investigated the impact of the number of participants and frequency of measurements per participant upon the confidence intervals (CIs) for the hours worked per week. Statistical analyses of the time-use data we obtained from GPs were performed. Ninety-five percent CIs were calculated, using equations and simulation techniques, for various different numbers of GPs included in the dataset and for various frequencies of measurements per participant. Our results showed that the one-tailed CI, including sample and measurement fluctuation, decreased from 21 until 3 h between one and 50 GPs. As a result of the formulas to calculate CIs, the increase of the precision continued and was lower with the same additional number of GPs. Likewise, the analyses showed how the number of participants required decreased if more measurements per participant were taken. For example, one measurement per 3-h time slot during the week requires 300 GPs to achieve a CI of 1 h, while one measurement per hour requires 100 GPs to obtain the same result. The sample size needed for time-use research based on a time-sampling technique depends on the design and aim of the study. In this paper, we showed how the precision of the

  20. Electrophysiological effects of desflurane in children with Wolff-Parkinson-White syndrome: a randomized crossover study.

    Science.gov (United States)

    Hino, H; Oda, Y; Yoshida, Y; Suzuki, T; Shimada, M; Nishikawa, K

    2018-02-01

    We hypothesized that, compared with propofol, desflurane prolongs the antegrade accessory pathway effective refractory period (APERP) in children undergoing radiofrequency catheter ablation for Wolff-Parkinson-White (WPW) syndrome. In this randomized crossover study, children aged 4.1-16.1 years undergoing radiofrequency catheter ablation for WPW syndrome were randomly divided into four groups according to the concentration of desflurane and anesthetics used in the first and the second electrophysiological studies (EPS). After induction of general anesthesia with propofol and tracheal intubation, they received one of the following regimens: 0.5 minimum alveolar concentration (MAC) desflurane (first EPS) and propofol (second EPS) (Des0.5-Prop group, n = 8); propofol (first EPS) and 0.5 MAC desflurane (second EPS) (Prop-Des0.5 group, n = 9); 1 MAC desflurane (first EPS) and propofol (second EPS) (Des1.0-Prop group, n = 10); propofol (first EPS) and 1 MAC desflurane (second EPS) (Prop-Des1.0 group, n = 9). Radiofrequency catheter ablation was performed upon completion of EPS. Sample size was determined to detect a difference in the APERP. Desflurane at 1.0 MAC significantly prolonged the APERP compared with propofol, but did not affect the sinoatrial conduction time, atrio-His interval or atrioventricular node effective refractory period. Supraventricular tachycardia was induced in all children receiving propofol, but not induced in 1 and 4 children receiving 0.5 MAC and 1.0 MAC desflurane, respectively. Desflurane enhances the refractoriness and may block the electrical conduction of the atrioventricular accessory pathway, and is therefore not suitable for use in children undergoing radiofrequency catheter ablation for WPW syndrome. © 2017 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  1. The Effects of Experimentally Manipulated Social Status on Acute Eating Behavior: A Randomized, Crossover Pilot Study

    Science.gov (United States)

    Cardel, MI; Johnson, SL; Beck, J; Dhurandhar, E; Keita, AD; Tomczik, AC; Pavela, G; Huo, T; Janicke, DM; Muller, K; Piff, PK; Peters, JC; Hill, JO; Allison, DB

    2016-01-01

    Both subjective and objectively measured social status has been associated with multiple health outcomes, including weight status, but the mechanism for this relationship remains unclear. Experimental studies may help identify the causal mechanisms underlying low social standing as a pathway for obesity. Our objective was to investigate the effects of experimentally manipulated social status on ad libitum acute dietary intakes and stress-related outcomes as potential mechanisms relating social status and weight. This was a pilot feasibility, randomized, crossover study in Hispanic young adults (n=9; age 19–25; 67% female; BMI ≥18.5 and ≤30 kg/m2). At visit 1, participants consumed a standardized breakfast and were randomized to a high social status position (HIGH) or low social status position (LOW) in a rigged game of Monopoly™. The rules for the game differed substantially in terms of degree of ‘privilege’ depending on randomization to HIGH or LOW. Following Monopoly™, participants were given an ad libitum buffet meal and energy intakes (kcal) were estimated by pre- and post-weighing foods consumed. Stress-related markers were measured at baseline, after the game of Monopoly™, and after lunch. Visit 2 used the same standardized protocol; however, participants were exposed to the opposite social status condition. When compared to HIGH, participants in LOW consumed 130 more calories (p=0.07) and a significantly higher proportion of their daily calorie needs in the ad libitum buffet meal (39% in LOW versus 31% in HIGH; p=0.04). In LOW, participants reported decreased feelings of pride and powerfulness following Monopoly™ (p=0.05) and after their lunch meal (p=0.08). Relative to HIGH, participants in LOW demonstrated higher heart rates following Monopoly™ (p=0.06), but this relationship was not significant once lunch was consumed (p=0.31). Our pilot data suggest a possible causal relationship between experimentally manipulated low social status

  2. The effects of experimentally manipulated social status on acute eating behavior: A randomized, crossover pilot study.

    Science.gov (United States)

    Cardel, M I; Johnson, S L; Beck, J; Dhurandhar, E; Keita, A D; Tomczik, A C; Pavela, G; Huo, T; Janicke, D M; Muller, K; Piff, P K; Peters, J C; Hill, J O; Allison, D B

    2016-08-01

    Both subjective and objectively measured social status has been associated with multiple health outcomes, including weight status, but the mechanism for this relationship remains unclear. Experimental studies may help identify the causal mechanisms underlying low social standing as a pathway for obesity. Our objective was to investigate the effects of experimentally manipulated social status on ad libitum acute dietary intakes and stress-related outcomes as potential mechanisms relating social status and weight. This was a pilot feasibility, randomized, crossover study in Hispanic young adults (n=9; age 19-25; 67% female; BMI ≥18.5 and ≤30kg/m(2)). At visit 1, participants consumed a standardized breakfast and were randomized to a high social status position (HIGH) or low social status position (LOW) in a rigged game of Monopoly™. The rules for the game differed substantially in terms of degree of 'privilege' depending on randomization to HIGH or LOW. Following Monopoly™, participants were given an ad libitum buffet meal and energy intakes (kcal) were estimated by pre- and post-weighing foods consumed. Stress-related markers were measured at baseline, after the game of Monopoly™, and after lunch. Visit 2 used the same standardized protocol; however, participants were exposed to the opposite social status condition. When compared to HIGH, participants in LOW consumed 130 more calories (p=0.07) and a significantly higher proportion of their daily calorie needs in the ad libitum buffet meal (39% in LOW versus 31% in HIGH; p=0.04). In LOW, participants reported decreased feelings of pride and powerfulness following Monopoly™ (p=0.05) and after their lunch meal (p=0.08). Relative to HIGH, participants in LOW demonstrated higher heart rates following Monopoly™ (p=0.06), but this relationship was not significant once lunch was consumed (p=0.31). Our pilot data suggest a possible causal relationship between experimentally manipulated low social status and

  3. Prednisolone and acupuncture in Bell's palsy: study protocol for a randomized, controlled trial

    Directory of Open Access Journals (Sweden)

    Wang Kangjun

    2011-06-01

    Full Text Available Abstract Background There are a variety of treatment options for Bell's palsy. Evidence from randomized controlled trials indicates corticosteroids can be used as a proven therapy for Bell's palsy. Acupuncture is one of the most commonly used methods to treat Bell's palsy in China. Recent studies suggest that staging treatment is more suitable for Bell's palsy, according to different path-stages of this disease. The aim of this study is to compare the effects of prednisolone and staging acupuncture in the recovery of the affected facial nerve, and to verify whether prednisolone in combination with staging acupuncture is more effective than prednisolone alone for Bell's palsy in a large number of patients. Methods/Design In this article, we report the design and protocol of a large sample multi-center randomized controlled trial to treat Bell's palsy with prednisolone and/or acupuncture. In total, 1200 patients aged 18 to 75 years within 72 h of onset of acute, unilateral, peripheral facial palsy will be assessed. There are six treatment groups, with four treated according to different path-stages and two not. These patients are randomly assigned to be in one of the following six treatment groups, i.e. 1 placebo prednisolone group, 2 prednisolone group, 3 placebo prednisolone plus acute stage acupuncture group, 4 prednisolone plus acute stage acupuncture group, 5 placebo prednisolone plus resting stage acupuncture group, 6 prednisolone plus resting stage acupuncture group. The primary outcome is the time to complete recovery of facial function, assessed by Sunnybrook system and House-Brackmann scale. The secondary outcomes include the incidence of ipsilateral pain in the early stage of palsy (and the duration of this pain, the proportion of patients with severe pain, the occurrence of synkinesis, facial spasm or contracture, and the severity of residual facial symptoms during the study period. Discussion The result of this trial will assess the

  4. Gambling problems in the family – A stratified probability sample study of prevalence and reported consequences

    Directory of Open Access Journals (Sweden)

    Øren Anita

    2008-12-01

    Full Text Available Abstract Background Prior studies on the impact of problem gambling in the family mainly include help-seeking populations with small numbers of participants. The objective of the present stratified probability sample study was to explore the epidemiology of problem gambling in the family in the general population. Methods Men and women 16–74 years-old randomly selected from the Norwegian national population database received an invitation to participate in this postal questionnaire study. The response rate was 36.1% (3,483/9,638. Given the lack of validated criteria, two survey questions ("Have you ever noticed that a close relative spent more and more money on gambling?" and "Have you ever experienced that a close relative lied to you about how much he/she gambles?" were extrapolated from the Lie/Bet Screen for pathological gambling. Respondents answering "yes" to both questions were defined as Concerned Significant Others (CSOs. Results Overall, 2.0% of the study population was defined as CSOs. Young age, female gender, and divorced marital status were factors positively associated with being a CSO. CSOs often reported to have experienced conflicts in the family related to gambling, worsening of the family's financial situation, and impaired mental and physical health. Conclusion Problematic gambling behaviour not only affects the gambling individual but also has a strong impact on the quality of life of family members.

  5. An alternative approach to treating lateral epicondylitis. A randomized, placebo-controlled, double-blinded study

    NARCIS (Netherlands)

    Nourbakhsh, Mohammad Reza; Fearon, Frank J.

    Objective: To investigate the effect of noxious level electrical stimulation on pain, grip strength and functional abilities in subjects with chronic lateral epicondylitis. Design: Randomized, placebo-control, double-blinded study. Setting: Physical Therapy Department, North Georgia College and

  6. Evaluating sampling strategy for DNA barcoding study of coastal and inland halo-tolerant Poaceae and Chenopodiaceae: A case study for increased sample size.

    Directory of Open Access Journals (Sweden)

    Peng-Cheng Yao

    Full Text Available Environmental conditions in coastal salt marsh habitats have led to the development of specialist genetic adaptations. We evaluated six DNA barcode loci of the 53 species of Poaceae and 15 species of Chenopodiaceae from China's coastal salt marsh area and inland area. Our results indicate that the optimum DNA barcode was ITS for coastal salt-tolerant Poaceae and matK for the Chenopodiaceae. Sampling strategies for ten common species of Poaceae and Chenopodiaceae were analyzed according to optimum barcode. We found that by increasing the number of samples collected from the coastal salt marsh area on the basis of inland samples, the number of haplotypes of Arundinella hirta, Digitaria ciliaris, Eleusine indica, Imperata cylindrica, Setaria viridis, and Chenopodium glaucum increased, with a principal coordinate plot clearly showing increased distribution points. The results of a Mann-Whitney test showed that for Digitaria ciliaris, Eleusine indica, Imperata cylindrica, and Setaria viridis, the distribution of intraspecific genetic distances was significantly different when samples from the coastal salt marsh area were included (P < 0.01. These results suggest that increasing the sample size in specialist habitats can improve measurements of intraspecific genetic diversity, and will have a positive effect on the application of the DNA barcodes in widely distributed species. The results of random sampling showed that when sample size reached 11 for Chloris virgata, Chenopodium glaucum, and Dysphania ambrosioides, 13 for Setaria viridis, and 15 for Eleusine indica, Imperata cylindrica and Chenopodium album, average intraspecific distance tended to reach stability. These results indicate that the sample size for DNA barcode of globally distributed species should be increased to 11-15.

  7. Study of C-MYC amplification and expression in Iranian gastric cancer samples using CISH and IHC methods.

    Science.gov (United States)

    Khaleghian, Malihea; Jahanzad, Issa; Shakoori, Abbas; Ardalan, Farid Azmoudeh; Azimi, Cyrus

    2015-01-01

    Gastric cancer is the fourth most frequent malignancy and the second cause of cancer-related mortality worldwide. It has been suggested that in gastric carcinogenesis, the C-MYC gene has an important function. The objective of this study is to establish the preference of Chromogenic in situ hybridization (CISH) and Immunohistochemistry (IHC) in the diagnosis and prognosis of gastric cancer. Samples comprised of 50 randomly selected patients of whom 40 were male and 10 female. To evaluate the MYC copy number and its protein expression, CISH and IHC analyses were performed for 50 gastric adenocarcinomas, in Iran. The location of the tumor in 64% of the patients was the fundus, and in 72% of patients, the tumors were of a diffuse type; 22 samples showed no amplification, and 28 samples were with amplification. MYC immunoreactivity was observed in 13 samples. Twelve samples showed both MYC amplification and MYC immunoreactivity. In addition, among the 28 CISH+ samples, 12 samples had positive signals for IHC and 16 samples had negative signals for IHC. A majority of the IHC-negative patients had no amplification, but only one patient with IHC positive had no amplification. Our conclusion was that for the management and treatment of gastric cancer, and for special attention of clinicians, for prognosis and tumor progression, the CISH was a better and more feasible test than IHC, in regard to the sensitivity and specificity.

  8. Study of C-MYC amplification and expression in Iranian gastric cancer samples using CISH and IHC methods

    Directory of Open Access Journals (Sweden)

    Malihea Khaleghian

    2015-01-01

    Full Text Available Background: Gastric cancer is the fourth most frequent malignancy and the second cause of cancer-related mortality worldwide. It has been suggested that in gastric carcinogenesis, the C-MYC gene has an important function. The objective of this study is to establish the preference of Chromogenic in situ hybridization (CISH and Immunohistochemistry (IHC in the diagnosis and prognosis of gastric cancer. Materials and Methods: Samples comprised of 50 randomly selected patients of whom 40 were male and 10 female. To evaluate the MYC copy number and its protein expression, CISH and IHC analyses were performed for 50 gastric adenocarcinomas, in Iran. Results: The location of the tumor in 64% of the patients was the fundus, and in 72% of patients, the tumors were of a diffuse type; 22 samples showed no amplification, and 28 samples were with amplification. MYC immunoreactivity was observed in 13 samples. Twelve samples showed both MYC amplification and MYC immunoreactivity. In addition, among the 28 CISH+ samples, 12 samples had positive signals for IHC and 16 samples had negative signals for IHC. A majority of the IHC-negative patients had no amplification, but only one patient with IHC positive had no amplification. Conclusion: Our conclusion was that for the management and treatment of gastric cancer, and for special attention of clinicians, for prognosis and tumor progression, the CISH was a better and more feasible test than IHC, in regard to the sensitivity and specificity.

  9. Effectiveness of Wii-based rehabilitation in stroke: A randomized controlled study

    OpenAIRE

    Ayça Utkan Karasu; Elif Balevi Batur; Gülçin Kaymak Karataş

    2018-01-01

    Objective: To investigate the efficacy of Nintendo Wii Fit®-based balance rehabilitation as an adjunc-tive therapy to conventional rehabilitation in stroke patients. Methods: During the study period, 70 stroke patients were evaluated. Of these, 23 who met the study criteria were randomly assigned to either the experimental group (n = 12) or the control group (n = 11) by block randomization. Primary outcome measures were Berg Balance Scale, Functional Reach Test, Postural Asses...

  10. Northeast Cooperative Research Study Fleet (SF) Program Biological Sampling Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Northeast Cooperative Research Study Fleet (SF) Program partners with a subset of commercial fishermen to collect high quality, high resolution, haul by haul...

  11. Study of Kissing Molars in Turkish Population Sample

    African Journals Online (AJOL)

    2017-06-28

    Jun 28, 2017 ... unerupted teeth; retention of the mandibular second molar is ... and November 2014 for surgical treatment retrospectively were evaluated. The cases of ... and treatment. Results: Of the 6570 radiographs included in the study,.

  12. Aragonite-Calcite Inversion During Biogenic Carbonate Sampling: Considerations for Interpreting Isotopic Measurements in Paleoclimate Studies

    Science.gov (United States)

    Waite, A. J.; Swart, P. K.

    2011-12-01

    , suggests that the isotopic depletion is tied to the polymorphic inversion of aragonite to calcite, and not just random chance based on natural isotopic variability in the skeleton. There appears to be no relationship between the percent inversion and carbon isotopic composition. Elemental ratios also appear to remain stable during the heating and inversion process. The findings of this and published studies present, in many cases, conflicting views of the isotopic fractionation associated with inversion of aragonite to calcite. Discrepancies such as this likely result from subtle differences in sampling protocol related to instruments, drill bits, skeletal density, and possibly even laboratory conditions like temperature and humidity, further complicating our understanding and interpretation of such observations. Preliminary investigation suggests that altering milling conditions or wet milling may reduce the extent of alteration. Unfortunately, milling/drilling remains one of the only practical methods of sampling biogenic carbonates at a high resolution for paleoclimate work and, as such, caution should be taken in the interpretation of oxygen isotopic measurements from specimens of this nature.

  13. Feasibility and effects of the semirecumbent position to prevent ventilator-associated pneumonia: a randomized study.

    Science.gov (United States)

    van Nieuwenhoven, Christianne A; Vandenbroucke-Grauls, Christine; van Tiel, Frank H; Joore, Hans C A; van Schijndel, Rob J M Strack; van der Tweel, Ingeborg; Ramsay, Graham; Bonten, Marc J M

    2006-02-01

    Reducing aspiration of gastric contents by placing mechanically ventilated patients in a semirecumbent position has been associated with lower incidences of ventilator-associated pneumonia (VAP). The feasibility and efficacy of this intervention in a larger patient population, however, are unknown. Assessment of the feasibility of the semirecumbent position for intensive care unit patients and its influence on development of VAP. In a prospective multicentered trial, critically ill patients undergoing mechanical ventilation were randomly assigned to the semirecumbent position, with a target backrest elevation of 45 degrees , or standard care (i.e., supine position) with a backrest elevation of 10 degrees . Backrest elevation was measured continuously during the first week of ventilation with a monitor-linked device. A deviation of position was defined as a change of the randomized position >5 degrees . Diagnosis of VAP was made by quantitative cultures of samples obtained by bronchoscopic techniques. One hundred nine patients were assigned to the supine group and 112 to the semirecumbent group. Baseline characteristics were comparable for both groups. Average elevations were 9.8 degrees and 16.1 degrees at day 1 and day 7, respectively, for the supine group and 28.1 degrees and 22.6 degrees at day 1 and day 7, respectively, for the semirecumbent group (p position of 45 degrees was not achieved for 85% of the study time, and these patients more frequently changed position than supine-positioned patients. VAP was diagnosed in eight patients (6.5%) in the supine group and in 13 (10.7%) in the semirecumbent group (NS), after a mean of 6 (range, 3-9) and 7 (range, 3-12) days, respectively. There were no differences in numbers of patients undergoing enteral feeding, receiving stress ulcer prophylaxis, or developing pressure sores or in mortality rates or duration of ventilation and intensive care unit stay between the groups. The targeted backrest elevation of 45 degrees

  14. Effects of Herbal vigRX on Premature Ejaculation: A randomized, double-blind study

    Directory of Open Access Journals (Sweden)

    Z Ghafuri

    2010-05-01

    Full Text Available Objective :   "nWe conducted a double-blind, placebo-controlled study todetermine the efficacy of an herbal sexual supplement (vigRX on premature ejaculation (PE. Method: "nA randomized double blind study was conducted on a fixed dose of herbal vigRX at Roozbeh Psychiatry Hospital, Tehran University of Medical Sciences. The sample consisted of 85 married patients diagnosed withprimary PE according to Diagnostic and Statistical Manual of Mental Disorders. Each patient underwent diagnostic evaluation by one trained psychiatrist, using Structured Clinical Interview for DSM-IV-TR. Each patient was evaluated by researchers to exclude the organic sexual dysfunctions. The patients were randomly assigned in to two groups: group 1 consisting of 42 patients receiving placebo, and group 2 consisting of 43 patients receiving 540 mg herbal vigRX for a 4-week treatment course. The effects of the drug on the ejaculatory function in each group were assessed by the intravaginal ejaculation latency time (IELT, and the Chinese Index of Premature Ejaculation (CIPE before and at the end of the treatment course. Statistical analysis was performed using SPSS software (15th version.      Results: "nThe mean IELT increased 22.4 and 32.0 seconds in the placebo and the vigRX group respectively after the treatment course. The mean IELT differences between the two groups was not significant. The mean CIPE score increased 2.40 and 4.37 in the placebo and the vigRX group respectively .The mean CIPE score differences between the two groups was not significant.No side effect was reported by the subjects in neither groups during the treatment course. "nConclusion: Although the improvement in IELT and CIPE scores in the herbal vigRX group was more than the placebo group, this difference was not statistically significant. The increasing of IELT and CIPE score in the placebo group may be due to the placebo effects. Further studies with higher vigRX doses, greater sample size

  15. Quantification of damage in DNA recovered from highly degraded samples – a case study on DNA in faeces

    Directory of Open Access Journals (Sweden)

    Eveson J Paige

    2006-08-01

    Full Text Available Abstract Background Poorly preserved biological tissues have become an important source of DNA for a wide range of zoological studies. Measuring the quality of DNA obtained from these samples is often desired; however, there are no widely used techniques available for quantifying damage in highly degraded DNA samples. We present a general method that can be used to determine the frequency of polymerase blocking DNA damage in specific gene-regions in such samples. The approach uses quantitative PCR to measure the amount of DNA present at several fragment sizes within a sample. According to a model of random degradation the amount of available template will decline exponentially with increasing fragment size in damaged samples, and the frequency of DNA damage (λ can be estimated by determining the rate of decline. Results The method is illustrated through the analysis of DNA extracted from sea lion faecal samples. Faeces contain a complex mixture of DNA from several sources and different components are expected to be differentially degraded. We estimated the frequency of DNA damage in both predator and prey DNA within individual faecal samples. The distribution of fragment lengths for each target fit well with the assumption of a random degradation process and, in keeping with our expectations, the estimated frequency of damage was always less in predator DNA than in prey DNA within the same sample (mean λpredator = 0.0106 per nucleotide; mean λprey = 0.0176 per nucleotide. This study is the first to explicitly define the amount of template damage in any DNA extracted from faeces and the first to quantify the amount of predator and prey DNA present within individual faecal samples. Conclusion We present an approach for characterizing mixed, highly degraded PCR templates such as those often encountered in ecological studies using non-invasive samples as a source of DNA, wildlife forensics investigations and ancient DNA research. This method will

  16. Localization in random bipartite graphs: Numerical and empirical study

    Science.gov (United States)

    Slanina, František

    2017-05-01

    We investigate adjacency matrices of bipartite graphs with a power-law degree distribution. Motivation for this study is twofold: first, vibrational states in granular matter and jammed sphere packings; second, graphs encoding social interaction, especially electronic commerce. We establish the position of the mobility edge and show that it strongly depends on the power in the degree distribution and on the ratio of the sizes of the two parts of the bipartite graph. At the jamming threshold, where the two parts have the same size, localization vanishes. We found that the multifractal spectrum is nontrivial in the delocalized phase, but still near the mobility edge. We also study an empirical bipartite graph, namely, the Amazon reviewer-item network. We found that in this specific graph the mobility edge disappears, and we draw a conclusion from this fact regarding earlier empirical studies of the Amazon network.

  17. Fission-track studies of uranium distribution in geological samples

    International Nuclear Information System (INIS)

    Brynard, H.J.

    1983-01-01

    The standard method of studying uranium distribution in geological material by registration of fission tracks from the thermal neutron-induced fission of 235 U has been adapted for utilisation in the SAFARI-1 reactor at Pelindaba. The theory of fission-track registration as well as practical problems are discussed. The method has been applied to study uranium distribution in a variety of rock types and the results are discussed in this paper. The method is very sensitive and uranium present in quantities far below the detection limit of the microprobe have been detected

  18. Genetic Influences on Pulmonary Function: A Large Sample Twin Study

    DEFF Research Database (Denmark)

    Ingebrigtsen, Truls S; Thomsen, Simon F; van der Sluis, Sophie

    2011-01-01

    Heritability of forced expiratory volume in one second (FEV(1)), forced vital capacity (FVC), and peak expiratory flow (PEF) has not been previously addressed in large twin studies. We evaluated the genetic contribution to individual differences observed in FEV(1), FVC, and PEF using data from...... the largest population-based twin study on spirometry. Specially trained lay interviewers with previous experience in spirometric measurements tested 4,314 Danish twins (individuals), 46-68 years of age, in their homes using a hand-held spirometer, and their flow-volume curves were evaluated. Modern variance...

  19. Pediatric Basic Life Support Self-training is Comparable to Instructor-led Training: A randomized manikin study

    DEFF Research Database (Denmark)

    Vestergaard, L. D.; Løfgren, Bo; Jessen, C.

    2011-01-01

    Pediatric Basic Life Support Self-training is comparable to Instructor-led Training: A randomized manikin study.......Pediatric Basic Life Support Self-training is comparable to Instructor-led Training: A randomized manikin study....

  20. Transfusion strategy in hematological intensive care unit: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Chantepie, Sylvain P; Mear, Jean-Baptiste; Guittet, Lydia; Dervaux, Benoît; Marolleau, Jean-Pierre; Jardin, Fabrice; Dutheil, Jean-Jacques; Parienti, Jean-Jacques; Vilque, Jean-Pierre; Reman, Oumedaly

    2015-11-23

    Packed red blood cell (PRBC) transfusion is required in hematology patients treated with chemotherapy for acute leukemia, autologous (auto) or allogeneic (allo) hematopoietic stem cell transplantation (HSCT). In certain situations like septic shock, hip surgery, coronary disease or gastrointestinal hemorrhage, a restrictive transfusion strategy is associated with a reduction of infection and death. A transfusion strategy using a single PRBC unit has been retrospectively investigated and showed a safe reduction of PRBC consumption and costs. We therefore designed a study to prospectively demonstrate that the transfusion of a single PRBC unit is safe and not inferior to standard care. The 1versus2 trial is a randomized trial which will determine if a single-unit transfusion policy is not inferior to a double-unit transfusion policy. The primary endpoint is the incidence of severe complication (grade ≥ 3) defined as stroke, transient ischemic attack, acute coronary syndrome, heart failure, elevated troponin level, intensive care unit transfer, death, new pulmonary infiltrates, and transfusion-related infections during hospital stays. The secondary endpoint is the number of PRBC units transfused per patient per hospital stay. Two hundred and thirty patients will be randomized to receive a single unit or double unit every time the hemoglobin level is less than 8 g/dL. All patients admitted for induction remission chemotherapy, auto-HSCT or allo-HSCT in hematology intensive care units will be eligible for inclusion. Sample size calculation has determined that a patient population of 230 will be required to prove that the 1-unit PRBC strategy is non-inferior to the 2-unit PRBC strategy. Hemoglobin threshold for transfusion is below 8 g/dL. Estimated percentage of complication-free hospital stays is 93 %. In a non-inferiority hypothesis, the number of patients to include is 230 with a power of 90 % and an alpha risk of 5 %. 14-128; Clinicaltrials.gov NCT02461264

  1. Arterial puncture using insulin needle is less painful than with standard needle: a randomized crossover study.

    Science.gov (United States)

    Ibrahim, Irwani; Yau, Ying Wei; Ong, Lizhen; Chan, Yiong Huak; Kuan, Win Sen

    2015-03-01

    Arterial punctures are important procedures performed by emergency physicians in the assessment of ill patients. However, arterial punctures are painful and can create anxiety and needle phobia in patients. The pain score of radial arterial punctures were compared between the insulin needle and the standard 23-gauge hypodermic needle. In a randomized controlled crossover design, healthy volunteers were recruited to undergo bilateral radial arterial punctures. They were assigned to receive either the insulin or the standard needle as the first puncture, using blocked randomization. The primary outcome was the pain score measured on a 100-mm visual analogue scale (VAS) for pain, and secondary outcomes were rate of hemolysis, mean potassium values, and procedural complications immediately and 24 hours postprocedure. Fifty healthy volunteers were included in the study. The mean (±standard deviation) VAS score in punctures with the insulin needle was lower than the standard needle (23 ± 22 mm vs. 39 ± 24 mm; mean difference = -15 mm; 95% confidence interval = -22 mm to -7 mm; p standard needle (31.3% vs. 11.6%, p = 0.035; and 4.6 ±0.7 mmol/L vs. 4.2 ±0.5 mmol/L, p = 0.002). Procedural complications were lower in punctures with the insulin needle both immediately postprocedure (0% vs. 24%; p standard needles. However, due to the higher rate of hemolysis, its use should be limited to conditions that do not require a concurrent potassium value in the same blood sample. © 2015 by the Society for Academic Emergency Medicine.

  2. A Prospective, Randomized Study Comparing 7-day and 14-day ...

    African Journals Online (AJOL)

    2018-02-07

    Feb 7, 2018 ... and 14-day quadruple therapies as first-line treatments for Helicobacter pylori infection in ..... Furthermore, in a large-scale, multicenter, Japanese study ... Taylor DE, Ge Z, Purych D, Lo T, Hiratsuka K. Cloning and sequence ...

  3. Localization of optical excitations on random surfaces: SNOM studies

    DEFF Research Database (Denmark)

    Bozhevolnyi, Sergey I.

    1999-01-01

    Localization of optical excitations on nanostructured metal surfaces and fractal colloid silver aggregates are studied by using a scanning near-field optical microscope (SNOM). The SNOM images obtained in both configurations exhibit spatially localized (within 150 to 250 nm) light intensity...

  4. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  5. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  6. Fetal response to abbreviated relaxation techniques. A randomized controlled study.

    Science.gov (United States)

    Fink, Nadine S; Urech, Corinne; Isabel, Fornaro; Meyer, Andrea; Hoesli, Irène; Bitzer, Johannes; Alder, Judith

    2011-02-01

    stress during pregnancy can have adverse effects on the course of pregnancy and on fetal development. There are few studies investigating the outcome of stress reduction interventions on maternal well-being and obstetric outcome. this study aims (1) to obtain fetal behavioral states (quiet/active sleep, quiet/active wakefulness), (2) to investigate the effects of maternal relaxation on fetal behavior as well as on uterine activity, and (3) to investigate maternal physiological and endocrine parameters as potential underlying mechanisms for maternal-fetal relaxation-transferral. the behavior of 33 fetuses was analyzed during laboratory relaxation/quiet rest (control group, CG) and controlled for baseline fetal behavior. Potential associations between relaxation/quiet rest and fetal behavior (fetal heart rate (FHR), FHR variation, FHR acceleration, and body movements) and uterine activity were studied, using a computerized cardiotocogram (CTG) system. Maternal heart rate, blood pressure, cortisol, and norepinephrine were measured. intervention (progressive muscle relaxation, PMR, and guided imagery, GI) showed changes in fetal behavior. The intervention groups had higher long-term variation during and after relaxation compared to the CG (p=.039). CG fetuses had more FHR acceleration, especially during and after quiet rest (p=.027). Women in the PMR group had significantly more uterine activity than women in the GI group (p=.011) and than CG women. Maternal heart rate, blood pressure, and stress hormones were not associated with fetal behavior. this study indicates that the fetus might participate in maternal relaxation and suggests that GI is superior to PMR. This could especially be true for women who tend to direct their attention to body sensations such as abdominal activity. 2010 Elsevier Ltd. All rights reserved.

  7. Schoolyard upgrade in a randomized controlled study design

    DEFF Research Database (Denmark)

    Christiansen, Lars Breum Skov; Toftager, Mette; Pawlowski, Charlotte Skau

    2017-01-01

    of student perceptions across the intervention schools, and that a one unit increase in the Schoolyard index (SYi) led to a 12% increase in recess PA. This study shows that adolescent PA during recess can be increased through a multicomponent intervention. The prospect for making an impact is low...... and according to the process analysis dependent on direct involvement; active and supportive adults; and varied, connected and well located facilities....

  8. Safety of Flibanserin in Women Treated With Antidepressants: A Randomized, Placebo-Controlled Study.

    Science.gov (United States)

    Clayton, Anita H; Croft, Harry A; Yuan, James; Brown, Louise; Kissling, Robert

    2018-01-01

    receiving placebo; remission of anxiety based on the Beck Anxiety Inventory was noted in 16.4% and 2.7% of patients, respectively. The results of this study support the safety of flibanserin in premenopausal women being treated with a serotonergic antidepressant. No increased risks were observed when adding flibanserin to a stable selective serotonin reuptake inhibitor or serotonin and norepinephrine reuptake inhibitor treatment regimen. This was a well-designed, randomized, placebo-controlled trial. The primary limitation was the early study discontinuation by the sponsor, which decreased the sample size and duration of treatment. In this small trial, flibanserin 100 mg qhs was generally safe and well tolerated in premenopausal women with mild or remitted depression taking a serotonergic antidepressant. Clayton AH, Croft HA, Yuan J, et al. Safety of Flibanserin in Women Treated With Antidepressants: A Randomized, Placebo-Controlled Study. J Sex Med 2018;15:43-51. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Random noise attenuation of non-uniformly sampled 3D seismic data along two spatial coordinates using non-equispaced curvelet transform

    Science.gov (United States)

    Zhang, Hua; Yang, Hui; Li, Hongxing; Huang, Guangnan; Ding, Zheyi

    2018-04-01

    The attenuation of random noise is important for improving the signal to noise ratio (SNR). However, the precondition for most conventional denoising methods is that the noisy data must be sampled on a uniform grid, making the conventional methods unsuitable for non-uniformly sampled data. In this paper, a denoising method capable of regularizing the noisy data from a non-uniform grid to a specified uniform grid is proposed. Firstly, the denoising method is performed for every time slice extracted from the 3D noisy data along the source and receiver directions, then the 2D non-equispaced fast Fourier transform (NFFT) is introduced in the conventional fast discrete curvelet transform (FDCT). The non-equispaced fast discrete curvelet transform (NFDCT) can be achieved based on the regularized inversion of an operator that links the uniformly sampled curvelet coefficients to the non-uniformly sampled noisy data. The uniform curvelet coefficients can be calculated by using the inversion algorithm of the spectral projected-gradient for ℓ1-norm problems. Then local threshold factors are chosen for the uniform curvelet coefficients for each decomposition scale, and effective curvelet coefficients are obtained respectively for each scale. Finally, the conventional inverse FDCT is applied to the effective curvelet coefficients. This completes the proposed 3D denoising method using the non-equispaced curvelet transform in the source-receiver domain. The examples for synthetic data and real data reveal the effectiveness of the proposed approach in applications to noise attenuation for non-uniformly sampled data compared with the conventional FDCT method and wavelet transformation.

  10. Percutaneous CT-guided lung biopsy: sequential versus spiral scanning. A randomized prospective study

    International Nuclear Information System (INIS)

    Ghaye, B.; Dondelinger, R.F.; Dewe, W.

    1999-01-01

    The aim of this study was to evaluate in a prospective and randomized study spiral versus sequential scanning in the guidance of percutaneous lung biopsy. Fifty thoracic lesions occurring in 48 patients were biopsied by a senior and a junior operator. Six different time segments of the procedure were measured. Scanning mode versus length of procedure, pathological results, irradiation and complications were evaluated. Total duration of the procedure and of the first sampling was significantly longer with spiral CT for the senior operator (p < 0.004). No significant time difference was observed for the junior operator. Diameter of the lesion, depth of location, position of the patient and needle entry site did not influence the results. The sensitivity was 90.9, specificity 100, positive predictive value 100 and negative predictive value 60 % for spiral CT, and 94.7, 100, 100 and 85.7 % for sequential CT, respectively. Eleven pneumothoraces and ten perinodular hemorrhages were seen with spiral CT and six and ten, respectively, with sequential CT. The mean dose of irradiation was 4027 mAs for spiral CT and 2358 mAs for conventional CT. Spiral CT does neither reduce procedure time nor the rate of complications. Pathological results do not differ compared with sequential CT, and total dose of irradiation is higher with spiral scanning. (orig.)

  11. Effect of Sugammadex on Postoperative Bleeding and Coagulation Parameters After Septoplasty: A Randomized Prospective Study

    Science.gov (United States)

    Taş, Nilay; Korkmaz, Hakan; Yağan, Özgür; Korkmaz, Mukadder

    2015-01-01

    Backround Sugammadex is a reversal agent with well known advantages but it’s effects on haemostasis and bleeding have been a topic of interest. Septoplasty is a common surgical procedure with postoperative respiratory complications and bleeding. The aim of this study is to investigate the effects of sugammadex on postoperative coagulation parameters and bleeding after septoplasty procedure. Material/Methods In this randomized controlled study, fifty patients were grouped into two groups; neostigmine (Group N) vs. sugammadex (Group S). For the evaluation of PT, aPTT and INR, blood samples were taken for at the postoperative 120th minutes and alteration of these values with respect to preoperative values were documented. Postoperative bleeding was measured by evaluating the amount of blood absorbed on the nasal tip dressing during 3 hours postoperatively. Results Postoperative bleeding amount was significantly higher in the Group S compared to Group N (p=0.013). No significant difference was observed between two groups according to coagulation parameters (PT; p=0.953, aPTT; p=0.734, INR; p=0.612). Conclusions Sugammadex was associated with higher amount of postoperative bleeding than neostigmine in septoplasty patients. In surgical procedures having high risk of bleeding the safety of sugammadex need to be verified. PMID:26271275

  12. Randomized pharmacokinetic study comparing subcutaneous and intravenous palonosetron in cancer patients treated with platinum based chemotherapy.

    Directory of Open Access Journals (Sweden)

    Belen Sadaba

    Full Text Available Palonosetron is a potent second generation 5- hydroxytryptamine-3 selective antagonist which can be administered by either intravenous (IV or oral routes, but subcutaneous (SC administration of palonosetron has never been studied, even though it could have useful clinical applications. In this study, we evaluate the bioavailability of SC palonosetron.Patients treated with platinum-based chemotherapy were randomized to receive SC or IV palonosetron, followed by the alternative route in a crossover manner, during the first two cycles of chemotherapy. Blood samples were collected at baseline and 10, 15, 30, 45, 60, 90 minutes and 2, 3, 4, 6, 8, 12 and 24 h after palonosetron administration. Urine was collected during 12 hours following palonosetron. We compared pharmacokinetic parameters including AUC0-24h, t1/2, and Cmax observed with each route of administration by analysis of variance (ANOVA.From October 2009 to July 2010, 25 evaluable patients were included. AUC0-24h for IV and SC palonosetron were respectively 14.1 and 12.7 ng × h/ml (p=0.160. Bioavalability of SC palonosetron was 118% (95% IC: 69-168. Cmax was lower with SC than with IV route and was reached 15 minutes following SC administration.Palonosetron bioavailability was similar when administered by either SC or IV route. This new route of administration might be specially useful for outpatient management of emesis and for administration of oral chemotherapy.ClinicalTrials.gov NCT01046240.

  13. Educational data mining: a sample of review and study case

    Directory of Open Access Journals (Sweden)

    Alejandro Pena, Rafael Domínguez, Jose de Jesus Medel

    2009-12-01

    Full Text Available The aim of this work is to encourage the research in a novel merged field: Educational data mining (EDM. Thereby, twosubjects are outlined: The first one corresponds to a review of data mining (DM methods and EDM applications. Thesecond topic represents an EDM study case. As a result of the application of DM in Web-based Education Systems (WBES,stratified groups of students were found during a trial. Such groups reveal key attributes of volunteers that deserted orremained during a WBES experiment. This kind of discovered knowledge inspires the statement of correlational hypothesisto set relations between attributes and behavioral patterns of WBES users. We concluded that: When EDM findings aretaken into account for designing and managing WBES, the learning objectives are improved

  14. Paracetamol sharpens reflection and spatial memory: a double-blind randomized controlled study in healthy volunteers

    Directory of Open Access Journals (Sweden)

    Pickering G

    2016-12-01

    Full Text Available Gisèle Pickering,1–3 Nicolas Macian,1,2 Claude Dubray,1–3 Bruno Pereira4 1University Hospital, CHU Clermont-Ferrand, Centre de Pharmacologie Clinique, 2Inserm, CIC 1405, UMR Neurodol 1107, 3Clermont Université, Laboratoire de Pharmacologie, Faculté de médecine, 4CHU de Clermont-Ferrand, Délégation Recherche Clinique Innovation, Clermont-Ferrand, France Background: Acetaminophen (APAP, paracetamol mechanism for analgesic and antipyretic outcomes has been largely addressed, but APAP action on cognitive function has not been studied in humans. Animal studies have suggested an improved cognitive performance but the link with analgesic and antipyretic modes of action is incomplete. This study aims at exploring cognitive tests in healthy volunteers in the context of antinociception and temperature regulation. A double-blind randomized controlled study (NCT01390467 was carried out from May 30, 2011 to July 12, 2011. Methods: Forty healthy volunteers were included and analyzed. Nociceptive thresholds, core temperature (body temperature, and a battery of cognitive tests were recorded before and after oral APAP (2 g or placebo: Information sampling task for predecisional processing, Stockings of Cambridge for spatial memory, reaction time, delayed matching of sample, and pattern recognition memory tests. Analysis of variance for repeated measures adapted to crossover design was performed and a two-tailed type I error was fixed at 5%. Results: APAP improved information sampling task (diminution of the number of errors, latency to open boxes, and increased number of opened boxes; all P<0.05. Spatial planning and working memory initial thinking time were decreased (P=0.04. All other tests were not modified by APAP. APAP had an antinociceptive effect (P<0.01 and body temperature did not change. Conclusion: This study shows for the first time that APAP sharpens decision making and planning strategy in healthy volunteers and that cognitive performance

  15. [Comparison study on sampling methods of Oncomelania hupensis snail survey in marshland schistosomiasis epidemic areas in China].

    Science.gov (United States)

    An, Zhao; Wen-Xin, Zhang; Zhong, Yao; Yu-Kuan, Ma; Qing, Liu; Hou-Lang, Duan; Yi-di, Shang

    2016-06-29

    To optimize and simplify the survey method of Oncomelania hupensis snail in marshland endemic region of schistosomiasis and increase the precision, efficiency and economy of the snail survey. A quadrate experimental field was selected as the subject of 50 m×50 m size in Chayegang marshland near Henghu farm in the Poyang Lake region and a whole-covered method was adopted to survey the snails. The simple random sampling, systematic sampling and stratified random sampling methods were applied to calculate the minimum sample size, relative sampling error and absolute sampling error. The minimum sample sizes of the simple random sampling, systematic sampling and stratified random sampling methods were 300, 300 and 225, respectively. The relative sampling errors of three methods were all less than 15%. The absolute sampling errors were 0.221 7, 0.302 4 and 0.047 8, respectively. The spatial stratified sampling with altitude as the stratum variable is an efficient approach of lower cost and higher precision for the snail survey.

  16. Study protocol: a randomized controlled trial investigating the effects of a psychosexual training program for adolescents with autism spectrum disorder.

    Science.gov (United States)

    Visser, Kirsten; Greaves-Lord, Kirstin; Tick, Nouchka T; Verhulst, Frank C; Maras, Athanasios; van der Vegt, Esther J M

    2015-08-28

    Previous research shows that adolescents with autism spectrum disorder (ASD) run several risks in their psychosexual development and that these adolescents can have limited access to reliable information on puberty and sexuality, emphasizing the need for specific guidance of adolescents with ASD in their psychosexual development. Few studies have investigated the effects of psychosexual training programs for adolescents with ASD and to date no randomized controlled trials are available to study the effects of psychosexual interventions for this target group. The randomized controlled trial (RCT) described in this study protocol aims to investigate the effects of the Tackling Teenage Training (TTT) program on the psychosexual development of adolescents with ASD. This parallel clinical trial, conducted in the South-West of the Netherlands, has a simple equal randomization design with an intervention and a waiting-list control condition. Two hundred adolescents and their parents participate in this study. We assess the participants in both conditions using self-report as well as parent-report questionnaires at three time points during 1 year: at baseline (T1), post-treatment (T2), and for follow-up (T3). To our knowledge, the current study is the first that uses a randomized controlled design to study the effects of a psychosexual training program for adolescents with ASD. It has a number of methodological strengths, namely a large sample size, a wide range of functionally relevant outcome measures, the use of multiple informants, and a standardized research and intervention protocol. Also some limitations of the described study are identified, for instance not making a comparison between two treatment conditions, and no use of blinded observational measures to investigate the ecological validity of the research results. Dutch Trial Register NTR2860. Registered on 20 April 2011.

  17. Reducing sample size by combining superiority and non-inferiority for two primary endpoints in the Social Fitness study.

    Science.gov (United States)

    Donkers, Hanneke; Graff, Maud; Vernooij-Dassen, Myrra; Nijhuis-van der Sanden, Maria; Teerenstra, Steven

    2017-01-01

    In randomized controlled trials, two endpoints may be necessary to capture the multidimensional concept of the intervention and the objectives of the study adequately. We show how to calculate sample size when defining success of a trial by combinations of superiority and/or non-inferiority aims for the endpoints. The randomized controlled trial design of the Social Fitness study uses two primary endpoints, which can be combined into five different scenarios for defining success of the trial. We show how to calculate power and sample size for each scenario and compare these for different settings of power of each endpoint and correlation between them. Compared to a single primary endpoint, using two primary endpoints often gives more power when success is defined as: improvement in one of the two endpoints and no deterioration in the other. This also gives better power than when success is defined as: improvement in one prespecified endpoint and no deterioration in the remaining endpoint. When two primary endpoints are equally important, but a positive effect in both simultaneously is not per se required, the objective of having one superior and the other (at least) non-inferior could make sense and reduce sample size. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. How much will afforestation of former cropland influence soil C stocks? A synthesis of paired sampling, chronosequence sampling and repeated sampling studies

    Science.gov (United States)

    Vesterdal, Lars; Hansen, K.; Stupak, I.; Don, Axel; Poeplau, C.; Leifeld, Jens; van Wesemael, Bas

    2010-05-01

    The need for documentation of land-use change effects on soil C is high on the agenda in most signatory countries to the Kyoto Protocol. Large land areas in Europe have experienced land-use change from cropland to forest since 1990 by direct afforestation as well as abandonment and regrowth of marginally productive cropland. Soil C dynamics following land-use change remain highly uncertain due to a limited number of available studies and due to influence of interacting factors such as land use history, soil type, and climate. Common approaches for estimation of potential soil C changes following land-use change are i) paired sampling of plots with a long legacy of different land uses, ii) chronosequence studies of land-use change, and lastly iii) repeated sampling of plots subject to changed land use. This paper will synthesize the quantitative effects of cropland afforestation on soil C sequestration based on all three approaches and will report on related work within Cost 639. Paired plots of forest and cropland were used to study the general differences between soil C stocks in the two land uses. At 27 sites in Denmark distributed among different regions and soil types forest floor and mineral soil were sampled in and around soil pits. Soil C stocks were higher in forest than cropland (mean difference 22 Mg C ha-1 to 1 m depth). This difference was caused solely by the presence of a forest floor in forests; mineral soil C stocks were similar (108 vs. 109 Mg C ha-1) in the two land uses regardless of soil type and the soil layers considered. The chronosequence approach was employed in the AFFOREST project for evaluation of C sequestration in biomass and soils following afforestation of cropland. Two oak (Quercus robur) and four Norway spruce (Picea abies) afforestation chronosequences (age range 1 to 90 years) were studied in Denmark, Sweden and the Netherlands. Forest floor and mineral soil (0-25 cm) C contents were as a minimum unchanged and in most cases there

  19. Predictors of Suicide Ideation in a Random Digit Dial Study: Exposure to Suicide Matters.

    Science.gov (United States)

    van de Venne, Judy; Cerel, Julie; Moore, Melinda; Maple, Myfanwy

    2017-07-03

    Suicide is an important public health concern requiring ongoing research to understand risk factors for suicide ideation. A dual-frame, random digit dial survey was utilized to identify demographic and suicide-related factors associated with suicide ideation in a statewide sample of 1,736 adults. The PH-Q 9 Depression scale suicide ideation question was used to assess current suicide ideation in both the full sample and suicide exposed sub-sample. Being non-married and having previous suicide exposure were separately associated with higher risks of suicide ideation in the full sample. Being male, having increased suicide exposures, and having increased perceptions of closeness to the decedent increased risks, while older age decreased risks for the suicide exposed. Implications for future screening and research are discussed.

  20. Brief intervention to reduce risky drinking in pregnancy: study protocol for a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Wilson Graeme B

    2012-09-01

    Full Text Available Abstract Background Risky drinking in pregnancy by UK women is likely to result in many alcohol-exposed pregnancies. Studies from the USA suggest that brief intervention has promise for alcohol risk reduction in antenatal care. However, further research is needed to establish whether this evidence from the USA is applicable to the UK. This pilot study aims to investigate whether pregnant women can be recruited and retained in a randomized controlled trial of brief intervention aimed at reducing risky drinking in women receiving antenatal care. Methods The trial will rehearse the parallel-group, non-blinded design and procedures of a subsequent definitive trial. Over 8 months, women aged 18 years and over (target number 2,742 attending their booking appointment with a community midwife (n = 31 in north-east England will be screened for alcohol consumption using the consumption questions of the Alcohol Use Disorders Identification Test (AUDIT-C. Those screening positive, without a history of substance use or alcohol dependence, with no pregnancy complication, and able to give informed consent, will be invited to participate in the trial (target number 120. Midwives will be randomized in a 1:1 ratio to deliver either treatment as usual (control or structured brief advice and referral for a 20-minute motivational interviewing session with an alcohol health worker (intervention. As well as demographic and health information, baseline measures will include two 7-day time line follow-back questionnaires and the EuroQoL EQ-5D-3 L questionnaire. Measures will be repeated in telephone follow-ups in the third trimester and at 6 months post-partum, when a questionnaire on use of National Health Service and social care resources will also be completed. Information on pregnancy outcomes and stillbirths will be accessed from central health service records before the follow-ups. Primary outcomes will be rates of eligibility, recruitment, intervention

  1. A randomized waitlist control community study of Social Cognition and Interaction Training for people with schizophrenia.

    Science.gov (United States)

    Gordon, Anne; Davis, Penelope J; Patterson, Susan; Pepping, Christopher A; Scott, James G; Salter, Kerri; Connell, Melissa

    2018-03-01

    Social Cognition and Interaction Training (SCIT) has demonstrated effectiveness in improving social cognition and functioning of people with schizophrenia. This pilot study examines the acceptability, feasibility, and effectiveness of SCIT with individuals who have schizophrenia-spectrum disorders and are receiving care through a public mental health service. In a pragmatic randomized waitlist controlled trial, 36 participants (aged 19-55 years) with a schizophrenia spectrum disorder were randomly allocated to SCIT or treatment as usual (TAU). Measures of theory of mind, emotion perception, attributional bias, social skills, quality of life, life skills, depression, anxiety, and stress were administered pre- and post-intervention with follow-up conducted 4 months later. All wait-list controls subsequently received the intervention and a secondary within-group analysis was conducted including these participants. While no significant differences were found between groups on any outcomes, there was strong engagement with the SCIT intervention. Of the 21 participants in the intervention group, the completion rate was 85.71% with a median attendance rate of 17 sessions. Within subject analyses of SCIT participants over time showed significant improvements in quality of life, emotion recognition, social skills, and a trend towards better life skills from pre- to post-intervention. These gains were sustained at the 4-month follow-up time. Although this study showed limited benefits in outcomes associated with SCIT compared with TAU, it demonstrated the acceptability of SCIT to participants in a real world public health setting shown by high retention, attendance, and positive feedback. This pilot shows SCIT can be implemented in routine clinical practice and lays the foundation for a larger pragmatic study. SCIT can be implemented successfully in a real-world community mental health setting. SCIT had high levels of acceptability to these participants. Limitations The

  2. Lifetime Prevalence of Suicide Attempts Among Sexual Minority Adults by Study Sampling Strategies: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Hottes, Travis Salway; Bogaert, Laura; Rhodes, Anne E; Brennan, David J; Gesink, Dionne

    2016-05-01

    Previous reviews have demonstrated a higher risk of suicide attempts for lesbian, gay, and bisexual (LGB) persons (sexual minorities), compared with heterosexual groups, but these were restricted to general population studies, thereby excluding individuals sampled through LGB community venues. Each sampling strategy, however, has particular methodological strengths and limitations. For instance, general population probability studies have defined sampling frames but are prone to information bias associated with underreporting of LGB identities. By contrast, LGB community surveys may support disclosure of sexuality but overrepresent individuals with strong LGB community attachment. To reassess the burden of suicide-related behavior among LGB adults, directly comparing estimates derived from population- versus LGB community-based samples. In 2014, we searched MEDLINE, EMBASE, PsycInfo, CINAHL, and Scopus databases for articles addressing suicide-related behavior (ideation, attempts) among sexual minorities. We selected quantitative studies of sexual minority adults conducted in nonclinical settings in the United States, Canada, Europe, Australia, and New Zealand. Random effects meta-analysis and meta-regression assessed for a difference in prevalence of suicide-related behavior by sample type, adjusted for study or sample-level variables, including context (year, country), methods (medium, response rate), and subgroup characteristics (age, gender, sexual minority construct). We examined residual heterogeneity by using τ(2). We pooled 30 cross-sectional studies, including 21,201 sexual minority adults, generating the following lifetime prevalence estimates of suicide attempts: 4% (95% confidence interval [CI] = 3%, 5%) for heterosexual respondents to population surveys, 11% (95% CI = 8%, 15%) for LGB respondents to population surveys, and 20% (95% CI = 18%, 22%) for LGB respondents to community surveys (Figure 1). The difference in LGB estimates by sample

  3. Serum iron levels and the risk of Parkinson disease: a Mendelian randomization study.

    Directory of Open Access Journals (Sweden)

    Irene Pichler

    Full Text Available Although levels of iron are known to be increased in the brains of patients with Parkinson disease (PD, epidemiological evidence on a possible effect of iron blood levels on PD risk is inconclusive, with effects reported in opposite directions. Epidemiological studies suffer from problems of confounding and reverse causation, and mendelian randomization (MR represents an alternative approach to provide unconfounded estimates of the effects of biomarkers on disease. We performed a MR study where genes known to modify iron levels were used as instruments to estimate the effect of iron on PD risk, based on estimates of the genetic effects on both iron and PD obtained from the largest sample meta-analyzed to date.We used as instrumental variables three genetic variants influencing iron levels, HFE rs1800562, HFE rs1799945, and TMPRSS6 rs855791. Estimates of their effect on serum iron were based on a recent genome-wide meta-analysis of 21,567 individuals, while estimates of their effect on PD risk were obtained through meta-analysis of genome-wide and candidate gene studies with 20,809 PD cases and 88,892 controls. Separate MR estimates of the effect of iron on PD were obtained for each variant and pooled by meta-analysis. We investigated heterogeneity across the three estimates as an indication of possible pleiotropy and found no evidence of it. The combined MR estimate showed a statistically significant protective effect of iron, with a relative risk reduction for PD of 3% (95% CI 1%-6%; p = 0.001 per 10 µg/dl increase in serum iron.Our study suggests that increased iron levels are causally associated with a decreased risk of developing PD. Further studies are needed to understand the pathophysiological mechanism of action of serum iron on PD risk before recommendations can be made.

  4. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  5. Reducing procrastination using a smartphone-based treatment program: A randomized controlled pilot study

    Directory of Open Access Journals (Sweden)

    Christian Aljoscha Lukas

    2018-06-01

    Full Text Available Background: Procrastination affects a large number of individuals and is associated with significant mental health problems. Despite the deleterious consequences individuals afflicted with procrastination have to bear, there is a surprising paucity of well-researched treatments for procrastination. To fill this gap, this study evaluated the efficacy of an easy-to-use smartphone-based treatment for procrastination. Method: N=31 individuals with heightened procrastination scores were randomly assigned to a blended smartphone-based intervention including two brief group counseling sessions and 14days of training with the mindtastic procrastination app (MT-PRO, or to a waitlist condition. MT-PRO fosters the approach of functional and the avoidance of dysfunctional behavior by systematically utilizing techniques derived from cognitive bias modification approaches, gamification principles, and operant conditioning. Primary outcome was the course of procrastination symptom severity as assessed with the General Procrastination Questionnaire. Results: Participating in the smartphone-based treatment was associated with a significantly greater reduction of procrastination than was participating in the control condition (η2=.15. Conclusion: A smartphone-based intervention may be an effective treatment for procrastination. Future research should use larger samples and directly compare the efficacy of smartphone-based interventions and traditional interventions for procrastination. Keywords: Procrastination, Intervention, Treatment, Smartphone, Mobile health

  6. Health service accreditation as a predictor of clinical and organisational performance: a blinded, random, stratified study.

    Science.gov (United States)

    Braithwaite, Jeffrey; Greenfield, David; Westbrook, Johanna; Pawsey, Marjorie; Westbrook, Mary; Gibberd, Robert; Naylor, Justine; Nathan, Sally; Robinson, Maureen; Runciman, Bill; Jackson, Margaret; Travaglia, Joanne; Johnston, Brian; Yen, Desmond; McDonald, Heather; Low, Lena; Redman, Sally; Johnson, Betty; Corbett, Angus; Hennessy, Darlene; Clark, John; Lancaster, Judie

    2010-02-01

    Despite the widespread use of accreditation in many countries, and prevailing beliefs that accreditation is associated with variables contributing to clinical care and organisational outcomes, little systematic research has been conducted to examine its validity as a predictor of healthcare performance. To determine whether accreditation performance is associated with self-reported clinical performance and independent ratings of four aspects of organisational performance. Independent blinded assessment of these variables in a random, stratified sample of health service organisations. Acute care: large, medium and small health-service organisations in Australia. Study participants Nineteen health service organisations employing 16 448 staff treating 321 289 inpatients and 1 971 087 non-inpatient services annually, representing approximately 5% of the Australian acute care health system. Correlations of accreditation performance with organisational culture, organisational climate, consumer involvement, leadership and clinical performance. Results Accreditation performance was significantly positively correlated with organisational culture (rho=0.618, p=0.005) and leadership (rho=0.616, p=0.005). There was a trend between accreditation and clinical performance (rho=0.450, p=0.080). Accreditation was unrelated to organisational climate (rho=0.378, p=0.110) and consumer involvement (rho=0.215, p=0.377). Accreditation results predict leadership behaviours and cultural characteristics of healthcare organisations but not organisational climate or consumer participation, and a positive trend between accreditation and clinical performance is noted.

  7. Tablet computers and forensic and correctional psychological assessment: A randomized controlled study.

    Science.gov (United States)

    King, Christopher M; Heilbrun, Kirk; Kim, Na Young; McWilliams, Kellie; Phillips, Sarah; Barbera, Jessie; Fretz, Ralph

    2017-10-01

    Mobile computing technology presents various possibilities and challenges for psychological assessment. Within forensic and correctional psychology, assessment of justice-involved persons facilitated by such technology has not been empirically examined. Accordingly, this randomized controlled experiment involved administering questionnaires about risk-needs, treatment readiness, and computerized technology opinions to a large (N = 212) and diverse sample of individuals under custodial correctional supervision using either a tablet computer or traditional paper-and-pencil materials. Results revealed that participants in the paper-and-pencil condition completed the packet of questionnaires faster but omitted items more frequently. Older participants and those with lower levels of education tended to take longer to complete the tablet-administrated measures. The tablet format was rated as more usable irrespective of demographic and personal characteristics, and most participants across the 2 conditions indicated that they would prefer to use computerized technology to complete psychological testing. Administration format did not have a clear effect on attitudes toward correctional rehabilitation services. Noteworthy for researchers is the substantial time saved and absence of practical problems with the tablet condition. Implications for practitioners include the general usability of the devices, their appeal to incarcerated persons, and the potential for tablets to facilitate clinical and administrative tasks with corrections clients. Considering the novel nature of this study, its promising results, and its limitations, future research in this area is warranted. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Neurofeedback reduces overeating episodes in female restrained eaters: a randomized controlled pilot-study.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2015-12-01

    Overeating episodes, despite of intentions to control weight, are a common problem among women. Recurring episodes of overeating and dietary failure have been reported to result in higher Body Mass Indexes and to induce severe distress even in non-clinical groups. Based on findings from physiological research on eating behavior and craving, as well as previous biofeedback studies, we derived a cue exposure based EEG neurofeedback protocol to target overeating episodes. The treatment was evaluated in a randomized controlled trial, comparing a neurofeedback group (NFG; n = 14) with a waiting list control group (WLG; n = 13) in a sub-clinical sample of female restrained eaters. At post-treatment, the number of weekly overeating episodes and subsequent distress were significantly reduced in the NFG compared to the WLG (p  .50). In a 3 month follow-up, effects in the NFG remained stable. As secondary outcomes, perceived dieting success was enhanced after the treatment. At follow-up, additional beneficial effects on trait food craving were observed. Altogether, we found preliminary evidence for the cue exposure neurofeedback against overeating episodes in female restrained eaters, although specific effects and underlying mechanisms still have to be explored in future research.

  9. A Study on the Representative Sampling Survey for Radionuclide Analysis of RI Waste

    Energy Technology Data Exchange (ETDEWEB)

    Jee, K. Y. [KAERI, Daejeon (Korea, Republic of); Kim, Juyoul; Jung, Gunhyo [FNC Tech. Co., Daejeon (Korea, Republic of)

    2007-07-15

    We developed a quantitative method for attaining a representative sample during sampling survey of RI waste. Considering a source, process, and type of RI waste, the method computes the number of sample, confidence interval, variance, and coefficient of variance. We also systematize the method of sampling survey logically and quantitatively. The result of this study can be applied to sampling survey of low- and intermediate-level waste generated from nuclear power plant during the transfer process to disposal facility.

  10. [Utilization of self-sampling kits for HPV testing in cervical cancer screening - pilot study].

    Science.gov (United States)

    Ondryášová, H; Koudeláková, V; Drábek, J; Vaněk, P; Slavkovský, R; Hajdúch, M

    2015-12-01

    To get initial experience with alternative sampling (self-sampling) for HPV testing as the means of cervical cancer screening program. Original work. Institute of Molecular and Translational Medicine, Faculty of Medicine and Dentistry, Palacky University in Olomouc. Based on expression of interest, 215 self-sampling kits were posted to women. Evalyn(®) Brush Vaginal swabs obtained by self-sampling were analyzed for the presence of HPV infection by Cobas 4800 HPV (Roche) followed by genotyping using PapilloCheck(®) HPV-Screening (Greiner Bio-One). Sixty women randomly chosen from our sample were sent a questionnaire focused on their experience with self-sampling. One hundred seventy-four of 215 (81%) distributed self-sampling devices have been delivered to analysis. All cervicovaginal swabs were sampled correctly and it was possible to analyze them by Cobas 4800 HPV test. Similarly, 98% (171/174) samples were analyzable by PapilloCheck(®) HPV-Screening.One hundred twenty-five (72%) of 174 tested samples were HPV negative. Low risk HPV infection was detected only in 7 samples (4%), and high risk HPV (hrHPV) infection was present in 42 samples (24%). The most frequently detected hrHPV genotypes were HPV16 (11/42; 26%) and HPV53 (6/42; 14%). HrHPV co-infection was detected in 10 cases, in 5 of them lrHPV infection was find also.Of the 60 questionnaires, 48 (80%) were returned. From this group, 47 (98%) women rated their experience with self-sampling device as good to excellent. User manual of self-sampling device was considered good to excellent by all women (100%). All women also rated the convenience of self-sampling device using as good to excellent. As expected, most of the women (n = 42 [88%]) preferred self-sampling to physician sampling. Cervicovaginal self-sampling leads to valid results of HPV screening using two molecular genetics methods and was accepted by Czech women very well. The self-sampling as an opportunity to participate in cervical cancer

  11. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  12. Improving Sleep for Hospitalized Antepartum Patients: A Non-Randomized Controlled Pilot Study.

    Science.gov (United States)

    Lee, Kathryn A; Gay, Caryl L

    2017-12-15

    To evaluate feasibility and efficacy of a hospital-based protocol for improving sleep in high- risk antepartum patients. Sleep measures were compared during 1 week of hospitalization before and after implementing a Sleep Improvement Protocol for Antepartum Patients (SIP-AP). A non-randomized convenience sample of usual care controls was compared to a subsequent intervention sample after the protocol was implemented. Women were eligible if they spoke English, were medically stable, pregnant for at least 20 weeks, and hospitalized at least 24 hours; 25 pregnant women had sufficient data for analyses (11 controls, 14 intervention). Sleep was assessed in 3 ways: the Pittsburgh Sleep Quality Index was completed after obtaining consent to estimate sleep quality prior to hospital admission; sleep diary completed each hospital day; and General Sleep Disturbance Scale completed at 7 days or prior to hospital discharge. Symptoms that could affect sleep were assessed with the Memorial Symptom Assessment Scale. Both groups recorded similar sleep duration (7 hours) but the intervention group had fewer symptoms and significantly ( P = .015) lower sleep disturbance scores (53.1 ± 14.5) than controls (71.9 ± 18.8). Participant feedback about the intervention was positive, although adherence to components of the intervention protocol was variable. This pilot study provides evidence of the feasibility and preliminary efficacy of the SIP-AP intervention for reducing symptoms and improving sleep of antepartum patients during hospitalization. Further detailed evaluation of specific components of this protocol is warranted, and other types of hospitalized patients may benefit from unit-based modifications to this SIP-AP protocol. © 2017 American Academy of Sleep Medicine

  13. High-dose estradiol improves cognition for women with AD: results of a randomized study.

    Science.gov (United States)

    Asthana, S; Baker, L D; Craft, S; Stanczyk, F Z; Veith, R C; Raskind, M A; Plymate, S R

    2001-08-28

    To characterize the cognitive and neuroendocrine response to treatment with a high dose of estrogen for postmenopausal women with AD. Twenty postmenopausal women with AD were randomized to receive either 0.10 mg/day of 17 beta-estradiol by skin patch or a placebo patch for 8 weeks. Subjects were evaluated at baseline, at weeks 3, 5, and 8 during treatment, and again 8 weeks after treatment termination. During each visit, cognition was assessed with a battery of neuropsychological tests, and blood samples were collected to measure plasma estradiol as well as several other neuroendocrine markers of interest. Significant effects of estrogen treatment were observed on attention (Stroop Color Word Interference Test), verbal memory (Buschke Selective Reminding Test), and visual memory (Figure Copy/Memory). In addition, women treated with estrogen demonstrated improved performance on a test of semantic memory (Boston Naming Test) compared with subjects who received a placebo. Estrogen appeared to have a suppressive effect on the insulin-like growth factor (IGF) system such that plasma concentration of IGF binding protein-3 was significantly reduced and plasma levels of estradiol and IGF-I were negatively correlated during estrogen treatment. Administration of a higher dose of estrogen may enhance attention and memory for postmenopausal women with AD. Although these findings provide further clinical evidence to support a cognitive benefit of estrogen for women with AD, studies evaluating the effect of estradiol administration, in particular, using larger sample sizes and for longer treatment durations are warranted before the therapeutic potential of estrogen replacement for women with AD can be firmly established.

  14. Using a partially randomized patient preference study design to evaluate the therapeutic effect of acupuncture and cupping therapy for fibromyalgia: study protocol for a partially randomized controlled trial.

    Science.gov (United States)

    Cao, Hui-Juan; Liu, Jian-Ping; Hu, Hui; Wang, Nissi S

    2014-07-10

    Conducting randomized controlled trials on traditional Chinese non-drug therapies has been limited by factors such as patient preference to specific treatment modality. The aim of this study is to investigate the feasibility of applying a partially randomized patient preference (PRPP) trial model in evaluating the efficacy of two types of traditional Chinese medicine therapies, acupuncture and cupping, for fibromyalgia while accounting for patients' preference of either therapeutic modality. This protocol was approved by the Institutional Ethics Committee of affiliated Dongfang Hospital, Beijing University of Chinese Medicine (approval number: 2013052104-2). One hundred participants with fibromyalgia will be included in this study. Diagnosis of fibromyalgia will be based on the American College of Rheumatology criteria. Before treatment, participants will be interviewed for their preference toward acupuncture or cupping therapy. Fifty participants with no preference will be randomly assigned to one of the two groups and another 50 participants with strong preference to either acupuncture or cupping will receive what they choose. For acupuncture and cupping therapy, the main acupoints used will be tender points (Ashi). Treatment will be three times a week for 5 consecutive weeks with a follow-up period of 12 weeks. Outcome measures will be qualitative (patient expectation and satisfaction) and quantitative (pain intensity, quality of life, depression assessment). NCT01869712 (in clinicaltrials.gov, on 22nd May 2013).

  15. A randomized study of multimedia informational aids for research on medical practices: Implications for informed consent.

    Science.gov (United States)

    Kraft, Stephanie A; Constantine, Melissa; Magnus, David; Porter, Kathryn M; Lee, Sandra Soo-Jin; Green, Michael; Kass, Nancy E; Wilfond, Benjamin S; Cho, Mildred K

    2017-02-01

    Participant understanding is a key element of informed consent for enrollment in research. However, participants often do not understand the nature, risks, benefits, or design of the studies in which they take part. Research on medical practices, which studies standard interventions rather than new treatments, has the potential to be es