WorldWideScience

Sample records for random sampling methodology

  1. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  2. Rationale, design, methodology and sample characteristics for the Vietnam pre-conceptual micronutrient supplementation trial (PRECONCEPT: a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Nguyen Phuong H

    2012-10-01

    Full Text Available Abstract Background Low birth weight and maternal anemia remain intractable problems in many developing countries. The adequacy of the current strategy of providing iron-folic acid (IFA supplements only during pregnancy has been questioned given many women enter pregnancy with poor iron stores, the substantial micronutrient demand by maternal and fetal tissues, and programmatic issues related to timing and coverage of prenatal care. Weekly IFA supplementation for women of reproductive age (WRA improves iron status and reduces the burden of anemia in the short term, but few studies have evaluated subsequent pregnancy and birth outcomes. The Preconcept trial aims to determine whether pre-pregnancy weekly IFA or multiple micronutrient (MM supplementation will improve birth outcomes and maternal and infant iron status compared to the current practice of prenatal IFA supplementation only. This paper provides an overview of study design, methodology and sample characteristics from baseline survey data and key lessons learned. Methods/design We have recruited 5011 WRA in a double-blind stratified randomized controlled trial in rural Vietnam and randomly assigned them to receive weekly supplements containing either: 1 2800 μg folic acid 2 60 mg iron and 2800 μg folic acid or 3 MM. Women who become pregnant receive daily IFA, and are being followed through pregnancy, delivery, and up to three months post-partum. Study outcomes include birth outcomes and maternal and infant iron status. Data are being collected on household characteristics, maternal diet and mental health, anthropometry, infant feeding practices, morbidity and compliance. Discussion The study is timely and responds to the WHO Global Expert Consultation which identified the need to evaluate the long term benefits of weekly IFA and MM supplementation in WRA. Findings will generate new information to help guide policy and programs designed to reduce the burden of anemia in women and

  3. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  4. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  5. METHODOLOGICAL ASPECTS OF STRATIFICATION OF AUDIT SAMPLING

    OpenAIRE

    Vilena A. Yakimova

    2013-01-01

    The article presents the methodological foundations for construction stratification audit sampling for attribute-based sampling. The sampling techniques of Russian and foreign practice is studied and stratified. The role of stratification in the audit is described. Approaches to construction of the stratification are revealed on the basis of professional judgment (qualitative methods), statistical groupings (quantitative methods) and combinatory ones (complex qualitative stratifications). Gro...

  6. Methodology series module 5: Sampling strategies

    OpenAIRE

    Maninder Singh Setia

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  7. Methodological Choices in Rating Speech Samples

    Science.gov (United States)

    O'Brien, Mary Grantham

    2016-01-01

    Much pronunciation research critically relies upon listeners' judgments of speech samples, but researchers have rarely examined the impact of methodological choices. In the current study, 30 German native listeners and 42 German L2 learners (L1 English) rated speech samples produced by English-German L2 learners along three continua: accentedness,…

  8. METHODOLOGICAL ASPECTS OF STRATIFICATION OF AUDIT SAMPLING

    Directory of Open Access Journals (Sweden)

    Vilena A. Yakimova

    2013-01-01

    Full Text Available The article presents the methodological foundations for construction stratification audit sampling for attribute-based sampling. The sampling techniques of Russian and foreign practice is studied and stratified. The role of stratification in the audit is described. Approaches to construction of the stratification are revealed on the basis of professional judgment (qualitative methods, statistical groupings (quantitative methods and combinatory ones (complex qualitative stratifications. Grouping of accounting information for the purpose of constructing an optimal stratification and its criteria are proposed. The stratification methods are worked out and tested on the example of ABC-analysis.

  9. Incremental Sampling Methodology (ISM) for Metallic Residues

    Science.gov (United States)

    2013-08-01

    result in improved precision for Cu or if other changes, such as increasing the digestion aliquot mass or di- gestion interval or increasing the number...200 g of material. The soil samples were air-dried at ambient temperature, sieved to remove the greater-than- 2-mm fraction, and the less-than-2-mm...yielding a 25-kg sample. The incremental sample was air-dried at ambient temperature and passed through a 2-mm sieve. A rotary splitter was

  10. Methods for sample size determination in cluster randomized trials.

    Science.gov (United States)

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-06-01

    The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.

  11. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  12. Experimental Design and Validation of an Accelerated Random Vibration Fatigue Testing Methodology

    OpenAIRE

    Yu Jiang(Center for Statistical and Theoretical Condensed Matter Physics, Zhejiang Normal University, Jinhua City, Zhejiang Province 321004, China); Gun Jin Yun; Li Zhao; Junyong Tao

    2015-01-01

    Novel accelerated random vibration fatigue test methodology and strategy are proposed, which can generate a design of the experimental test plan significantly reducing the test time and the sample size. Based on theoretical analysis and fatigue damage model, several groups of random vibration fatigue tests were designed and conducted with the aim of investigating effects of both Gaussian and non-Gaussian random excitation on the vibration fatigue. First, stress responses at a weak point of a ...

  13. K-Median: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. K-Median: Random Sampling Procedure. Sample a set of 1/ + 1 points from P. Let Q = first 1/ points, p = last point. Let T = Avg. 1-Median cost of P, c=1-Median. Let B1 = B(c,T/ 2), B2 = B(p, T). Let P' = points in B1.

  14. A random spatial sampling method in a rural developing nation.

    Science.gov (United States)

    Kondo, Michelle C; Bream, Kent D W; Barg, Frances K; Branas, Charles C

    2014-04-10

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available.

  15. Methodological reporting quality of randomized controlled trials in three spine journals from 2010 to 2012.

    Science.gov (United States)

    Chen, Xiao; Zhai, Xiao; Wang, Xue; Su, Jiacan; Li, Ming

    2014-08-01

    To elucidate the methodological reporting quality of randomized controlled trials (RCTs) in three spine journals from 2010 to 2012. In this study, we summarized the methodological report of RCTs in three major spine journals, including the Spine Journal, Spine and the European Spine Journal from 2010 to 2012. The methodological reporting quality, including the allocation sequence generation, allocation concealment, blinding and sample size calculation, was revealed. Number of patients, funding source, type of intervention and country were also retrieved from each trial. The methodological reporting quality was descriptively reported. Ninety trials were involved and 57.8% (52/90) reported adequate allocation sequence generation, 46.7% (42/90) reported adequate allocation concealment, 34.4% (31/90) reported adequate blinding and 37.8% (34/90) reported adequate sample size calculation. This study shows that the methodological reporting quality of RCTs in the spine field needs further improvement.

  16. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  17. Optimal Design and Purposeful Sampling: Complementary Methodologies for Implementation Research.

    Science.gov (United States)

    Duan, Naihua; Bhaumik, Dulal K; Palinkas, Lawrence A; Hoagwood, Kimberly

    2015-09-01

    Optimal design has been an under-utilized methodology. However, it has significant real-world applications, particularly in mixed methods implementation research. We review the concept and demonstrate how it can be used to assess the sensitivity of design decisions and balance competing needs. For observational studies, this methodology enables selection of the most informative study units. For experimental studies, it entails selecting and assigning study units to intervention conditions in the most informative manner. We blend optimal design methods with purposeful sampling to show how these two concepts balance competing needs when there are multiple study aims, a common situation in implementation research.

  18. Published methodological quality of randomized controlled trials does not reflect the actual quality assessed in protocols

    Science.gov (United States)

    Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P.; Kumar, Ambuj

    2011-01-01

    Objectives To assess whether reported methodological quality of randomized controlled trials (RCTs) reflect the actual methodological quality, and to evaluate the association of effect size (ES) and sample size with methodological quality. Study design Systematic review Setting Retrospective analysis of all consecutive phase III RCTs published by 8 National Cancer Institute Cooperative Groups until year 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Results 429 RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94, 95%CI: 0.88, 0.99) and 24% (RHR: 1.24, 95%CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. Conclusion The largest study to-date shows poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. PMID:22424985

  19. GSAMPLE: Stata module to draw a random sample

    OpenAIRE

    Jann, Ben

    2006-01-01

    gsample draws a random sample from the data in memory. Simple random sampling (SRS) is supported, as well as unequal probability sampling (UPS), of which sampling with probabilities proportional to size (PPS) is a special case. Both methods, SRS and UPS/PPS, provide sampling with replacement and sampling without replacement. Furthermore, stratified sampling and cluster sampling is supported.

  20. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, whi...

  1. Can we apply the Mendelian randomization methodology without considering epigenetic effects?

    Directory of Open Access Journals (Sweden)

    Karmaus Wilfried

    2009-05-01

    Full Text Available Abstract Introduction Instrumental variable (IV methods have been used in econometrics for several decades now, but have only recently been introduced into the epidemiologic research frameworks. Similarly, Mendelian randomization studies, which use the IV methodology for analysis and inference in epidemiology, were introduced into the epidemiologist's toolbox only in the last decade. Analysis Mendelian randomization studies using instrumental variables (IVs have the potential to avoid some of the limitations of observational epidemiology (confounding, reverse causality, regression dilution bias for making causal inferences. Certain limitations of randomized controlled trials, such as problems with generalizability, feasibility and ethics for some exposures, and high costs, also make the use of Mendelian randomization in observational studies attractive. Unlike conventional randomized controlled trials (RCTs, Mendelian randomization studies can be conducted in a representative sample without imposing any exclusion criteria or requiring volunteers to be amenable to random treatment allocation. Within the last decade, epigenetics has gained recognition as an independent field of study, and appears to be the new direction for future research into the genetics of complex diseases. Although previous articles have addressed some of the limitations of Mendelian randomization (such as the lack of suitable genetic variants, unreliable associations, population stratification, linkage disequilibrium (LD, pleiotropy, developmental canalization, the need for large sample sizes and some potential problems with binary outcomes, none has directly characterized the impact of epigenetics on Mendelian randomization. The possibility of epigenetic effects (non-Mendelian, heritable changes in gene expression not accompanied by alterations in DNA sequence could alter the core instrumental variable assumptions of Mendelian randomization. This paper applies conceptual

  2. Experience-Sampling Methodology with a Mobile Device in Fibromyalgia

    Directory of Open Access Journals (Sweden)

    Castilla Diana

    2012-01-01

    Full Text Available This work describes the usability studies conducted in the development of an experience-sampling methodology (ESM system running in a mobile device. The goal of the system is to improve the accuracy and ecology in gathering daily self-report data in individuals suffering a chronic pain condition, fibromyalgia. The usability studies showed that the developed software to conduct ESM with mobile devices (smartphones, cell phones can be successfully used by individuals with fibromyalgia of different ages and with low level of expertise in the use of information and communication technologies. 100% of users completed the tasks successfully, although some have completely illiterate. Also there seems to be a clear difference in the way of interaction obtained in the two studies carried out.

  3. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Dreyhaupt, Jens

    2017-05-01

    Full Text Available An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called “cluster randomization”. Compared with studies with individual randomization, studies with cluster randomization normally require (significantly larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies.Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  4. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  5. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, Clara M.; Buchhave, Preben; K. George, William

    2014-01-01

    with high data rate and low inherent bias, respectively, while residence time weighting provides non-biased estimates regardless of setting. The free-running processor was also tested and compared to residence time weighting using actual LDA measurements in a turbulent round jet. Power spectra from...... of alternative methods attempting to produce correct power spectra have been invented andtested. The objective of the current study is to create a simple computer generated signal for baseline testing of residence time weighting and some of the most commonly proposed algorithms (or algorithms which most...... modernalgorithms ultimately are based on), sample-and-hold and the direct spectral estimator without residence time weighting, and compare how they perform in relation to power spectra based on the equidistantly sampled reference signal. The computer generated signal is a Poisson process with a sample rate...

  6. Random constraint sampling and duality for convex optimization

    OpenAIRE

    Haskell, William B.; Pengqian, Yu

    2016-01-01

    We are interested in solving convex optimization problems with large numbers of constraints. Randomized algorithms, such as random constraint sampling, have been very successful in giving nearly optimal solutions to such problems. In this paper, we combine random constraint sampling with the classical primal-dual algorithm for convex optimization problems with large numbers of constraints, and we give a convergence rate analysis. We then report numerical experiments that verify the effectiven...

  7. Random Walks on Directed Networks: Inference and Respondent-driven Sampling

    CERN Document Server

    Malmros, Jens; Britton, Tom

    2013-01-01

    Respondent driven sampling (RDS) is a method often used to estimate population properties (e.g. sexual risk behavior) in hard-to-reach populations. It combines an effective modified snowball sampling methodology with an estimation procedure that yields unbiased population estimates under the assumption that the sampling process behaves like a random walk on the social network of the population. Current RDS estimation methodology assumes that the social network is undirected, i.e. that all edges are reciprocal. However, empirical social networks in general also have non-reciprocated edges. To account for this fact, we develop a new estimation method for RDS in the presence of directed edges on the basis of random walks on directed networks. We distinguish directed and undirected edges and consider the possibility that the random walk returns to its current position in two steps through an undirected edge. We derive estimators of the selection probabilities of individuals as a function of the number of outgoing...

  8. Random number datasets generated from statistical analysis of randomly sampled GSM recharge cards.

    Science.gov (United States)

    Okagbue, Hilary I; Opanuga, Abiodun A; Oguntunde, Pelumi E; Ugwoke, Paulinus O

    2017-02-01

    In this article, a random number of datasets was generated from random samples of used GSM (Global Systems for Mobile Communications) recharge cards. Statistical analyses were performed to refine the raw data to random number datasets arranged in table. A detailed description of the method and relevant tests of randomness were also discussed.

  9. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    . Residence time weighting provides non-biased estimates regardless of setting. The free-running processor was also tested and compared to residence time weighting using actual LDA measurements in a turbulent round jet. Power spectra from measurements on the jet centerline and the outer part of the jet...... sine waves. The primary signal and the corresponding power spectrum are shown in Figure 1. The conventional spectrum shows multiple erroneous mixing frequencies and the peak values are too low. The residence time weighted spectrum is correct. The sample-and-hold spectrum has lower power than...... the correct spectrum, and the f -2-filtering effect appearing for low data densities is evident (Adrian and Yao 1987). The remaining tests also show that sample-and-hold and the free-running processor perform well only under very particular circumstances with high data rate and low inherent bias, respectively...

  10. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  11. Towards Standardization of Sampling Methodology for Evaluation of ...

    African Journals Online (AJOL)

    This article proposes the procedure that may be adopted for comparable, representative and cost effective, soil sampling, and thereafter explores the policy issues regarding standardization of sampling activities and analytical process as it relates to soil pollution in Nigeria. Standardized sampling and analytical data for soil ...

  12. Towards Standardization of Sampling Methodology for Evaluation of ...

    African Journals Online (AJOL)

    MICHAEL

    ABSTRACT: This article proposes the procedure that may be adopted for comparable, representative and cost effective, soil sampling, and thereafter explores the policy issues regarding standardization of sampling activities and analytical process as it relates to soil pollution in Nigeria. Standardized sampling and analytical.

  13. 21 CFR 118.7 - Sampling methodology for Salmonella Enteritidis (SE).

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Sampling methodology for Salmonella Enteritidis (SE). 118.7 Section 118.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN....7 Sampling methodology for Salmonella Enteritidis (SE). (a) Environmental sampling. An environmental...

  14. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    Directory of Open Access Journals (Sweden)

    Sampath Sundaram

    2010-09-01

    Full Text Available In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957for various types of systematic sampling schemes available in literature, namely(i  Balanced Systematic Sampling (BSS of  Sethi (1965 and (ii Modified Systematic Sampling (MSS of Singh, Jindal, and Garg  (1968. Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic sampling (LSS with two random starts using appropriate super population models with the  help of R package for statistical computing.

  15. Sampling Methodologies and Approaches for Ballast Water Management Compliance Monitoring

    OpenAIRE

    Stephan Gollasch; Matej David

    2011-01-01

    The human-mediated transfer of harmful organisms via shipping, especially via ballast water transport, has raised considerable attention especially in the last decade due to the negative associated impacts. Ballast water sampling is important to assess the compliance with ballast water management requirements (i.e. compliance monitoring). The complexity of ballast water sampling is a result of organism diversity and behaviour which may require different sampling strategies, as well as ship de...

  16. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  17. Journal impact factor and methodological quality of surgical randomized controlled trials: an empirical study.

    Science.gov (United States)

    Ahmed Ali, Usama; Reiber, Beata M M; Ten Hove, Joren R; van der Sluis, Pieter C; Gooszen, Hein G; Boermeester, Marja A; Besselink, Marc G

    2017-11-01

    The journal impact factor (IF) is often used as a surrogate marker for methodological quality. The objective of this study is to evaluate the relation between the journal IF and methodological quality of surgical randomized controlled trials (RCTs). Surgical RCTs published in PubMed in 1999 and 2009 were identified. According to IF, RCTs were divided into groups of low (3), as well as into top-10 vs all other journals. Methodological quality characteristics and factors concerning funding, ethical approval and statistical significance of outcomes were extracted and compared between the IF groups. Additionally, a multivariate regression was performed. The median IF was 2.2 (IQR 2.37). The percentage of 'low-risk of bias' RCTs was 13% for top-10 journals vs 4% for other journals in 1999 (P < 0.02), and 30 vs 12% in 2009 (P < 0.02). Similar results were observed for high vs low IF groups. The presence of sample-size calculation, adequate generation of allocation and intention-to-treat analysis were independently associated with publication in higher IF journals; as were multicentre trials and multiple authors. Publication of RCTs in high IF journals is associated with moderate improvement in methodological quality compared to RCTs published in lower IF journals. RCTs with adequate sample-size calculation, generation of allocation or intention-to-treat analysis were associated with publication in a high IF journal. On the other hand, reporting a statistically significant outcome and being industry funded were not independently associated with publication in a higher IF journal.

  18. Impact of Incremental Sampling Methodology (ISM) on Metals Bioavailability

    Science.gov (United States)

    2016-05-01

    Certification Program (ESTCP) funded ER-0918 project, which developed new sampling and sample preparation procedures falling under the ISM umbrella...2007; Rodriguez et al. 1999; Ruby et al. 1996, ERDC TR-16-4 4 1999) were conducted to assess the relative bioavailability (RBA) of metals in soil or...differences in digestion efficiencies (Rodriguez et al. 1999; Ruby et al. 1999, 1996). The bioavailable metal is typically only a fraction of the total

  19. Spatial Random Sampling: A Structure-Preserving Data Sketching Tool

    Science.gov (United States)

    Rahmani, Mostafa; Atia, George K.

    2017-09-01

    Random column sampling is not guaranteed to yield data sketches that preserve the underlying structures of the data and may not sample sufficiently from less-populated data clusters. Also, adaptive sampling can often provide accurate low rank approximations, yet may fall short of producing descriptive data sketches, especially when the cluster centers are linearly dependent. Motivated by that, this paper introduces a novel randomized column sampling tool dubbed Spatial Random Sampling (SRS), in which data points are sampled based on their proximity to randomly sampled points on the unit sphere. The most compelling feature of SRS is that the corresponding probability of sampling from a given data cluster is proportional to the surface area the cluster occupies on the unit sphere, independently from the size of the cluster population. Although it is fully randomized, SRS is shown to provide descriptive and balanced data representations. The proposed idea addresses a pressing need in data science and holds potential to inspire many novel approaches for analysis of big data.

  20. Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.

    Science.gov (United States)

    Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe

    2017-12-27

    The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.

  1. Incremental Sampling Methodology: Applications for Background Screening Assessments.

    Science.gov (United States)

    Pooler, Penelope S; Goodrum, Philip E; Crumbling, Deana; Stuchal, Leah D; Roberts, Stephen M

    2018-01-01

    This article presents the findings from a numerical simulation study that was conducted to evaluate the performance of alternative statistical analysis methods for background screening assessments when data sets are generated with incremental sampling methods (ISMs). A wide range of background and site conditions are represented in order to test different ISM sampling designs. Both hypothesis tests and upper tolerance limit (UTL) screening methods were implemented following U.S. Environmental Protection Agency (USEPA) guidance for specifying error rates. The simulations show that hypothesis testing using two-sample t-tests can meet standard performance criteria under a wide range of conditions, even with relatively small sample sizes. Key factors that affect the performance include unequal population variances and small absolute differences in population means. UTL methods are generally not recommended due to conceptual limitations in the technique when applied to ISM data sets from single decision units and due to insufficient power given standard statistical sample sizes from ISM. © 2017 Society for Risk Analysis.

  2. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  3. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  4. Sequential time interleaved random equivalent sampling for repetitive signal

    Science.gov (United States)

    Zhao, Yijiu; Liu, Jingjing

    2016-12-01

    Compressed sensing (CS) based sampling techniques exhibit many advantages over other existing approaches for sparse signal spectrum sensing; they are also incorporated into non-uniform sampling signal reconstruction to improve the efficiency, such as random equivalent sampling (RES). However, in CS based RES, only one sample of each acquisition is considered in the signal reconstruction stage, and it will result in more acquisition runs and longer sampling time. In this paper, a sampling sequence is taken in each RES acquisition run, and the corresponding block measurement matrix is constructed using a Whittaker-Shannon interpolation formula. All the block matrices are combined into an equivalent measurement matrix with respect to all sampling sequences. We implemented the proposed approach with a multi-cores analog-to-digital converter (ADC), whose ADC cores are time interleaved. A prototype realization of this proposed CS based sequential random equivalent sampling method has been developed. It is able to capture an analog waveform at an equivalent sampling rate of 40 GHz while sampled at 1 GHz physically. Experiments indicate that, for a sparse signal, the proposed CS based sequential random equivalent sampling exhibits high efficiency.

  5. Analytical methodologies for the determination of benzodiazepines in biological samples.

    Science.gov (United States)

    Persona, Karolina; Madej, Katarzyna; Knihnicki, Paweł; Piekoszewski, Wojciech

    2015-09-10

    Benzodiazepine drugs belong to important and most widely used medicaments. They demonstrate such therapeutic properties as anxiolytic, sedative, somnifacient, anticonvulsant, diastolic and muscle relaxant effects. However, despite the fact that benzodiazepines possess high therapeutic index and are considered to be relatively safe, their use can be dangerous when: (1) co-administered with alcohol, (2) co-administered with other medicaments like sedatives, antidepressants, neuroleptics or morphine like substances, (3) driving under their influence, (4) using benzodiazepines non-therapeutically as drugs of abuse or in drug-facilitated crimes. For these reasons benzodiazepines are still studied and determined in a variety of biological materials. In this article, sample preparation techniques which have been applied in analysis of benzodiazepine drugs in biological samples have been reviewed and presented. The next part of the article is focused on a review of analytical methods which have been employed for pharmacological, toxicological or forensic study of this group of drugs in the biological matrices. The review was preceded by a description of the physicochemical properties of the selected benzodiazepines and two, very often coexisting in the same analyzed samples, sedative-hypnotic drugs. Copyright © 2015. Published by Elsevier B.V.

  6. Optimum allocation in multivariate stratified random sampling: Stochastic matrix optimisation

    OpenAIRE

    Diaz-Garcia, Jose A.; Ramos-Quiroga, Rogelio

    2011-01-01

    The allocation problem for multivariate stratified random sampling as a problem of stochastic matrix integer mathematical programming is considered. With these aims the asymptotic normality of sample covariance matrices for each strata is established. Some alternative approaches are suggested for its solution. An example is solved by applying the proposed techniques.

  7. Performance of Random Effects Model Estimators under Complex Sampling Designs

    Science.gov (United States)

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  8. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    Science.gov (United States)

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  9. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  10. Random sampling and validation of covariance matrices of resonance parameters

    Science.gov (United States)

    Plevnik, Lucijan; Zerovnik, Gašper

    2017-09-01

    Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices) in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.

  11. Generalized and synthetic regression estimators for randomized branch sampling

    Science.gov (United States)

    David L. R. Affleck; Timothy G. Gregoire

    2015-01-01

    In felled-tree studies, ratio and regression estimators are commonly used to convert more readily measured branch characteristics to dry crown mass estimates. In some cases, data from multiple trees are pooled to form these estimates. This research evaluates the utility of both tactics in the estimation of crown biomass following randomized branch sampling (...

  12. Effective sampling of random surfaces by baby universe surgery

    NARCIS (Netherlands)

    Ambjørn, J.; Białas, P.; Jurkiewicz, J.; Burda, Z.; Petersson, B.

    1994-01-01

    We propose a new, very efficient algorithm for sampling of random surfaces in the Monte Carlo simulations, based on so-called baby universe surgery, i.e. cutting and pasting of baby universe. It drastically reduces slowing down as compared to the standard local flip algorithm, thereby allowing

  13. Are marketed topical metronidazole creams bioequivalent? Evaluation by in vivo microdialysis sampling and tape stripping methodology

    DEFF Research Database (Denmark)

    Garcia Ortiz, Patricia Elodia; Hansen, S H; Shah, Surendra P.

    2011-01-01

    To evaluate the bioequivalence of 3 marketed topical metronidazole formulations by simultaneous dermal microdialysis and stratum corneum sampling by the tape stripping methodology, and to compare the techniques as tools for the determination of bioequivalence....

  14. Are marketed topical metronidazole creamas bioequivalent ? Evaluation by in vivo microdialysis sampling and tape stripping methodology

    DEFF Research Database (Denmark)

    Ortiz, P. Garcia; Hansen, Steen Honore'; Shah, V. P.

    2011-01-01

    To evaluate the bioequivalence of 3 marketed topical metronidazole formulations by simultaneous dermal microdialysis and stratum corneum sampling by the tape stripping methodology, and to compare the techniques as tools for the determination of bioequivalence.......To evaluate the bioequivalence of 3 marketed topical metronidazole formulations by simultaneous dermal microdialysis and stratum corneum sampling by the tape stripping methodology, and to compare the techniques as tools for the determination of bioequivalence....

  15. Flexible sampling large-scale social networks by self-adjustable random walk

    Science.gov (United States)

    Xu, Xiao-Ke; Zhu, Jonathan J. H.

    2016-12-01

    Online social networks (OSNs) have become an increasingly attractive gold mine for academic and commercial researchers. However, research on OSNs faces a number of difficult challenges. One bottleneck lies in the massive quantity and often unavailability of OSN population data. Sampling perhaps becomes the only feasible solution to the problems. How to draw samples that can represent the underlying OSNs has remained a formidable task because of a number of conceptual and methodological reasons. Especially, most of the empirically-driven studies on network sampling are confined to simulated data or sub-graph data, which are fundamentally different from real and complete-graph OSNs. In the current study, we propose a flexible sampling method, called Self-Adjustable Random Walk (SARW), and test it against with the population data of a real large-scale OSN. We evaluate the strengths of the sampling method in comparison with four prevailing methods, including uniform, breadth-first search (BFS), random walk (RW), and revised RW (i.e., MHRW) sampling. We try to mix both induced-edge and external-edge information of sampled nodes together in the same sampling process. Our results show that the SARW sampling method has been able to generate unbiased samples of OSNs with maximal precision and minimal cost. The study is helpful for the practice of OSN research by providing a highly needed sampling tools, for the methodological development of large-scale network sampling by comparative evaluations of existing sampling methods, and for the theoretical understanding of human networks by highlighting discrepancies and contradictions between existing knowledge/assumptions of large-scale real OSN data.

  16. Randomized clinical trials in dentistry: Risks of bias, risks of random errors, reporting quality, and methodologic quality over the years 1955-2013.

    Science.gov (United States)

    Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos

    2017-01-01

    To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955-2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent

  17. Random sampling and validation of covariance matrices of resonance parameters

    Directory of Open Access Journals (Sweden)

    Plevnik Lucijan

    2017-01-01

    Full Text Available Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.

  18. Cluster randomized trials utilizing primary care electronic health records : methodological issues in design, conduct, and analysis (eCRT Study)

    NARCIS (Netherlands)

    Gulliford, Martin C; van Staa, Tjeerd P; McDermott, Lisa; McCann, Gerard; Charlton, Judith; Dregan, Alex

    2014-01-01

    BACKGROUND: There is growing interest in conducting clinical and cluster randomized trials through electronic health records. This paper reports on the methodological issues identified during the implementation of two cluster randomized trials using the electronic health records of the Clinical

  19. Sampling Polymorphs of Ionic Solids using Random Superlattices.

    Science.gov (United States)

    Stevanović, Vladan

    2016-02-19

    Polymorphism offers rich and virtually unexplored space for discovering novel functional materials. To harness this potential approaches capable of both exploring the space of polymorphs and assessing their realizability are needed. One such approach devised for partially ionic solids is presented. The structure prediction part is carried out by performing local density functional theory relaxations on a large set of random supperlattices (RSLs) with atoms distributed randomly over different planes in a way that favors cation-anion coordination. Applying the RSL sampling on MgO, ZnO, and SnO_{2} reveals that the resulting probability of occurrence of a given structure offers a measure of its realizability explaining fully the experimentally observed, metastable polymorphs in these three systems.

  20. Applying Incremental Sampling Methodology to Soils Containing Heterogeneously Distributed Metallic Residues to Improve Risk Analysis.

    Science.gov (United States)

    Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A

    2018-01-01

    This study compares conventional grab sampling to incremental sampling methodology (ISM) to characterize metal contamination at a military small-arms-range. Grab sample results had large variances, positively skewed non-normal distributions, extreme outliers, and poor agreement between duplicate samples even when samples were co-located within tens of centimeters of each other. The extreme outliers strongly influenced the grab sample means for the primary contaminants lead (Pb) and antinomy (Sb). In contrast, median and mean metal concentrations were similar for the ISM samples. ISM significantly reduced measurement uncertainty of estimates of the mean, increasing data quality (e.g., for environmental risk assessments) with fewer samples (e.g., decreasing total project costs). Based on Monte Carlo resampling simulations, grab sampling resulted in highly variable means and upper confidence limits of the mean relative to ISM.

  1. Analysis of a global random stratified sample of nurse legislation.

    Science.gov (United States)

    Benton, D C; Fernández-Fernández, M P; González-Jurado, M A; Beneit-Montesinos, J V

    2015-06-01

    To identify, compare and contrast the major component parts of heterogeneous stratified sample of nursing legislation. Nursing legislation varies from one jurisdiction to another. Up until now no research exists into whether the variations of such legislation are random or if variations are related to a set of key attributes. This mixed method study used a random stratified sample of legislation to map through documentary analysis the content of 14 nursing acts and then explored, using quantitative techniques, whether the material contained relates to a number of key attributes. These attributes include: legal tradition of the jurisdiction; model of regulation; administrative approach; area of the world; and the economic status of the jurisdiction. Twelve component parts of nursing legislation were identified. These were remarkably similar irrespective of attributes of interest. However, not all component parts were specified in the same level of detail and the manner by which the elements were addressed did vary. A number of potential relationships between the structure of the legislation and the key attributes of interest were identified. This study generated a comprehensive and integrated map of a global sample of nursing legislation. It provides a set of descriptors to be used to undertake further quantitative work and provides an important policy tool to facilitate dialogue between regulatory bodies. At the individual nurse level it offers insights that can help nurses pursue recognition of credentials across jurisdictions. © 2015 International Council of Nurses.

  2. Effectiveness of the Comprehensive Approach to Rehabilitation (CARe) methodology: design of a cluster randomized controlled trial.

    Science.gov (United States)

    Bitter, Neis A; Roeg, Diana P K; van Nieuwenhuizen, Chijs; van Weeghel, Jaap

    2015-07-22

    There is an increasing amount of evidence for the effectiveness of rehabilitation interventions for people with severe mental illness (SMI). In the Netherlands, a rehabilitation methodology that is well known and often applied is the Comprehensive Approach to Rehabilitation (CARe) methodology. The overall goal of the CARe methodology is to improve the client's quality of life by supporting the client in realizing his/her goals and wishes, handling his/her vulnerability and improving the quality of his/her social environment. The methodology is strongly influenced by the concept of 'personal recovery' and the 'strengths case management model'. No controlled effect studies have been conducted hitherto regarding the CARe methodology. This study is a two-armed cluster randomized controlled trial (RCT) that will be executed in teams from three organizations for sheltered and supported housing, which provide services to people with long-term severe mental illness. Teams in the intervention group will receive the multiple-day CARe methodology training from a specialized institute and start working according the CARe Methodology guideline. Teams in the control group will continue working in their usual way. Standardized questionnaires will be completed at baseline (T0), and 10 (T1) and 20 months (T2) post baseline. Primary outcomes are recovery, social functioning and quality of life. The model fidelity of the CARe methodology will be assessed at T1 and T2. This study is the first controlled effect study on the CARe methodology and one of the few RCTs on a broad rehabilitation method or strength-based approach. This study is relevant because mental health care organizations have become increasingly interested in recovery and rehabilitation-oriented care. The trial registration number is ISRCTN77355880 .

  3. Methodology for Speech Assessment in the Scandcleft Project-An International Randomized Clinical Trial on Palatal Surgery

    DEFF Research Database (Denmark)

    Willadsen, Elisabeth

    2009-01-01

    -sum and the overall rating of VPC was 78%. Conclusions: Pooling data of speakers of different languages in the same trial and comparing speech outcome across trials seems possible if the assessment of speech concerns consonants and is confined to speech units that are phonetically similar across languages. Agreed......Objective: To present the methodology for speech assessment in the Scandcleft project and discuss issues from a pilot study. Design: Description of methodology and blinded test for speech assessment. Speech samples and instructions for data collection and analysis for comparisons of speech outcomes...... across five included languages were developed and tested. Participants and Materials: Randomly selected video recordings of 10 5-year-old children from each language (n = 50) were included in the project. Speech material consisted of test consonants in single words, connected speech, and syllable chains...

  4. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    DEFF Research Database (Denmark)

    Edjabou, Vincent Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona

    2015-01-01

    stratification parameter. Separating food leftovers from food packaging during manual sorting of the sampled waste did not have significant influence on the proportions of food waste and packaging materials, indicating that this step may not be required. (C) 2014 Elsevier Ltd. All rights reserved.......Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub...

  5. Randomly Sampled-Data Control Systems. Ph.D. Thesis

    Science.gov (United States)

    Han, Kuoruey

    1990-01-01

    The purpose is to solve the Linear Quadratic Regulator (LQR) problem with random time sampling. Such a sampling scheme may arise from imperfect instrumentation as in the case of sampling jitter. It can also model the stochastic information exchange among decentralized controllers to name just a few. A practical suboptimal controller is proposed with the nice property of mean square stability. The proposed controller is suboptimal in the sense that the control structure is limited to be linear. Because of i. i. d. assumption, this does not seem unreasonable. Once the control structure is fixed, the stochastic discrete optimal control problem is transformed into an equivalent deterministic optimal control problem with dynamics described by the matrix difference equation. The N-horizon control problem is solved using the Lagrange's multiplier method. The infinite horizon control problem is formulated as a classical minimization problem. Assuming existence of solution to the minimization problem, the total system is shown to be mean square stable under certain observability conditions. Computer simulations are performed to illustrate these conditions.

  6. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  7. Improvement of sampling strategies for randomly distributed hotspots in soil applying a computerized simulation considering the concept of uncertainty.

    Science.gov (United States)

    Hildebrandt, Thomas; Pick, Denis; Einax, Jürgen W

    2012-02-01

    The pollution of soil and environment as a result of human activity is a major problem. Nowadays, the determination of local contaminations is of interest for environmental remediation. These hotspots can have various toxic effects on plants, animals, humans, and the whole ecological system. However, economical and juridical consequences are also possible, e.g., high costs for remediation measures. In this study three sampling strategies (simple random sampling, stratified sampling, and systematic sampling) were applied on randomly distributed hotspot contaminations to prove their efficiency in term of finding hotspots. The results were used for the validation of a computerized simulation. This application can simulate the contamination on a field, the sampling pattern, and a virtual sampling. A constant hit rate showed that none of the sampling patterns could reach better results than others. Furthermore, the uncertainty associated with the results is described by confidence intervals. It is to be considered that the uncertainty during sampling is enormous and will decrease slightly, even the number of samples applied was increased to an unreasonable amount. It is hardly possible to identify the exact number of randomly distributed hotspot contaminations by statistical sampling. But a range of possible results could be calculated. Depending on various parameters such as shape and size of the area, number of hotspots, and sample quantity, optimal sampling strategies could be derived. Furthermore, an estimation of bias arising from sampling methodology is possible. The developed computerized simulation is an innovative tool for optimizing sampling strategies in terrestrial compartments for hotspot distributions.

  8. Autoradiography of geological fluorite samples for determination of uranium and thorium distribution using nuclear track methodology

    Energy Technology Data Exchange (ETDEWEB)

    Pi, T.; Sole, J. [Instituto de Geologia, UNAM, Cd. Universitaria, Coyoacan, 04510 Mexico DF (Mexico); Golzarri, J.I; Rickards, J.; Espinosa, G. [IFUNAM, AP 20-364, 01000 Mexico DF (Mexico)]. e-mail: espinosa@fisica.unam.mx

    2007-07-01

    In this paper we present the uranium and thorium distribution analysis of several samples of the 'La Azul' an epithermal fluorspar deposit in southern Mexico, using nuclear track methodology (NTM), in the alpha-autoradiography mode, by placing the mineral sample in contact with a polycarbonate detector. This constitutes a non-destructive analysis, with sufficient sensitivity to provide valuable information about textural and para genetic characteristics of the geological samples. The selected nuclear track detector was CR-39 (Landauer). The region of interest of the geological samples was polished and put in contact with the detector material surface for 45 days in a vacuum chamber (10-3 torr). After this period of time, the detectors were chemically etched, revealing the auto radiograph of the radioactive material. The results show a clear distribution of bands of uranium and thorium in the fluorite samples. This is valuable information for the genetic or geochronological studies of the ore deposits. (Author)

  9. A comparison of methods for representing sparsely sampled random quantities.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  10. Methodological reporting of randomized controlled trials in major hepato-gastroenterology journals in 2008 and 1998: a comparative study

    Science.gov (United States)

    2011-01-01

    Background It was still unclear whether the methodological reporting quality of randomized controlled trials (RCTs) in major hepato-gastroenterology journals improved after the Consolidated Standards of Reporting Trials (CONSORT) Statement was revised in 2001. Methods RCTs in five major hepato-gastroenterology journals published in 1998 or 2008 were retrieved from MEDLINE using a high sensitivity search method and their reporting quality of methodological details were evaluated based on the CONSORT Statement and Cochrane Handbook for Systematic Reviews of interventions. Changes of the methodological reporting quality between 2008 and 1998 were calculated by risk ratios with 95% confidence intervals. Results A total of 107 RCTs published in 2008 and 99 RCTs published in 1998 were found. Compared to those in 1998, the proportion of RCTs that reported sequence generation (RR, 5.70; 95%CI 3.11-10.42), allocation concealment (RR, 4.08; 95%CI 2.25-7.39), sample size calculation (RR, 3.83; 95%CI 2.10-6.98), incomplete outecome data addressed (RR, 1.81; 95%CI, 1.03-3.17), intention-to-treat analyses (RR, 3.04; 95%CI 1.72-5.39) increased in 2008. Blinding and intent-to-treat analysis were reported better in multi-center trials than in single-center trials. The reporting of allocation concealment and blinding were better in industry-sponsored trials than in public-funded trials. Compared with historical studies, the methodological reporting quality improved with time. Conclusion Although the reporting of several important methodological aspects improved in 2008 compared with those published in 1998, which may indicate the researchers had increased awareness of and compliance with the revised CONSORT statement, some items were still reported badly. There is much room for future improvement. PMID:21801429

  11. Cluster randomized trials utilizing primary care electronic health records: methodological issues in design, conduct, and analysis (eCRT Study).

    Science.gov (United States)

    Gulliford, Martin C; van Staa, Tjeerd P; McDermott, Lisa; McCann, Gerard; Charlton, Judith; Dregan, Alex

    2014-06-11

    There is growing interest in conducting clinical and cluster randomized trials through electronic health records. This paper reports on the methodological issues identified during the implementation of two cluster randomized trials using the electronic health records of the Clinical Practice Research Datalink (CPRD). Two trials were completed in primary care: one aimed to reduce inappropriate antibiotic prescribing for acute respiratory infection; the other aimed to increase physician adherence with secondary prevention interventions after first stroke. The paper draws on documentary records and trial datasets to report on the methodological experience with respect to research ethics and research governance approval, general practice recruitment and allocation, sample size calculation and power, intervention implementation, and trial analysis. We obtained research governance approvals from more than 150 primary care organizations in England, Wales, and Scotland. There were 104 CPRD general practices recruited to the antibiotic trial and 106 to the stroke trial, with the target number of practices being recruited within six months. Interventions were installed into practice information systems remotely over the internet. The mean number of participants per practice was 5,588 in the antibiotic trial and 110 in the stroke trial, with the coefficient of variation of practice sizes being 0.53 and 0.56 respectively. Outcome measures showed substantial correlations between the 12 months before, and after intervention, with coefficients ranging from 0.42 for diastolic blood pressure to 0.91 for proportion of consultations with antibiotics prescribed, defining practice and participant eligibility for analysis requires careful consideration. Cluster randomized trials may be performed efficiently in large samples from UK general practices using the electronic health records of a primary care database. The geographical dispersal of trial sites presents a difficulty for

  12. Comparison of sampling methodologies for nutrient monitoring in streams: uncertainties, costs and implications for mitigation

    Science.gov (United States)

    Audet, J.; Martinsen, L.; Hasler, B.; de Jonge, H.; Karydi, E.; Ovesen, N. B.; Kronvang, B.

    2014-11-01

    Eutrophication of aquatic ecosystems caused by excess concentrations of nitrogen and phosphorus may have harmful consequences for biodiversity and poses a health risk to humans via water supplies. Reduction of nitrogen and phosphorus losses to aquatic ecosystems involves implementation of costly measures, and reliable monitoring methods are therefore essential to select appropriate mitigation strategies and to evaluate their effects. Here, we compare the performances and costs of three methodologies for the monitoring of nutrients in rivers: grab sampling; time-proportional sampling; and passive sampling using flow-proportional samplers. Assuming hourly time-proportional sampling to be the best estimate of the "true" nutrient load, our results showed that the risk of obtaining wrong total nutrient load estimates by passive samplers is high despite similar costs as the time-proportional sampling. Our conclusion is that for passive samplers to provide a reliable monitoring alternative, further development is needed. Grab sampling was the cheapest of the three methods and was more precise and accurate than passive sampling. We conclude that although monitoring employing time-proportional sampling is costly, its reliability precludes unnecessarily high implementation expenses.

  13. Sampling methodology and site selection in the National Eye Health Survey: an Australian population-based prevalence study.

    Science.gov (United States)

    Foreman, Joshua; Keel, Stuart; Dunn, Ross; van Wijngaarden, Peter; Taylor, Hugh R; Dirani, Mohamed

    2017-05-01

    This paper presents the sampling methodology of the National Eye Health Survey that aimed to determine the prevalence of vision impairment and blindness in Australia. The National Eye Health Survey is a cross-sectional population-based survey. Indigenous Australians aged 40 years and older and non-Indigenous Australians aged 50 years and older residing in all levels of geographic remoteness in Australia. Using multistage, random-cluster sampling, 30 geographic areas were selected to provide samples of 3000 non-Indigenous Australians and 1400 Indigenous Australians. Sampling involved (i) selecting Statistical Area- Level 2 sites, stratified by remoteness; (ii) selecting Statistical Area- Level 1 sites within Statistical Area- Level 2 sites to provide targeted samples; and (iii) grouping of contiguous Statistical Area- Level 1 sites or replacing Statistical Area- Level 1 sites to provide sufficient samples. The main outcome measures involved Sites sites selected and participants sampled in the survey. Thirty sites were generated, including 12 Major City sites, 6 Inner Regional sites, 6 Outer Regional sites, 4 Remote sites and 2 Very Remote sites. Three thousand ninety-eight non-Indigenous participants and 1738 Indigenous participants were recruited. Selection of Statistical Area- Level 1 site overestimated the number of eligible residents in all sites. About 20% (6/30) of Statistical Area- Level 1 sites were situated in non-residential bushland, and 26.67% (8/30) of Statistical Area- Level 1 populations had low eligibility or accessibility, requiring replacement. Representative samples of Indigenous and non-Indigenous Australians were selected, recruited and tested, providing the first national data on the prevalence of vision impairment and blindness in Australia. © 2016 Royal Australian and New Zealand College of Ophthalmologists.

  14. Methodological and ethical aspects of randomized controlled clinical trials in minors with malignant diseases.

    Science.gov (United States)

    Rothenberger, Lillian G; Henschel, Andreas Dirk; Schrey, Dominik; Becker, Andreas; Boos, Joachim

    2011-10-01

    Due to the new European regulations for pediatric medications, future clinical trials will include an increasing number of minors. It is therefore important to reconsider and evaluate recent methodological and ethical aspects of clinical trials in minors. The following questions were investigated: How are randomized controlled clinical trials (RCTs) performed in practice? Do investigators take into consideration biomedical ethical principles, explicated for example by Beauchamp and Childress, when planning and conducting a trial? The study was conducted in a descriptive manner. A systematic, algorithm-guided search focusing on RCTs in minors with malignant diseases was carried out in PubMed. One-thousand-nine-hundred-sixty-two publications from 2001 to 2005 were randomized in sequence. The first 1,000 publications were screened according to a priori defined inclusion criteria. One hundred seventy-five publications met the criteria and were reviewed using the SIGN methodological checklist (2004), the CONSORT Statement (2001, section Methods, items 3-12) and indicators for ethical aspects. Seventeen publications were checked by two raters. Information on randomization and blinding was often equivocal. The publications were mainly rated positive for the criteria of the SIGN checklist, and mostly rated negative for the additional items of the CONSORT Statement. Regarding the ethical principles, only few contributions were found in the publications. Inter-rater reliability was good. In the publications analyzed, we found only limited information concerning methods and reflections on ethical principles of the trials. Improvements are thus necessary and possible. We suggest how such trials and their respective publications can be optimized for these aspects. Copyright © 2011 Wiley-Liss, Inc.

  15. The project ENABLE II randomized controlled trial to improve palliative care for rural patients with advanced cancer: baseline findings, methodological challenges, and solutions.

    Science.gov (United States)

    Bakitas, Marie; Lyons, Kathleen Doyle; Hegel, Mark T; Balan, Stefan; Barnett, Kathleen N; Brokaw, Frances C; Byock, Ira R; Hull, Jay G; Li, Zhongze; McKinstry, Elizabeth; Seville, Janette L; Ahles, Tim A

    2009-03-01

    There is a paucity of randomized controlled trials (RCTs) to evaluate models of palliative care. Although interventions vary, all have faced a variety of methodological challenges including adequate recruitment, missing data, and contamination of the control group. We describe the ENABLE II intervention, methods, and sample baseline characteristics to increase intervention and methodological transparency, and to describe our solutions to selected methodological issues. Half of the participants recruited from our rural U.S. comprehensive cancer center and affiliated clinics were randomly assigned to a phone-based, nurse-led educational, care coordination palliative care intervention model. Intervention services were provided to half of the participants weekly for the first month and then monthly until death, including bereavement follow-up call to the caregiver. The other half of the participants were assigned to care as usual. Symptoms, quality of life, mood, and functional status were assessed every 3 months until death. Baseline data of 279 participants were similar to normative samples. Solutions to methodological challenges of recruitment, missing data, and "usual care" control group contamination are described. It is feasible to overcome many of the methodological challenges to conducting a rigorous palliative care RCT.

  16. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    Directory of Open Access Journals (Sweden)

    Zoraida Sosa-Ferrera

    2013-01-01

    Full Text Available Endocrine-disruptor compounds (EDCs can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented.

  17. How Does Sampling Methodology Influence Molecular Detection and Isolation Success in Influenza A Virus Field Studies?

    Science.gov (United States)

    Latorre-Margalef, Neus; Avril, Alexis; Tolf, Conny; Olsen, Björn; Waldenström, Jonas

    2015-12-11

    Wild waterfowl are important reservoir hosts for influenza A virus (IAV) and a potential source of spillover infections in other hosts, including poultry and swine. The emergence of highly pathogenic avian influenza (HPAI) viruses, such as H5N1 and H5N8, and subsequent spread along migratory flyways prompted the initiation of several programs in Europe, North America, and Africa to monitor circulation of HPAI and low-pathogenicity precursor viruses (low-pathogenicity avian influenza [LPAI] viruses). Given the costs of maintaining such programs, it is essential to establish best practice for field methodologies to provide robust data for epidemiological interpretation. Here, we use long-term surveillance data from a single site to evaluate the influence of a number of parameters on virus detection and isolation of LPAI viruses. A total of 26,586 samples (oropharyngeal, fecal, and cloacal) collected from wild mallards were screened by real-time PCR, and positive samples were subjected to isolation in embryonated chicken eggs. The LPAI virus detection rate was influenced by the sample type: cloacal/fecal samples showed a consistently higher detection rate and lower cycle threshold (Ct) value than oropharyngeal samples. Molecular detection was more sensitive than isolation, and virus isolation success was proportional to the number of RNA copies in the sample. Interestingly, for a given Ct value, the isolation success was lower in samples from adult birds than in those from juveniles. Comparing the results of specific real-time reverse transcriptase (RRT)-PCRs and of isolation, it was clear that coinfections were common in the investigated birds. The effects of sample type and detection methods warrant some caution in interpretation of the surveillance data. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  18. Methodology for randomized trials of patients with nonvariceal upper gastrointestinal bleeding: recommendations from an international consensus conference.

    Science.gov (United States)

    Laine, Loren; Spiegel, Brennan; Rostom, Alaa; Moayyedi, Paul; Kuipers, Ernst J; Bardou, Marc; Sung, Joseph; Barkun, Alan N

    2010-03-01

    The aim of this document is to provide a methodological framework for the design, performance, analysis, interpretation, and communication of randomized trials that assess management of patients with nonvariceal upper gastrointestinal bleeding. Literature searches were performed and an iterative process with electronic and face-to-face meetings was used to achieve consensus among panel members as part of an International Consensus Conference on Nonvariceal Upper Gastrointestinal Bleeding. Recommendations of the panel include the following. Randomized trials must explicitly state their primary hypothesis. A nonmanipulable randomization schedule with concealed allocation should be used. Stratification (e.g., for age and stigmata of hemorrhage) may be considered, especially in smaller studies. The patient and personnel providing care or recording information should be blinded. Inclusion criteria should be overt bleeding with endoscopy performed within 24 h or less. One type of lesion (e.g., ulcer) should be studied with stigmata to be included predefined. Use of placebo/no therapy vs. active controls depends on current standard practice. Standardizing study and key non-study interventions should ensure uniform provision of interventions. Criteria for repeat endoscopy and subsequent interventions should be predefined. The primary end point should be further bleeding (persistent and recurrent bleeding) with primary assessment at 7 days; mortality, with primary assessment at 30 days, would be appropriate in very large trials. Sample size calculation based on assumptions regarding primary end point results with regard to study intervention and control must be provided, and all patients enrolled must be accounted for. In general, the primary population for analysis is all patients randomized, although a per-protocol population may be used if this is the more conservative approach (e.g., equivalence study).

  19. Bee (Hymenoptera: Apoidea) Diversity and Sampling Methodology in a Midwestern USA Deciduous Forest.

    Science.gov (United States)

    McCravy, Kenneth W; Ruholl, Jared D

    2017-08-04

    Forests provide potentially important bee habitat, but little research has been done on forest bee diversity and the relative effectiveness of bee sampling methods in this environment. Bee diversity and sampling methodology were studied in an Illinois, USA upland oak-hickory forest using elevated and ground-level pan traps, malaise traps, and vane traps. 854 bees and 55 bee species were collected. Elevated pan traps collected the greatest number of bees (473), but ground-level pan traps collected greater species diversity (based on Simpson's diversity index) than did elevated pan traps. Elevated and ground-level pan traps collected the greatest bee species richness, with 43 and 39 species, respectively. An estimated sample size increase of over 18-fold would be required to approach minimum asymptotic richness using ground-level pan traps. Among pan trap colors/elevations, elevated yellow pan traps collected the greatest number of bees (266) but the lowest diversity. Malaise traps were relatively ineffective, collecting only 17 bees. Vane traps collected relatively low species richness (14 species), and Chao1 and abundance coverage estimators suggested that minimum asymptotic species richness was approached for that method. Bee species composition differed significantly between elevated pan traps, ground-level pan traps, and vane traps. Indicator species were significantly associated with each of these trap types, as well as with particular pan trap colors/elevations. These results indicate that Midwestern deciduous forests provide important bee habitat, and that the performance of common bee sampling methods varies substantially in this environment.

  20. Effect of passive acoustic sampling methodology on detecting bats after declines from white nose syndrome

    Science.gov (United States)

    Coleman, Laci S.; Ford, W. Mark; Dobony, Christopher A.; Britzke, Eric R.

    2014-01-01

    Concomitant with the emergence and spread of white-nose syndrome (WNS) and precipitous decline of many bat species in North America, natural resource managers need modified and/or new techniques for bat inventory and monitoring that provide robust occupancy estimates. We used Anabat acoustic detectors to determine the most efficient passive acoustic sampling design for optimizing detection probabilities of multiple bat species in a WNS-impacted environment in New York, USA. Our sampling protocol included: six acoustic stations deployed for the entire duration of monitoring as well as a 4 x 4 grid and five transects of 5-10 acoustic units that were deployed for 6-8 night sample durations surveyed during the summers of 2011-2012. We used Program PRESENCE to determine detection probability and site occupancy estimates. Overall, the grid produced the highest detection probabilities for most species because it contained the most detectors and intercepted the greatest spatial area. However, big brown bats (Eptesicus fuscus) and species not impacted by WNS were detected easily regardless of sampling array. Endangered Indiana (Myotis sodalis) and little brown (Myotis lucifugus) and tri-colored bats (Perimyotis subflavus) showed declines in detection probabilities over our study, potentially indicative of continued WNS-associated declines. Identification of species presence through efficient methodologies is vital for future conservation efforts as bat populations decline further due to WNS and other factors.   

  1. A descriptive analysis of a representative sample of pediatric randomized controlled trials published in 2007

    Directory of Open Access Journals (Sweden)

    Thomson Denise

    2010-12-01

    Full Text Available Abstract Background Randomized controlled trials (RCTs are the gold standard for trials assessing the effects of therapeutic interventions; therefore it is important to understand how they are conducted. Our objectives were to provide an overview of a representative sample of pediatric RCTs published in 2007 and assess the validity of their results. Methods We searched Cochrane Central Register of Controlled Trials using a pediatric filter and randomly selected 300 RCTs published in 2007. We extracted data on trial characteristics; outcomes; methodological quality; reporting; and registration and protocol characteristics. Trial registration and protocol availability were determined for each study based on the publication, an Internet search and an author survey. Results Most studies (83% were efficacy trials, 40% evaluated drugs, and 30% were placebo-controlled. Primary outcomes were specified in 41%; 43% reported on adverse events. At least one statistically significant outcome was reported in 77% of trials; 63% favored the treatment group. Trial registration was declared in 12% of publications and 23% were found through an Internet search. Risk of bias (ROB was high in 59% of trials, unclear in 33%, and low in 8%. Registered trials were more likely to have low ROB than non-registered trials (16% vs. 5%; p = 0.008. Effect sizes tended to be larger for trials at high vs. low ROB (0.28, 95% CI 0.21,0.35 vs. 0.16, 95% CI 0.07,0.25. Among survey respondents (50% response rate, the most common reason for trial registration was a publication requirement and for non-registration, a lack of familiarity with the process. Conclusions More than half of this random sample of pediatric RCTs published in 2007 was at high ROB and three quarters of trials were not registered. There is an urgent need to improve the design, conduct, and reporting of child health research.

  2. [Methodological Aspects of the Sampling Design for the 2015 National Mental Health Survey].

    Science.gov (United States)

    Rodríguez, Nelcy; Rodríguez, Viviana Alejandra; Ramírez, Eugenia; Cediel, Sandra; Gil, Fabián; Rondón, Martín Alonso

    2016-12-01

    The WHO has encouraged the development, implementation and evaluation of policies related to mental health all over the world. In Colombia, within this framework and promoted by the Ministry of Health and Social Protection, as well as being supported by Colciencias, the fourth National Mental Health Survey (NMHST) was conducted using a observational cross sectional study. According to the context and following the guidelines and sampling design, a summary of the methodology used for this sampling process is presented. The fourth NMHST used the Homes Master Sample for Studies in Health from the National System of Studies and Population Surveys for Health to calculate its sample. This Master Sample was developed and implemented in the year 2013 by the Ministry of Social Protection. This study included non-institutionalised civilian population divided into four age groups: children 7-11 years, adolescent 12-17 years, 18-44 years and 44 years old or older. The sample size calculation was based on the reported prevalences in other studies for the outcomes of mental disorders, depression, suicide, associated morbidity, and alcohol use. A probabilistic, cluster, stratified and multistage selection process was used. Expansions factors to the total population were calculated. A total of 15,351 completed surveys were collected and were distributed according to the age groups: 2727, 7-11 years, 1754, 12-17 years, 5889, 18-44 years, and 4981, ≥45 years. All the surveys were distributed in five regions: Atlantic, Oriental, Bogotá, Central and Pacific. A sufficient number of surveys were collected in this study to obtain a more precise approximation of the mental problems and disorders at the regional and national level. Copyright © 2016 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  3. Fourier based methodology for simulating 2D-random shapes in heterogeneous materials

    Science.gov (United States)

    Mattrand, C.; Béakou, A.; Charlet, K.

    2015-08-01

    Gaining insights into the effects of microstructural details on materials behavior may be achieved by incorporating their attributes into numerical modeling. This requires us to make considerable efforts to feature heterogeneity morphology distributions and their spatial arrangement. This paper focuses on modeling the scatter observed in materials heterogeneity geometry. The proposed strategy is based on the development of a 1D-shape signature function representing the 2D-section of a given shape, on Fourier basis functions. The Fourier coefficients are then considered as random variables. This methodology has been applied to flax fibers which are gradually introduced into composite materials as a potential alternative to synthetic reinforcements. In this contribution, the influence of some underlying assumptions regarding the choice of one 1D-shape signature function, its discretization scheme and truncation level, and the best way of modeling the associated random variables is also investigated. Some configurations coming from the combination of these tuning parameters are found to be sufficiently relevant to render efficiently the morphometric factors of the observed fibers statistically speaking.

  4. Sample size calculations for 3-level cluster randomized trials

    NARCIS (Netherlands)

    Teerenstra, S.; Moerbeek, M.; Achterberg, T. van; Pelzer, B.J.; Borm, G.F.

    2008-01-01

    BACKGROUND: The first applications of cluster randomized trials with three instead of two levels are beginning to appear in health research, for instance, in trials where different strategies to implement best-practice guidelines are compared. In such trials, the strategy is implemented in health

  5. Sample size calculations for 3-level cluster randomized trials

    NARCIS (Netherlands)

    Teerenstra, S.; Moerbeek, M.; Achterberg, T. van; Pelzer, B.J.; Borm, G.F.

    2008-01-01

    Background The first applications of cluster randomized trials with three instead of two levels are beginning to appear in health research, for instance, in trials where different strategies to implement best-practice guidelines are compared. In such trials, the strategy is implemented in health

  6. Improved estimator of finite population mean using auxiliary attribute in stratified random sampling

    OpenAIRE

    Verma, Hemant K.; Sharma, Prayas; Singh, Rajesh

    2014-01-01

    The present study discuss the problem of estimating the finite population mean using auxiliary attribute in stratified random sampling. In this paper taking the advantage of point bi-serial correlation between the study variable and auxiliary attribute, we have improved the estimation of population mean in stratified random sampling. The expressions for Bias and Mean square error have been derived under stratified random sampling. In addition, an empirical study has been carried out to examin...

  7. A Family of Estimators of a Sensitive Variable Using Auxiliary Information in Stratified Random Sampling

    Directory of Open Access Journals (Sweden)

    Nadia Mushtaq

    2017-03-01

    Full Text Available In this article, a combined general family of estimators is proposed for estimating finite population mean of a sensitive variable in stratified random sampling with non-sensitive auxiliary variable based on randomized response technique. Under stratified random sampling without replacement scheme, the expression of bias and mean square error (MSE up to the first-order approximations are derived. Theoretical and empirical results through a simulation study show that the proposed class of estimators is more efficient than the existing estimators, i.e., usual stratified random sample mean estimator, Sousa et al (2014 ratio and regression estimator of the sensitive variable in stratified sampling.

  8. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  9. English Language Teaching in Spain: Do Textbooks Comply with the Official Methodological Regulations? A Sample Analysis

    Directory of Open Access Journals (Sweden)

    Aquilino Sánchez

    2009-06-01

    Full Text Available The goal of this paper is to verify up to what point ELT textbooks used in Spanish educational settings comply with the official regulations prescribed, which fully advocate the Communicative Language Teaching Method (CLT. For that purpose, seven representative coursebooks of different educational levels and modalities in Spain – secondary, upper secondary, teenager and adult textbooks – were selected to be analysed. A full unit randomly selected from each coursebook was examined through the parameters of the communicative potential of the activities – measured on a scale from 0 to 10 – and the communicative nature of the methodological strategies implemented – measured on a dichotomous scale (yes/no. Global results per educational levels point to the prevailing communicative nature of all the materials, which was shown to be above 50%. The remaining non-communicative block was covered by activities focused on the formal features of language (grammar and vocabulary. This resulting degree of dissociation between official regulations and what is really found in teaching materials may be positive, since the learning of languages is complex and results from the intervention of multiple factors and learning styles, as is evidenced by the professional experience of teachers from different backgrounds and beliefs.

  10. Validation of the 2008 Landsat Burned Area Ecv Product for North America Using Stratified Random Sampling

    Science.gov (United States)

    Brunner, N. M.; Mladinich, C. S.; Caldwell, M. K.; Beal, Y. J. G.

    2014-12-01

    The U.S. Geological Survey is generating a suite of Essential Climate Variables (ECVs) products, as defined by the Global Climate Observing System, from the Landsat data archive. Validation protocols for these products are being established, incorporating the Committee on Earth Observing Satellites Land Product Validation Subgroup's best practice guidelines and validation hierarchy stages. The sampling design and accuracy measures follow the methodology developed by the European Space Agency's Climate Change Initiative Fire Disturbance (fire_cci) project (Padilla and others, 2014). A rigorous validation was performed on the 2008 Burned Area ECV (BAECV) prototype product, using a stratified random sample of 48 Thiessen scene areas overlaying Landsat path/rows distributed across several terrestrial biomes throughout North America. The validation reference data consisted of fourteen sample sites acquired from the fire_cci project and the remaining new samples sites generated from a densification of the stratified sampling for North America. The reference burned area polygons were generated using the ABAMS (Automatic Burned Area Mapping) software (Bastarrika and others, 2011; Izagirre, 2014). Accuracy results will be presented indicating strengths and weaknesses of the BAECV algorithm.Bastarrika, A., Chuvieco, E., and Martín, M.P., 2011, Mapping burned areas from Landsat TM/ETM+ data with a two-phase algorithm: Balancing omission and commission errors: Remote Sensing of Environment, v. 115, no. 4, p. 1003-1012.Izagirre, A.B., 2014, Automatic Burned Area Mapping Software (ABAMS), Preliminary Documentation, Version 10 v4,: Vitoria-Gasteiz, Spain, University of Basque Country, p. 27.Padilla, M., Chuvieco, E., Hantson, S., Theis, R., and Sandow, C., 2014, D2.1 - Product Validation Plan: UAH - University of Alcalá de Henares (Spain), 37 p.

  11. Nonlinear Methodologies for Identifying Seismic Event and Nuclear Explosion Using Random Forest, Support Vector Machine, and Naive Bayes Classification

    Directory of Open Access Journals (Sweden)

    Longjun Dong

    2014-01-01

    Full Text Available The discrimination of seismic event and nuclear explosion is a complex and nonlinear system. The nonlinear methodologies including Random Forests (RF, Support Vector Machines (SVM, and Naïve Bayes Classifier (NBC were applied to discriminant seismic events. Twenty earthquakes and twenty-seven explosions with nine ratios of the energies contained within predetermined “velocity windows” and calculated distance are used in discriminators. Based on the one out cross-validation, ROC curve, calculated accuracy of training and test samples, and discriminating performances of RF, SVM, and NBC were discussed and compared. The result of RF method clearly shows the best predictive power with a maximum area of 0.975 under the ROC among RF, SVM, and NBC. The discriminant accuracies of RF, SVM, and NBC for test samples are 92.86%, 85.71%, and 92.86%, respectively. It has been demonstrated that the presented RF model can not only identify seismic event automatically with high accuracy, but also can sort the discriminant indicators according to calculated values of weights.

  12. Computer Corner: A Note on Pascal's Triangle and Simple Random Sampling.

    Science.gov (United States)

    Wright, Tommy

    1989-01-01

    Describes the algorithm used to select a simple random sample of certain size without having to list all possible samples and a justification based on Pascal's triangle. Provides testing results by various computers. (YP)

  13. What about N? A methodological study of sample-size reporting in focus group studies

    Science.gov (United States)

    2011-01-01

    Background Focus group studies are increasingly published in health related journals, but we know little about how researchers use this method, particularly how they determine the number of focus groups to conduct. The methodological literature commonly advises researchers to follow principles of data saturation, although practical advise on how to do this is lacking. Our objectives were firstly, to describe the current status of sample size in focus group studies reported in health journals. Secondly, to assess whether and how researchers explain the number of focus groups they carry out. Methods We searched PubMed for studies that had used focus groups and that had been published in open access journals during 2008, and extracted data on the number of focus groups and on any explanation authors gave for this number. We also did a qualitative assessment of the papers with regard to how number of groups was explained and discussed. Results We identified 220 papers published in 117 journals. In these papers insufficient reporting of sample sizes was common. The number of focus groups conducted varied greatly (mean 8.4, median 5, range 1 to 96). Thirty seven (17%) studies attempted to explain the number of groups. Six studies referred to rules of thumb in the literature, three stated that they were unable to organize more groups for practical reasons, while 28 studies stated that they had reached a point of saturation. Among those stating that they had reached a point of saturation, several appeared not to have followed principles from grounded theory where data collection and analysis is an iterative process until saturation is reached. Studies with high numbers of focus groups did not offer explanations for number of groups. Too much data as a study weakness was not an issue discussed in any of the reviewed papers. Conclusions Based on these findings we suggest that journals adopt more stringent requirements for focus group method reporting. The often poor and

  14. What about N? A methodological study of sample-size reporting in focus group studies.

    Science.gov (United States)

    Carlsen, Benedicte; Glenton, Claire

    2011-03-11

    Focus group studies are increasingly published in health related journals, but we know little about how researchers use this method, particularly how they determine the number of focus groups to conduct. The methodological literature commonly advises researchers to follow principles of data saturation, although practical advise on how to do this is lacking. Our objectives were firstly, to describe the current status of sample size in focus group studies reported in health journals. Secondly, to assess whether and how researchers explain the number of focus groups they carry out. We searched PubMed for studies that had used focus groups and that had been published in open access journals during 2008, and extracted data on the number of focus groups and on any explanation authors gave for this number. We also did a qualitative assessment of the papers with regard to how number of groups was explained and discussed. We identified 220 papers published in 117 journals. In these papers insufficient reporting of sample sizes was common. The number of focus groups conducted varied greatly (mean 8.4, median 5, range 1 to 96). Thirty seven (17%) studies attempted to explain the number of groups. Six studies referred to rules of thumb in the literature, three stated that they were unable to organize more groups for practical reasons, while 28 studies stated that they had reached a point of saturation. Among those stating that they had reached a point of saturation, several appeared not to have followed principles from grounded theory where data collection and analysis is an iterative process until saturation is reached. Studies with high numbers of focus groups did not offer explanations for number of groups. Too much data as a study weakness was not an issue discussed in any of the reviewed papers. Based on these findings we suggest that journals adopt more stringent requirements for focus group method reporting. The often poor and inconsistent reporting seen in these

  15. What about N? A methodological study of sample-size reporting in focus group studies

    Directory of Open Access Journals (Sweden)

    Glenton Claire

    2011-03-01

    Full Text Available Abstract Background Focus group studies are increasingly published in health related journals, but we know little about how researchers use this method, particularly how they determine the number of focus groups to conduct. The methodological literature commonly advises researchers to follow principles of data saturation, although practical advise on how to do this is lacking. Our objectives were firstly, to describe the current status of sample size in focus group studies reported in health journals. Secondly, to assess whether and how researchers explain the number of focus groups they carry out. Methods We searched PubMed for studies that had used focus groups and that had been published in open access journals during 2008, and extracted data on the number of focus groups and on any explanation authors gave for this number. We also did a qualitative assessment of the papers with regard to how number of groups was explained and discussed. Results We identified 220 papers published in 117 journals. In these papers insufficient reporting of sample sizes was common. The number of focus groups conducted varied greatly (mean 8.4, median 5, range 1 to 96. Thirty seven (17% studies attempted to explain the number of groups. Six studies referred to rules of thumb in the literature, three stated that they were unable to organize more groups for practical reasons, while 28 studies stated that they had reached a point of saturation. Among those stating that they had reached a point of saturation, several appeared not to have followed principles from grounded theory where data collection and analysis is an iterative process until saturation is reached. Studies with high numbers of focus groups did not offer explanations for number of groups. Too much data as a study weakness was not an issue discussed in any of the reviewed papers. Conclusions Based on these findings we suggest that journals adopt more stringent requirements for focus group method

  16. Post-trial follow-up methodology in large randomized controlled trials: a systematic review protocol.

    Science.gov (United States)

    Llewellyn-Bennett, Rebecca; Bowman, Louise; Bulbulia, Richard

    2016-12-15

    Clinical trials typically have a relatively short follow-up period, and may both underestimate potential benefits of treatments investigated, and fail to detect hazards, which can take much longer to emerge. Prolonged follow-up of trial participants after the end of the scheduled trial period can provide important information on both efficacy and safety outcomes. This protocol describes a systematic review to qualitatively compare methods of post-trial follow-up used in large randomized controlled trials. A systematic search of electronic databases and clinical trial registries will use a predefined search strategy. All large (more than 1000 adult participants) randomized controlled trials will be evaluated. Two reviewers will screen and extract data according to this protocol with the aim of 95% concordance of papers checked and discrepancies will be resolved by a third reviewer. Trial methods, participant retention rates and prevalence of missing data will be recorded and compared. The potential for bias will be evaluated using the Cochrane Risk of Bias tool (applied to the methods used during the in-trial period) with the aim of investigating whether the quality of the post-trial follow-up methodology might be predicted by the quality of the methods used for the original trial. Post-trial follow-up can provide valuable information about the long-term benefits and hazards of medical interventions. However, it can be logistically challenging and costly. The aim of this systematic review is to describe how trial participants have been followed-up post-trial in order to inform future post-trial follow-up designs. Not applicable for PROSPERO registration.

  17. Effective speed and agility conditioning methodology for random intermittent dynamic type sports.

    Science.gov (United States)

    Bloomfield, Jonathan; Polman, Remco; O'Donoghue, Peter; McNaughton, Lars

    2007-11-01

    Different coaching methods are often used to improve performance. This study compared the effectiveness of 2 methodologies for speed and agility conditioning for random, intermittent, and dynamic activity sports (e.g., soccer, tennis, hockey, basketball, rugby, and netball) and the necessity for specialized coaching equipment. Two groups were delivered either a programmed method (PC) or a random method (RC) of conditioning with a third group receiving no conditioning (NC). PC participants used the speed, agility, quickness (SAQ) conditioning method, and RC participants played supervised small-sided soccer games. PC was also subdivided into 2 groups where participants either used specialized SAQ equipment or no equipment. A total of 46 (25 males and 21 females) untrained participants received (mean +/- SD) 12.2 +/- 2.1 hours of physical conditioning over 6 weeks between a battery of speed and agility parameter field tests. Two-way analysis of variance results indicated that both conditioning groups showed a significant decrease in body mass and body mass index, although PC achieved significantly greater improvements on acceleration, deceleration, leg power, dynamic balance, and the overall summation of % increases when compared to RC and NC (p < 0.05). PC in the form of SAQ exercises appears to be a superior method for improving speed and agility parameters; however, this study found that specialized SAQ equipment was not a requirement to observe significant improvements. Further research is required to establish whether these benefits transfer to sport-specific tasks as well as to the underlying mechanisms resulting in improved performance.

  18. Methodologies for the Extraction of Phenolic Compounds from Environmental Samples: New Approaches

    Directory of Open Access Journals (Sweden)

    Cristina Mahugo Santana

    2009-01-01

    Full Text Available Phenolic derivatives are among the most important contaminants present in the environment. These compounds are used in several industrial processes to manufacture chemicals such as pesticides, explosives, drugs and dyes. They also are used in the bleaching process of paper manufacturing. Apart from these sources, phenolic compounds have substantial applications in agriculture as herbicides, insecticides and fungicides. However, phenolic compounds are not only generated by human activity, but they are also formed naturally, e.g., during the decomposition of leaves or wood. As a result of these applications, they are found in soils and sediments and this often leads to wastewater and ground water contamination. Owing to their high toxicity and persistence in the environment, both, the US Environmental Protection Agency (EPA and the European Union have included some of them in their lists of priority pollutants. Current standard methods of phenolic compounds analysis in water samples are based on liquid–liquid extraction (LLE while Soxhlet extraction is the most used technique for isolating phenols from solid matrices. However, these techniques require extensive cleanup procedures that are time-intensive and involve expensive and hazardous organic solvents, which are undesirable for health and disposal reasons. In the last years, the use of news methodologies such as solid-phase extraction (SPE and solid-phase microextraction (SPME have increased for the extraction of phenolic compounds from liquid samples. In the case of solid samples, microwave assisted extraction (MAE is demonstrated to be an efficient technique for the extraction of these compounds. In this work we review the developed methods in the extraction and determination of phenolic derivatives in different types of environmental matrices such as water, sediments and soils. Moreover, we present the new approach in the use of micellar media coupled with SPME process for the

  19. Methodologies for the extraction of phenolic compounds from environmental samples: new approaches.

    Science.gov (United States)

    Mahugo Santana, Cristina; Sosa Ferrera, Zoraida; Esther Torres Padrón, M; Juan Santana Rodríguez, José

    2009-01-09

    Phenolic derivatives are among the most important contaminants present in the environment. These compounds are used in several industrial processes to manufacture chemicals such as pesticides, explosives, drugs and dyes. They also are used in the bleaching process of paper manufacturing. Apart from these sources, phenolic compounds have substantial applications in agriculture as herbicides, insecticides and fungicides. However, phenolic compounds are not only generated by human activity, but they are also formed naturally, e.g., during the decomposition of leaves or wood. As a result of these applications, they are found in soils and sediments and this often leads to wastewater and ground water contamination. Owing to their high toxicity and persistence in the environment, both, the US Environmental Protection Agency (EPA) and the European Union have included some of them in their lists of priority pollutants. Current standard methods of phenolic compounds analysis in water samples are based on liquid-liquid extraction (LLE) while Soxhlet extraction is the most used technique for isolating phenols from solid matrices. However, these techniques require extensive cleanup procedures that are time-intensive and involve expensive and hazardous organic solvents, which are undesirable for health and disposal reasons. In the last years, the use of news methodologies such as solid-phase extraction (SPE) and solid-phase microextraction (SPME) have increased for the extraction of phenolic compounds from liquid samples. In the case of solid samples, microwave assisted extraction (MAE) is demonstrated to be an efficient technique for the extraction of these compounds. In this work we review the developed methods in the extraction and determination of phenolic derivatives in different types of environmental matrices such as water, sediments and soils. Moreover, we present the new approach in the use of micellar media coupled with SPME process for the extraction of phenolic

  20. Exponential ratio-product type estimators under second order approximation in stratified random sampling

    OpenAIRE

    Singh, Rajesh; Sharma, Prayas; Smarandache, Florentin

    2014-01-01

    Singh et al (20009) introduced a family of exponential ratio and product type estimators in stratified random sampling. Under stratified random sampling without replacement scheme, the expressions of bias and mean square error (MSE) of Singh et al (2009) and some other estimators, up to the first- and second-order approximations are derived. Also, the theoretical findings are supported by a numerical example.

  1. Query-Based Sampling: Can we do Better than Random?

    NARCIS (Netherlands)

    Tigelaar, A.S.; Hiemstra, Djoerd

    2010-01-01

    Many servers on the web offer content that is only accessible via a search interface. These are part of the deep web. Using conventional crawling to index the content of these remote servers is impossible without some form of cooperation. Query-based sampling provides an alternative to crawling

  2. Stratified random sampling plan for an irrigation customer telephone survey

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, J.W.; Davis, L.J.

    1986-05-01

    This report describes the procedures used to design and select a sample for a telephone survey of individuals who use electricity in irrigating agricultural cropland in the Pacific Northwest. The survey is intended to gather information on the irrigated agricultural sector that will be useful for conservation assessment, load forecasting, rate design, and other regional power planning activities.

  3. The Hubble Space Telescope Medium Deep Survey Cluster Sample: Methodology and Data

    Science.gov (United States)

    Ostrander, E. J.; Nichol, R. C.; Ratnatunga, K. U.; Griffiths, R. E.

    1998-12-01

    We present a new, objectively selected, sample of galaxy overdensities detected in the Hubble Space Telescope Medium Deep Survey (MDS). These clusters/groups were found using an automated procedure that involved searching for statistically significant galaxy overdensities. The contrast of the clusters against the field galaxy population is increased when morphological data are used to search around bulge-dominated galaxies. In total, we present 92 overdensities above a probability threshold of 99.5%. We show, via extensive Monte Carlo simulations, that at least 60% of these overdensities are likely to be real clusters and groups and not random line-of-sight superpositions of galaxies. For each overdensity in the MDS cluster sample, we provide a richness and the average of the bulge-to-total ratio of galaxies within each system. This MDS cluster sample potentially contains some of the most distant clusters/groups ever detected, with about 25% of the overdensities having estimated redshifts z > ~0.9. We have made this sample publicly available to facilitate spectroscopic confirmation of these clusters and help more detailed studies of cluster and galaxy evolution. We also report the serendipitous discovery of a new cluster close on the sky to the rich optical cluster Cl l0016+16 at z = 0.546. This new overdensity, HST 001831+16208, may be coincident with both an X-ray source and a radio source. HST 001831+16208 is the third cluster/group discovered near to Cl 0016+16 and appears to strengthen the claims of Connolly et al. of superclustering at high redshift.

  4. Methodological reporting quality of randomized controlled trials: A survey of seven core journals of orthopaedics from Mainland China over 5 years following the CONSORT statement.

    Science.gov (United States)

    Zhang, J; Chen, X; Zhu, Q; Cui, J; Cao, L; Su, J

    2016-11-01

    In recent years, the number of randomized controlled trials (RCTs) in the field of orthopaedics is increasing in Mainland China. However, randomized controlled trials (RCTs) are inclined to bias if they lack methodological quality. Therefore, we performed a survey of RCT to assess: (1) What about the quality of RCTs in the field of orthopedics in Mainland China? (2) Whether there is difference between the core journals of the Chinese department of orthopedics and Orthopaedics Traumatology Surgery & Research (OTSR). This research aimed to evaluate the methodological reporting quality according to the CONSORT statement of randomized controlled trials (RCTs) in seven key orthopaedic journals published in Mainland China over 5 years from 2010 to 2014. All of the articles were hand researched on Chongqing VIP database between 2010 and 2014. Studies were considered eligible if the words "random", "randomly", "randomization", "randomized" were employed to describe the allocation way. Trials including animals, cadavers, trials published as abstracts and case report, trials dealing with subgroups analysis, or trials without the outcomes were excluded. In addition, eight articles selected from Orthopaedics Traumatology Surgery & Research (OTSR) between 2010 and 2014 were included in this study for comparison. The identified RCTs are analyzed using a modified version of the Consolidated Standards of Reporting Trials (CONSORT), including the sample size calculation, allocation sequence generation, allocation concealment, blinding and handling of dropouts. A total of 222 RCTs were identified in seven core orthopaedic journals. No trials reported adequate sample size calculation, 74 (33.4%) reported adequate allocation generation, 8 (3.7%) trials reported adequate allocation concealment, 18 (8.1%) trials reported adequate blinding and 16 (7.2%) trials reported handling of dropouts. In OTSR, 1 (12.5%) trial reported adequate sample size calculation, 4 (50.0%) reported adequate

  5. A systematic review of trends in the methodological quality of randomized controlled trials in various research fields.

    Science.gov (United States)

    Falagas, Matthew E; Grigori, Tatiana; Ioannidou, Eleni

    2009-03-01

    We sought to evaluate the trends in the methodological quality of randomized controlled trials in various medical fields. Relevant studies were retrieved by the PubMed and the ISI Web of science databases. Thirty-five out of 457 retrieved studies met the inclusion criteria. Twenty-one out of 35 selected studies reported significant improvement in at least one methodological quality factor. Overall quality scores were increased in 13 out of 26 studies providing relevant data. The most commonly separately examined key quality factors were allocation concealment and blinding in 13 out of 21 studies that reported relevant data. Allocation concealment was the quality characteristic most commonly reported as significantly improving during the reviewed period (in five out of eight studies reporting relevant comparative data). Certain aspects of methodological quality have improved significantly over time, but others remain stagnant. Further efforts to improve study design, conduct, and reporting of randomized controlled trials are warranted.

  6. Definitions of love in a sample of British women: an empirical study using Q methodology.

    Science.gov (United States)

    Watts, Simon; Stenner, Paul

    2014-09-01

    Social psychological research has increasingly acknowledged that any pretensions to a singular theory of love should be replaced with a concern about its affirmation and what people actually say and do in love's name. Lee's (1977) love styles research and Sternberg's (1995) theory of love as a story are prime examples. Despite traditional definitions of love in western cultures being dominated by feminine images and tales of gender difference, however, the personal definitions and experiences of women have received comparatively little empirical attention, particularly in recent years and despite some well-documented changes in their cultural circumstances. This study remedies that situation through presentation of a Q methodological study in which a convenience sample of 59 British women were asked to Q sort 54 single-word descriptors of love to define love as they had experienced it. Factor analysis of the resulting Q sorts revealed six distinct definitions of love, interpreted as 'attraction, passion & romance', 'unconditional love', 'sex & fun', 'friendship & spirituality', 'a permanent commitment', and 'separate people, separate lives'. The six definitions are then discussed in terms of their allegiance to traditionally feminine and/or masculine values and as a means of highlighting the changing face of Britain's relational culture. © 2013 The British Psychological Society.

  7. METHODOLOGICAL ASPECTS OF MOLASSES SAMPLE PREPARATION IN SULFUR DIOXIDE CONTENT DETERMINING

    Directory of Open Access Journals (Sweden)

    M. I. Egorova

    2015-01-01

    Full Text Available Summary. Molasses is characterized as sugar production by-product from primary or secondary sacchariferous raw materials. The features of the appearance, the chemical composition, molasses and exit directions of its use, depending on the type of production, in which it is formed. The value of molasses is demonstrated according to its total composition as well as its use directions. Statistics on beet molasses amounts in Russia is presented. Described consumer market molasses in Russia and abroad with its exports. Shown regulations contain requirements for the quality and safety of molasses, including sulfur dioxide. The data on sulfur allergenic properties are presented. Showing source of the sulfur dioxide in the residual molasses number of processing aids and the impact of its level in the value of raw molasses for use in biotechnological processes and fodder production. The necessity to develop methodology for determining the sulfur dioxide content in the molasses to control its security. The iodometric method, which is used in practice for determination of sulphur dioxide in foods are characterized. Differences molasses and sugar as objects of iodometric determination of sulfur dioxide, which leads to the inability to ascertain the equivalence point. The variants eliminate interfering background of dark-colored foods common in analytical chemistry. Advantages and disadvantages of the background masking and stripping the determination of sulfur dioxide in the darkcolored products. It was characterized by clarifying sugar solutions in optical control methods. The hypothesis about preferability of its use in sample molasses preparation for equivalence point fixation in iodometric titration is suggested. The tasks of experimental research for the development of sample preparation algorithm molasses in determining the content of sulphurous acid.

  8. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column and dimini......When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  9. Systematic evaluation of the methodology of randomized controlled trials of anticoagulation in patients with cancer

    Directory of Open Access Journals (Sweden)

    Rada Gabriel

    2013-02-01

    Full Text Available Abstract Background Randomized controlled trials (RCTs that are inappropriately designed or executed may provide biased findings and mislead clinical practice. In view of recent interest in the treatment and prevention of thrombotic complications in cancer patients we evaluated the characteristics, risk of bias and their time trends in RCTs of anticoagulation in patients with cancer. Methods We conducted a comprehensive search, including a search of four electronic databases (MEDLINE, EMBASE, ISI the Web of Science, and CENTRAL up to February 2010. We included RCTs in which the intervention and/or comparison consisted of: vitamin K antagonists, unfractionated heparin (UFH, low molecular weight heparin (LMWH, direct thrombin inhibitors or fondaparinux. We performed descriptive analyses and assessed the association between the variables of interest and the year of publication. Results We included 67 RCTs with 24,071 participants. In twenty one trials (31% DVT diagnosis was triggered by clinical suspicion; the remaining trials either screened for DVT or were unclear about their approach. 41 (61%, 22 (33%, and 11 (16% trials respectively reported on major bleeding, minor bleeding, and thrombocytopenia. The percentages of trials satisfying risk of bias criteria were: adequate sequence generation (85%, adequate allocation concealment (61%, participants’ blinding (39%, data collectors’ blinding (44%, providers’ blinding (41%, outcome assessors’ blinding (75%, data analysts’ blinding (15%, intention to treat analysis (57%, no selective outcome reporting (12%, no stopping early for benefit (97%. The mean follow-up rate was 96%. Adequate allocation concealment and the reporting of intention to treat analysis were the only two quality criteria that improved over time. Conclusions Many RCTs of anticoagulation in patients with cancer appear to use insufficiently rigorous outcome assessment methods and to have deficiencies in key methodological

  10. Parameter Estimation in Stratified Cluster Sampling under Randomized Response Models for Sensitive Question Survey.

    Science.gov (United States)

    Pu, Xiangke; Gao, Ge; Fan, Yubo; Wang, Mian

    2016-01-01

    Randomized response is a research method to get accurate answers to sensitive questions in structured sample survey. Simple random sampling is widely used in surveys of sensitive questions but hard to apply on large targeted populations. On the other side, more sophisticated sampling regimes and corresponding formulas are seldom employed to sensitive question surveys. In this work, we developed a series of formulas for parameter estimation in cluster sampling and stratified cluster sampling under two kinds of randomized response models by using classic sampling theories and total probability formulas. The performances of the sampling methods and formulas in the survey of premarital sex and cheating on exams at Soochow University were also provided. The reliability of the survey methods and formulas for sensitive question survey was found to be high.

  11. Parameter Estimation in Stratified Cluster Sampling under Randomized Response Models for Sensitive Question Survey.

    Directory of Open Access Journals (Sweden)

    Xiangke Pu

    Full Text Available Randomized response is a research method to get accurate answers to sensitive questions in structured sample survey. Simple random sampling is widely used in surveys of sensitive questions but hard to apply on large targeted populations. On the other side, more sophisticated sampling regimes and corresponding formulas are seldom employed to sensitive question surveys. In this work, we developed a series of formulas for parameter estimation in cluster sampling and stratified cluster sampling under two kinds of randomized response models by using classic sampling theories and total probability formulas. The performances of the sampling methods and formulas in the survey of premarital sex and cheating on exams at Soochow University were also provided. The reliability of the survey methods and formulas for sensitive question survey was found to be high.

  12. THE RHETORICAL USE OF RANDOM SAMPLING: CRAFTING AND COMMUNICATING THE PUBLIC IMAGE OF POLLS AS A SCIENCE (1935-1948).

    Science.gov (United States)

    Lusinchi, Dominic

    2017-03-01

    The scientific pollsters (Archibald Crossley, George H. Gallup, and Elmo Roper) emerged onto the American news media scene in 1935. Much of what they did in the following years (1935-1948) was to promote both the political and scientific legitimacy of their enterprise. They sought to be recognized as the sole legitimate producers of public opinion. In this essay I examine the, mostly overlooked, rhetorical work deployed by the pollsters to publicize the scientific credentials of their polling activities, and the central role the concept of sampling has had in that pursuit. First, they distanced themselves from the failed straw poll by claiming that their sampling methodology based on quotas was informed by science. Second, although in practice they did not use random sampling, they relied on it rhetorically to derive the symbolic benefits of being associated with the "laws of probability." © 2017 Wiley Periodicals, Inc.

  13. A Unified Approach to Power Calculation and Sample Size Determination for Random Regression Models

    Science.gov (United States)

    Shieh, Gwowen

    2007-01-01

    The underlying statistical models for multiple regression analysis are typically attributed to two types of modeling: fixed and random. The procedures for calculating power and sample size under the fixed regression models are well known. However, the literature on random regression models is limited and has been confined to the case of all…

  14. A Family of Estimators of a Sensitive Variable Using Auxiliary Information in Stratified Random Sampling

    National Research Council Canada - National Science Library

    Nadia Mushtaq; Noor Ul Amin; Muhammad Hanif

    2017-01-01

    In this article, a combined general family of estimators is proposed for estimating finite population mean of a sensitive variable in stratified random sampling with non-sensitive auxiliary variable...

  15. A New Estimator For Population Mean Using Two Auxiliary Variables in Stratified random Sampling

    OpenAIRE

    Singh, Rajesh; Malik, Sachin

    2014-01-01

    In this paper, we suggest an estimator using two auxiliary variables in stratified random sampling. The propose estimator has an improvement over mean per unit estimator as well as some other considered estimators. Expressions for bias and MSE of the estimator are derived up to first degree of approximation. Moreover, these theoretical findings are supported by a numerical example with original data. Key words: Study variable, auxiliary variable, stratified random sampling, bias and mean squa...

  16. Conflict-cost based random sampling design for parallel MRI with low rank constraints

    Science.gov (United States)

    Kim, Wan; Zhou, Yihang; Lyu, Jingyuan; Ying, Leslie

    2015-05-01

    In compressed sensing MRI, it is very important to design sampling pattern for random sampling. For example, SAKE (simultaneous auto-calibrating and k-space estimation) is a parallel MRI reconstruction method using random undersampling. It formulates image reconstruction as a structured low-rank matrix completion problem. Variable density (VD) Poisson discs are typically adopted for 2D random sampling. The basic concept of Poisson disc generation is to guarantee samples are neither too close to nor too far away from each other. However, it is difficult to meet such a condition especially in the high density region. Therefore the sampling becomes inefficient. In this paper, we present an improved random sampling pattern for SAKE reconstruction. The pattern is generated based on a conflict cost with a probability model. The conflict cost measures how many dense samples already assigned are around a target location, while the probability model adopts the generalized Gaussian distribution which includes uniform and Gaussian-like distributions as special cases. Our method preferentially assigns a sample to a k-space location with the least conflict cost on the circle of the highest probability. To evaluate the effectiveness of the proposed random pattern, we compare the performance of SAKEs using both VD Poisson discs and the proposed pattern. Experimental results for brain data show that the proposed pattern yields lower normalized mean square error (NMSE) than VD Poisson discs.

  17. A new methodology for sampling blackflies for the entomological surveillance of onchocerciasis in Brazil.

    Directory of Open Access Journals (Sweden)

    Érika S do Nascimento-Carvalho

    Full Text Available The effectiveness of the MosqTent® trap was evaluated in endemic area to onchocerciasis in Brazil. This study seeks to provide subsidies for the monitoring of the onchocerciasis transmission in the country. The study was carried out at the Homoxi and Thirei villages, located in the Yanomami Indigenous Land, in the state of Roraima. This area presents hyperendemicity, high blackflies densities, large population migrations and mining activities. The Homoxi and Thirei villages are assisted by the Brazilian Ministry of Health. To conduct the present study, the village leader, health leaders and the Brazilian Ethics Committee were consulted. Blackflies captures were carried out simultaneously at the Homoxi and Thirei, using systematized methods to allow for comparisons between the traditional Human Landing Catch (HLC and HLC protected by the MosqTent®. The female blackflies were captured at two equidistant capture stations per locality, by two collectors per station, for five consecutive days. Individuals captured by interval/station/day were counted, identified and maintained at -20°C. The underlying probability distributions and the differences between the methods for the independent sample data were verified in a comparative statistical analysis between the use of the MosqTent® and the HLC. A total of 10,855 antropophilic blackflies were captured by both methodologies. A total of 7,367 (67.87% blackflies belonging to seven species were captured by MosqTent® -Simulium incrustatum s.l (99.06%; S. guianense s.l (0.74%, S. oyapockense s.l (0.01%, S. exiguum (0.10%, S. metallicum (0.05%, S. ochraceum (0.03% and S. minusculum s.l (0.01%. Moreover, 3,488 (32.14% blackflies belonging to four species were captured by HLC-S. incrustatum s.l (98.33%; S. guianense s.l (1.38%, S. oyapockense s.l (0.26% and S. metallicum (0.03%. The MosqTent® was more effective and efficient when compared to HLC. When comparing total blackflies captured/day, the Mosq

  18. DEVELOPMENT OF METHODOLOGY AND FIELD DEPLOYABLE SAMPLING TOOLS FOR SPENT NUCLEAR FUEL INTERROGATION IN LIQUID STORAGE

    Energy Technology Data Exchange (ETDEWEB)

    Berry, T.; Milliken, C.; Martinez-Rodriguez, M.; Hathcock, D.; Heitkamp, M.

    2012-06-04

    This project developed methodology and field deployable tools (test kits) to analyze the chemical and microbiological condition of the fuel storage medium and determine the oxide thickness on the spent fuel basin materials. The overall objective of this project was to determine the amount of time fuel has spent in a storage basin to determine if the operation of the reactor and storage basin is consistent with safeguard declarations or expectations. This project developed and validated forensic tools that can be used to predict the age and condition of spent nuclear fuels stored in liquid basins based on key physical, chemical and microbiological basin characteristics. Key parameters were identified based on a literature review, the parameters were used to design test cells for corrosion analyses, tools were purchased to analyze the key parameters, and these were used to characterize an active spent fuel basin, the Savannah River Site (SRS) L-Area basin. The key parameters identified in the literature review included chloride concentration, conductivity, and total organic carbon level. Focus was also placed on aluminum based cladding because of their application to weapons production. The literature review was helpful in identifying important parameters, but relationships between these parameters and corrosion rates were not available. Bench scale test systems were designed, operated, harvested, and analyzed to determine corrosion relationships between water parameters and water conditions, chemistry and microbiological conditions. The data from the bench scale system indicated that corrosion rates were dependent on total organic carbon levels and chloride concentrations. The highest corrosion rates were observed in test cells amended with sediment, a large microbial inoculum and an organic carbon source. A complete characterization test kit was field tested to characterize the SRS L-Area spent fuel basin. The sampling kit consisted of a TOC analyzer, a YSI

  19. Abundance, distribution and diversity of gelatinous predators along the northern Mid-Atlantic Ridge: A comparison of different sampling methodologies.

    Science.gov (United States)

    Hosia, Aino; Falkenhaug, Tone; Baxter, Emily J; Pagès, Francesc

    2017-01-01

    The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology

  20. Abundance, distribution and diversity of gelatinous predators along the northern Mid-Atlantic Ridge: A comparison of different sampling methodologies.

    Directory of Open Access Journals (Sweden)

    Aino Hosia

    Full Text Available The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP & Remotely Operated Vehicle (ROV. Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d. when used at the same stations (n = 6. While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering

  1. Accuracy or precision: Implications of sample design and methodology on abundance estimation

    Science.gov (United States)

    Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.

    2015-01-01

    Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.

  2. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  3. Comparison of kriging interpolation precision between grid sampling scheme and simple random sampling scheme for precision agriculture

    Directory of Open Access Journals (Sweden)

    Jiang Houlong

    2016-01-01

    Full Text Available Sampling methods are important factors that can potentially limit the accuracy of predictions of spatial distribution patterns. A 10 ha tobacco-planted field was selected to compared the accuracy in predicting the spatial distribution of soil properties by using ordinary kriging and cross validation methods between grid sampling and simple random sampling scheme (SRS. To achieve this objective, we collected soil samples from the topsoil (0-20 cm in March 2012. Sample numbers of grid sampling and SRS were both 115 points each. Accuracies of spatial interpolation using the two sampling schemes were then evaluated based on validation samples (36 points and deviations of the estimates. The results suggested that soil pH and nitrate-N (NO3-N had low variation, whereas all other soil properties exhibited medium variation. Soil pH, organic matter (OM, total nitrogen (TN, cation exchange capacity (CEC, total phosphorus (TP and available phosphorus (AP matched the spherical model, whereas the remaining variables fit an exponential model with both sampling methods. The interpolation error of soil pH, TP, and AP was the lowest in SRS. The errors of interpolation for OM, CEC, TN, available potassium (AK and total potassium (TK were the lowest for grid sampling. The interpolation precisions of the soil NO3-N showed no significant differences between the two sampling schemes. Considering our data on interpolation precision and the importance of minerals for cultivation of flue-cured tobacco, the grid-sampling scheme should be used in tobacco-planted fields to determine the spatial distribution of soil properties. The grid-sampling method can be applied in a practical and cost-effective manner to facilitate soil sampling in tobacco-planted field.

  4. Determination of rare earth elements in natural water samples – A review of sample separation, preconcentration and direct methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Fisher, Andrew, E-mail: afisher@plymouth.ac.uk [School of Geography, Earth and Environmental Sciences, Plymouth University, Drake Circus, Plymouth, Devon, PL4 8AA (United Kingdom); Kara, Derya [Department of Chemistry, Art and Science Faculty, Balikesir University, 10100, Balikesir (Turkey)

    2016-09-07

    This review discusses and compares the methods given for the determination of rare earth elements (REE) in natural water samples, including sea, river, lake, tap, ground and waste waters as well as Antarctic ice. Since REE are at very low concentrations in natural waters, numerous different preconcentration methods have been proposed to enable their measurement. These include liquid liquid extraction, dispersive liquid-liquid micro-extraction and solidified floating drop micro-extraction. In addition to liquid-liquid extraction methods, solid phase extraction using commercial resins, resins made in-house, silica-based exchange materials and other solid media is also discussed. These and other techniques such as precipitation/co-precipitation and flotation are compared in terms of speed, preconcentration factors achieved, precision, accuracy and limits of detection (LOD). Some papers have discussed the direct determination of REE in these sample types. Some have used specialised sample introduction systems such as ultrasonic nebulization whereas others have used a standard sample introduction system coupled with inductively coupled plasma mass spectrometry (ICP-MS) detection. These direct methods have also been discussed and compared. - Highlights: • The determination of rare earth elements in waters is reviewed. • Assorted preconcentration techniques are discussed and evaluated. • Detection techniques include atomic spectrometry, potentiometry and spectrophotometry. • Special nebulisers and electrothermal vaporization approaches are reviewed.

  5. Design and methodology of the LA Sprouts nutrition, cooking and gardening program for Latino youth: A randomized controlled intervention.

    Science.gov (United States)

    Martinez, Lauren C; Gatto, Nicole M; Spruijt-Metz, Donna; Davis, Jaimie N

    2015-05-01

    The LA Sprouts 12-week nutrition, cooking and gardening intervention targets obesity reduction in Latino children. While other gardening and nutrition programs are shown to improve dietary intake, LA Sprouts is unique in that it utilized a curriculum demonstrated to decrease obesity. This methodology paper outlines the design and processes of the LA Sprouts study, and discusses key strategies employed to foster successful implementation of the program. After-school program in four Los Angeles elementary schools. 3rd-5th grade students. Randomized controlled trial. Gardens were built on two of four school campuses, and the 90-minute weekly lessons focused on strategies to increase fruit and vegetable consumption, gardening at school and home, and cooking healthy meals/snacks. Data collection was conducted pre- and post-intervention and included basic clinical and anthropometric measures, dietary intake and psychosocial constructs measured by questionnaire, and an optional fasting blood draw. Baseline data was collected from 364 children, and 320 (88%) completed follow-up. No participants withdrew from the program (data were missing for other reasons). Intervention students attended 9.7 ± 2.3 lessons. Fasting blood samples were collected on 169 children at baseline, and 113 (67%) at follow-up. Questionnaire scales had good internal consistency (IC) and intra-rater reliability (IRR; in child scales: 88% items with IC > 0.7 and 70% items with IRR > 0.50; in parent scales: 75% items with IC > 0.7). The intervention was successfully implemented in the schools and scales appear appropriate to evaluate psychosocial constructs relevant to a gardening intervention. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Design and methodology of the LA Sprouts nutrition, cooking and gardening program for Latino youth: a randomized controlled intervention

    Science.gov (United States)

    Martinez, Lauren C.; Gatto, Nicole M.; Spruijt-Metz, Donna; Davis, Jaimie N.

    2015-01-01

    Objective The LA Sprouts 12-week nutrition, cooking and gardening intervention targets obesity reduction in Latino children. While other gardening and nutrition programs are shown to improve dietary intake, LA Sprouts is unique in that it utilized a curriculum demonstrated to decrease obesity. This methodology paper outlines the design and processes of the LA Sprouts study, and discusses key strategies employed to foster successful implementation of the program. Setting After-school program in four Los Angeles elementary schools. Subjects 3rd–5th grade students. Design Randomized controlled trial. Gardens were built on two of four school campuses, and the 90-minute weekly lessons focused on strategies to increase fruit and vegetable consumption, gardening at school and home, and cooking healthy meals/snacks. Data collection was conducted pre- and post-intervention and included basic clinical and anthropometric measures, dietary intake and psychosocial constructs measured by questionnaire, and an optional fasting blood draw. Results Baseline data was collected from 364 children, and 320 (88%) completed follow-up. No participants withdrew from the program (data were missing for other reasons). Intervention students attended 9.7 ± 2.3 lessons. Fasting blood samples were collected on 169 children at baseline, and 113 (67%) at follow-up. Questionnaire scales had good internal consistency (IC) and intra-rater reliability (IRR; in child scales: 88% items with IC >0.7 and 70% items with IRR > 0.50; in parent scales: 75% items with IC > 0.7). Conclusions The intervention was successfully implemented in the schools and scales appear appropriate to evaluate psychosocial constructs relevant to a gardening intervention. PMID:25896115

  7. Calculating sample sizes for cluster randomized trials: we can keep it simple and efficient !

    NARCIS (Netherlands)

    van Breukelen, Gerard J.P.; Candel, Math J.J.M.

    2012-01-01

    Objective: Simple guidelines for efficient sample sizes in cluster randomized trials with unknown intraclass correlation and varying cluster sizes. Methods: A simple equation is given for the optimal number of clusters and sample size per cluster. Here, optimal means maximizing power for a given

  8. Towards a methodology for large-sample prompt-gamma neutron-activation analysis

    NARCIS (Netherlands)

    Degenaar, I.H.

    2004-01-01

    Large-sample prompt-gamma neutron-activation analysis, or shortly LS PGNAA, is a method by which mass fractions of elements can be determined in large samples with a mass over 1 kg. In this method the large sample is irradiated with neutrons. Directly (prompt) after absorption of the neutrons

  9. Stratified random sampling for estimating billing accuracy in health care systems.

    Science.gov (United States)

    Buddhakulsomsiri, Jirachai; Parthanadee, Parthana

    2008-03-01

    This paper presents a stratified random sampling plan for estimating accuracy of bill processing performance for the health care bills submitted to third party payers in health care systems. Bill processing accuracy is estimated with two measures: percent accuracy and total dollar accuracy. Difficulties in constructing a sampling plan arise when the population strata structure is unknown, and when the two measures require different sampling schemes. To efficiently utilize sample resource, the sampling plan is designed to effectively estimate both measures from the same sample. The sampling plan features a simple but efficient strata construction method, called rectangular method, and two accuracy estimation methods, one for each measure. The sampling plan is tested on actual populations from an insurance company. Accuracy estimates obtained are then used to compare the rectangular method to other potential clustering methods for strata construction, and compare the accuracy estimation methods to other eligible methods. Computational study results show effectiveness of the proposed sampling plan.

  10. SNP selection and classification of genome-wide SNP data using stratified sampling random forests.

    Science.gov (United States)

    Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K

    2012-09-01

    For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.

  11. A simple and reliable methodology to detect egg white in art samples

    Indian Academy of Sciences (India)

    2013-04-26

    Apr 26, 2013 ... Present address: Bio & Environmental Technologies, Institute CIM for Sustainable Innovation,. Department of Innovative Technologies, University of Applied Sciences of .... samples) was homogenised by pestle and mortar in 1 mL of. 6 M urea for 2 min. Gilding samples were shortly crushed against the Eppi ...

  12. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population

    OpenAIRE

    Guillermo Rey Gozalo; Juan Miguel Barrigón Morillas

    2016-01-01

    Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods) were analysed and compared using the city of Talca (Chile) as a test case. The results show that the stratification of sound values in road categories has ...

  13. A simple and reliable methodology to detect egg white in art samples.

    Science.gov (United States)

    Gambino, Michela; Cappitelli, Francesca; Cattò, Cristina; Carpen, Aristodemo; Principi, Pamela; Ghezzi, Lisa; Bonaduce, Ilaria; Galano, Eugenio; Pucci, Pietro; Birolo, Leila; Villa, Federica; Forlani, Fabio

    2013-06-01

    A protocol for a simple and reliable dot-blot immunoassay was developed and optimized to test work of art samples for the presence of specific proteinaceus material (i.e. ovalbumin-based). The analytical protocol has been extensively set up with respect, among the other, to protein extraction conditions, to densitometric analysis and to the colorimetric reaction conditions. Feasibility evaluation demonstrated that a commercial scanner and a free image analysis software can be used for the data acquisition and elaboration, thus facilitating the application of the proposed protocol to commonly equipped laboratories and to laboratories of museums and conservation centres. The introduction of method of standard additions in the analysis of fresh and artificially aged laboratory-prepared samples, containing egg white and various pigments, allowed us to evaluate the matrix effect and the effect of sample aging and to generate threshold density values useful for the detection of ovalbumin in samples from ancient works of art. The efficacy of the developed dot-blot immunoassay was proved testing microsamples from 13th-16th century mural paintings of Saint Francesco Church in Lodi (Italy). Despite the aging, the altered conditions of conservation, the complex matrix, and the micro-size of samples, the presence of ovalbumin was detected in all those mural painting samples where mass-spectrometry-based proteomic analysis unambiguously detected ovalbumin peptides.

  14. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2013-01-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  15. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    Science.gov (United States)

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10(5) W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10(5) W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Assessing Impact and Bridging Methodological Divides: Randomized Trials in Countries Affected by Conflict

    Science.gov (United States)

    Burde, Dana

    2012-01-01

    Randomized trials have experienced a marked surge in endorsement and popularity in education research in the past decade. This surge reignited paradigm debates and spurred qualitative critics to accuse these experimental designs of eclipsing qualitative research. This article reviews a current iteration of this debate and examines two randomized…

  17. Surface wipe sampling for antineoplastic (chemotherapy) and other hazardous drug residue in healthcare settings: Methodology and recommendations.

    Science.gov (United States)

    Connor, Thomas H; Zock, Matthew D; Snow, Amy H

    2016-09-01

    Surface wipe sampling for various hazardous agents has been employed in many occupational settings over the years for various reasons such as evaluation of potential dermal exposure and health risk, source determination, quality or cleanliness, compliance, and others. Wipe sampling for surface residue of antineoplastic and other hazardous drugs in healthcare settings is currently the method of choice to determine surface contamination of the workplace with these drugs. The purpose of this article is to review published studies of wipe sampling for antineoplastic and other hazardous drugs, to summarize the methods in use by various organizations and researchers, and to provide some basic guidance for conducting surface wipe sampling for these drugs in healthcare settings.  Recommendations on wipe sampling methodology from several government agencies and organizations were reviewed. Published reports on wipe sampling for hazardous drugs in numerous studies were also examined. The critical elements of a wipe sampling program and related limitations were reviewed and summarized.  Recommendations and guidance are presented concerning the purposes of wipe sampling for antineoplastic and other hazardous drugs in the healthcare setting, technical factors and variables, sampling strategy, materials required, and limitations. The reporting and interpretation of wipe sample results is also discussed.  It is recommended that all healthcare settings where antineoplastic and other hazardous drugs are handled consider wipe sampling as part of a comprehensive hazardous drug "safe handling" program. Although no standards exist for acceptable or allowable surface concentrations for these drugs in the healthcare setting, wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. A comprehensive safe-handling program for antineoplastic drugs may

  18. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  19. A sampling and metagenomic sequencing-based methodology for monitoring antimicrobial resistance in swine herds.

    Science.gov (United States)

    Munk, Patrick; Andersen, Vibe Dalhoff; de Knegt, Leonardo; Jensen, Marie Stengaard; Knudsen, Berith Elkær; Lukjancenko, Oksana; Mordhorst, Hanne; Clasen, Julie; Agersø, Yvonne; Folkesson, Anders; Pamp, Sünje Johanna; Vigre, Håkan; Aarestrup, Frank Møller

    2017-02-01

    Reliable methods for monitoring antimicrobial resistance (AMR) in livestock and other reservoirs are essential to understand the trends, transmission and importance of agricultural resistance. Quantification of AMR is mostly done using culture-based techniques, but metagenomic read mapping shows promise for quantitative resistance monitoring. We evaluated the ability of: (i) MIC determination for Escherichia coli; (ii) cfu counting of E. coli; (iii) cfu counting of aerobic bacteria; and (iv) metagenomic shotgun sequencing to predict expected tetracycline resistance based on known antimicrobial consumption in 10 Danish integrated slaughter pig herds. In addition, we evaluated whether fresh or manure floor samples constitute suitable proxies for intestinal sampling, using cfu counting, qPCR and metagenomic shotgun sequencing. Metagenomic read-mapping outperformed cultivation-based techniques in terms of predicting expected tetracycline resistance based on antimicrobial consumption. Our metagenomic approach had sufficient resolution to detect antimicrobial-induced changes to individual resistance gene abundances. Pen floor manure samples were found to represent rectal samples well when analysed using metagenomics, as they contain the same DNA with the exception of a few contaminating taxa that proliferate in the extraintestinal environment. We present a workflow, from sampling to interpretation, showing how resistance monitoring can be carried out in swine herds using a metagenomic approach. We propose metagenomic sequencing should be part of routine livestock resistance monitoring programmes and potentially of integrated One Health monitoring in all reservoirs. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy.

  20. Development of a SPME-GC-ECD methodology for selected pesticides in must and wine samples.

    Science.gov (United States)

    Correia, M; Delerue-Matos, C; Alves, A

    2001-04-01

    A method for the determination of some pesticide residues in must and wine samples was developed using solid-phase microextraction (SPME) and gas chromatography-electron capture detection (GC/ECD). The procedure only needs dilution as sample pre-treatment and is therefore simple, fast and solvent-free. Eight fungicides (vinclozolin, procymidone, iprodione, penconazole, fenarimol, folpet, nuarimol and hexaconazole), one insecticide (chlorpyriphos) and two acaricides (bromopropylate and tetradifon) can be quantified. Good linearity was observed for all the compounds in the range 5-100 microg/L. The reproducibility of the measurements was found acceptable (with RSD's below 20%). Detection limits of 11 microg/L, on average, are sufficiently below the proposed maximum residue limits (MRL's) for these compounds in wine. The analytical method was applied to the determination of these compounds in Portuguese must and wine samples from the Demarcated Region of Alentejo, where any residues could be detected.

  1. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  2. Power and sample size calculations for Mendelian randomization studies using one genetic instrument.

    Science.gov (United States)

    Freeman, Guy; Cowling, Benjamin J; Schooling, C Mary

    2013-08-01

    Mendelian randomization, which is instrumental variable analysis using genetic variants as instruments, is an increasingly popular method of making causal inferences from observational studies. In order to design efficient Mendelian randomization studies, it is essential to calculate the sample sizes required. We present formulas for calculating the power of a Mendelian randomization study using one genetic instrument to detect an effect of a given size, and the minimum sample size required to detect effects for given levels of significance and power, using asymptotic statistical theory. We apply the formulas to some example data and compare the results with those from simulation methods. Power and sample size calculations using these formulas should be more straightforward to carry out than simulation approaches. These formulas make explicit that the sample size needed for Mendelian randomization study is inversely proportional to the square of the correlation between the genetic instrument and the exposure and proportional to the residual variance of the outcome after removing the effect of the exposure, as well as inversely proportional to the square of the effect size.

  3. Sampling versus Random Binning for Multiple Descriptions of a Bandlimited Source

    DEFF Research Database (Denmark)

    Mashiach, Adam; Østergaard, Jan; Zamir, Ram

    2013-01-01

    Random binning is an efficient, yet complex, coding technique for the symmetric L-description source coding problem. We propose an alternative approach, that uses the quantized samples of a bandlimited source as "descriptions". By the Nyquist condition, the source can be reconstructed if enough s...

  4. Recidivism among Child Sexual Abusers: Initial Results of a 13-Year Longitudinal Random Sample

    Science.gov (United States)

    Patrick, Steven; Marsh, Robert

    2009-01-01

    In the initial analysis of data from a random sample of all those charged with child sexual abuse in Idaho over a 13-year period, only one predictive variable was found that related to recidivism of those convicted. Variables such as ethnicity, relationship, gender, and age differences did not show a significant or even large association with…

  5. HABITAT ASSESSMENT USING A RANDOM PROBABILITY BASED SAMPLING DESIGN: ESCAMBIA RIVER DELTA, FLORIDA

    Science.gov (United States)

    Smith, Lisa M., Darrin D. Dantin and Steve Jordan. In press. Habitat Assessment Using a Random Probability Based Sampling Design: Escambia River Delta, Florida (Abstract). To be presented at the SWS/GERS Fall Joint Society Meeting: Communication and Collaboration: Coastal Systems...

  6. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  7. A simple and reliable methodology to detect egg white in art samples

    Indian Academy of Sciences (India)

    A protocol for a simple and reliable dot-blot immunoassay was developed and optimized to test work of art samples for the presence of specific proteinaceus material (i.e. ovalbumin-based). The analytical protocol has been extensively set up with respect, among the other, to protein extraction conditions, to densitometric ...

  8. Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology

    Science.gov (United States)

    Blackmore, Lars James C.; Acikmese, Behcet; Mandic, Milan

    2012-01-01

    A software tool is used to demonstrate the feasibility of Touch and Go (TAG) sampling for Asteroid Sample Return missions. TAG is a concept whereby a spacecraft is in contact with the surface of a small body, such as a comet or asteroid, for a few seconds or less before ascending to a safe location away from the small body. Previous work at JPL developed the G-TAG simulation tool, which provides a software environment for fast, multi-body simulations of the TAG event. G-TAG is described in Multibody Simulation Software Testbed for Small-Body Exploration and Sampling, (NPO-47196) NASA Tech Briefs, Vol. 35, No. 11 (November 2011), p.54. This current innovation adapts this tool to a mission that intends to return a sample from the surface of an asteroid. In order to demonstrate the feasibility of the TAG concept, the new software tool was used to generate extensive simulations that demonstrate the designed spacecraft meets key requirements. These requirements state that contact force and duration must be sufficient to ensure that enough material from the surface is collected in the brushwheel sampler (BWS), and that the spacecraft must survive the contact and must be able to recover and ascend to a safe position, and maintain velocity and orientation after the contact.

  9. Salmonella Recovery Following Immersion Chilling for Matched Neck Skin and Whole Carcass Enrichment Sampling Methodologies

    Science.gov (United States)

    The prevalence and serogroups of Salmonella recovered following immersion chilling were determined for both neck skin and the matching whole carcass enriched samples. Commercially processed and eviscerated broiler carcasses were immersion chilled in ice and tap water for 40 min. Following immersio...

  10. Assessment of Psychopathic Traits in an Incarcerated Adolescent Sample: A Methodological Comparison

    Science.gov (United States)

    Fink, Brandi C.; Tant, Adam S.; Tremba, Katherine; Kiehl, Kent A.

    2012-01-01

    Analyses of convergent validity and group assignment using self-report, caregiver-report and interview-based measures of adolescent psychopathy were conducted in a sample of 160 incarcerated adolescents. Results reveal significant convergent validity between caregiver-report measures of adolescent psychopathy, significant convergent validity…

  11. Methodology for the Validation of Collection, Handling and Preservation of Water and Soil Samples.

    Science.gov (United States)

    1977-05-01

    and interstitial water samples by volatilization, adsorption, hydrolysis, biodegradation and other mechanisms. The main objective of the literature...Organo- phosphorus Pesticide Residues in Crop Extracts. J.AOAC Vol. 52, 522-526 (1969). 11. D.E. Coffin and G. Savary. Procedure for Extraction and

  12. Sample dilution : A methodological pitfall in the measurement of tissue but not serum ACE-activity

    NARCIS (Netherlands)

    Koiter, J; Navis, G; de Jong, PE; van Gilst, WH; de Zeeuw, D

    Many tissue ACE-assays suffer from underestimation of the ACE-activity at low sample dilutions. However, measurement of ACE-activity as the amount of hippuric acid produced by cleavage of the commonly used substrate hippuryl-histidyl-leucine might circumvent this problem. In this study, we

  13. Adaptation of image cytometry methodology for DNA ploidy analysis of cervical epithelium samples: A pilot study

    Directory of Open Access Journals (Sweden)

    Christiane Eliza Motta Duarte

    2014-06-01

    Conclusion: Image cytometry of the cervical specimens revealed DNA aneuploidy, most probably resulting from chromosomal alterations and appearing as precancerous lesions in 65% of the cases. The adaptations implemented in this study, enabled the DNA-image cytometry to become more accessible, enhancing its extended use as an adjuvant strategy for the early screening of the cervical epithelium samples during routine analyses.

  14. An evaluation of sampling methodology for assessing settlement of temperate fish in seagrass meadows

    Directory of Open Access Journals (Sweden)

    I. A. CATALAN

    2014-03-01

    Full Text Available All demersal fish with planktonic larvae settle at some point early in life, generally around the transformation from larvae to juveniles or soon after. Sampling pre-settlement or very young, settled fish is challenging due to spatial concretions within the habitat and the pulsed, rapid nature of the settlement process. There is a lack of robust methods that enable this sampling, but information on the settlement, that represents a mortality bottleneck, is crucial for the follow-up of populations for fisheries and conservation purposes. An empirical evaluation of sampling methods has not been conducted in temperate habitats. Here, we compare six different sampling methods to collect pre- and post-settlement stages of fish to determine the best combination of techniques to utilise in Posidonia oceanica, an endemic Mediterranean seagrass that provides a key nursery habitat for coastal fish. We considered three types of pelagic nets (bongo net, neuston net and ring net, two types of light-traps (Quatrefoil and Ecocean CARE® to sample pre-settled stages and a low-impact epibenthic trawl for recent settlers. Our results show a significantly different size-spectrum for each method, with a continuous range of sizes from 2 mm to 200 mm. The smallest sizes were collected by the bongo net, followed by the ring net, the neuston net, the Quatrefoil, the Ecocean and finally the epibenthic trawl. Our results suggest that an appropriate strategy for collecting and estimating the abundance of key littoral fish species around settlement size is the combination of the Ecocean light trap and the epibenthic trawl.

  15. Combined micro-XRF and TXRF methodology for quantitative elemental imaging of tissue samples.

    Science.gov (United States)

    Wróbel, Paweł M; Bała, Sławomir; Czyzycki, Mateusz; Golasik, Magdalena; Librowski, Tadeusz; Ostachowicz, Beata; Piekoszewski, Wojciech; Surówka, Artur; Lankosz, Marek

    2017-01-01

    Local differences in structural properties of biological specimens pose a major limitation to quantitative X-ray fluorescence imaging. This is because both the various tissue compartments of different density and variation in the sample thickness upon frequently used freeze-drying come up with the different values of the sample mass per unit area to be taken into account. Even though several solutions to tackle this problem based on the home-made standards for quantification in terms of thickness- and density-independent elemental mass fractions have been proposed, this issue is not addressed enough due to the samples' heterogeneity. In our recent study, we propose a calculation scheme based on combined external-standard micro X-ray fluorescence (micro-XRF) imaging and internal-standard total reflection X-ray fluorescence (TXRF) analysis to determine the corrected elemental mass fraction distributions in commonly analysed rat tissues: kidney, liver and spleen. The results of TXRF analysis of digested large tissue sections together with the mean values of elemental masses per unit area obtained with micro-XRF were employed to determine the average masses per unit area of the samples. The correction for variation of the tissue thickness and density was done through with the use of Compton intensities. Importantly, by its versatility, our novel approach can be used to produce elemental contrast in a variety of biological specimens where local variations in either the sample density or thickness are no longer the issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A new coprecipitation methodology with lutetium hydroxide for preconcentration of heavy metal ions in herbal plant samples.

    Science.gov (United States)

    Soylak, Mustafa; Murat, Ipek

    2014-01-01

    A new coprecipitation methodology that used lutetium hydroxide as a precipitant for Cu(II), Pb(II), Mn(II), Co(II), Cd(II), Fe(III), and Ni(II) ions in herbal plant and water samples for analysis by atomic absorption spectrometry has been investigated. The parameters such as pH, amount of lutetium, and volume of aqueous sample were optimized for the recovery of these seven metals. The effects of concomitant ions on the separation-preconcentration of analytes were also checked. The validation of the procedure was checked with addition recovery tests and analysis of Standard Reference Material 1570a-Trace Elements in Spinach Leaves and TMDA-70 fortified lake water Certified Reference Material. The LODs for analyte ions were in the range of 1.7-7.2 microg/L. The application of the present procedure was successfully performed for the analysis of analyte contents of herbal plant samples from Turkey.

  17. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population

    Science.gov (United States)

    Rey Gozalo, Guillermo; Barrigón Morillas, Juan Miguel

    2016-01-01

    Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods) were analysed and compared using the city of Talca (Chile) as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population. PMID:27187429

  18. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population

    Directory of Open Access Journals (Sweden)

    Guillermo Rey Gozalo

    2016-05-01

    Full Text Available Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods were analysed and compared using the city of Talca (Chile as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population.

  19. Validation of an analytical methodology for the quantitative analysis of petroleum hydrocarbons in marine sediment samples

    Directory of Open Access Journals (Sweden)

    Eloy Yordad Companioni Damas

    2009-01-01

    Full Text Available This work describes a validation of an analytical procedure for the analysis of petroleum hydrocarbons in marine sediment samples. The proposed protocol is able to measure n-alkanes and polycyclic aromatic hydrocarbons (PAH in samples at concentrations as low as 30 ng/g, with a precision better than 15% for most of analytes. The extraction efficiency of fortified sediments varied from 65.1 to 105.6% and 59.7 to 97.8%, for n-alkanes and PAH in the ranges: C16 - C32 and fluoranthene - benzo(apyrene, respectively. The analytical protocol was applied to determine petroleum hydrocarbons in sediments collected from a marine coastal zone.

  20. A sampling and metagenomic sequencing-based methodology for monitoring antimicrobial resistance in swine herds

    DEFF Research Database (Denmark)

    Munk, Patrick; Dalhoff Andersen, Vibe; de Knegt, Leonardo

    2016-01-01

    Objectives Reliable methods for monitoring antimicrobial resistance (AMR) in livestock and other reservoirs are essential to understand the trends, transmission and importance of agricultural resistance. Quantification of AMR is mostly done using culture-based techniques, but metagenomic read...... mapping shows promise for quantitative resistance monitoring. Methods We evaluated the ability of: (i) MIC determination for Escherichia coli; (ii) cfu counting of E. coli; (iii) cfu counting of aerobic bacteria; and (iv) metagenomic shotgun sequencing to predict expected tetracycline resistance based...... on known antimicrobial consumption in 10 Danish integrated slaughter pig herds. In addition, we evaluated whether fresh or manure floor samples constitute suitable proxies for intestinal sampling, using cfu counting, qPCR and metagenomic shotgun sequencing. Results Metagenomic read-mapping outperformed...

  1. Cost and Performance Report of Incremental Sampling Methodology for Soil Containing Metallic Residues

    Science.gov (United States)

    2013-09-01

    distributed over training ranges as particles of various sizes, shapes , and compositions. To obtain representative samples (i.e., to ensure mean con...distributed over active training ranges as particles of various sizes, shapes , and compositions. To address the compositional and distributional...much as 5 m of silt. Gravel consists mostly of quartz and metamorphic rock with clasts ranging from 0.6 to 7 cm in diameter. The gravel is 3 to

  2. Sample size calculations for micro-randomized trials in mHealth.

    Science.gov (United States)

    Liao, Peng; Klasnja, Predrag; Tewari, Ambuj; Murphy, Susan A

    2016-05-30

    The use and development of mobile interventions are experiencing rapid growth. In "just-in-time" mobile interventions, treatments are provided via a mobile device, and they are intended to help an individual make healthy decisions 'in the moment,' and thus have a proximal, near future impact. Currently, the development of mobile interventions is proceeding at a much faster pace than that of associated data science methods. A first step toward developing data-based methods is to provide an experimental design for testing the proximal effects of these just-in-time treatments. In this paper, we propose a 'micro-randomized' trial design for this purpose. In a micro-randomized trial, treatments are sequentially randomized throughout the conduct of the study, with the result that each participant may be randomized at the 100s or 1000s of occasions at which a treatment might be provided. Further, we develop a test statistic for assessing the proximal effect of a treatment as well as an associated sample size calculator. We conduct simulation evaluations of the sample size calculator in various settings. Rules of thumb that might be used in designing a micro-randomized trial are discussed. This work is motivated by our collaboration on the HeartSteps mobile application designed to increase physical activity. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    somatization symptoms (OR = 6.28, 95% CI = 1.39-28.46). CONCLUSIONS: Unskilled manual workers, the unemployed, and, to a lesser extent, the low-grade self-employed showed an increased level of mental distress. Activities to promote mental health in the Danish population should be directed toward these groups.......PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  4. Random sampling for a mental health survey in a deprived multi-ethnic area of Berlin.

    Science.gov (United States)

    Mundt, Adrian P; Aichberger, Marion C; Kliewe, Thomas; Ignatyev, Yuriy; Yayla, Seda; Heimann, Hannah; Schouler-Ocak, Meryam; Busch, Markus; Rapp, Michael; Heinz, Andreas; Ströhle, Andreas

    2012-12-01

    The aim of the study was to assess the response to random sampling for a mental health survey in a deprived multi-ethnic area of Berlin, Germany, with a large Turkish-speaking population. A random list from the registration office with 1,000 persons stratified by age and gender was retrieved from the population registry and these persons were contacted using a three-stage design including written information, telephone calls and personal contact at home. A female bilingual interviewer contacted persons with Turkish names. Of the persons on the list, 202 were not living in the area, one was deceased, 502 did not respond. Of the 295 responders, 152 explicitly refused(51.5%) to participate. We retained a sample of 143 participants(48.5%) representing the rate of multi-ethnicity in the area (52.1% migrants in the sample vs. 53.5% in the population). Turkish migrants were over-represented(28.9% in the sample vs. 18.6% in the population). Polish migrants (2.1 vs. 5.3% in the population) and persons from the former Yugoslavia (1.4 vs. 4.8% in the population)were under-represented. Bilingual contact procedures can improve the response rates of the most common migrant populations to random sampling if migrants of the same origin gate the contact. High non-contact and non-response rates for migrant and non-migrant populations in deprived urban areas remain a challenge for obtaining representative random samples.

  5. Assessment of proteinuria by using protein: creatinine index in random urine sample.

    Science.gov (United States)

    Khan, Dilshad Ahmed; Ahmad, Tariq Mahmood; Qureshil, Ayaz Hussain; Halim, Abdul; Ahmad, Mumtaz; Afzal, Saeed

    2005-10-01

    To assess the quantitative measurement of proteinuria by using random urine protein:creatinine index/ratio in comparison with 24 hours urinary protein excretion in patients of renal diseases having normal glomerular filtration rate. One hundred and thirty patients, 94 males and 36 females, with an age range of 5 to 60 years; having proteinuria of more than 150 mg/day were included in this study. Qualitative urinary protein estimation was done on random urine specimen by dipstick. Quantitative measurement of protein in the random and 24 hours urine specimens were carried out by a method based on the formation of a red complex of protein with pyrogallal red in acid medium on Micro lab 200 (Merck). Estimation of creatinine was done on Selectra -2 (Merck) by Jaffe's reaction. The urine protein:creatinine index and ratio were calculated by dividing the urine protein concentration (mg/L) by urine creatinine concentration (mmol/L) multilplied by 10 and mg/mg respectively. The protein:creatinine index and ratio of more than 140 and 0.18 respectively in a random urine sample indicated pathological proteinuria. An excellent correlation (r=0.96) was found between random urine protein:creatinine index/ratio and standard 24 hours urinary protein excretion in these patients (pprotein:creatinine index in random urine is a convenient, quick and reliable method of estimation of proteinuria as compared to 24 hours of urinary protein excretion for diagnosis and monitoring of renal diseases in our medical setup.

  6. Preventive home visits to older people in Denmark: methodology of a randomized controlled study

    DEFF Research Database (Denmark)

    Vass, Mikkel; Avlund, Kirsten; Hendriksen, Carsten

    2002-01-01

    BACKGROUND AND AIMS: Preventive home visits were introduced by legislation in Denmark in 1998. This ongoing randomized controlled intervention study introduces a model where preventive home visits to elderly people are carried out in a standardized way, focusing on early signs of disability...... a preventive program completely on their own. RESULTS: Baseline characteristics were similar in the intervention and control communities with regard to size, rural and urban characteristics and geriatric services, and the individual characteristics of participating persons living in these communities were also...... comparable. The intervention was well accepted and feasible. CONCLUSIONS: When results become available, the study should reveal the most important factors for preventive home visits to elderly people....

  7. Stature in archeological samples from central Italy: methodological issues and diachronic changes.

    Science.gov (United States)

    Giannecchini, Monica; Moggi-Cecchi, Jacopo

    2008-03-01

    Stature reconstructions from skeletal remains are usually obtained through regression equations based on the relationship between height and limb bone length. Different equations have been employed to reconstruct stature in skeletal samples, but this is the first study to provide a systematic analysis of the reliability of the different methods for Italian historical samples. Aims of this article are: 1) to analyze the reliability of different regression methods to estimate stature for populations living in Central Italy from the Iron Age to Medieval times; 2) to search for trends in stature over this time period by applying the most reliable regression method. Long bone measurements were collected from 1,021 individuals (560 males, 461 females), from 66 archeological sites for males and 54 for females. Three time periods were identified: Iron Age, Roman period, and Medieval period. To determine the most appropriate equation to reconstruct stature the Delta parameter of Gini (Memorie di metodologia statistica. Milano: Giuffre A. 1939), in which stature estimates derived from different limb bones are compared, was employed. The equations proposed by Pearson (Philos Trans R Soc London 192 (1899) 169-244) and Trotter and Gleser for Afro-Americans (Am J Phys Anthropol 10 (1952) 463-514; Am J Phys Anthropol 47 (1977) 355-356) provided the most consistent estimates when applied to our sample. We then used the equation by Pearson for further analyses. Results indicate a reduction in stature in the transition from the Iron Age to the Roman period, and a subsequent increase in the transition from the Roman period to the Medieval period. Changes of limb lengths over time were more pronounced in the distal than in the proximal elements in both limbs. 2007 Wiley-Liss, Inc.

  8. Generalized essential energy space random walks to more effectively accelerate solute sampling in aqueous environment.

    Science.gov (United States)

    Lv, Chao; Zheng, Lianqing; Yang, Wei

    2012-01-28

    Molecular dynamics sampling can be enhanced via the promoting of potential energy fluctuations, for instance, based on a Hamiltonian modified with the addition of a potential-energy-dependent biasing term. To overcome the diffusion sampling issue, which reveals the fact that enlargement of event-irrelevant energy fluctuations may abolish sampling efficiency, the essential energy space random walk (EESRW) approach was proposed earlier. To more effectively accelerate the sampling of solute conformations in aqueous environment, in the current work, we generalized the EESRW method to a two-dimension-EESRW (2D-EESRW) strategy. Specifically, the essential internal energy component of a focused region and the essential interaction energy component between the focused region and the environmental region are employed to define the two-dimensional essential energy space. This proposal is motivated by the general observation that in different conformational events, the two essential energy components have distinctive interplays. Model studies on the alanine dipeptide and the aspartate-arginine peptide demonstrate sampling improvement over the original one-dimension-EESRW strategy; with the same biasing level, the present generalization allows more effective acceleration of the sampling of conformational transitions in aqueous solution. The 2D-EESRW generalization is readily extended to higher dimension schemes and employed in more advanced enhanced-sampling schemes, such as the recent orthogonal space random walk method. © 2012 American Institute of Physics

  9. Multilayer pixel super-resolution lensless in-line holographic microscope with random sample movement.

    Science.gov (United States)

    Wang, Mingjun; Feng, Shaodong; Wu, Jigang

    2017-10-06

    We report a multilayer lensless in-line holographic microscope (LIHM) with improved imaging resolution by using the pixel super-resolution technique and random sample movement. In our imaging system, a laser beam illuminated the sample and a CMOS imaging sensor located behind the sample recorded the in-line hologram for image reconstruction. During the imaging process, the sample was moved by hand randomly and the in-line holograms were acquired sequentially. Then the sample image was reconstructed from an enhanced-resolution hologram obtained from multiple low-resolution in-line holograms by applying the pixel super-resolution (PSR) technique. We studied the resolution enhancement effects by using the U.S. Air Force (USAF) target as the sample in numerical simulation and experiment. We also showed that multilayer pixel super-resolution images can be obtained by imaging a triple-layer sample made with the filamentous algae on the middle layer and microspheres with diameter of 2 μm on the top and bottom layers. Our pixel super-resolution LIHM provides a compact and low-cost solution for microscopic imaging and is promising for many biomedical applications.

  10. Randomized controlled trials 5: Determining the sample size and power for clinical trials and cohort studies.

    Science.gov (United States)

    Greene, Tom

    2015-01-01

    Performing well-powered randomized controlled trials is of fundamental importance in clinical research. The goal of sample size calculations is to assure that statistical power is acceptable while maintaining a small probability of a type I error. This chapter overviews the fundamentals of sample size calculation for standard types of outcomes for two-group studies. It considers (1) the problems of determining the size of the treatment effect that the studies will be designed to detect, (2) the modifications to sample size calculations to account for loss to follow-up and nonadherence, (3) the options when initial calculations indicate that the feasible sample size is insufficient to provide adequate power, and (4) the implication of using multiple primary endpoints. Sample size estimates for longitudinal cohort studies must take account of confounding by baseline factors.

  11. [The methodology and sample description of the National Survey on Addiction Problems in Hungary 2015 (NSAPH 2015)].

    Science.gov (United States)

    Paksi, Borbala; Demetrovics, Zsolt; Magi, Anna; Felvinczi, Katalin

    2017-06-01

    This paper introduces the methods and methodological findings of the National Survey on Addiction Problems in Hungary (NSAPH 2015). Use patterns of smoking, alcohol use and other psychoactive substances were measured as well as that of certain behavioural addictions (problematic gambling - PGSI, DSM-V, eating disorders - SCOFF, problematic internet use - PIUQ, problematic on-line gaming - POGO, problematic social media use - FAS, exercise addictions - EAI-HU, work addiction - BWAS, compulsive buying - CBS). The paper describes the applied measurement techniques, sample selection, recruitment of respondents and the data collection strategy as well. Methodological results of the survey including reliability and validity of the measures are reported. The NSAPH 2015 research was carried out on a nationally representative sample of the Hungarian adult population aged 16-64 yrs (gross sample 2477, net sample 2274 persons) with the age group of 18-34 being overrepresented. Statistical analysis of the weight-distribution suggests that weighting did not create any artificial distortion in the database leaving the representativeness of the sample unaffected. The size of the weighted sample of the 18-64 years old adult population is 1490 persons. The extent of the theoretical margin of error in the weighted sample is ±2,5%, at a reliability level of 95% which is in line with the original data collection plans. Based on the analysis of reliability and the extent of errors beyond sampling within the context of the database we conclude that inconsistencies create relatively minor distortions in cumulative prevalence rates; consequently the database makes possible the reliable estimation of risk factors related to different substance use behaviours. The reliability indexes of measurements used for prevalence estimates of behavioural addictions proved to be appropriate, though the psychometric features in some cases suggest the presence of redundant items. The comparison of

  12. Characterization of Electron Microscopes with Binary Pseudo-random Multilayer Test Samples

    Energy Technology Data Exchange (ETDEWEB)

    V Yashchuk; R Conley; E Anderson; S Barber; N Bouet; W McKinney; P Takacs; D Voronov

    2011-12-31

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1] and [2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  13. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Energy Technology Data Exchange (ETDEWEB)

    Yashchuk, Valeriy V., E-mail: VVYashchuk@lbl.gov [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Conley, Raymond [NSLS-II, Brookhaven National Laboratory, Upton, NY 11973 (United States); Anderson, Erik H. [Center for X-ray Optics, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Barber, Samuel K. [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Bouet, Nathalie [NSLS-II, Brookhaven National Laboratory, Upton, NY 11973 (United States); McKinney, Wayne R. [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Takacs, Peter Z. [Brookhaven National Laboratory, Upton, NY 11973 (United States); Voronov, Dmitriy L. [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi{sub 2}/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  14. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  15. Application of response surface methodology for determination of methyl red in water samples by spectrophotometry method.

    Science.gov (United States)

    Khodadoust, Saeid; Ghaedi, Mehrorang

    2014-12-10

    In this study a rapid and effective method (dispersive liquid-liquid microextraction (DLLME)) was developed for extraction of methyl red (MR) prior to its determination by UV-Vis spectrophotometry. Influence variables on DLLME such as volume of chloroform (as extractant solvent) and methanol (as dispersive solvent), pH and ionic strength and extraction time were investigated. Then significant variables were optimized by using a Box-Behnken design (BBD) and desirability function (DF). The optimized conditions (100μL of chloroform, 1.3mL of ethanol, pH 4 and 4% (w/v) NaCl) resulted in a linear calibration graph in the range of 0.015-10.0mgmL(-1) of MR in initial solution with R(2)=0.995 (n=5). The limits of detection (LOD) and limit of quantification (LOQ) were 0.005 and 0.015mgmL(-1), respectively. Finally, the DLLME method was applied for determination of MR in different water samples with relative standard deviation (RSD) less than 5% (n=5). Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Pumping tests in networks of multilevel sampling wells: Motivation and methodology

    Science.gov (United States)

    Butler, J.J.; McElwee, C.D.; Bohling, G.C.

    1999-01-01

    The identification of spatial variations in hydraulic conductivity (K) on a scale of relevance for transport investigations has proven to be a considerable challenge. Recently, a new field method for the estimation of interwell variations in K has been proposed. This method, hydraulic tomography, essentially consists of a series of short-term pumping tests performed in a tomographic-like arrangement. In order to fully realize the potential of this approach, information about lateral and vertical variations in pumping-induced head changes (drawdown) is required with detail that has previously been unobtainable in the field. Pumping tests performed in networks of multilevel sampling (MLS) wells can provide data of the needed density if drawdown can accurately and rapidly be measured in the small-diameter tubing used in such wells. Field and laboratory experiments show that accurate transient drawdown data can be obtained in the small-diameter MLS tubing either directly with miniature fiber-optic pressure sensors or indirectly using air-pressure transducers. As with data from many types of hydraulic tests, the quality of drawdown measurements from MLS tubing is quite dependent on the effectiveness of well development activities. Since MLS ports of the standard design are prone to clogging and are difficult to develop, alternate designs are necessary to ensure accurate drawdown measurements. Initial field experiments indicate that drawdown measurements obtained from pumping tests performed in MLS networks have considerable potential for providing valuable information about spatial variations in hydraulic conductivity.

  17. Methodologies and perspectives of proteomics applied to filamentous fungi: from sample preparation to secretome analysis.

    Science.gov (United States)

    Bianco, Linda; Perrotta, Gaetano

    2015-03-12

    Filamentous fungi possess the extraordinary ability to digest complex biomasses and mineralize numerous xenobiotics, as consequence of their aptitude to sensing the environment and regulating their intra and extra cellular proteins, producing drastic changes in proteome and secretome composition. Recent advancement in proteomic technologies offers an exciting opportunity to reveal the fluctuations of fungal proteins and enzymes, responsible for their metabolic adaptation to a large variety of environmental conditions. Here, an overview of the most commonly used proteomic strategies will be provided; this paper will range from sample preparation to gel-free and gel-based proteomics, discussing pros and cons of each mentioned state-of-the-art technique. The main focus will be kept on filamentous fungi. Due to the biotechnological relevance of lignocellulose degrading fungi, special attention will be finally given to their extracellular proteome, or secretome. Secreted proteins and enzymes will be discussed in relation to their involvement in bio-based processes, such as biomass deconstruction and mycoremediation.

  18. A Methodology to Estimate Ores Work Index Values, Using Miduk Copper Mine Sample

    Directory of Open Access Journals (Sweden)

    Mohammad Noaparast

    2012-12-01

    Full Text Available It is always attempted to reduce the costs of comminution in mineral processing plants. One of thedifficulties in size reduction section is not to be designed properly. The key factor to design size reductionunits such as crushers and grinding mills, is ore’s work index. The work index, wi, presents the oregrindability, and is used in Bond formula to calculate the required energy. Bond has defined a specificrelationship between some parameters which is applied to calculate wi, which are control screen, fineparticles produced, feed and product d80.In this research work, a high grade copper sample from Miduk copper concentrator was prepared, and itswork index values were experimentally estimated, using different control screens, 600, 425, 212, 150, 106and 75 microns. The obtained results from the tests showed two different behaviors in fine production.According to these two trends the required models were then defined to present the fine mass calculationusing control screen. In next step, an equation was presented in order to calculate Miduk copper ore workindex for any size. In addition to verify the model creditability, a test using 300 microns control screenwas performed and its result was compared with calculated ones using defined model, which showed agood fit. Finally the experimental and calculated values were compared and their relative error was equalto 4.11% which is an indication of good fit for the results.

  19. Methodology For Reduction Of Sampling On The Visual Inspection Of Developed And Etched Wafers

    Science.gov (United States)

    van de Ven, Jamie S.; Khorasani, Fred

    1989-07-01

    There is a lot of inspection in the manufacturing of semiconductor devices. Generally, the more important a manufacturing step, the higher is the level of inspection. In some cases 100% of the wafers are inspected after certain steps. Inspection is a non-value added and expensive activity. It requires an army of "inspectors," often times expensive equipment and becomes a "bottle neck" when the level of inspection is high. Although inspection helps identify quality problems, it hurts productivity. The new management, quality and productivity philosophies recommend against over inspection. [Point #3 in Dr. Deming's 14 Points for Management (1)] 100% inspection is quite unnecessary . Often the nature of a process allows us to reduce inspection drastically and still maintain a high level of confidence in quality. In section 2, we discuss such situations and show that some elementary probability theory allows us to determine sample sizes and measure the chances of catching a bad "lot" and accepting a good lot. In section 3, we provide an example and application of the theory, and make a few comments on money and time saved because of this work. Finally, in section 4, we draw some conclusions about the new quality and productivity philosophies and how applied statisticians and engineers should study every situation individually and avoid blindly using methods and tables given in books.

  20. Trace-element characterization of evidential cannabis sative samples using k{sub 0}-standardization methodology

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, D.P. Jr.; Vernetson, W.G.; Ratner, R.T. [Univ. of Florida, Gainesville, FL (United States)] [and others

    1995-12-31

    The University of Florida Training Reactor (UFTR) facilities including the analytical laboratory are used for a wide range of educational, research, training, and service functions. The UFTR is a 100-kW light-water-cooled, graphite-and-water-moderated modified Argonaut-type reactor. The UFTR utilizes high enriched plate-type fuel in a two-slab arrangement and operates at a 100-kW power level. Since first licensed to operate at 10 kW in 1959, this nonpower reactor facility has had an active but evolving record of continuous service to a wide range of academic, utility, and community users. The services of the UFTR have also been used by various state authorities in criminal investigations. Because of its relatively low power and careful laboratory analyses, the UFTR neutron flux characteristics in several ports are not only well characterized but they are also quite invariant with time. As a result, such a facility is well-suited to the application of the multielement analysis using the k{sub o}-standardization method of neutron activation analysis. The analysis of untreated evidential botanical samples presented a unique opportunity to demonstrate implementation of this method at the UFTR facilities.

  1. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  2. Methodologies for measurement of transuranic elements in environmental samples and migration behavior of transuranic elements in paddy fields

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Masayoshi; Ueno, Kaori [Kanazawa Univ., Tatsunokuchi, Ishikawa (Japan). Low Level Radioactivity Lab.; Amano, Hikaru

    1996-02-01

    Methodologies for the measurement of transuranic elements in environmental samples and migration behavior of transuranic elements in paddy fields are reviewed in this report. Long lived transuranic elements in our environment are quite important, because their effect to human is prolonged. Migration analysis of long lived transuranic elements in paddy fields is also quite important, because rice is our main dishes. For the methodologies about the measurement of transuranic elements in environmental samples, traditional chemical separation and purification plus alpha-ray spectrometric methods are reviewed with mass spectrometric method. For the estimation of migration behavior of transuranic elements in paddy fields, experimental results from 1957y to 1989y in Japan are reviewed. Important findings are as follows. (1) Detection limit of transuranic elements for traditional chemical separation and purification plus alpha-ray spectrometric methods is about 0.2mBq/sample(10,000min counting). On contrast, detection limit of transuranic elements for mass spectrometric methods using High Resolution ICP-MS is 0.02mBq/sample for {sup 237}Np. (2) Integrated deposits of {sup 239,240}Pu and {sup 137}Cs in paddy field soils are 2-3 times higher in Pacific ocean side area than Japan sea side area in Japan. (3) Apparent residence time of {sup 237}Np in paddy field soils was estimated to be in the range of 50-70 years, which is shorter than those of {sup 239,240}Pu and {sup 137}Cs(100-140 years). (author) 54 refs.

  3. On analysis-based two-step interpolation methods for randomly sampled seismic data

    Science.gov (United States)

    Yang, Pengliang; Gao, Jinghuai; Chen, Wenchao

    2013-02-01

    Interpolating the missing traces of regularly or irregularly sampled seismic record is an exceedingly important issue in the geophysical community. Many modern acquisition and reconstruction methods are designed to exploit the transform domain sparsity of the few randomly recorded but informative seismic data using thresholding techniques. In this paper, to regularize randomly sampled seismic data, we introduce two accelerated, analysis-based two-step interpolation algorithms, the analysis-based FISTA (fast iterative shrinkage-thresholding algorithm) and the FPOCS (fast projection onto convex sets) algorithm from the IST (iterative shrinkage-thresholding) algorithm and the POCS (projection onto convex sets) algorithm. A MATLAB package is developed for the implementation of these thresholding-related interpolation methods. Based on this package, we compare the reconstruction performance of these algorithms, using synthetic and real seismic data. Combined with several thresholding strategies, the accelerated convergence of the proposed methods is also highlighted.

  4. Hemodynamic and glucometabolic factors fail to predict renal function in a random population sample

    DEFF Research Database (Denmark)

    Pareek, M.; Nielsen, M.; Olesen, Thomas Bastholm

    2015-01-01

    Objective: To determine whether baseline hemodynamic and/or glucometabolic risk factors could predict renal function at follow-up, independently of baseline serum creatinine, in survivors from a random population sample. Design and method: We examined associations between baseline serum creatinine...... indices of beta-cell function (HOMA-2B), insulin sensitivity (HOMA-2S), and insulin resistance (HOMA-2IR)), traditional cardiovascular risk factors (age, sex, smoking status, body mass index, diabetes mellitus, total serum cholesterol), and later renal function determined as serum cystatin C in 238 men...... and 7 women aged 38 to 49 years at the time of inclusion, using multivariable linear regression analysis (p-entry 0.05, p-removal 0.20). Study subjects came from a random population based sample and were included 1974-1992, whilst the follow-up with cystatin C measurement was performed 2002...

  5. An inversion method based on random sampling for real-time MEG neuroimaging

    CERN Document Server

    Pascarella, Annalisa

    2016-01-01

    The MagnetoEncephaloGraphy (MEG) has gained great interest in neurorehabilitation training due to its high temporal resolution. The challenge is to localize the active regions of the brain in a fast and accurate way. In this paper we use an inversion method based on random spatial sampling to solve the real-time MEG inverse problem. Several numerical tests on synthetic but realistic data show that the method takes just a few hundredths of a second on a laptop to produce an accurate map of the electric activity inside the brain. Moreover, it requires very little memory storage. For this reasons the random sampling method is particularly attractive in real-time MEG applications.

  6. Methodological interference of biochar in the determination of extracellular enzyme activities in composting samples

    Science.gov (United States)

    Jindo, K.; Matsumoto, K.; García Izquierdo, C.; Sonoki, T.; Sanchez-Monedero, M. A.

    2014-07-01

    Biochar application has received increasing attention as a means to trap recalcitrant carbon and enhance soil fertility. Hydrolytic enzymatic assays, such as β-glucosidase and phosphatase activities, are used for the assessment of soil quality and composting process, which are based on use of p-nitrophenol (PNP) derivatives as substrate. However, sorption capacity of biochar can interfere with colorimetric determination of the hydrolysed PNP, either by the sorption of the substrate or the reaction product of hydrolysis into biochar surface. The aim of the present work is to study the biochar sorption capacity for PNP in biochar-blended composting mixtures in order to assess its impact on the estimation of the colorimetric-based enzymatic assays. A retention test was conducted by adding a solution of known amounts of PNP in universal buffer solution (pH = 5, 6.5 and 11, corresponding to the β-glucosidase, acid and alkaline phosphatase activity assays, respectively), in samples taken at the initial stage and after maturation stage from four different composting piles (two manure composting piles; PM: poultry manure, CM: cow manure and two other similar piles containing 10% of additional biochar (PM + B, CM + B)). The results show that biochar-blended composts (PM + B, CM + B) generally exhibited low enzymatic activities, compared to manure compost without biochar (PM, CM). In terms of the difference between the initial and maturation stage of composting process, the PNP retention in biochar was shown higher at maturation stage, caused most probably by an enlarged proportion of biochar inside compost mixture after the selective degradation of easily decomposable organic matter. TThe retention of PNP on biochar was influenced by pH dependency of sorption capacity of biochar and/or PNP solubility, since PNP was more efficiently retained by biochar at low pH values (5 and 6.5) than at high pH values (11).

  7. Modified Exponential Type Estimator for Population Mean Using Auxiliary Variables in Stratified Random Sampling

    OpenAIRE

    Özel, Gamze

    2015-01-01

    In this paper, a new exponential type estimator is developed in the stratified random sampling for the population mean using auxiliary variable information. In order to evaluate efficiency of the introduced estimator, we first review some estimators and study the optimum property of the suggested strategy. To judge the merits of the suggested class of estimators over others under the optimal condition, simulation study and real data applications are conducted. The results show that the introduc...

  8. Effectiveness of hand hygiene education among a random sample of women from the community

    OpenAIRE

    Ubheeram, J.; Biranjia-Hurdoyal, S.D.

    2017-01-01

    Summary Objective. The effectiveness of hand hygiene education was investigated by studying the hand hygiene awareness and bacterial hand contamination among a random sample of 170 women in the community. Methods. Questionnaire was used to assess the hand hygiene awareness score, followed by swabbing of the dominant hand. Bacterial identification was done by conventional biochemical tests. Results. Better hand hygiene awareness score was significantly associated with age, scarce bacterial gro...

  9. Control Capacity and A Random Sampling Method in Exploring Controllability of Complex Networks

    OpenAIRE

    Jia, Tao; Barab?si, Albert-L?szl?

    2013-01-01

    Controlling complex systems is a fundamental challenge of network science. Recent advances indicate that control over the system can be achieved through a minimum driver node set (MDS). The existence of multiple MDS's suggests that nodes do not participate in control equally, prompting us to quantify their participations. Here we introduce control capacity quantifying the likelihood that a node is a driver node. To efficiently measure this quantity, we develop a random sampling algorithm. Thi...

  10. Classification of epileptic EEG signals based on simple random sampling and sequential feature selection

    OpenAIRE

    Ghayab, Hadi Ratham Al; Li, Yan; Abdulla, Shahab; Diykh, Mohammed; Wan, Xiangkui

    2016-01-01

    Electroencephalogram (EEG) signals are used broadly in the medical fields. The main applications of EEG signals are the diagnosis and treatment of diseases such as epilepsy, Alzheimer, sleep problems and so on. This paper presents a new method which extracts and selects features from multi-channel EEG signals. This research focuses on three main points. Firstly, simple random sampling (SRS) technique is used to extract features from the time domain of EEG signals. Secondly, the sequential fea...

  11. Systematic review of randomized controlled trials of candidate treatments for cognitive impairment in depression and methodological challenges in the field

    DEFF Research Database (Denmark)

    Miskowiak, K. W.; Ott, C. V.; Petersen, Jeff Zarp

    2016-01-01

    Cognitive impairment is a core feature of Major Depressive Disorder (MDD) but treatments targeting cognition are lacking. Numerous pre-clinical and clinical studies have investigated potential cognition treatments, but overall the evidence is conflicting. We conducted a systematic search following...... the PRISMA guidelines on PubMed and PsychInfo to evaluate the extant evidence and methodological challenges in randomized controlled trials (RCTs) of biological, psychological and behavioural candidate treatments targeting cognition in MDD. Inclusion criteria were RCTs with a placebo control assessing...... potential pro-cognitive effects of candidate treatments in MDD. Two independent authors reviewed the studies and assessed their risk of bias with the Cochrane Collaboration׳s Risk of Bias tool. Twenty-eight eligible studies (24 biological and four psychological or behavioural studies) were identified...

  12. Uncertainty Of Stream Nutrient Transport Estimates Using Random Sampling Of Storm Events From High Resolution Water Quality And Discharge Data

    Science.gov (United States)

    Scholefield, P. A.; Arnscheidt, J.; Jordan, P.; Beven, K.; Heathwaite, L.

    2007-12-01

    The uncertainties associated with stream nutrient transport estimates are frequently overlooked and the sampling strategy is rarely if ever investigated. Indeed, the impact of sampling strategy and estimation method on the bias and precision of stream phosphorus (P) transport calculations is little understood despite the use of such values in the calibration and testing of models of phosphorus transport. The objectives of this research were to investigate the variability and uncertainty in the estimates of total phosphorus transfers at an intensively monitored agricultural catchment. The Oona Water which is located in the Irish border region, is part of a long term monitoring program focusing on water quality. The Oona Water is a rural river catchment with grassland agriculture and scattered dwelling houses and has been monitored for total phosphorus (TP) at 10 min resolution for several years (Jordan et al, 2007). Concurrent sensitive measurements of discharge are also collected. The water quality and discharge data were provided at 1 hour resolution (averaged) and this meant that a robust estimate of the annual flow weighted concentration could be obtained by simple interpolation between points. A two-strata approach (Kronvang and Bruhn, 1996) was used to estimate flow weighted concentrations using randomly sampled storm events from the 400 identified within the time series and also base flow concentrations. Using a random stratified sampling approach for the selection of events, a series ranging from 10 through to the full 400 were used, each time generating a flow weighted mean using a load-discharge relationship identified through log-log regression and monte-carlo simulation. These values were then compared to the observed total phosphorus concentration for the catchment. Analysis of these results show the impact of sampling strategy, the inherent bias in any estimate of phosphorus concentrations and the uncertainty associated with such estimates. The

  13. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  14. Methodological Reporting Quality of Randomized Controlled Trials in 3 Leading Diabetes Journals From 2011 to 2013 Following CONSORT Statement: A System Review.

    Science.gov (United States)

    Zhai, Xiao; Wang, Yiran; Mu, Qingchun; Chen, Xiao; Huang, Qin; Wang, Qijin; Li, Ming

    2015-07-01

    To appraise the current reporting methodological quality of randomized clinical trials (RCTs) in 3 leading diabetes journals.We systematically searched the literature for RCTs in Diabetes Care, Diabetes and Diabetologia from 2011 to 2013.Characteristics were extracted based on Consolidated Standards of Reporting Trials (CONSORT) statement. Generation of allocation, concealment of allocation, intention-to-treat (ITT) analysis and handling of dropouts were defined as primary outcome and "low risk of bias." Sample size calculation, type of intervention, country, number of patients, funding source were also revealed and descriptively reported. Trials were compared among journals, study years, and other characters.A total of 305 RCTs were enrolled in this study. One hundred eight (35.4%) trials reported adequate generation of allocation, 87 (28.5%) trials reported adequate concealment of allocation, 53 (23.8%) trials used ITT analysis, and 130 (58.3%) trials were adequate in handling of dropouts. Only 15 (4.9%) were "low risk of bias" trials. Studies at a large scale (n > 100) or from European presented with more "low risk of bias" trials than those at a small scale (n ≤ 100) or from other regions. No improvements were found in these 3 years.This study shows that methodological reporting quality of RCTs in the major diabetes journals remains suboptimal. It can be further improved to meet and keep up with the standards of the CONSORT statement.

  15. Determining optimal sample sizes for multi-stage randomized clinical trials using value of information methods.

    Science.gov (United States)

    Willan, Andrew; Kowgier, Matthew

    2008-01-01

    Traditional sample size calculations for randomized clinical trials depend on somewhat arbitrarily chosen factors, such as Type I and II errors. An effectiveness trial (otherwise known as a pragmatic trial or management trial) is essentially an effort to inform decision-making, i.e., should treatment be adopted over standard? Taking a societal perspective and using Bayesian decision theory, Willan and Pinto (Stat. Med. 2005; 24:1791-1806 and Stat. Med. 2006; 25:720) show how to determine the sample size that maximizes the expected net gain, i.e., the difference between the cost of doing the trial and the value of the information gained from the results. These methods are extended to include multi-stage adaptive designs, with a solution given for a two-stage design. The methods are applied to two examples. As demonstrated by the two examples, substantial increases in the expected net gain (ENG) can be realized by using multi-stage adaptive designs based on expected value of information methods. In addition, the expected sample size and total cost may be reduced. Exact solutions have been provided for the two-stage design. Solutions for higher-order designs may prove to be prohibitively complex and approximate solutions may be required. The use of multi-stage adaptive designs for randomized clinical trials based on expected value of sample information methods leads to substantial gains in the ENG and reductions in the expected sample size and total cost.

  16. Sample size calculations for pilot randomized trials: a confidence interval approach.

    Science.gov (United States)

    Cocks, Kim; Torgerson, David J

    2013-02-01

    To describe a method using confidence intervals (CIs) to estimate the sample size for a pilot randomized trial. Using one-sided CIs and the estimated effect size that would be sought in a large trial, we calculated the sample size needed for pilot trials. Using an 80% one-sided CI, we estimated that a pilot trial should have at least 9% of the sample size of the main planned trial. Using the estimated effect size difference for the main trial and using a one-sided CI, this allows us to calculate a sample size for a pilot trial, which will make its results more useful than at present. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. The cyclical lower extremity exercise for Parkinson's trial (CYCLE): methodology for a randomized controlled trial.

    Science.gov (United States)

    Rosenfeldt, Anson B; Rasanow, Matthew; Penko, Amanda L; Beall, Erik B; Alberts, Jay L

    2015-04-24

    Motor and non-motor impairments affect quality of life in individuals with Parkinson's disease. Our preliminary research indicates that forced exercise cycling, a mode of exercise in which a participant's voluntary rate of exercise is augmented on a stationary cycle, results in global improvements in the cardinal symptoms of Parkinson's disease. The objective of the Cyclical Lower Extremity Exercise (CYCLE) trial for Parkinson's disease is to determine the effects of forced exercise cycling on motor and non-motor performance when compared to voluntary rate cycling and a non-exercise control group. Additionally, we plan to identify any associated changes in neural activity determined by functional magnetic resonance imaging. A total of 100 individuals with mild to moderate idiopathic Parkinson's disease will participate in a single-center, parallel-group, rater-blind study. Participants will be randomized 2:2:1 into a forced exercise, voluntary exercise, or no-exercise control group, respectively. Both exercise groups will cycle 3 times per week for 8 weeks at identical aerobic intensities for 40 minutes, but participants in the forced exercise group will cycle 30% faster than their voluntary rate by means of an augmented motorized bicycle. Neuroimaging, clinical, and biomechanical assessments of motor and non-motor performance will be made at baseline both 'on' and 'off' medication, after four weeks of exercise (midpoint), end of treatment, 4 weeks after end of treatment, and 8 weeks after end of treatment. CYCLE trial will play a critical role in determining the effectiveness of two different types of aerobic exercise, forced and voluntary, on motor and non-motor performance in individuals with Parkinson's disease. Additionally, the coupling of clinical, biomechanical, and neuroimaging outcomes has the potential to provide insight into mechanisms underlying change in function as a result of exercise. Clinicaltrials.gov registration number NCT01636297.

  18. Toward a Galactic Distribution of Planets. I. Methodology and Planet Sensitivities of the 2015 High-cadence Spitzer Microlens Sample

    Science.gov (United States)

    Zhu, Wei; Udalski, A.; Calchi Novati, S.; Chung, S.-J.; Jung, Y. K.; Ryu, Y.-H.; Shin, I.-G.; Gould, A.; Lee, C.-U.; Albrow, M. D.; Yee, J. C.; Han, C.; Hwang, K.-H.; Cha, S.-M.; Kim, D.-J.; Kim, H.-W.; Kim, S.-L.; Kim, Y.-H.; Lee, Y.; Park, B.-G.; Pogge, R. W.; KMTNet Collaboration; Poleski, R.; Mróz, P.; Pietrukowicz, P.; Skowron, J.; Szymański, M. K.; Kozłowski, S.; Ulaczyk, K.; Pawlak, M.; OGLE Collaboration; Beichman, C.; Bryden, G.; Carey, S.; Fausnaugh, M.; Gaudi, B. S.; Henderson, C. B.; Shvartzvald, Y.; Wibking, B.; Spitzer Team

    2017-11-01

    We analyze an ensemble of microlensing events from the 2015 Spitzer microlensing campaign, all of which were densely monitored by ground-based high-cadence survey teams. The simultaneous observations from Spitzer and the ground yield measurements of the microlensing parallax vector {{\\boldsymbol{π }}}{{E}}, from which compact constraints on the microlens properties are derived, including ≲25% uncertainties on the lens mass and distance. With the current sample, we demonstrate that the majority of microlenses are indeed in the mass range of M dwarfs. The planet sensitivities of all 41 events in the sample are calculated, from which we provide constraints on the planet distribution function. In particular, assuming a planet distribution function that is uniform in {log}q, where q is the planet-to-star mass ratio, we find a 95% upper limit on the fraction of stars that host typical microlensing planets of 49%, which is consistent with previous studies. Based on this planet-free sample, we develop the methodology to statistically study the Galactic distribution of planets using microlensing parallax measurements. Under the assumption that the planet distributions are the same in the bulge as in the disk, we predict that ∼1/3 of all planet detections from the microlensing campaigns with Spitzer should be in the bulge. This prediction will be tested with a much larger sample, and deviations from it can be used to constrain the abundance of planets in the bulge relative to the disk.

  19. Estimating the Size of a Large Network and its Communities from a Random Sample.

    Science.gov (United States)

    Chen, Lin; Karbasi, Amin; Crawford, Forrest W

    2016-01-01

    Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = (V, E) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W ⊆ V and letting G(W) be the induced subgraph in G of the vertices in W. In addition to G(W), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that accurately estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhaustive set of experiments to study the effects of sample size, K, and SBM model parameters on the accuracy of the estimates. The experimental results also demonstrate that PULSE significantly outperforms a widely-used method called the network scale-up estimator in a wide variety of scenarios.

  20. Investigating causal associations between use of nicotine, alcohol, caffeine, and cannabis: A two-sample bidirectional Mendelian randomization study.

    Science.gov (United States)

    Verweij, Karin J H; Treur, Jorien L; Vink, Jacqueline M

    2018-01-15

    Epidemiological studies consistently show co-occurrence of use of different addictive substances. Whether these associations are causal or due to overlapping underlying influences remains an important question in addiction research. Methodological advances have made it possible to use published genetic associations to infer causal relationships between phenotypes. In this exploratory study, we used Mendelian randomization (MR) to examine the causality of well-established associations between nicotine, alcohol, caffeine, and cannabis use. Two-sample MR was employed to estimate bi-directional causal effects between four addictive substances: nicotine (smoking initiation and cigarettes smoked per day), caffeine (cups of coffee per day), alcohol (units per week), and cannabis (initiation). Based on existing genome-wide association results we selected genetic variants associated with the exposure measure as an instrument to estimate causal effects. Where possible we applied sensitivity analyses (MR-Egger and weighted median) more robust to horizontal pleiotropy. Most MR tests did not reveal causal associations. There was some weak evidence for a causal positive effect of genetically instrumented alcohol use on smoking initiation and of cigarettes per day on caffeine use, but these did not hold up with the sensitivity analyses. There was also some suggestive evidence for a positive effect of alcohol use on caffeine use (only with MR-Egger) and smoking initiation on cannabis initiation (only with weighted median). None of the suggestive causal associations survived corrections for multiple testing. Two-sample Mendelian randomization analyses found little evidence for causal relationships between nicotine, alcohol, caffeine, and cannabis use. This article is protected by copyright. All rights reserved.

  1. COMPARATIVE ANALYSIS OF AVIONICS SAMPLES AND COMPONENTS DUE TO DEVELOPING A METHODOLOGY OF THE UAV INTEGRATED AVIONICS SYNTHESIS

    Directory of Open Access Journals (Sweden)

    Dmytro Prusov

    2017-03-01

    Full Text Available The article is devoted to analysis of avionics components samples with a view to creating an integrated avionics synthesis methodology remotely piloted and unmanned Aerospace Dynamic Objects (ADO, which provides a comprehensive solution of problems of navigation and synergistic control ADO air navigation in space. Methods: The analysis and setting objectives are the basis for the development of modern techniques for combining and processing primary data, methods to solve navigation problems, methods of solving problems in the management of complex integrated avionics ADO. In particular, this invariant compensation method for combining primary sources of information, complete correlation-extreme methods of navigation, control method synergistic ADO. Results: Using these techniques will increase the effectiveness of solving problems of navigation and control both civilian and military ADO, in terms of accidental and intentional interference, the failures of avionics. Discussion: Based on the provisions set a prototype integrated avionics for ADO navigation and control synergistic with current approaches has been developing.

  2. Nicotine therapy sampling to induce quit attempts among smokers unmotivated to quit: a randomized clinical trial.

    Science.gov (United States)

    Carpenter, Matthew J; Hughes, John R; Gray, Kevin M; Wahlquist, Amy E; Saladin, Michael E; Alberg, Anthony J

    2011-11-28

    Rates of smoking cessation have not changed in a decade, accentuating the need for novel approaches to prompt quit attempts. Within a nationwide randomized clinical trial (N = 849) to induce further quit attempts and cessation, smokers currently unmotivated to quit were randomized to a practice quit attempt (PQA) alone or to nicotine replacement therapy (hereafter referred to as nicotine therapy), sampling within the context of a PQA. Following a 6-week intervention period, participants were followed up for 6 months to assess outcomes. The PQA intervention was designed to increase motivation, confidence, and coping skills. The combination of a PQA plus nicotine therapy sampling added samples of nicotine lozenges to enhance attitudes toward pharmacotherapy and to promote the use of additional cessation resources. Primary outcomes included the incidence of any ever occurring self-defined quit attempt and 24-hour quit attempt. Secondary measures included 7-day point prevalence abstinence at any time during the study (ie, floating abstinence) and at the final follow-up assessment. Compared with PQA intervention, nicotine therapy sampling was associated with a significantly higher incidence of any quit attempt (49% vs 40%; relative risk [RR], 1.2; 95% CI, 1.1-1.4) and any 24-hour quit attempt (43% vs 34%; 1.3; 1.1-1.5). Nicotine therapy sampling was marginally more likely to promote floating abstinence (19% vs 15%; RR, 1.3; 95% CI, 1.0-1.7); 6-month point prevalence abstinence rates were no different between groups (16% vs 14%; 1.2; 0.9-1.6). Nicotine therapy sampling during a PQA represents a novel strategy to motivate smokers to make a quit attempt. clinicaltrials.gov Identifier: NCT00706979.

  3. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  4. ESTIMATION OF FINITE POPULATION MEAN USING RANDOM NON–RESPONSE IN SURVEY SAMPLING

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2010-12-01

    Full Text Available This paper consider the problem of estimating the population mean under three different situations of random non–response envisaged by Singh et al (2000. Some ratio and product type estimators have been proposed and their properties are studied under an assumption that the number of sampling units on which information can not be obtained owing to random non–response follows some distribution. The suggested estimators are compared with the usual ratio and product estimators. An empirical study is carried out to show the performance of the suggested estimators over usual unbiased estimator, ratio and product estimators. A generalized version of the proposed ratio and product estimators is also given.

  5. Classification of small lesions on dynamic breast MRI: Integrating dimension reduction and out-of-sample extension into CADx methodology.

    Science.gov (United States)

    Nagarajan, Mahesh B; Huber, Markus B; Schlossbauer, Thomas; Leinsinger, Gerda; Krol, Andrzej; Wismüller, Axel

    2014-01-01

    While dimension reduction has been previously explored in computer aided diagnosis (CADx) as an alternative to feature selection, previous implementations of its integration into CADx do not ensure strict separation between training and test data required for the machine learning task. This compromises the integrity of the independent test set, which serves as the basis for evaluating classifier performance. We propose, implement and evaluate an improved CADx methodology where strict separation is maintained. This is achieved by subjecting the training data alone to dimension reduction; the test data is subsequently processed with out-of-sample extension methods. Our approach is demonstrated in the research context of classifying small diagnostically challenging lesions annotated on dynamic breast magnetic resonance imaging (MRI) studies. The lesions were dynamically characterized through topological feature vectors derived from Minkowski functionals. These feature vectors were then subject to dimension reduction with different linear and non-linear algorithms applied in conjunction with out-of-sample extension techniques. This was followed by classification through supervised learning with support vector regression. Area under the receiver-operating characteristic curve (AUC) was evaluated as the metric of classifier performance. Of the feature vectors investigated, the best performance was observed with Minkowski functional 'perimeter' while comparable performance was observed with 'area'. Of the dimension reduction algorithms tested with 'perimeter', the best performance was observed with Sammon's mapping (0.84±0.10) while comparable performance was achieved with exploratory observation machine (0.82±0.09) and principal component analysis (0.80±0.10). The results reported in this study with the proposed CADx methodology present a significant improvement over previous results reported with such small lesions on dynamic breast MRI. In particular, non

  6. Chi-Squared Test of Fit and Sample Size-A Comparison between a Random Sample Approach and a Chi-Square Value Adjustment Method.

    Science.gov (United States)

    Bergh, Daniel

    2015-01-01

    Chi-square statistics are commonly used for tests of fit of measurement models. Chi-square is also sensitive to sample size, which is why several approaches to handle large samples in test of fit analysis have been developed. One strategy to handle the sample size problem may be to adjust the sample size in the analysis of fit. An alternative is to adopt a random sample approach. The purpose of this study was to analyze and to compare these two strategies using simulated data. Given an original sample size of 21,000, for reductions of sample sizes down to the order of 5,000 the adjusted sample size function works as good as the random sample approach. In contrast, when applying adjustments to sample sizes of lower order the adjustment function is less effective at approximating the chi-square value for an actual random sample of the relevant size. Hence, the fit is exaggerated and misfit under-estimated using the adjusted sample size function. Although there are big differences in chi-square values between the two approaches at lower sample sizes, the inferences based on the p-values may be the same.

  7. Randomized controlled trial on timing and number of sampling for bile aspiration cytology.

    Science.gov (United States)

    Tsuchiya, Tomonori; Yokoyama, Yukihiro; Ebata, Tomoki; Igami, Tsuyoshi; Sugawara, Gen; Kato, Katsuyuki; Shimoyama, Yoshie; Nagino, Masato

    2014-06-01

    The issue on timing and number of bile sampling for exfoliative bile cytology is still unsettled. A total of 100 patients with cholangiocarcinoma undergoing resection after external biliary drainage were randomized into two groups: a 2-day group where bile was sampled five times per day for 2 days; and a 10-day group where bile was sampled once per day for 10 days (registered University Hospital Medical Information Network/ID 000005983). The outcome of 87 patients who underwent laparotomy was analyzed, 44 in the 2-day group and 43 in the 10-day group. There were no significant differences in patient characteristics between the two groups. Positivity after one sampling session was significantly lower in the 2-day group than in the 10-day group (17.0 ± 3.7% vs. 20.7 ± 3.5%, P = 0.034). However, cumulative positivity curves were similar and overlapped each other between both groups. The final cumulative positivity by the 10th sampling session was 52.3% in the 2-day group and 51.2% in the 10-day group. We observed a small increase in cumulative positivity after the 5th or 6th session in both groups. Bile cytology positivity is unlikely to be affected by sample time. © 2013 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  8. Estimating the Size of a Large Network and its Communities from a Random Sample

    CERN Document Server

    Chen, Lin; Crawford, Forrest W

    2016-01-01

    Most real-world networks are too large to be measured or studied directly and there is substantial interest in estimating global network properties from smaller sub-samples. One of the most important global properties is the number of vertices/nodes in the network. Estimating the number of vertices in a large network is a major challenge in computer science, epidemiology, demography, and intelligence analysis. In this paper we consider a population random graph G = (V;E) from the stochastic block model (SBM) with K communities/blocks. A sample is obtained by randomly choosing a subset W and letting G(W) be the induced subgraph in G of the vertices in W. In addition to G(W), we observe the total degree of each sampled vertex and its block membership. Given this partial information, we propose an efficient PopULation Size Estimation algorithm, called PULSE, that correctly estimates the size of the whole population as well as the size of each community. To support our theoretical analysis, we perform an exhausti...

  9. Studies on spectral analysis of randomly sampled signals: Application to laser velocimetry data

    Science.gov (United States)

    Sree, David

    1992-01-01

    Spectral analysis is very useful in determining the frequency characteristics of many turbulent flows, for example, vortex flows, tail buffeting, and other pulsating flows. It is also used for obtaining turbulence spectra from which the time and length scales associated with the turbulence structure can be estimated. These estimates, in turn, can be helpful for validation of theoretical/numerical flow turbulence models. Laser velocimetry (LV) is being extensively used in the experimental investigation of different types of flows, because of its inherent advantages; nonintrusive probing, high frequency response, no calibration requirements, etc. Typically, the output of an individual realization laser velocimeter is a set of randomly sampled velocity data. Spectral analysis of such data requires special techniques to obtain reliable estimates of correlation and power spectral density functions that describe the flow characteristics. FORTRAN codes for obtaining the autocorrelation and power spectral density estimates using the correlation-based slotting technique were developed. Extensive studies have been conducted on simulated first-order spectrum and sine signals to improve the spectral estimates. A first-order spectrum was chosen because it represents the characteristics of a typical one-dimensional turbulence spectrum. Digital prefiltering techniques, to improve the spectral estimates from randomly sampled data were applied. Studies show that the spectral estimates can be increased up to about five times the mean sampling rate.

  10. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  11. Methodological Quality of Randomized Clinical Trials of Respiratory Physiotherapy in Coronary Artery Bypass Grafting Patients in the Intensive Care Unit: a Systematic Review.

    Science.gov (United States)

    Lorscheitter, Jaqueline; Stein, Cinara; Plentz, Rodrigo Della Méa

    2017-01-01

    To assess methodological quality of the randomized controlled trials of physiotherapy in patients undergoing coronary artery bypass grafting in the intensive care unit. The studies published until May 2015, in MEDLINE, Cochrane and PEDro were included. The primary outcome extracted was proper filling of the Cochrane Collaboration's tool's items and the secondary was suitability to the requirements of the CONSORT Statement and its extension. From 807 studies identified, 39 were included. Most at CONSORT items showed a better adequacy after the statement's publication. Studies with positive outcomes presented better methodological quality. The methodological quality of the studies has been improving over the years. However, many aspects can still be better designed.

  12. Protein/creatinine ratio on random urine samples for prediction of proteinuria in preeclampsia.

    Science.gov (United States)

    Roudsari, F Vahid; Ayati, S; Ayatollahi, H; Shakeri, M T

    2012-01-01

    To evaluate Protein/Creatinine ratio on random urine samples for prediction of proteinuria in preeclampsia. This study was performed on 150 pregnant women who were hospitalized as preeclampsia in Ghaem Hospital during 2006. At first, a 24-hours urine sample was collected for each patient to determine protein/creatinine ratio. Then, 24-hours urine collection was analyzed for the evaluation of proteinuria. Statistical analysis was performed with SPSS software. A total of 150 patients entered the study. There was a significant relation between the 24-hours urine protein and protein/creatinine ratio (r = 0.659, P < 0.001). Since the measurement of protein/creatinine ratio is more accurate, reliable, and cost-effective, it can be replaced by the method of measurement the 24-hours urine protein.

  13. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  14. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  15. Mindfulness-based intervention for prodromal sleep disturbances in older adults: design and methodology of a randomized controlled trial.

    Science.gov (United States)

    Black, David S; O'Reilly, Gillian A; Olmstead, Richard; Breen, Elizabeth C; Irwin, Michael R

    2014-09-01

    Sleep problems are prevalent among older adults, often persist untreated, and are predictive of health detriments. Given the limitations of conventional treatments, non-pharmacological treatments such as mindfulness-based interventions (MBIs) are gaining popularity for sleep ailments. However, nothing is yet known about the impact of MBIs on sleep in older adults with prodromal sleep disturbances. This article details the design and methodology of a 6-week parallel-group RCT calibrated to test the treatment effect of the Mindful Awareness Practices (MAPs) program versus sleep hygiene education for improving sleep quality, as the main outcome, in older adults with prodromal sleep disturbances. Older adults with current sleep disturbances will be recruited from the urban Los Angeles community. Participants will be randomized into two standardized treatment conditions, MAPs and sleep hygiene education. Each condition will consist of weekly 2-hour group-based classes over the course of the 6-week intervention. The primary objective of this study is to determine if mindfulness meditation practice as engaged through the MAPs program leads to improved sleep quality relative to sleep hygiene education in older adults with prodromal sleep disturbances. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Optimization of Phenolic Antioxidant Extraction from Wuweizi (Schisandra chinensis Pulp Using Random-Centroid Optimazation Methodology

    Directory of Open Access Journals (Sweden)

    Xiong Yu

    2011-09-01

    Full Text Available The extraction optimization and composition analysis of polyphenols in the fresh pulp of Wuweizi (Schisandra chinensis have been investigated in this study. The extraction process of polyphenols from Wuweizi pulp was optimized using Random-Centroid Optimization (RCO methodology. Six factors including liquid and solid ratio, ethanol concentration, pH, temperature, heating time and extraction times, and three extraction targets of polyphenol content, antioxidant activity and extract yield were considered in the RCO program. Three sets of optimum proposed factor values were obtained corresponding to three extraction targets respectively. The set of optimum proposed factor values for polyphenol extraction given was chosen in further experiments as following: liquid and solid ratio (v/w 8, ethanol 67.3% (v/v, initial pH 1.75, temperature 55 °C for 4 h and extraction repeated for 4 times. The Wuweizi polyphenol extract (WPE was obtained with a yield of 16.37 mg/g and composition of polyphenols 1.847 mg/g, anthocyanins 0.179 mg/g, sugar 9.573 mg/g and protein 0.327 mg/g. The WPE demonstrated high scavenging activities against DPPH radicals.

  17. Inflammatory Biomarkers and Risk of Schizophrenia: A 2-Sample Mendelian Randomization Study.

    Science.gov (United States)

    Hartwig, Fernando Pires; Borges, Maria Carolina; Horta, Bernardo Lessa; Bowden, Jack; Davey Smith, George

    2017-12-01

    Positive associations between inflammatory biomarkers and risk of psychiatric disorders, including schizophrenia, have been reported in observational studies. However, conventional observational studies are prone to bias, such as reverse causation and residual confounding, thus limiting our understanding of the effect (if any) of inflammatory biomarkers on schizophrenia risk. To evaluate whether inflammatory biomarkers have an effect on the risk of developing schizophrenia. Two-sample mendelian randomization study using genetic variants associated with inflammatory biomarkers as instrumental variables to improve inference. Summary association results from large consortia of candidate gene or genome-wide association studies, including several epidemiologic studies with different designs, were used. Gene-inflammatory biomarker associations were estimated in pooled samples ranging from 1645 to more than 80 000 individuals, while gene-schizophrenia associations were estimated in more than 30 000 cases and more than 45 000 ancestry-matched controls. In most studies included in the consortia, participants were of European ancestry, and the prevalence of men was approximately 50%. All studies were conducted in adults, with a wide age range (18 to 80 years). Genetically elevated circulating levels of C-reactive protein (CRP), interleukin-1 receptor antagonist (IL-1Ra), and soluble interleukin-6 receptor (sIL-6R). Risk of developing schizophrenia. Individuals with schizophrenia or schizoaffective disorders were included as cases. Given that many studies contributed to the analyses, different diagnostic procedures were used. The pooled odds ratio estimate using 18 CRP genetic instruments was 0.90 (random effects 95% CI, 0.84-0.97; P = .005) per 2-fold increment in CRP levels; consistent results were obtained using different mendelian randomization methods and a more conservative set of instruments. The odds ratio for sIL-6R was 1.06 (95% CI, 1.01-1.12; P = .02

  18. Sensitive determination of pesticides residues in wine samples with the aid of single-drop microextraction and response surface methodology.

    Science.gov (United States)

    Garbi, Aimilia; Sakkas, Vasilios; Fiamegos, Yiannis C; Stalikas, Constantine D; Albanis, Triantafyllos

    2010-09-15

    The multi-residue trace-level determination of six pesticides (diazinon, dimethoate, chlorpyrifos, vinclozolin, fenthion and quinalphos) in wine samples, after their single-drop microextraction (SDME) is presented herein. The extraction procedure was optimized using the multivariate optimization approach following a two-stage process. The first screening experimental design brought out the significant parameters and was followed by a central composite design (CCD) experiment, which revealed the simultaneous effect of the significant factors affecting the SDME process. High level of linearity for all target analytes was recorded with r(2) ranging between 0.9978 and 0.9999 while repeatability (intra-day) and reproducibility (inter-day) varied from 5.6% to 7.4% and 4.9% to 12.5%, respectively. Limits of detection (LODs) and limits of quantification (LOQs) were found to range in the low microg L(-1) level. In general, the developed methodology presented simplicity and enhanced sensitivity, rendering it appropriate for routine wine screening purposes. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  19. Conic sampling: an efficient method for solving linear and quadratic programming by randomly linking constraints within the interior.

    Science.gov (United States)

    Serang, Oliver

    2012-01-01

    Linear programming (LP) problems are commonly used in analysis and resource allocation, frequently surfacing as approximations to more difficult problems. Existing approaches to LP have been dominated by a small group of methods, and randomized algorithms have not enjoyed popularity in practice. This paper introduces a novel randomized method of solving LP problems by moving along the facets and within the interior of the polytope along rays randomly sampled from the polyhedral cones defined by the bounding constraints. This conic sampling method is then applied to randomly sampled LPs, and its runtime performance is shown to compare favorably to the simplex and primal affine-scaling algorithms, especially on polytopes with certain characteristics. The conic sampling method is then adapted and applied to solve a certain quadratic program, which compute a projection onto a polytope; the proposed method is shown to outperform the proprietary software Mathematica on large, sparse QP problems constructed from mass spectometry-based proteomics.

  20. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  1. Application of the SmartSampling Methodology to the Evaluation of Contaminated Landscape Soils at Brookhaven National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    RAUTMAN,CHRISTOPHER A.

    2000-08-01

    Portions of the SmartSampling{trademark} analysis methodology have been applied to the evaluation of radioactive contaminated landscape soils at Brookhaven National Laboratory. Specifically, the spatial, volumetric distribution of cesium-137 ({sup 137}Cs) contamination within Area of Concern 16E-1 has been modeled probabilistically using a geostatistical methodology, with the purpose of identifying the likelihood of successfully reducing, with respect to a pre-existing, baseline remediation plan, the volume of soil that must be disposed of offsite during clean-up. The principal objective of the analysis was to evaluate the likelihood of successful deployment of the Segmented Gate System (SGS), a novel remediation approach that emphasizes real-time separation of clean from contaminated materials during remediation operations. One primary requirement for successful application of the segmented gate technology investigated is that a variety of contaminant levels exist at the deployment site, which would enable to the SGS to discriminate material above and below a specified remediation threshold value. The results of this analysis indicate that there is potential for significant volume reduction with respect to the baseline remediation plan at a threshold excavation level of 23 pCi/g {sup 137}Cs. A reduction of approximately 50%, from a baseline volume of approximately 1,064.7 yd{sup 3} to less than 550 yd{sup 3}, is possible with acceptance of only a very small level of engineering risk. The vast majority of this volume reduction is obtained by not excavating almost all of levels 3 and 4 (from 12 to 24 inches in depth), which appear to be virtually uncontaminated, based on the available data. Additional volume reductions related to soil materials on levels 1 (depths of 0--6 inches) and 2 (6--12 inches) may be possible, specifically through use of the SGS technology. Level-by-level evaluation of simulation results suggests that as much as 26 percent of level 1 and as

  2. Sample size and power for a stratified doubly randomized preference design.

    Science.gov (United States)

    Cameron, Briana; Esserman, Denise A

    2016-11-21

    The two-stage (or doubly) randomized preference trial design is an important tool for researchers seeking to disentangle the role of patient treatment preference on treatment response through estimation of selection and preference effects. Up until now, these designs have been limited by their assumption of equal preference rates and effect sizes across the entire study population. We propose a stratified two-stage randomized trial design that addresses this limitation. We begin by deriving stratified test statistics for the treatment, preference, and selection effects. Next, we develop a sample size formula for the number of patients required to detect each effect. The properties of the model and the efficiency of the design are established using a series of simulation studies. We demonstrate the applicability of the design using a study of Hepatitis C treatment modality, specialty clinic versus mobile medical clinic. In this example, a stratified preference design (stratified by alcohol/drug use) may more closely capture the true distribution of patient preferences and allow for a more efficient design than a design which ignores these differences (unstratified version). © The Author(s) 2016.

  3. Notes on interval estimation of the generalized odds ratio under stratified random sampling.

    Science.gov (United States)

    Lui, Kung-Jong; Chang, Kuang-Chao

    2013-05-01

    It is not rare to encounter the patient response on the ordinal scale in a randomized clinical trial (RCT). Under the assumption that the generalized odds ratio (GOR) is homogeneous across strata, we consider four asymptotic interval estimators for the GOR under stratified random sampling. These include the interval estimator using the weighted-least-squares (WLS) approach with the logarithmic transformation (WLSL), the interval estimator using the Mantel-Haenszel (MH) type of estimator with the logarithmic transformation (MHL), the interval estimator using Fieller's theorem with the MH weights (FTMH) and the interval estimator using Fieller's theorem with the WLS weights (FTWLS). We employ Monte Carlo simulation to evaluate the performance of these interval estimators by calculating the coverage probability and the average length. To study the bias of these interval estimators, we also calculate and compare the noncoverage probabilities in the two tails of the resulting confidence intervals. We find that WLSL and MHL can generally perform well, while FTMH and FTWLS can lose either precision or accuracy. We further find that MHL is likely the least biased. Finally, we use the data taken from a study of smoking status and breathing test among workers in certain industrial plants in Houston, Texas, during 1974 to 1975 to illustrate the use of these interval estimators.

  4. Control capacity and a random sampling method in exploring controllability of complex networks.

    Science.gov (United States)

    Jia, Tao; Barabási, Albert-László

    2013-01-01

    Controlling complex systems is a fundamental challenge of network science. Recent advances indicate that control over the system can be achieved through a minimum driver node set (MDS). The existence of multiple MDS's suggests that nodes do not participate in control equally, prompting us to quantify their participations. Here we introduce control capacity quantifying the likelihood that a node is a driver node. To efficiently measure this quantity, we develop a random sampling algorithm. This algorithm not only provides a statistical estimate of the control capacity, but also bridges the gap between multiple microscopic control configurations and macroscopic properties of the network under control. We demonstrate that the possibility of being a driver node decreases with a node's in-degree and is independent of its out-degree. Given the inherent multiplicity of MDS's, our findings offer tools to explore control in various complex systems.

  5. Classification of epileptic EEG signals based on simple random sampling and sequential feature selection.

    Science.gov (United States)

    Ghayab, Hadi Ratham Al; Li, Yan; Abdulla, Shahab; Diykh, Mohammed; Wan, Xiangkui

    2016-06-01

    Electroencephalogram (EEG) signals are used broadly in the medical fields. The main applications of EEG signals are the diagnosis and treatment of diseases such as epilepsy, Alzheimer, sleep problems and so on. This paper presents a new method which extracts and selects features from multi-channel EEG signals. This research focuses on three main points. Firstly, simple random sampling (SRS) technique is used to extract features from the time domain of EEG signals. Secondly, the sequential feature selection (SFS) algorithm is applied to select the key features and to reduce the dimensionality of the data. Finally, the selected features are forwarded to a least square support vector machine (LS_SVM) classifier to classify the EEG signals. The LS_SVM classifier classified the features which are extracted and selected from the SRS and the SFS. The experimental results show that the method achieves 99.90, 99.80 and 100 % for classification accuracy, sensitivity and specificity, respectively.

  6. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  7. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  8. Random Photon Absorption Model Elucidates How Early Gain Control in Fly Photoreceptors Arises from Quantal Sampling

    Science.gov (United States)

    Song, Zhuoyi; Zhou, Yu; Juusola, Mikko

    2016-01-01

    Many diurnal photoreceptors encode vast real-world light changes effectively, but how this performance originates from photon sampling is unclear. A 4-module biophysically-realistic fly photoreceptor model, in which information capture is limited by the number of its sampling units (microvilli) and their photon-hit recovery time (refractoriness), can accurately simulate real recordings and their information content. However, sublinear summation in quantum bump production (quantum-gain-nonlinearity) may also cause adaptation by reducing the bump/photon gain when multiple photons hit the same microvillus simultaneously. Here, we use a Random Photon Absorption Model (RandPAM), which is the 1st module of the 4-module fly photoreceptor model, to quantify the contribution of quantum-gain-nonlinearity in light adaptation. We show how quantum-gain-nonlinearity already results from photon sampling alone. In the extreme case, when two or more simultaneous photon-hits reduce to a single sublinear value, quantum-gain-nonlinearity is preset before the phototransduction reactions adapt the quantum bump waveform. However, the contribution of quantum-gain-nonlinearity in light adaptation depends upon the likelihood of multi-photon-hits, which is strictly determined by the number of microvilli and light intensity. Specifically, its contribution to light-adaptation is marginal (≤ 1%) in fly photoreceptors with many thousands of microvilli, because the probability of simultaneous multi-photon-hits on any one microvillus is low even during daylight conditions. However, in cells with fewer sampling units, the impact of quantum-gain-nonlinearity increases with brightening light. PMID:27445779

  9. Notes on interval estimation of the gamma correlation under stratified random sampling.

    Science.gov (United States)

    Lui, Kung-Jong; Chang, Kuang-Chao

    2012-07-01

    We have developed four asymptotic interval estimators in closed forms for the gamma correlation under stratified random sampling, including the confidence interval based on the most commonly used weighted-least-squares (WLS) approach (CIWLS), the confidence interval calculated from the Mantel-Haenszel (MH) type estimator with the Fisher-type transformation (CIMHT), the confidence interval using the fundamental idea of Fieller's Theorem (CIFT) and the confidence interval derived from a monotonic function of the WLS estimator of Agresti's α with the logarithmic transformation (MWLSLR). To evaluate the finite-sample performance of these four interval estimators and note the possible loss of accuracy in application of both Wald's confidence interval and MWLSLR using pooled data without accounting for stratification, we employ Monte Carlo simulation. We use the data taken from a general social survey studying the association between the income level and job satisfaction with strata formed by genders in black Americans published elsewhere to illustrate the practical use of these interval estimators. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Predictive value of testing random urine sample to detect microalbuminuria in diabetic subjects during outpatient visit.

    Science.gov (United States)

    Bouhanick, B; Berrut, G; Chameau, A M; Hallar, M; Bled, F; Chevet, B; Vergely, J; Rohmer, V; Fressinaud, P; Marre, M

    1992-01-01

    The predictive value of random urine sample during outpatient visit to predict persistent microalbuminuria was studied in 76 Type 1, insulin-dependent diabetic subjects, 61 Type 2, non-insulin-dependent diabetic subjects, and 72 Type 2, insulin-treated diabetic subjects. Seventy-six patients attended outpatient clinic during morning, and 133 during afternoon. Microalbuminuria was suspected if Urinary Albumin Excretion (UAE) exceeded 20 mg/l. All patients were hospitalized within 6 months following outpatient visit, and persistent microalbuminuria was assessed then if UAE was between 30 and 300 mg/24 h on 2-3 occasions in 3 urines samples. Of these 209 subjects eighty-three were also screened with Microbumintest (Ames-Bayer), a semi-quantitative method. Among the 209 subjects, 71 were positive both for microalbuminuria during outpatient visit and a persistent microalbuminuria during hospitalization: sensitivity 91.0%, specificity 83.2%, concordance 86.1%, and positive predictive value 76.3% (chi-squared test: 191; p less than 10(-4)). Data were not different for subjects examined on morning, or on afternoon. Among the 83 subjects also screened with Microbumintest, 22 displayed both a positive reaction and a persistent microalbuminuria: sensitivity 76%, specificity 81%, concordance 80%, and positive predictive value 69% (chi-squared test: 126; p less than 10(-4)). Both types of screening appeared equally effective during outpatient visit. Hence, a persistent microalbuminuria can be predicted during an outpatient visit in a diabetic clinic.

  11. Effectiveness of hand hygiene education among a random sample of women from the community.

    Science.gov (United States)

    Ubheeram, J; Biranjia-Hurdoyal, S D

    2017-03-01

    The effectiveness of hand hygiene education was investigated by studying the hand hygiene awareness and bacterial hand contamination among a random sample of 170 women in the community. Questionnaire was used to assess the hand hygiene awareness score, followed by swabbing of the dominant hand. Bacterial identification was done by conventional biochemical tests. Better hand hygiene awareness score was significantly associated with age, scarce bacterial growth and absence of potential pathogen (p hand samples, bacterial growth was noted in 155 (91.2%), which included 91 (53.5%) heavy growth, 53 (31.2%) moderate growth and 11 (6.47%) scanty growth. The presence of enteric bacteria was associated with long nails (49.4% vs 29.2%; p = 0.007; OR = 2.3; 95% CI: 1.25-4.44) while finger rings were associated with higher bacterial load (p = 0.003). Coliforms was significantly higher among women who had a lower hand hygiene awareness score, washed their hands at lower frequency (59.0% vs 32.8%; p = 0.003; OR = 2.9; 95% CI: 1.41-6.13) and used common soap as compared to antiseptic soaps (69.7% vs 30.3%, p = 0.000; OR = 4.11; 95% CI: 1.67-10.12). Level of hand hygiene awareness among the participants was satisfactory but not the compliance of hand washing practice, especially among the elders.

  12. Association between stalking victimisation and psychiatric morbidity in a random community sample.

    Science.gov (United States)

    Purcell, Rosemary; Pathé, Michele; Mullen, Paul E

    2005-11-01

    No studies have assessed psychopathology among victims of stalking who have not sought specialist help. To examine the associations between stalking victimisation and psychiatric morbidity in a representative community sample. A random community sample (n=1844) completed surveys examining the experience of harassment and current mental health. The 28-item General Health Questionnaire (GHQ-28) and the Impact of Event Scale were used to assess symptomatology in those reporting brief harassment (n=196) or protracted stalking (n=236) and a matched control group reporting no harassment (n=432). Rates of caseness on the GHQ-28 were higher among stalking victims (36.4%) than among controls (19.3%) and victims of brief harassment (21.9%). Psychiatric morbidity did not differ according to the recency of victimisation, with 34.1% of victims meeting caseness criteria 1 year after stalking had ended. In a significant minority of victims, stalking victimisation is associated with psychiatric morbidity that may persist long after it has ceased. Recognition of the immediate and long-term impacts of stalking is necessary to assist victims and help alleviate distress and long-term disability.

  13. Random sample community-based health surveys: does the effort to reach participants matter?

    Science.gov (United States)

    Messiah, Antoine; Castro, Grettel; Rodríguez de la Vega, Pura; Acuna, Juan M

    2014-12-15

    Conducting health surveys with community-based random samples are essential to capture an otherwise unreachable population, but these surveys can be biased if the effort to reach participants is insufficient. This study determines the desirable amount of effort to minimise such bias. A household-based health survey with random sampling and face-to-face interviews. Up to 11 visits, organised by canvassing rounds, were made to obtain an interview. Single-family homes in an underserved and understudied population in North Miami-Dade County, Florida, USA. Of a probabilistic sample of 2200 household addresses, 30 corresponded to empty lots, 74 were abandoned houses, 625 households declined to participate and 265 could not be reached and interviewed within 11 attempts. Analyses were performed on the 1206 remaining households. Each household was asked if any of their members had been told by a doctor that they had high blood pressure, heart disease including heart attack, cancer, diabetes, anxiety/ depression, obesity or asthma. Responses to these questions were analysed by the number of visit attempts needed to obtain the interview. Return per visit fell below 10% after four attempts, below 5% after six attempts and below 2% after eight attempts. As the effort increased, household size decreased, while household income and the percentage of interviewees active and employed increased; proportion of the seven health conditions decreased, four of which did so significantly: heart disease 20.4-9.2%, high blood pressure 63.5-58.1%, anxiety/depression 24.4-9.2% and obesity 21.8-12.6%. Beyond the fifth attempt, however, cumulative percentages varied by less than 1% and precision varied by less than 0.1%. In spite of the early and steep drop, sustaining at least five attempts to reach participants is necessary to reduce selection bias. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. Sample-to-sample fluctuations of power spectrum of a random motion in a periodic Sinai model

    Science.gov (United States)

    Dean, David S.; Iorio, Antonio; Marinari, Enzo; Oshanin, Gleb

    2016-09-01

    The Sinai model of a tracer diffusing in a quenched Brownian potential is a much-studied problem exhibiting a logarithmically slow anomalous diffusion due to the growth of energy barriers with the system size. However, if the potential is random but periodic, the regime of anomalous diffusion crosses over to one of normal diffusion once a tracer has diffused over a few periods of the system. Here we consider a system in which the potential is given by a Brownian bridge on a finite interval (0 ,L ) and then periodically repeated over the whole real line and study the power spectrum S (f ) of the diffusive process x (t ) in such a potential. We show that for most of realizations of x (t ) in a given realization of the potential, the low-frequency behavior is S (f ) ˜A /f2 , i.e., the same as for standard Brownian motion, and the amplitude A is a disorder-dependent random variable with a finite support. Focusing on the statistical properties of this random variable, we determine the moments of A of arbitrary, negative, or positive order k and demonstrate that they exhibit a multifractal dependence on k and a rather unusual dependence on the temperature and on the periodicity L , which are supported by atypical realizations of the periodic disorder. We finally show that the distribution of A has a log-normal left tail and exhibits an essential singularity close to the right edge of the support, which is related to the Lifshitz singularity. Our findings are based both on analytic results and on extensive numerical simulations of the process x (t ) .

  15. [Systematic review on methodology of randomized controlled trials of post-marketing Chinese patent drugs for treatment of type 2 diabetes].

    Science.gov (United States)

    Ma, Li-xin; Wang, Yu-yi; Li, Xin-xue; Liu, Jian-ping

    2012-03-01

    Randomized controlled trial (RCT) is considered as the gold standard for the efficacy assessment of medicines. With the increasing number of Chinese patent drugs for treatment of type 2 diabetes, the methodology of post-marketing RCTs evaluating the efficacy and specific effect has become more important. To investigate post-marketing Chinese patent drugs for treatment of type 2 diabetes, as well as the methodological quality of post-marketing RCTs. Literature was searched from the books of Newly Compiled Traditional Chinese Patent Medicine and Chinese Pharmacopeia, the websites of the State Food and Drug Administration and the Ministry of Human Resources and Social Security of the People's Republic of China, China National Knowledge Infrastructure Database, Chongqing VIP Chinese Science and Technology Periodical Database, Chinese Biomedical Database (SinoMed) and Wanfang Data. The time period for searching ran from the commencement of each database to August 2011. RCTs of post-marketing Chinese patent drugs for treatment of type 2 diabetes with intervention course no less than 3 months. Two authors independently evaluated the research quality of the RCTs by the checklist of risk bias assessment and the data collection forms based on the CONSORT Statement. Independent double data-extraction was performed. The authors identified a total of 149 Chinese patent drugs for treatment of type 2 diabetes. According to different indicative syndromes, the Chinese patent drugs can be divided into the following types, namely, yin deficiency and interior heat (n=48, 32%), dual deficiency of qi and yin (n=58, 39%) and dual deficiency of qi and yin combined with blood stasis (n=22, 15%). A total of 41 RCTs meeting the inclusion criteria were included. Neither multicenter RCTs nor endpoint outcome reports were found. Risk bias analysis showed that 81% of the included studies reported randomization for grouping without sequence generation, 98% of these studies did not report

  16. Random Model Sampling: Making Craig Interpolation Work When It Should Not

    Directory of Open Access Journals (Sweden)

    Marat Akhin

    2014-01-01

    Full Text Available One of the most serious problems when doing program analyses is dealing with function calls. While function inlining is the traditional approach to this problem, it nonetheless suffers from the increase in analysis complexity due to the state space explosion. Craig interpolation has been successfully used in recent years in the context of bounded model checking to do function summarization which allows one to replace the complete function body with its succinct summary and, therefore, reduce the complexity. Unfortunately this technique can be applied only to a pair of unsatisfiable formulae.In this work-in-progress paper we present an approach to function summarization based on Craig interpolation that overcomes its limitation by using random model sampling. It captures interesting input/output relations, strengthening satisfiable formulae into unsatisfiable ones and thus allowing the use of Craig interpolation. Preliminary experiments show the applicability of this approach; in our future work we plan to do a full evaluation on real-world examples.

  17. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  18. Neurofeedback Against Binge Eating: A Randomized Controlled Trial in a Female Subclinical Threshold Sample.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2016-09-01

    Brain-directed treatment techniques, such as neurofeedback, have recently been proposed as adjuncts in the treatment of eating disorders to improve therapeutic outcomes. In line with this recommendation, a cue exposure EEG-neurofeedback protocol was developed. The present study aimed at the evaluation of the specific efficacy of neurofeedback to reduce subjective binge eating in a female subthreshold sample. A total of 75 subjects were randomized to EEG-neurofeedback, mental imagery with a comparable treatment set-up or a waitlist group. At post-treatment, only EEG-neurofeedback led to a reduced frequency of binge eating (p = .015, g = 0.65). The effects remained stable to a 3-month follow-up. EEG-neurofeedback further showed particular beneficial effects on perceived stress and dietary self-efficacy. Differences in outcomes did not arise from divergent treatment expectations. Because EEG-neurofeedback showed a specific efficacy, it may be a promising brain-directed approach that should be tested as a treatment adjunct in clinical groups with binge eating. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

  19. A coupled well-balanced and random sampling scheme for computing bubble oscillations*

    Directory of Open Access Journals (Sweden)

    Jung Jonathan

    2012-04-01

    Full Text Available We propose a finite volume scheme to study the oscillations of a spherical bubble of gas in a liquid phase. Spherical symmetry implies a geometric source term in the Euler equations. Our scheme satisfies the well-balanced property. It is based on the VFRoe approach. In order to avoid spurious pressure oscillations, the well-balanced approach is coupled with an ALE (Arbitrary Lagrangian Eulerian technique at the interface and a random sampling remap. Nous proposons un schéma de volumes finis pour étudier les oscillations d’une bulle sphérique de gaz dans l’eau. La symétrie sphérique fait apparaitre un terme source géométrique dans les équations d’Euler. Notre schéma est basé sur une approche VFRoe et préserve les états stationnaires. Pour éviter les oscillations de pression, l’approche well-balanced est couplée avec une approche ALE (Arbitrary Lagrangian Eulerian, et une étape de projection basée sur un échantillonage aléatoire.

  20. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  1. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  2. Prevalence and Serogroup Diversity of Salmonella for Broiler Neck Skin, Whole Carcass Rinse, and Whole Carcass Enrichment Sampling Methodologies following Air or Immersion Chilling.

    Science.gov (United States)

    Bourassa, D V; Holmes, J M; Cason, J A; Cox, N A; Rigsby, L L; Buhr, R J

    2015-11-01

    The purpose of this study was to evaluate neck skin (NS), whole carcass rinse (WCR), and whole carcass enrichment (WCE) sampling procedures for Salmonella isolation and serogroup identification from the same broiler chicken carcass treated with air or immersion chilling. Commercially processed and eviscerated broiler carcasses were collected from a commercial processing plant, individually bagged, and transported to the pilot processing plant. In experiment 1, carcasses were air chilled to 4°C. In experiment 2, carcasses were immersion chilled with or without chlorine. After air chilling, Salmonella was detected on 78% of NS and 89% of WCE samples. Only one Salmonella serogroup was detected from each of 13 Salmonella-positive NS samples, and two serogroups were detected on 1 Salmonella-positive NS sample. Only one Salmonella serogroup was detected from each of 13 Salmonella-positive WCE samples, and two serogroups were detected from 3 Salmonella-positive WCE samples. After immersion chilling without chlorine, Salmonella was detected on 38% of NS, 45% of WCR, and 100% of WCE samples. Without chlorine, the 15 Salmonella-positive NS samples included 14 samples with one serogroup and 1 sample with two serogroups. Only one Salmonella serogroup was detected from WCR samples after immersion chilling. Of 40 Salmonella-positive WCE samples, 23 had a one, 14 had two, and 3 had three Salmonella serogroups. After immersion chilling with chlorine, Salmonella was detected on 35% of NS, 0% of WCR, and 90% of WCE samples. With chlorine, the 14 Salmonella-positive NS samples included 11 samples with one serogroup and 3 samples with two serogroups. No Salmonella serogroups were detected from WCR samples after immersion chilling with 20 mg/liter free chlorine. The 36 Salmonella-positive WCE samples included 21 samples with one serogroup and 15 samples with two serogroups. NS and WCE sampling methodologies yielded similar prevalence and serogroup diversity after air chilling. However

  3. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology

    Directory of Open Access Journals (Sweden)

    Jordi Sánchez-Ribas

    2015-09-01

    Full Text Available Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.

  4. Proposal of the Methodology for Analysing the Structural Relationship in the System of Random Process Using the Data Mining Methods

    National Research Council Canada - National Science Library

    German Michaľčonok; Michaela Horalová Kalinová; Martin Németh

    2014-01-01

    .... In this paper, we will approach the area of the random processes, present the process of structural analysis and select suitable circuit data mining methods applicable to the area of structural analysis...

  5. Experiments with central-limit properties of spatial samples from locally covariant random fields

    Science.gov (United States)

    Barringer, T.H.; Smith, T.E.

    1992-01-01

    When spatial samples are statistically dependent, the classical estimator of sample-mean standard deviation is well known to be inconsistent. For locally dependent samples, however, consistent estimators of sample-mean standard deviation can be constructed. The present paper investigates the sampling properties of one such estimator, designated as the tau estimator of sample-mean standard deviation. In particular, the asymptotic normality properties of standardized sample means based on tau estimators are studied in terms of computer experiments with simulated sample-mean distributions. The effects of both sample size and dependency levels among samples are examined for various value of tau (denoting the size of the spatial kernel for the estimator). The results suggest that even for small degrees of spatial dependency, the tau estimator exhibits significantly stronger normality properties than does the classical estimator of standardized sample means. ?? 1992.

  6. LONG-TERM VARIABILITY OF BRONCHIAL RESPONSIVENESS TO HISTAMINE IN A RANDOM-POPULATION SAMPLE OF ADULTS

    NARCIS (Netherlands)

    RIJCKEN, B; SCHOUTEN, JP; WEISS, ST; ROSNER, B; DEVRIES, K; VANDERLENDE, R

    1993-01-01

    Long-term variability of bronchial responsiveness has been studied in a random population sample of adults. During a follow-up period of 18 yr, 2,216 subjects contributed 5,012 observations to the analyses. Each subject could have as many as seven observations. Bronchial responsiveness was assessed

  7. Albumin to creatinine ratio in a random urine sample: Correlation with severity of preeclampsia

    Directory of Open Access Journals (Sweden)

    Fady S. Moiety

    2014-06-01

    Conclusions: Random urine ACR may be a reliable method for prediction and assessment of severity of preeclampsia. Using the estimated cut-off may add to the predictive value of such a simple quick test.

  8. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Directory of Open Access Journals (Sweden)

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  9. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  10. "What’s so special about studying old people?". The ethical, methodological, and sampling issues surrounding the study of older adults and ICTs

    OpenAIRE

    Birkland, Johanna L.

    2010-01-01

    Generational research surrounding technology use and trends is beginning to receive more attention. Most of this research has focused on the younger generations, while ignoring those generations at the older end of the age spectrum. As research of younger generations becomes more popular, it is likely that researching older generations will also grow in popularity. There are, however, several methodological, ethical, and sampling challenges that researchers must consider whe...

  11. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  12. Beyond Random Walk and Metropolis-Hastings Samplers: Why You Should Not Backtrack for Unbiased Graph Sampling

    CERN Document Server

    Lee, Chul-Ho; Eun, Do Young

    2012-01-01

    Graph sampling via crawling has been actively considered as a generic and important tool for collecting uniform node samples so as to consistently estimate and uncover various characteristics of complex networks. The so-called simple random walk with re-weighting (SRW-rw) and Metropolis-Hastings (MH) algorithm have been popular in the literature for such unbiased graph sampling. However, an unavoidable downside of their core random walks -- slow diffusion over the space, can cause poor estimation accuracy. In this paper, we propose non-backtracking random walk with re-weighting (NBRW-rw) and MH algorithm with delayed acceptance (MHDA) which are theoretically guaranteed to achieve, at almost no additional cost, not only unbiased graph sampling but also higher efficiency (smaller asymptotic variance of the resulting unbiased estimators) than the SRW-rw and the MH algorithm, respectively. In particular, a remarkable feature of the MHDA is its applicability for any non-uniform node sampling like the MH algorithm,...

  13. Identifying the origin of groundwater samples in a multi-layer aquifer system with Random Forest classification

    Science.gov (United States)

    Baudron, Paul; Alonso-Sarría, Francisco; García-Aróstegui, José Luís; Cánovas-García, Fulgencio; Martínez-Vicente, David; Moreno-Brotóns, Jesús

    2013-08-01

    Accurate identification of the origin of groundwater samples is not always possible in complex multilayered aquifers. This poses a major difficulty for a reliable interpretation of geochemical results. The problem is especially severe when the information on the tubewells design is hard to obtain. This paper shows a supervised classification method based on the Random Forest (RF) machine learning technique to identify the layer from where groundwater samples were extracted. The classification rules were based on the major ion composition of the samples. We applied this method to the Campo de Cartagena multi-layer aquifer system, in southeastern Spain. A large amount of hydrogeochemical data was available, but only a limited fraction of the sampled tubewells included a reliable determination of the borehole design and, consequently, of the aquifer layer being exploited. Added difficulty was the very similar compositions of water samples extracted from different aquifer layers. Moreover, not all groundwater samples included the same geochemical variables. Despite of the difficulty of such a background, the Random Forest classification reached accuracies over 90%. These results were much better than the Linear Discriminant Analysis (LDA) and Decision Trees (CART) supervised classification methods. From a total of 1549 samples, 805 proceeded from one unique identified aquifer, 409 proceeded from a possible blend of waters from several aquifers and 335 were of unknown origin. Only 468 of the 805 unique-aquifer samples included all the chemical variables needed to calibrate and validate the models. Finally, 107 of the groundwater samples of unknown origin could be classified. Most unclassified samples did not feature a complete dataset. The uncertainty on the identification of training samples was taken in account to enhance the model. Most of the samples that could not be identified had an incomplete dataset.

  14. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    algorithm were evaluated. The resulting maps were validated on 777 soil profiles situated in a grid covering Denmark. The experiments showed that the results obtained with Jacobsen’s map were more accurate than the results obtained with the CEC map, despite a nominally coarser scale of 1:2,000,000 vs. 1...... of European Communities (CEC, 1985) respectively, both using the FAO 1974 classification. Furthermore, the effects of implementing soil-landscape relationships, using area proportional sampling instead of per polygon sampling, and replacing the default C5.0 classification tree algorithm with a random forest......:1,000,000. This finding is probably related to the fact that Jacobsen’s map was more detailed with a larger number of polygons, soil map units and soil types, despite its coarser scale. The results showed that the implementation of soil-landscape relationships, area-proportional sampling and the random forest...

  15. Implementing the PAIN RelieveIt Randomized Controlled Trial in Hospice Care: Mechanisms for Success and Meeting PCORI Methodology Standards.

    Science.gov (United States)

    Ezenwa, Miriam O; Suarez, Marie L; Carrasco, Jesus D; Hipp, Theresa; Gill, Anayza; Miller, Jacob; Shea, Robert; Shuey, David; Zhao, Zhongsheng; Angulo, Veronica; McCurry, Timothy; Martin, Joanna; Yao, Yingwei; Molokie, Robert E; Wang, Zaijie Jim; Wilkie, Diana J

    2017-07-01

    This purpose of this article is to describe how we adhere to the Patient-Centered Outcomes Research Institute's (PCORI) methodology standards relevant to the design and implementation of our PCORI-funded study, the PAIN RelieveIt Trial. We present details of the PAIN RelieveIt Trial organized by the PCORI methodology standards and components that are relevant to our study. The PAIN RelieveIt Trial adheres to four PCORI standards and 21 subsumed components. The four standards include standards for formulating research questions, standards associated with patient centeredness, standards for data integrity and rigorous analyses, and standards for preventing and handling missing data. In the past 24 months, we screened 2,837 cancer patients and their caregivers; 874 dyads were eligible; 223.5 dyads consented and provided baseline data. Only 55 patients were lost to follow-up-a 25% attrition rate. The design and implementation of the PAIN RelieveIt Trial adhered to PCORI's methodology standards for research rigor.

  16. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  17. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  18. Analytical methodology for determination of the sulfate in vinasse samples; Metodologia analitica para a determinacao de sulfato em vinhoto

    Energy Technology Data Exchange (ETDEWEB)

    Prada, Silvio Miranda; Guekezian, Marcia; Suarez-Ilha, Maria Encarnacion V. [Sao Paulo Univ., SP (Brazil). Inst. de Quimica

    1998-05-01

    When sulfate is present in high concentrations, it acts as an inhibitor in the production of methane (Biogas Formation) in anaerobic biodigestion processes. In this way it is very important to know the sulfate concentration in vinasse samples before to make the biodigester design. A previous developed and indirect method (Anal. Chim. Acta. 1996, 329, 197), was used to determine sulfate in samples of vinasse, after previous treatments, done in order to eliminate organic matter with hydrogen peroxide 30% and concentrated nitric acid mixture (3:1), under heating. Interferent cationic ions were isolated by using ion exchange columns. The results obtained for some samples from Araraquara and Penapolis are here presented. The phosphate concentration was also determined. (author) 23 refs., 3 tabs.

  19. A Methodology of Image Segmentation for High Resolution Remote Sensing Image Based on Visual System and Markov Random Field

    Directory of Open Access Journals (Sweden)

    XU Miaozhong

    2015-02-01

    Full Text Available In consideration of the visual system's tremendous ability to perceive and identify the information, a new image segmentation method is presented which simulates the mechanism of visual system for the high resolution remote sensing image segmentation with Markov random field model. Firstly, the characteristics of the visual system have been summarized as: hierarchy, learning ability, feature detection capability and sparse coding property. Secondly, the working mechanism of visual system is simulated by wavelet transform, unsupervised clustering algorithm, feature analysis and Laplace distribution. Then, the segmentation is achieved by the visual mechanism and the Markov random field. Different satellites remote sensing images are adopted as the experimental data, and the segmentation results demonstrate the proposed method have good performance in high resolution remote sensing images.

  20. Finite-sample corrected generalized estimating equation of population average treatment effects in stepped wedge cluster randomized trials.

    Science.gov (United States)

    Scott, JoAnna M; deCamp, Allan; Juraska, Michal; Fay, Michael P; Gilbert, Peter B

    2017-04-01

    Stepped wedge designs are increasingly commonplace and advantageous for cluster randomized trials when it is both unethical to assign placebo, and it is logistically difficult to allocate an intervention simultaneously to many clusters. We study marginal mean models fit with generalized estimating equations for assessing treatment effectiveness in stepped wedge cluster randomized trials. This approach has advantages over the more commonly used mixed models that (1) the population-average parameters have an important interpretation for public health applications and (2) they avoid untestable assumptions on latent variable distributions and avoid parametric assumptions about error distributions, therefore, providing more robust evidence on treatment effects. However, cluster randomized trials typically have a small number of clusters, rendering the standard generalized estimating equation sandwich variance estimator biased and highly variable and hence yielding incorrect inferences. We study the usual asymptotic generalized estimating equation inferences (i.e., using sandwich variance estimators and asymptotic normality) and four small-sample corrections to generalized estimating equation for stepped wedge cluster randomized trials and for parallel cluster randomized trials as a comparison. We show by simulation that the small-sample corrections provide improvement, with one correction appearing to provide at least nominal coverage even with only 10 clusters per group. These results demonstrate the viability of the marginal mean approach for both stepped wedge and parallel cluster randomized trials. We also study the comparative performance of the corrected methods for stepped wedge and parallel designs, and describe how the methods can accommodate interval censoring of individual failure times and incorporate semiparametric efficient estimators.

  1. Methodological advances

    Directory of Open Access Journals (Sweden)

    Lebreton, J.-D.

    2004-06-01

    approaches are the usual ultra–structural model and a new one based on a random effects model. Thispaper can be viewed as part of a revival of studies of the dynamics of exploited populations, in the broad sense, including the study of man–induced mortality in the framework of conservation biology (Lebreton, in press. This revival is a direct consequence of the increasing impact of man on the biosphere and of continuing methodological progress (Ferson & Burgman, 2000. The use of random effects models (see also Schaub & Lebreton, 2004 directly builds upon the seminal work by Anderson and Burnham (1976. Stauffer presented a Winbugs implementation of the Cormack–Jolly–Seber model that complemented other presentations in the conference and the short course. Finally, Morgan, Besbeas, Thomas, Buckland, Harwood,Duck and Pomery, proposed a thorough and timely review of integrated modelling, i.e., in our context, of models considering simultaneously capture–recapture demographic information and census information. These methods were covered in other sessions, in relation to Bayesian methodology. Integrated modelling appears indeed to be the logical way of combining all pieces of information arising from integrated monitoring, and as one of the great methodological challenges for our community in the years to come (Besbeas et al., 2002.

  2. Prevalence of salmonella following immersion chilling for matched neck skin, whole carcass rinse, and whole carcass enrichment sampling methodologies

    Science.gov (United States)

    Salmonella prevalence and the serogroups recovered following immersion chilling were determined for matched enriched neck skin, whole carcass rinse, and whole carcass samples. Commercially processed and eviscerated broiler carcasses were chilled in ice/tap water 40 min with or without 20 ppm free c...

  3. Simple methodology to directly genotype Trypanosoma cruzi discrete typing units in single and mixed infections from human blood samples.

    Science.gov (United States)

    Bontempi, Iván A; Bizai, María L; Ortiz, Sylvia; Manattini, Silvia; Fabbro, Diana; Solari, Aldo; Diez, Cristina

    2016-09-01

    Different DNA markers to genotype Trypanosoma cruzi are now available. However, due to the low quantity of parasites present in biological samples, DNA markers with high copy number like kinetoplast minicircles are needed. The aim of this study was to complete a DNA assay called minicircle lineage specific-PCR (MLS-PCR) previously developed to genotype the T. cruzi DTUs TcV and TcVI, in order to genotype DTUs TcI and TcII and to improve TcVI detection. We screened kinetoplast minicircle hypervariable sequences from cloned PCR products from reference strains belonging to the mentioned DTUs using specific kDNA probes. With the four highly specific sequences selected, we designed primers to be used in the MLS-PCR to directly genotype T. cruzi from biological samples. High specificity and sensitivity were obtained when we evaluated the new approach for TcI, TcII, TcV and TcVI genotyping in twenty two T. cruzi reference strains. Afterward, we compared it with hybridization tests using specific kDNA probes in 32 blood samples from chronic chagasic patients from North Eastern Argentina. With both tests we were able to genotype 94% of the samples and the concordance between them was very good (kappa=0.855). The most frequent T. cruzi DTUs detected were TcV and TcVI, followed by TcII and much lower TcI. A unique T. cruzi DTU was detected in 18 samples meantime more than one in the remaining; being TcV and TcVI the most frequent association. A high percentage of mixed detections were obtained with both assays and its impact was discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Securing image information using double random phase encoding and parallel compressive sensing with updated sampling processes

    Science.gov (United States)

    Hu, Guiqiang; Xiao, Di; Wang, Yong; Xiang, Tao; Zhou, Qing

    2017-11-01

    Recently, a new kind of image encryption approach using compressive sensing (CS) and double random phase encoding has received much attention due to the advantages such as compressibility and robustness. However, this approach is found to be vulnerable to chosen plaintext attack (CPA) if the CS measurement matrix is re-used. Therefore, designing an efficient measurement matrix updating mechanism that ensures resistance to CPA is of practical significance. In this paper, we provide a novel solution to update the CS measurement matrix by altering the secret sparse basis with the help of counter mode operation. Particularly, the secret sparse basis is implemented by a reality-preserving fractional cosine transform matrix. Compared with the conventional CS-based cryptosystem that totally generates all the random entries of measurement matrix, our scheme owns efficiency superiority while guaranteeing resistance to CPA. Experimental and analysis results show that the proposed scheme has a good security performance and has robustness against noise and occlusion.

  5. Alcohol and marijuana use in adolescents' daily lives: a random sample of experiences.

    Science.gov (United States)

    Larson, R; Csikszentmihalyi, M; Freeman, M

    1984-07-01

    High school students filled out reports on their experiences at random times during their daily lives, including 48 occasions when they were using alcohol or marijuana. Alcohol use was reported primarily in the context of Friday and Saturday night social gatherings and was associated with a happy and gregarious subjective state. Marijuana use was reported across a wider range of situations and was associated with an average state that differed much less from ordinary experience.

  6. Stemflow estimation in a redwood forest using model-based stratified random sampling

    Science.gov (United States)

    Jack Lewis

    2003-01-01

    Model-based stratified sampling is illustrated by a case study of stemflow volume in a redwood forest. The approach is actually a model-assisted sampling design in which auxiliary information (tree diameter) is utilized in the design of stratum boundaries to optimize the efficiency of a regression or ratio estimator. The auxiliary information is utilized in both the...

  7. Random or systematic sampling to detect a localised microbial contamination within a batch of food

    NARCIS (Netherlands)

    Jongenburger, I.; Reij, M.W.; Boer, E.P.J.; Gorris, L.G.M.; Zwietering, M.H.

    2011-01-01

    Pathogenic microorganisms are known to be distributed heterogeneously in food products that are solid, semi-solid or powdered, like for instance peanut butter, cereals, or powdered milk. This complicates effective detection of the pathogens by sampling. Two-class sampling plans, which are deployed

  8. Multistage point relascope and randomized branch sampling for downed coarse woody debris estimation

    Science.gov (United States)

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine

    2002-01-01

    New sampling methods have recently been introduced that allow estimation of downed coarse woody debris using an angle gauge, or relascope. The theory behind these methods is based on sampling straight pieces of downed coarse woody debris. When pieces deviate from this ideal situation, auxillary methods must be employed. We describe a two-stage procedure where the...

  9. Development of standardized methodology for identifying toxins in clinical samples and fish species associated with tetrodotoxin-borne poisoning incidents

    Directory of Open Access Journals (Sweden)

    Tai-Yuan Chen

    2016-01-01

    Full Text Available Tetrodotoxin (TTX is a naturally occurring toxin in food, especially in puffer fish. TTX poisoning is observed frequently in South East Asian regions. In TTX-derived food poisoning outbreaks, the amount of TTX recovered from suspicious fish samples or leftovers, and residual levels from biological fluids of victims are typically trace. However, liquid chromatography–mass spectrometry and liquid chromatography–tandem mass spectrometry methods have been demonstrated to qualitatively and quantitatively determine TTX in clinical samples from victims. Identification and validation of the TTX-originating seafood species responsible for a food poisoning incident is needed. A polymerase chain reaction-based method on mitochondrial DNA analysis is useful for identification of fish species. This review aims to collect pertinent information available on TTX-borne food poisoning incidents with a special emphasis on the analytical methods employed for TTX detection in clinical laboratories as well as for the identification of TTX-bearing species.

  10. Testing of toxicology and emissions-sampling methodology for ocean incineration of hazardous wastes. Final report, January 1985-January 1986

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, P.; Cooke, M.; Carr, S.; Piispanen, W.; Werme, C.

    1988-05-01

    This report addresses the development and testing of a system to expose marine organisms to hazardous-waste emissions in order to assess the potential toxicity of incinerator plumes at sea as they contact the marine environment through air-sea exchange and initial mixing. A sampling train was designed and tested at EPA's land-based hazardous-waste incinerator, using transformer oil as a waste feed. The incinerator was operated under conditions which would be appropriate for at-sea incinerators. The sampling train (Marine Incineration Biological Assessment Sampler--MIBAS) provides a sea-water sample containing a plume emission for the marine organisms testing. Five toxicity-test protocols were refined and/or developed for use in the program: (1) a sea-urchin fertilization test; (2) a chronic test using macroalgae Champia parvula; (3) a 7-day chronic test using growth and reproduction of the crustacean Mysidopsis bahia; (4) a 7-day growth and survival test with the fish Menidia beryllina; and (5) a 7-day life-cycle test using the archiannelid worm Dinophilus gyrocilatus. The results of applying these tests during a hazardous-waste burn are given.

  11. Random Assignment of Schools to Groups in the Drug Resistance Strategies Rural Project: Some New Methodological Twists

    Science.gov (United States)

    Pettigrew, Jonathan; Miller-Day, Michelle; Krieger, Janice L.; Zhou, Jiangxiu; Hecht, Michael L.

    2014-01-01

    Random assignment to groups is the foundation for scientifically rigorous clinical trials. But assignment is challenging in group randomized trials when only a few units (schools) are assigned to each condition. In the DRSR project, we assigned 39 rural Pennsylvania and Ohio schools to three conditions (rural, classic, control). But even with 13 schools per condition, achieving pretest equivalence on important variables is not guaranteed. We collected data on six important school-level variables: rurality, number of grades in the school, enrollment per grade, percent white, percent receiving free/assisted lunch, and test scores. Key to our procedure was the inclusion of school-level drug use data, available for a subset of the schools. Also, key was that we handled the partial data with modern missing data techniques. We chose to create one composite stratifying variable based on the seven school-level variables available. Principal components analysis with the seven variables yielded two factors, which were averaged to form the composite inflate-suppress (CIS) score which was the basis of stratification. The CIS score was broken into three strata within each state; schools were assigned at random to the three program conditions from within each stratum, within each state. Results showed that program group membership was unrelated to the CIS score, the two factors making up the CIS score, and the seven items making up the factors. Program group membership was not significantly related to pretest measures of drug use (alcohol, cigarettes, marijuana, chewing tobacco; smallest p>.15), thus verifying that pretest equivalence was achieved. PMID:23722619

  12. JOINT STUDY OF IMPROVED SAFEGUARDS METHODOLOGY USING NO-NOTICE RANDOMIZED INSPECTION AT JNC'S Pu HANDLING FACILITIES

    Energy Technology Data Exchange (ETDEWEB)

    LU,M.S.; SANBORN,J.B.

    2000-04-01

    After the Iraq war, the International Atomic Energy Agency (IAEA) 93+2 Program was developed to strengthen and improve the cost-effectiveness of the existing safeguards system. In particular, the Program aims to enhance the IAEA ability to detect undeclared nuclear activities and materials. The IAEA 93+2 Program includes: (1) Increased access to information and its effective use; (2) Increased physical access; (3) Optimum use of the existing system. The measures considered are divided in two parts: measures in Part 1 are those, which may be implemented within the existing IAEA authority; Part 2 measures require complementary legal authority, in the form of an additional Protocol, INFCIRC/540. A description of the status of its implementation can be found in ``Implementation of the Additional Protocol'' (Cooley, 1999). In particular, increased physical access includes access beyond locations requiring additional authorities derived from the INFCIRC/540 and no-notice randomized inspections. No-notice randomized inspections could enhance the inspection effectiveness and efficiency by increasing the coverage of the material involved, providing better confirmation of the operational status of the facilities and higher degree of confidence that no undeclared activities or materials existed at the facilities--including the detection of possible measures to conceal diversions.

  13. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  14. Hybrid random walk-linear discriminant analysis method for unwrapping quantitative phase microscopy images of biological samples

    Science.gov (United States)

    Kim, Diane N. H.; Teitell, Michael A.; Reed, Jason; Zangle, Thomas A.

    2015-11-01

    Standard algorithms for phase unwrapping often fail for interferometric quantitative phase imaging (QPI) of biological samples due to the variable morphology of these samples and the requirement to image at low light intensities to avoid phototoxicity. We describe a new algorithm combining random walk-based image segmentation with linear discriminant analysis (LDA)-based feature detection, using assumptions about the morphology of biological samples to account for phase ambiguities when standard methods have failed. We present three versions of our method: first, a method for LDA image segmentation based on a manually compiled training dataset; second, a method using a random walker (RW) algorithm informed by the assumed properties of a biological phase image; and third, an algorithm which combines LDA-based edge detection with an efficient RW algorithm. We show that the combination of LDA plus the RW algorithm gives the best overall performance with little speed penalty compared to LDA alone, and that this algorithm can be further optimized using a genetic algorithm to yield superior performance for phase unwrapping of QPI data from biological samples.

  15. Use of protein: creatinine ratio in a random spot urine sample for predicting significant proteinuria in diabetes mellitus.

    Science.gov (United States)

    Yadav, B K; Adhikari, S; Gyawali, P; Shrestha, R; Poudel, B; Khanal, M

    2010-06-01

    Present study was undertaken during a period of 6 months (September 2008-February 2009) to see an correlation of 24 hours urine protein estimation with random spot protein-creatinine (P:C) ratio among a diabetic patients. The study comprised of 144 patients aged 30-70 years, recruited from Kantipur hospital, Kathmandu. The 24-hr urine sample was collected, followed by spot random urine sample. Both samples were analyzed for protein and creatinine excretion. An informed consent was taken from all participants. Sixteen inadequately collected urine samples as defined by (predicted creatinine--measured creatinine)/predicted creatinine > 0.2 were excluded from analysis. The Spearman's rank correlation between the spot urine P:C ratio and 24-hr total protein were performed by the Statistical Package for Social Service. At the P:C ratio cutoff of 0.15 and reference method (24-hr urine protein) cutoff of 150 mg/day, the correlation coefficient was found to be 0.892 (p urine collection but the cutoff should be carefully selected for different patients group under different laboratory procedures and settings.

  16. Thermal Protection for Mars Sample Return Earth Entry Vehicle: A Grand Challenge for Design Methodology and Reliability Verification

    Science.gov (United States)

    Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.

    2017-01-01

    Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.

  17. Intravenous magnesium for pediatric sickle cell vaso-occlusive crisis: methodological issues of a randomized controlled trial.

    Science.gov (United States)

    Badaki-Makun, Oluwakemi; Scott, J Paul; Panepinto, Julie A; Casper, T Charles; Hillery, Cheryl A; Dean, J Michael; Brousseau, David C

    2014-06-01

    Multiple recent Sickle Cell Disease studies have been terminated due to poor enrollment. We developed methods to overcome past barriers and utilized these to study the efficacy and safety of intravenous magnesium for vaso-occlusive crisis (VOC). We describe the methods of the Intravenous Magnesium in Sickle Vaso-occlusive Crisis (MAGiC) trial and discuss methods used to overcome past barriers. MAGiC was a multi-center randomized double-blind placebo-controlled trial of intravenous magnesium versus normal saline for treatment of VOC. The study was a collaboration between Pediatric Hematologists and Emergency Physicians in the Pediatric Emergency Care Applied Research Network (PECARN). Eligible patients were randomized within 12 hours of receiving intravenous opioids in the Emergency Department (ED) and administered study medication every 8 hours. The primary outcome was hospital length of stay. Associated plasma studies elucidated magnesium's mechanism of action and the pathophysiology of VOC. Health-related quality of life was measured. Site-, protocol-, and patient-related barriers from prior studies were identified and addressed. Limited study staff availability, lack of collaboration with the ED, and difficulty obtaining consent were previously identified barriers. Leveraging PECARN resources, forging close collaborations between Sickle Cell Centers and EDs of participating sites, and approaching eligible patients for prior consent helped overcome these barriers. Participation in the PECARN network and establishment of collaborative arrangements between Sickle Cell Centers and their affiliated EDs are major innovative features of the MAGiC study that allowed improved subject capture. These methods could serve as a model for future studies of VOCs. © 2014 Wiley Periodicals, Inc.

  18. The Family Communication Study: A randomized trial of prospective pediatric palliative care consultation, study methodology and perceptions of participation burden.

    Science.gov (United States)

    Starks, Helene; Doorenbos, Ardith; Lindhorst, Taryn; Bourget, Erica; Aisenberg, Eugene; Oman, Natalie; Rue, Tessa; Curtis, J Randall; Hays, Ross

    2016-07-01

    To describe the study methods, baseline characteristics and burden of study procedures of an intervention designed to reduce family stress symptoms through early support from the palliative care team. Length of stay of ≥8days was the trigger for early palliative care involvement. Cluster-randomized trial with children as the unit of randomization. Up to 3 family members per child were recruited. Family stress symptoms were recorded at baseline, discharge from the ICU, and 3months post-enrollment. Questionnaire burden was assessed on a 1-10 point scale at each time point and open-ended comments were analyzed to describe the participants' experience in the study. 380 family members of 220 children (control=115 children and 204 family members; intervention=105 children and 176 family members) were recruited, which represented 50% of all eligible families. Most family participants were parents (86% control; 92% intervention) and female (66% both groups). Retention rates were high through the 3-month follow-up: 93% and 90% for the control and intervention groups respectively. Questionnaire burden was very low: mean (sd) scores were 1.1 (1.6), 0.7 (1.5), and 0.9 (1.6) for the baseline, discharge and follow-up questionnaires, respectively. Comments suggest that participation was beneficial by promoting reflection and self-awareness about stress, coping and resilience, and feeling cared for because the intervention and questionnaires focused on their own well-being. The participants' comments regarding the focus on them as the point of intervention reflects the value of conducting research with family members of seriously ill children during ICU stays. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Multivariate Multi-Objective Allocation in Stratified Random Sampling: A Game Theoretic Approach.

    Science.gov (United States)

    Muhammad, Yousaf Shad; Hussain, Ijaz; Shoukry, Alaa Mohamd

    2016-01-01

    We consider the problem of multivariate multi-objective allocation where no or limited information is available within the stratum variance. Results show that a game theoretic approach (based on weighted goal programming) can be applied to sample size allocation problems. We use simulation technique to determine payoff matrix and to solve a minimax game.

  20. The effect of dead time on randomly sampled power spectral estimates

    DEFF Research Database (Denmark)

    Buchhave, Preben; Velte, Clara Marika; George, William K.

    2014-01-01

    consider both the effect on the measured spectrum of a finite sampling time, i.e., a finite time during which the signal is acquired, and a finite dead time, that is a time in which the signal processor is busy evaluating a data point and therefore unable to measure a subsequent data point arriving within...... the dead time delay....

  1. Phase microscopy of technical and biological samples through random phase modulation with a difuser

    DEFF Research Database (Denmark)

    Almoro, Percival; Pedrini, Giancarlo; Gundu, Phanindra Narayan

    2010-01-01

    A technique for phase microscopy using a phase diffuser and a reconstruction algorithm is proposed. A magnified specimen wavefront is projected on the diffuser plane that modulates the wavefront into a speckle field. The speckle patterns at axially displaced planes are sampled and used in an iter...

  2. Determination of radium isotopes in environmental samples by gamma spectrometry, liquid scintillation counting and alpha spectrometry: a review of analytical methodology.

    Science.gov (United States)

    Jia, Guogang; Jia, Jing

    2012-04-01

    Radium (Ra) isotopes are important from the viewpoints of radiation protection and environmental protection. Their high toxicity has stimulated the continuing interest in methodology research for determination of Ra isotopes in various media. In this paper, the three most routinely used analytical techniques for Ra isotope determination in biological and environmental samples, i.e. low-background γ-spectrometry, liquid scintillation counting and α-spectrometry, were reviewed, with emphasis on new methodological developments in sample preparation, preconcentration, separation, purification, source preparation and measurement techniques. The accuracy, selectivity, traceability, applicability and minimum detectable activity (MDA) of the three techniques were discussed. It was concluded that the MDA (0.1mBqL(-1)) of the α-spectrometry technique coupled with chemical separation is about two orders of magnitude lower than that of low-background HPGe γ-spectrometry and LSC techniques. Therefore, when maximum sensitivity is required, the α-spectrometry technique remains the first choice. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Design and methodology of a community-based cluster-randomized controlled trial for dietary behaviour change in rural Kerala.

    Science.gov (United States)

    Daivadanam, Meena; Wahlstrom, Rolf; Sundari Ravindran, T K; Sarma, P S; Sivasankaran, S; Thankappan, K R

    2013-07-17

    Interventions targeting lifestyle-related risk factors and non-communicable diseases have contributed to the mainstream knowledge necessary for action. However, there are gaps in how this knowledge can be translated for practical day-to-day use in complex multicultural settings like that in India. Here, we describe the design of the Behavioural Intervention for Diet study, which was developed as a community-based intervention to change dietary behaviour among middle-income households in rural Kerala. This was a cluster-randomized controlled trial to assess the effectiveness of a sequential stage-matched intervention to bring about dietary behaviour change by targeting the procurement and consumption of five dietary components: fruits, vegetables, salt, sugar, and oil. Following a step-wise process of pairing and exclusion of outliers, six out of 22 administrative units in the northern part of Trivandrum district, Kerala state were randomly selected and allocated to intervention or control arms. Trained community volunteers carried out the data collection and intervention delivery. An innovative tool was developed to assess household readiness-to-change, and a household measurement kit and easy formulas were introduced to facilitate the practical side of behaviour change. The 1-year intervention included a household component with sequential stage-matched intervention strategies at 0, 6, and 12 months along with counselling sessions, telephonic reminders, and home visits and a community component with general awareness sessions in the intervention arm. Households in the control arm received information on recommended levels of intake of the five dietary components and general dietary information leaflets. Formative research provided the knowledge to contextualise the design of the study in accordance with socio-cultural aspects, felt needs of the community, and the ground realities associated with existing dietary procurement, preparation, and consumption patterns

  4. Design and methodology of a community-based cluster-randomized controlled trial for dietary behaviour change in rural Kerala

    Directory of Open Access Journals (Sweden)

    Meena Daivadanam

    2013-07-01

    Full Text Available Background: Interventions targeting lifestyle-related risk factors and non-communicable diseases have contributed to the mainstream knowledge necessary for action. However, there are gaps in how this knowledge can be translated for practical day-to-day use in complex multicultural settings like that in India. Here, we describe the design of the Behavioural Intervention for Diet study, which was developed as a community-based intervention to change dietary behaviour among middle-income households in rural Kerala. Methods: This was a cluster-randomized controlled trial to assess the effectiveness of a sequential stage-matched intervention to bring about dietary behaviour change by targeting the procurement and consumption of five dietary components: fruits, vegetables, salt, sugar, and oil. Following a step-wise process of pairing and exclusion of outliers, six out of 22 administrative units in the northern part of Trivandrum district, Kerala state were randomly selected and allocated to intervention or control arms. Trained community volunteers carried out the data collection and intervention delivery. An innovative tool was developed to assess household readiness-to-change, and a household measurement kit and easy formulas were introduced to facilitate the practical side of behaviour change. The 1-year intervention included a household component with sequential stage-matched intervention strategies at 0, 6, and 12 months along with counselling sessions, telephonic reminders, and home visits and a community component with general awareness sessions in the intervention arm. Households in the control arm received information on recommended levels of intake of the five dietary components and general dietary information leaflets. Discussion: Formative research provided the knowledge to contextualise the design of the study in accordance with socio-cultural aspects, felt needs of the community, and the ground realities associated with existing dietary

  5. Building research capacity in Botswana: a randomized trial comparing training methodologies in the Botswana ethics training initiative

    Science.gov (United States)

    2013-01-01

    Background Little empirical data are available on the extent to which capacity-building programs in research ethics prepare trainees to apply ethical reasoning skills to the design, conduct, or review of research. A randomized controlled trial was conducted in Botswana in 2010 to assess the effectiveness of a case-based intervention using email to augment in-person seminars. Methods University faculty and current and prospective IRB/REC members took part in a semester-long training program in research ethics. Participants attended two 2-day seminars and were assigned at random to one of two on-line arms of the trial. Participants in both arms completed on-line international modules from the Collaborative Institutional Training Initiative. Between seminars, intervention-arm participants were also emailed a weekly case to analyze in response to set questions; responses and individualized faculty feedback were exchanged via email. Tests assessing ethics knowledge were administered at the start of each seminar. The post-test included an additional section in which participants were asked to identify the ethical issues highlighted in five case studies from a list of multiple-choice responses. Results were analyzed using regression and ANOVA. Results Of the 71 participants (36 control, 35 intervention) enrolled at the first seminar, 41 (57.7%) attended the second seminar (19 control, 22 intervention). In the intervention arm, 19 (54.3%) participants fully completed and 8 (22.9%) partially completed all six weekly cases. The mean score was higher on the post-test (30.3/40) than on the pre-test (28.0/40), and individual post- and pre-test scores were highly correlated (r = 0.65, p  0.84), but intervention-arm subjects who completed all assigned cases answered an average of 3.2 more questions correctly on the post-test than others, controlling for pre-test scores (p = 0.003). Conclusions Completion of the case-based intervention improved respondents’ test

  6. A Monte-Carlo simulation analysis for evaluating the severity distribution functions (SDFs) calibration methodology and determining the minimum sample-size requirements.

    Science.gov (United States)

    Shirazi, Mohammadali; Reddy Geedipally, Srinivas; Lord, Dominique

    2017-01-01

    Severity distribution functions (SDFs) are used in highway safety to estimate the severity of crashes and conduct different types of safety evaluations and analyses. Developing a new SDF is a difficult task and demands significant time and resources. To simplify the process, the Highway Safety Manual (HSM) has started to document SDF models for different types of facilities. As such, SDF models have recently been introduced for freeway and ramps in HSM addendum. However, since these functions or models are fitted and validated using data from a few selected number of states, they are required to be calibrated to the local conditions when applied to a new jurisdiction. The HSM provides a methodology to calibrate the models through a scalar calibration factor. However, the proposed methodology to calibrate SDFs was never validated through research. Furthermore, there are no concrete guidelines to select a reliable sample size. Using extensive simulation, this paper documents an analysis that examined the bias between the 'true' and 'estimated' calibration factors. It was indicated that as the value of the true calibration factor deviates further away from '1', more bias is observed between the 'true' and 'estimated' calibration factors. In addition, simulation studies were performed to determine the calibration sample size for various conditions. It was found that, as the average of the coefficient of variation (CV) of the 'KAB' and 'C' crashes increases, the analyst needs to collect a larger sample size to calibrate SDF models. Taking this observation into account, sample-size guidelines are proposed based on the average CV of crash severities that are used for the calibration process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Random Evolutionary Dynamics Driven by Fitness and House-of-Cards Mutations: Sampling Formulae

    Science.gov (United States)

    Huillet, Thierry E.

    2017-07-01

    We first revisit the multi-allelic mutation-fitness balance problem, especially when mutations obey a house of cards condition, where the discrete-time deterministic evolutionary dynamics of the allelic frequencies derives from a Shahshahani potential. We then consider multi-allelic Wright-Fisher stochastic models whose deviation to neutrality is from the Shahshahani mutation/selection potential. We next focus on the weak selection, weak mutation cases and, making use of a Gamma calculus, we compute the normalizing partition functions of the invariant probability densities appearing in their Wright-Fisher diffusive approximations. Using these results, generalized Ewens sampling formulae (ESF) from the equilibrium distributions are derived. We start treating the ESF in the mixed mutation/selection potential case and then we restrict ourselves to the ESF in the simpler house-of-cards mutations only situation. We also address some issues concerning sampling problems from infinitely-many alleles weak limits.

  8. Dual to Ratio-Cum-Product Estimator in Simple and Stratified Random Sampling

    OpenAIRE

    Yunusa Olufadi

    2013-01-01

    New estimators for estimating the finite population mean using two auxiliary variables under simple and stratified sampling design is proposed. Their properties (e.g., mean square error) are studied to the first order of approximation. More so, some estimators are shown to be a particular member of this estimator. Furthermore, comparison of the proposed estimator with the usual unbiased estimator and other estimators considered in this paper reveals interesting results. These results are fur...

  9. The psychometric properties of the AUDIT: a survey from a random sample of elderly Swedish adults.

    Science.gov (United States)

    Källmén, Håkan; Wennberg, Peter; Ramstedt, Mats; Hallgren, Mats

    2014-07-01

    Increasing alcohol consumption and related harms have been reported among the elderly population of Europe. Consequently, it is important to monitor patterns of alcohol use, and to use a valid and reliable tool when screening for risky consumption in this age group. The aim was to evaluate the internal consistency reliability and construct validity of the Alcohol Use Disorders Identification Test (AUDIT) in elderly Swedish adults, and to compare the results with the general Swedish population. Another aim was to calculate the level of alcohol consumption (AUDIT-C) to be used for comparison in future studies. The questionnaire was sent to 1459 Swedish adults aged 79-80 years with a response rate of 73.3%. Internal consistency reliability, were assessed using Cronbach alpha, and confirmatory factor analysis assessed construct validity of the Alcohol Use Disorders Identification Test (AUDIT) in elderly population as compared to a Swedish general population sample. The results showed that AUDIT was more reliable and valid among the Swedish general population sample than among the elderly and that Item 1 and 4 in AUDIT was less reliable and valid among the elderly. While the AUDIT showed acceptable psychometric properties in the general population sample, it's performance was of less quality among the elderly respondents. Further psychometric assessments of the AUDIT in elderly populations are required before it is implemented more widely.

  10. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  11. Development of a methodology utilizing gas chromatography ion-trap tandem mass spectrometry for the determination of low levels of caffeine in surface marine and freshwater samples.

    Science.gov (United States)

    Verenitch, Sergei S; Mazumder, Asit

    2008-08-01

    A methodology for monitoring low level of caffeine in aqueous samples via gas chromatography coupled with an ion-trap tandem mass spectrometry detection system (IT-MS/MS) was developed. Four IT-MS/MS operating parameters, including the collision-induced dissociation (CID) voltage, the excitation time (ET), the isolation time (IT) and the maximum ionization time (MIT) were optimized in order to maximize the sensitivity of the IT-MS/MS technique towards the analyte and its isotope-labeled standard. After optimization, a limit of detection of 500 fg microl(-1) with S/N = 3 was achieved. Taking into account blank values and the matrix background, a method detection limit of 1.0-2.0 ng l(-1) was derived and applied to all of the samples analyzed in the study. Various mass spectrometric conditions have been applied to caffeine and its trimethyl-(13)C-labeled standard to elucidate fragmentation pathways for new and commonly occurring product ions observed in the collision-induced dissociation (CID) spectra produced by the ion trap. Ion structures and fragmentation pathway mechanisms have been proposed and compared with previously published data. An isotope dilution method using (13)C-labeled caffeine as a surrogate internal standard was employed to determine and correct the recovery of native caffeine in water samples. The developed methodology has been applied for the determination of caffeine in surface marine and freshwater samples collected on the west coast of Vancouver Island in British Columbia, Canada. The results obtained for the marine water samples indicated a wide variation in the level of caffeine, ranging from 4.5 to 149 ng l(-1), depending on the location of the sampling site within the inlet. The concentrations of caffeine in samples from lakes associated with various residential densities ranged from ND to 6.5, 1.8 to 10.4 and 6.1 to 21.7 ng l(-1) for low, moderate and high residential densities, respectively.

  12. Active Learning Not Associated with Student Learning in a Random Sample of College Biology Courses

    Science.gov (United States)

    Andrews, T. M.; Leonard, M. J.; Colgrove, C. A.; Kalinowski, S. T.

    2011-01-01

    Previous research has suggested that adding active learning to traditional college science lectures substantially improves student learning. However, this research predominantly studied courses taught by science education researchers, who are likely to have exceptional teaching expertise. The present study investigated introductory biology courses randomly selected from a list of prominent colleges and universities to include instructors representing a broader population. We examined the relationship between active learning and student learning in the subject area of natural selection. We found no association between student learning gains and the use of active-learning instruction. Although active learning has the potential to substantially improve student learning, this research suggests that active learning, as used by typical college biology instructors, is not associated with greater learning gains. We contend that most instructors lack the rich and nuanced understanding of teaching and learning that science education researchers have developed. Therefore, active learning as designed and implemented by typical college biology instructors may superficially resemble active learning used by education researchers, but lacks the constructivist elements necessary for improving learning. PMID:22135373

  13. Protocol adherence for continuously titrated interventions in randomized trials: an overview of the current methodology and case study

    Directory of Open Access Journals (Sweden)

    F. Lauzier

    2017-07-01

    Full Text Available Abstract Background The standard definition for protocol adherence is the proportion of all scheduled doses that are delivered. In clinical research, this definition has several limitations when evaluating protocol adherence in trials that study interventions requiring continuous titration. Discussion Building upon a specific case study, we analyzed a recent trial of a continuously titrated intervention to assess the impact of different definitions of protocol deviations on the interpretation of protocol adherence. The OVATION pilot trial was an open-label randomized controlled trial of higher (75–80 mmHg versus lower (60–65 mmHg mean arterial pressure (MAP targets for vasopressor therapy in shock. In this trial, potential protocol deviations were defined as MAP values outside the targeted range for >4 consecutive hours during vasopressor therapy without synchronous and consistent adjustments of vasopressor doses. An adjudication committee reviewed each potential deviation to determine if it was clinically-justified or not. There are four reasons for this contextual measurement and reporting of protocol adherence. First, between-arm separation is a robust measure of adherence to complex protocols. Second, adherence assessed by protocol deviations varies in function of the definition of deviations and the frequency of measurements. Third, distinguishing clinically-justified vs. not clinically-justified protocol deviations acknowledges clinically sensible bedside decision-making and offers a clear terminology before the trial begins. Finally, multiple metrics exist to report protocol deviations, which provides different information but complementary information on protocol adherence. Conclusions In trials of interventions requiring continuous titration, metrics used for defining protocol deviations have a considerable impact on the interpretation of protocol adherence. Definitions for protocol deviations should be prespecified and correlated

  14. Effect of the Mediterranean diet on heart failure biomarkers: a randomized sample from the PREDIMED trial.

    Science.gov (United States)

    Fitó, Montserrat; Estruch, Ramón; Salas-Salvadó, Jordi; Martínez-Gonzalez, Miguel Angel; Arós, Fernando; Vila, Joan; Corella, Dolores; Díaz, Oscar; Sáez, Guillermo; de la Torre, Rafael; Mitjavila, María-Teresa; Muñoz, Miguel Angel; Lamuela-Raventós, Rosa-María; Ruiz-Gutierrez, Valentina; Fiol, Miquel; Gómez-Gracia, Enrique; Lapetra, José; Ros, Emilio; Serra-Majem, Lluis; Covas, María-Isabel

    2014-05-01

    Scarce data are available on the effect of the traditional Mediterranean diet (TMD) on heart failure biomarkers. We assessed the effect of TMD on biomarkers related to heart failure in a high cardiovascular disease risk population. A total of 930 subjects at high cardiovascular risk (420 men and 510 women) were recruited in the framework of a multicentre, randomized, controlled, parallel-group clinical trial directed at testing the efficacy of the TMD on the primary prevention of cardiovascular disease (The PREDIMED Study). Participants were assigned to a low-fat diet (control, n = 310) or one of two TMDs [TMD + virgin olive oil (VOO) or TMD + nuts]. Depending on group assignment, participants received free provision of extra-virgin olive oil, mixed nuts, or small non-food gifts. After 1 year of intervention, both TMDs decreased plasma N-terminal pro-brain natriuretic peptide, with changes reaching significance vs. control group (P cardiovascular disease (CVD) who improved their diet toward a TMD pattern reduced their N-terminal pro-brain natriuretic peptide compared with those assigned to a low-fat diet. The same was found for in vivo oxidized low-density lipoprotein and lipoprotein(a) plasma concentrations after the TMD + VOO diet. From our results TMD could be a useful tool to mitigate against risk factors for heart failure. From our results TMD could modify markers of heart failure towards a more protective mode. © 2014 The Authors. European Journal of Heart Failure © 2014 European Society of Cardiology.

  15. Sequential sampling model for multiattribute choice alternatives with random attention time and processing order.

    Science.gov (United States)

    Diederich, Adele; Oswald, Peter

    2014-01-01

    A sequential sampling model for multiattribute binary choice options, called multiattribute attention switching (MAAS) model, assumes a separate sampling process for each attribute. During the deliberation process attention switches from one attribute consideration to the next. The order in which attributes are considered as well for how long each attribute is considered-the attention time-influences the predicted choice probabilities and choice response times. Several probability distributions for the attention time with different variances are investigated. Depending on the time and order schedule the model predicts a rich choice probability/choice response time pattern including preference reversals and fast errors. Furthermore, the difference between finite and infinite decision horizons for the attribute considered last is investigated. For the former case the model predicts a probability p 0 > 0 of not deciding within the available time. The underlying stochastic process for each attribute is an Ornstein-Uhlenbeck process approximated by a discrete birth-death process. All predictions are also true for the widely applied Wiener process.

  16. Sequential sampling model for multiattribute choice alternatives with random attention time and processing order

    Directory of Open Access Journals (Sweden)

    Adele eDiederich

    2014-09-01

    Full Text Available A sequential sampling model for multiattribute binary choice options, called Multiattribute attention switching (MAAS model, assumes a separate sampling process for each attribute. During the deliberation process attention switches from one attribute consideration to the next. The order in which attributes are considered as well for how long each attribute is considered - the attention time - influences the predicted choice probabilities and choice response times. Several probability distributions for the attention time including deterministic, Poisson, binomial, geometric, and uniform with different variances are investigated. Depending on the time and order schedule the model predicts a rich choice probability/choice response time pattern including preference reversals and fast errors. Furthermore, the difference between a finite and infinite decision horizons for the attribute considered last is investigated. For the former case the model predicts a probability $p_0> 0$ of not deciding within the available time. The underlying stochastic process for each attribute is an Ornstein-Uhlenbeck process approximated by a discrete birth-death process. All predictions are also true for the widely applied Wiener process.

  17. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  18. Generalized SAMPLE SIZE Determination Formulas for Investigating Contextual Effects by a Three-Level Random Intercept Model.

    Science.gov (United States)

    Usami, Satoshi

    2017-03-01

    Behavioral and psychological researchers have shown strong interests in investigating contextual effects (i.e., the influences of combinations of individual- and group-level predictors on individual-level outcomes). The present research provides generalized formulas for determining the sample size needed in investigating contextual effects according to the desired level of statistical power as well as width of confidence interval. These formulas are derived within a three-level random intercept model that includes one predictor/contextual variable at each level to simultaneously cover various kinds of contextual effects that researchers can show interest. The relative influences of indices included in the formulas on the standard errors of contextual effects estimates are investigated with the aim of further simplifying sample size determination procedures. In addition, simulation studies are performed to investigate finite sample behavior of calculated statistical power, showing that estimated sample sizes based on derived formulas can be both positively and negatively biased due to complex effects of unreliability of contextual variables, multicollinearity, and violation of assumption regarding the known variances. Thus, it is advisable to compare estimated sample sizes under various specifications of indices and to evaluate its potential bias, as illustrated in the example.

  19. Comparison of Detrusor Muscle Sampling Rate in Monopolar and Bipolar Transurethral Resection of Bladder Tumor: A Randomized Trial.

    Science.gov (United States)

    Teoh, Jeremy Yuen-Chun; Chan, Eddie Shu-Yin; Yip, Siu-Ying; Tam, Ho-Man; Chiu, Peter Ka-Fung; Yee, Chi-Hang; Wong, Hon-Ming; Chan, Chi-Kwok; Hou, Simon See-Ming; Ng, Chi-Fai

    2017-05-01

    Our aim was to investigate the detrusor muscle sampling rate after monopolar versus bipolar transurethral resection of bladder tumor (TURBT). This was a single-center, prospective, randomized, phase III trial on monopolar versus bipolar TURBT. Baseline patient characteristics, disease characteristics and perioperative outcomes were compared, with the primary outcome being the detrusor muscle sampling rate in the TURBT specimen. Multivariate logistic regression analyses on detrusor muscle sampling were performed. From May 2012 to December 2015, a total of 160 patients with similar baseline characteristics were randomized to receive monopolar or bipolar TURBT. Fewer patients in the bipolar TURBT group required postoperative irrigation than patients in the monopolar TURBT group (18.7 vs. 43%; p = 0.001). In the whole cohort, no significant difference in the detrusor muscle sampling rates was observed between the bipolar and monopolar TURBT groups (77.3 vs. 63.3%; p = 0.057). In patients with urothelial carcinoma, bipolar TURBT achieved a higher detrusor muscle sampling rate than monopolar TURBT (84.6 vs. 67.7%; p = 0.025). On multivariate analyses, bipolar TURBT (odds ratio [OR] 2.23, 95% confidence interval [CI] 1.03-4.81; p = 0.042) and larger tumor size (OR 1.04, 95% CI 1.01-1.08; p = 0.022) were significantly associated with detrusor muscle sampling in the whole cohort. In addition, bipolar TURBT (OR 2.88, 95% CI 1.10-7.53; p = 0.031), larger tumor size (OR 1.05, 95% CI 1.01-1.10; p = 0.035), and female sex (OR 3.25, 95% CI 1.10-9.59; p = 0.033) were significantly associated with detrusor muscle sampling in patients with urothelial carcinoma. There was a trend towards a superior detrusor muscle sampling rate after bipolar TURBT. Further studies are needed to determine its implications on disease recurrence and progression.

  20. Rosiglitazone: can meta-analysis accurately estimate excess cardiovascular risk given the available data? Re-analysis of randomized trials using various methodologic approaches

    Directory of Open Access Journals (Sweden)

    Friedrich Jan O

    2009-01-01

    Full Text Available Abstract Background A recent and provocative meta-analysis, based on few outcome events, suggested that rosiglitazone increased cardiovascular mortality and myocardial infarction. However, results of meta-analyses of trials with sparse events, often performed when examining uncommon adverse effects due to common therapies, can vary substantially depending on methodologic decisions. The objective of this study was to assess the robustness of the rosiglitazone results by using alternative reasonable methodologic approaches and by analyzing additional related outcomes. Findings In duplicate and independently, we abstracted all myocardial and cerebrovascular ischemic events from all randomized controlled trials listed on the manufacturer's web site meeting inclusion criteria of the original meta-analysis (at least 24 weeks of rosiglitazone exposure in the intervention group and any control group without rosiglitazone. We performed meta-analyses of these data under different methodologic conditions. An unconfounded comparison that includes only trials (or arms of trials in which medications apart from rosiglitazone are identical suggests higher risks than previously reported, making even the risk of cardiovascular death statistically significant. Alternatively, meta-analysis that includes all trials comparing a treatment arm receiving rosiglitazone to any control arm without rosiglitazone (as in the original meta-analysis but also including trials with no events in both the rosiglitazone and control arms (not incorporated in the original meta-analysis, shows adverse but non-statistically significant effects of rosiglitazone on myocardial infarction and cardiovascular mortality. Rosiglitazone appears to have inconsistent effects on a wider range of cardiovascular outcomes. It increases the risk of a broad range of myocardial ischemic events (not just myocardial infarction. However, its effect on cerebrovascular ischemic events suggests benefit

  1. Intraarticular Facet Injections for Low Back Pain: Design Considerations, Consensus Methodology to Develop the Protocol for a Randomized Controlled Trial.

    Science.gov (United States)

    Mars, Tom; Ellard, David R; Antrobus, James H L; Cairns, Melinda; Underwood, Martin; Haywood, Kirstie; Keohane, Susie; Sandhu, Harbinder; Griffiths, Frances

    2015-01-01

    Since the publication of guidelines by the UK National Institute for Health and Care Excellence (NICE) and the American Pain Society guidelines for low back pain in 2009 there have been deep divisions in the pain treatment community about the use of therapeutic intraarticular facet joint injections. While evidence for the effectiveness or not of intraarticular facet joint injections remains sparse, uncertainty will remain. The Warwick feasibility study, along with a concurrent study with a different design led by another group, aims to provide a stable platform from which the effectiveness and cost effectiveness of intraarticular facet joint injections added to normal care could be evaluated in randomized controlled trials (RCTs). To reach consensus on key design considerations for the Warwick facet feasibility study from which the study protocol and working manuals will be developed. A consensus conference involving expert professionals and lay members. Preliminary work identified 5 key design considerations for deliberation at our consensus conference. Three concerned patient assessment and treatment: diagnosis of possible facet joint pain, interaarticular facet joint injection technique, and best usual care. Two concerned trial analysis: a priori sub-groups and minimally important difference and are reported elsewhere. We did systematic evidence reviews of the design considerations and summarized the evidence. Our design questions and evidence summaries were distributed to all delegates. This formed the basis for discussions on the day. Clinical experts in all aspects of facet joint injection from across the UK along with lay people were invited via relevant organizations. Nominal group technique was used in 15 facilitated initial small group discussions. Further discussion and ranking was undertaken in plenary. All small group and plenary results were recorded and checked and verified post conference. Where necessary participants were contacted via email to

  2. Determining optimal sample sizes for multistage adaptive randomized clinical trials from an industry perspective using value of information methods.

    Science.gov (United States)

    Chen, Maggie H; Willan, Andrew R

    2013-02-01

    Most often, sample size determinations for randomized clinical trials are based on frequentist approaches that depend on somewhat arbitrarily chosen factors, such as type I and II error probabilities and the smallest clinically important difference. As an alternative, many authors have proposed decision-theoretic (full Bayesian) approaches, often referred to as value of information methods that attempt to determine the sample size that maximizes the difference between the trial's expected utility and its expected cost, referred to as the expected net gain. Taking an industry perspective, Willan proposes a solution in which the trial's utility is the increase in expected profit. Furthermore, Willan and Kowgier, taking a societal perspective, show that multistage designs can increase expected net gain. The purpose of this article is to determine the optimal sample size using value of information methods for industry-based, multistage adaptive randomized clinical trials, and to demonstrate the increase in expected net gain realized. At the end of each stage, the trial's sponsor must decide between three actions: continue to the next stage, stop the trial and seek regulatory approval, or stop the trial and abandon the drug. A model for expected total profit is proposed that includes consideration of per-patient profit, disease incidence, time horizon, trial duration, market share, and the relationship between trial results and probability of regulatory approval. The proposed method is extended to include multistage designs with a solution provided for a two-stage design. An example is given. Significant increases in the expected net gain are realized by using multistage designs. The complexity of the solutions increases with the number of stages, although far simpler near-optimal solutions exist. The method relies on the central limit theorem, assuming that the sample size is sufficiently large so that the relevant statistics are normally distributed. From a value of

  3. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  4. An optimized SPE-LC-MS/MS method for antibiotics residue analysis in ground, surface and treated water samples by response surface methodology- central composite design.

    Science.gov (United States)

    Mirzaei, Roya; Yunesian, Masoud; Nasseri, Simin; Gholami, Mitra; Jalilzadeh, Esfandiyar; Shoeibi, Shahram; Bidshahi, Hooshang Shafieyan; Mesdaghinia, Alireza

    2017-01-01

    Antibiotic residues are being constantly identified in environmental waters at low concentration. Growing concern has been expressed over the adverse environmental and human health effects even at low concentration. Hence, it is crucial to develop a multi-residues analytical method for antibiotics to generate a considerable dataset which are necessary in the assessment of aquatic toxicity of environmental waters for aquatic organisms and human health. This work aimed to develop a reliable and sensitive multi-residue method based on high performance liquid chromatography coupled with quadrupole-linear ion trap tandem mass spectrometry (HPLC-MS-MS). The method was optimized and validated for simultaneous determination of four classes of antibiotics including, β-lactam, macrolide, fluoroquinolone and nitro-imidazole in treated, ground and surface water matrices. In order to optimize the solid phase extraction process, main parameters influencing the extraction process including, pH, the volume of elution solvent and the amount of Na4EDTA were evaluated. The optimization of extraction process was carried out by response surface methodology using central composite design. Analysis of variance was performed for nine target antibiotics using response surface methodology. The extraction recoveries were found to be sensitive to the independent variables of pH, the volume of elution solvent and the amount of Na4EDTA. The extraction process was pH-dependent and pH was a significant model term in the extraction process of all target antibiotics. Method validation was performed in optimum operation conditions in which the recoveries were obtained in the range of 50-117% for seven antibiotics in spiked treated and ground water samples and for six antibiotics in spiked river water samples. Method validation parameters in terms of method detection limit were obtained in the range of 1-10 ng/L in treated water, 0.8-10 ng/L in the ground water and 0.8-25 ng/L in river water

  5. Global Stratigraphy of Venus: Analysis of a Random Sample of Thirty-Six Test Areas

    Science.gov (United States)

    Basilevsky, Alexander T.; Head, James W., III

    1995-01-01

    The age relations between 36 impact craters with dark paraboloids and other geologic units and structures at these localities have been studied through photogeologic analysis of Magellan SAR images of the surface of Venus. Geologic settings in all 36 sites, about 1000 x 1000 km each, could be characterized using only 10 different terrain units and six types of structures. These units and structures form a major stratigraphic and geologic sequence (from oldest to youngest): (1) tessera terrain; (2) densely fractured terrains associated with coronae and in the form of remnants among plains; (3) fractured and ridged plains and ridge belts; (4) plains with wrinkle ridges; (5) ridges associated with coronae annulae and ridges of arachnoid annulae which are contemporary with wrinkle ridges of the ridged plains; (6) smooth and lobate plains; (7) fractures of coronae annulae, and fractures not related to coronae annulae, which disrupt ridged and smooth plains; (8) rift-associated fractures; and (9) craters with associated dark paraboloids, which represent the youngest 1O% of the Venus impact crater population (Campbell et al.), and are on top of all volcanic and tectonic units except the youngest episodes of rift-associated fracturing and volcanism; surficial streaks and patches are approximately contemporary with dark-paraboloid craters. Mapping of such units and structures in 36 randomly distributed large regions (each approximately 10(exp 6) sq km) shows evidence for a distinctive regional and global stratigraphic and geologic sequence. On the basis of this sequence we have developed a model that illustrates several major themes in the history of Venus. Most of the history of Venus (that of its first 80% or so) is not preserved in the surface geomorphological record. The major deformation associated with tessera formation in the period sometime between 0.5-1.0 b.y. ago (Ivanov and Basilevsky) is the earliest event detected. In the terminal stages of tessera fon

  6. Use of pornography in a random sample of Norwegian heterosexual couples.

    Science.gov (United States)

    Daneback, Kristian; Traeen, Bente; Månsson, Sven-Axel

    2009-10-01

    This study examined the use of pornography in couple relationships to enhance the sex-life. The study contained a representative sample of 398 heterosexual couples aged 22-67 years. Data collection was carried out by self-administered postal questionnaires. The majority (77%) of the couples did not report any kind of pornography use to enhance the sex-life. In 15% of the couples, both had used pornography; in 3% of the couples, only the female partner had used pornography; and, in 5% of the couples, only the male partner had used pornography for this purpose. Based on the results of a discriminant function analysis, it is suggested that couples where one or both used pornography had a more permissive erotic climate compared to the couples who did not use pornography. In couples where only one partner used pornography, we found more problems related to arousal (male) and negative (female) self-perception. These findings could be of importance for clinicians who work with couples.

  7. Novel approach to systematic random sampling in population surveys: Lessons from the United Arab Emirates National Diabetes Study (UAEDIAB).

    Science.gov (United States)

    Sulaiman, Nabil; Albadawi, Salah; Abusnana, Salah; Fikri, Mahmoud; Madani, Abdulrazzag; Mairghani, Maisoon; Alawadi, Fatheya; Zimmet, Paul; Shaw, Jonathan

    2015-09-01

    The prevalence of diabetes has risen rapidly in the Middle East, particularly in the Gulf Region. However, some prevalence estimates have not fully accounted for large migrant worker populations and have focused on minority indigenous populations. The objectives of the UAE National Diabetes and Lifestyle Study are to: (i) define the prevalence of, and risk factors for, T2DM; (ii) describe the distribution and determinants of T2DM risk factors; (iii) study health knowledge, attitudes, and (iv) identify gene-environment interactions; and (v) develop baseline data for evaluation of future intervention programs. Given the high burden of diabetes in the region and the absence of accurate data on non-UAE nationals in the UAE, a representative sample of the non-UAE nationals was essential. We used an innovative methodology in which non-UAE nationals were sampled when attending the mandatory biannual health check that is required for visa renewal. Such an approach could also be used in other countries in the region. Complete data were available for 2719 eligible non-UAE nationals (25.9% Arabs, 70.7% Asian non-Arabs, 1.1% African non-Arabs, and 2.3% Westerners). Most were men service and sales, and unskilled occupations. Most (37.4%) had completed high school and 4.1% had a postgraduate degree. This novel methodology could provide insights for epidemiological studies in the UAE and other Gulf States, particularly for expatriates. © 2015 Ruijin Hospital, Shanghai Jiaotong University School of Medicine and Wiley Publishing Asia Pty Ltd.

  8. Estimating screening-mammography receiver operating characteristic (ROC) curves from stratified random samples of screening mammograms: a simulation study.

    Science.gov (United States)

    Zur, Richard M; Pesce, Lorenzo L; Jiang, Yulei

    2015-05-01

    To evaluate stratified random sampling (SRS) of screening mammograms by (1) Breast Imaging Reporting and Data System (BI-RADS) assessment categories, and (2) the presence of breast cancer in mammograms, for estimation of screening-mammography receiver operating characteristic (ROC) curves in retrospective observer studies. We compared observer study case sets constructed by (1) random sampling (RS); (2) SRS with proportional allocation (SRS-P) with BI-RADS 1 and 2 noncancer cases accounting for 90.6% of all noncancer cases; (3) SRS with disproportional allocation (SRS-D) with BI-RADS 1 and 2 noncancer cases accounting for 10%-80%; and (4) SRS-D and multiple imputation (SRS-D + MI) with missing BI-RADS 1 and 2 noncancer cases imputed to recover the 90.6% proportion. Monte Carlo simulated case sets were drawn from a large case population modeled after published Digital Mammography Imaging Screening Trial data. We compared the bias, root-mean-square error, and coverage of 95% confidence intervals of area under the ROC curve (AUC) estimates from the sampling methods (200-2000 cases, of which 25% were cancer cases) versus from the large case population. AUC estimates were unbiased from RS, SRS-P, and SRS-D + MI, but biased from SRS-D. AUC estimates from SRS-P and SRS-D + MI had 10% smaller root-mean-square error than RS. Both SRS-P and SRS-D + MI can be used to obtain unbiased and 10% more efficient estimate of screening-mammography ROC curves. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  9. Analytical and sampling methodology used for determining the impact of a MHD test facility. [Al, As, Ca, Cd, Fe, K, Mn, Pb, Zn, B, Be, Bi, Co, Cr, Hg, Mg, Mo, Ni, Sb, Se, Sn, V

    Energy Technology Data Exchange (ETDEWEB)

    Jenke, D.R.

    1984-01-01

    The suitability of various analytical and sampling methodologies for the determination of the environmental impact of a test scale MHD (magnetohydrodynamic) power generation facility is evaluated. Media which were sampled included airbourne particulates, soils, vegetation and natural waters. Analytical techniques examined include plasma emission and atomic absorption spectrometry and ion chromatography. Except in the case of the sampling of particulates with moss ball samplers, the analytical and sampling methodologies can be integrated in such a manner so as to maximize precision and thereby facilitate the statistical evaluation of impact effects. 12 references, 1 figure, 7 tables.

  10. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Directory of Open Access Journals (Sweden)

    Rosa Catarino

    Full Text Available Human papillomavirus (HPV self-sampling (self-HPV is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab.A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA. After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET. HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic.HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2 detected by s-DRY, 56.2% (95%CI: 47.6-64.4 by Dr-WET, and 54.6% (95%CI: 46.1-62.9 by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34, and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56. Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+ were: 64.0% (95%CI: 44.5-79.8 for s-FTA, 84.6% (95%CI: 66.5-93.9 for s-DRY, and 76.9% (95%CI: 58.0-89.0 for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%. Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD/per card vs. ~1 USD/per swab.Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA.International Standard Randomized Controlled Trial Number (ISRCTN: 43310942.

  11. Evaluation of the methodologies used to generate random pavement profiles based on the power spectral density: An approach based on the International Roughness Index

    Directory of Open Access Journals (Sweden)

    Boris Jesús Goenaga

    2017-01-01

    Full Text Available The pavement roughness is the main variable that produces the vertical excitation in vehicles. Pavement profiles are the main determinant of (i discomfort perception on users and (ii dynamic loads generated at the tire-pavement interface, hence its evaluation constitutes an essential step on a Pavement Management System. The present document evaluates two specific techniques used to simulate pavement profiles; these are the shaping filter and the sinusoidal approach, both based on the Power Spectral Density. Pavement roughness was evaluated using the International Roughness Index (IRI, which represents the most used index to characterize longitudinal road profiles. Appropriate parameters were defined in the simulation process to obtain pavement profiles with specific ranges of IRI values using both simulation techniques. The results suggest that using a sinusoidal approach one can generate random profiles with IRI values that are representative of different road types, therefore, one could generate a profile for a paved or an unpaved road, representing all the proposed categories defined by ISO 8608 standard. On the other hand, to obtain similar results using the shaping filter approximation a modification in the simulation parameters is necessary. The new proposed values allow one to generate pavement profiles with high levels of roughness, covering a wider range of surface types. Finally, the results of the current investigation could be used to further improve our understanding on the effect of pavement roughness on tire pavement interaction. The evaluated methodologies could be used to generate random profiles with specific levels of roughness to assess its effect on dynamic loads generated at the tire-pavement interface and user’s perception of road condition.

  12. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling

    Directory of Open Access Journals (Sweden)

    Fuqun Zhou

    2016-10-01

    Full Text Available Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS. It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  13. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    Science.gov (United States)

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  14. Media Use and Source Trust among Muslims in Seven Countries: Results of a Large Random Sample Survey

    Directory of Open Access Journals (Sweden)

    Steven R. Corman

    2013-12-01

    Full Text Available Despite the perceived importance of media in the spread of and resistance against Islamist extremism, little is known about how Muslims use different kinds of media to get information about religious issues, and what sources they trust when doing so. This paper reports the results of a large, random sample survey among Muslims in seven countries Southeast Asia, West Africa and Western Europe, which helps fill this gap. Results show a diverse set of profiles of media use and source trust that differ by country, with overall low trust in mediated sources of information. Based on these findings, we conclude that mass media is still the most common source of religious information for Muslims, but that trust in mediated information is low overall. This suggests that media are probably best used to persuade opinion leaders, who will then carry anti-extremist messages through more personal means.

  15. Enhancing positive parent-child interactions and family functioning in a poverty sample: a randomized control trial.

    Science.gov (United States)

    Negrão, Mariana; Pereira, Mariana; Soares, Isabel; Mesman, Judi

    2014-01-01

    This study tested the attachment-based intervention program Video-feedback Intervention to promote Positive Parenting and Sensitive Discipline (VIPP-SD) in a randomized controlled trial with poor families of toddlers screened for professional's concerns about the child's caregiving environment. The VIPP-SD is an evidence-based intervention, but has not yet been tested in the context of poverty. The sample included 43 families with 1- to 4-year-old children: mean age at the pretest was 29 months and 51% were boys. At the pretest and posttest, mother-child interactions were observed at home, and mothers reported on family functioning. The VIPP-SD proved to be effective in enhancing positive parent-child interactions and positive family relations in a severely deprived context. Results are discussed in terms of implications for support services provided to such poor families in order to reduce intergenerational risk transmission.

  16. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  17. Rationale, Design, Samples, and Baseline Sun Protection in a Randomized Trial on a Skin Cancer Prevention Intervention in Resort Environments

    Science.gov (United States)

    Buller, David B.; Andersen, Peter A.; Walkosz, Barbara J.; Scott, Michael D.; Beck, Larry; Cutter, Gary R.

    2016-01-01

    Introduction Exposure to solar ultraviolet radiation during recreation is a risk factor for skin cancer. A trial evaluating an intervention to promote advanced sun protection (sunscreen pre-application/reapplication; protective hats and clothing; use of shade) during vacations. Materials and Methods Adult visitors to hotels/resorts with outdoor recreation (i.e., vacationers) participated in a group-randomized pretest-posttest controlled quasi-experimental design in 2012–14. Hotels/resorts were pair-matched and randomly assigned to the intervention or untreated control group. Sun protection (e.g., clothing, hats, shade and sunscreen) was measured in cross-sectional samples by observation and a face-to-face intercept survey during two-day visits. Results Initially, 41 hotel/resorts (11%) participated but 4 dropped out before posttest. Hotel/resorts were diverse (employees=30 to 900; latitude=24o 78′ N to 50o 52′ N; elevation=2 ft. to 9,726 ft. above sea level), and had a variety of outdoor venues (beaches/pools, court/lawn games, golf courses, common areas, and chairlifts). At pretest, 4,347 vacationers were observed and 3,531 surveyed. More females were surveyed (61%) than observed (50%). Vacationers were mostly 35–60 years old, highly educated (college education = 68%) and non-Hispanic white (93%), with high-risk skin types (22%). Vacationers reported covering 60% of their skin with clothing. Also, 40% of vacationers used shade; 60% applied sunscreen; and 42% had been sunburned. Conclusions The trial faced challenges recruiting resorts but result show that the large, multi-state sample of vacationers were at high risk for solar UV exposure. PMID:26593781

  18. Do health care institutions value research? A mixed methods study of barriers and facilitators to methodological rigor in pediatric randomized trials.

    Science.gov (United States)

    Hamm, Michele P; Scott, Shannon D; Klassen, Terry P; Moher, David; Hartling, Lisa

    2012-10-18

    Pediatric randomized controlled trials (RCTs) are susceptible to a high risk of bias. We examined the barriers and facilitators that pediatric trialists face in the design and conduct of unbiased trials. We used a mixed methods design, with semi-structured interviews building upon the results of a quantitative survey. We surveyed Canadian (n=253) and international (n=600) pediatric trialists regarding their knowledge and awareness of bias and their perceived barriers and facilitators in conducting clinical trials. We then interviewed 13 participants from different subspecialties and geographic locations to gain a more detailed description of how their experiences and attitudes towards research interacted with trial design and conduct. The survey response rate was 23.0% (186/807). 68.1% of respondents agreed that bias is a problem in pediatric RCTs and 72.0% felt that there is sufficient evidence to support changing some aspects of how trials are conducted. Knowledge related to bias was variable, with inconsistent awareness of study design features that may introduce bias into a study. Interview participants highlighted a lack of formal training in research methods, a negative research culture, and the pragmatics of trial conduct as barriers. Facilitators included contact with knowledgeable and supportive colleagues and infrastructure for research. A lack of awareness of bias and negative attitudes towards research present significant barriers in terms of conducting methodologically rigorous pediatric RCTs. Knowledge translation efforts must focus on these issues to ensure the relevance and validity of trial results.

  19. A pilot double-blind, randomized, placebo-controlled trial of the efficacy of trace elements in the treatment of endometriosis-related pain: study design and methodology

    Directory of Open Access Journals (Sweden)

    Oberweis D

    2016-02-01

    Full Text Available Didier Oberweis,1 Patrick Madelenat,2 Michelle Nisolle,3 Etienne Demanet4 1Department of Gynecology and Obstetrics, CHU de Charleroi, Hôpital André Vésale, Montigny-le-Tilleul, Belgium; 2Private Consultation, Paris, France; 3Department of Gynecology and Obstetrics, CHR Citadelle, Liège, 4Clinical Research Unit, Charleroi, Belgium Abstract: Endometriosis is one of the most common benign gynecological disorders, affecting almost 10%–15% of all women of reproductive age and >30% of infertile women. The pathology is associated with various distressing symptoms, particularly pelvic pain, which adversely affect patients' quality of life. It is an estrogen-dependent disease. There is evidence both in animals and in humans that metal ions can activate the estrogen receptors. They are defined as a variety of xenoestrogens, called metalloestrogens, which could act as endocrine disruptors. Therefore, it could be considered to act on this gynecological disorder using food supplements containing trace elements (ie, nutripuncture. The assumption is that they could modulate estrogen receptors and thus influence the tropism and the survival of cells involved in endometriosis. By a modulation of the antioxidant system, they might also interact with various parameters influencing tissue biochemistry. The objective of this article is to describe and discuss the design and methodology of an ongoing double-blind, randomized, placebo-controlled study aiming to evaluate the efficacy of metal trace elements on the reduction of pain and improvement of quality of life, in patients with a revised American Fertility Society Score Stages II–IV endometriosis, combined or not with adenomyosis, during a treatment period of 4 months. Trace elements or placebo is proposed in the absence of any other treatment or as an add-on to current therapies, such as sexual hormones, nonsteroidal anti-inflammatory drugs, and surgery. A placebo run-in period of one menstrual cycle or

  20. The French national survey on food consumption of children under 3 years of age - Nutri-Bébé 2013: design, methodology, population sampling and feeding practices.

    Science.gov (United States)

    Chouraqui, Jean-Pierre; Tavoularis, Gabriel; Emery, Yves; Francou, Aurée; Hébel, Pascale; Bocquet, Magali; Hankard, Régis; Turck, Dominique

    2017-11-02

    To update the data on food consumption and practices in children under 3 years of age in metropolitan France. The Nutri-Bébé 2013 cross-sectional study selected a random sample, according to the quota sampling method. After giving their informed consent, parents had to record the food consumption during three non-consecutive days framed by two face-to-face interviews, using for quantitative information different portion size measurement aids. One thousand one hundred and eighty-four children were enrolled. Mothers' mean age was 30·8 (sd 5·4) years; 38 % were primiparous; 89 % lived with a partner; 60 % had an occupation. Of the infants younger than 4 months, 31 % were breast-fed. One thousand and thirty-five children consumed infant formula followed by growing-up milk in 63 % of them; solid foods were introduced at a mean age of 5·4 (sd 2·13) months. From 8 months onwards, 25 % of children consumed the same foods as their parents on a more or less regular basis; 29 % ate in front of a screen, with a daily average screen time of 43·0 (sd 40·4) min. This robust survey highlights the low prevalence and duration of breast-feeding in France and shows a modest improvement since the previous survey of 2005 in the observance of recommendations concerning other feeding practices. The frequent consumption of adult foods and the screen time are of concern.

  1. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  2. New methodology for the determination of phthalate esters, bisphenol A, bisphenol A diglycidyl ether, and nonylphenol in commercial whole milk samples.

    Science.gov (United States)

    Casajuana, Neus; Lacorte, Sílvia

    2004-06-16

    This paper reports a new methodology aimed at determining dimethyl phthalate, diethyl phthalate, di-n-butyl phthalate, butylbenzyl phthalate, bis(2-ethylhexyl) phthalate, nonylphenol, bisphenol A, and bisphenol A diglycidyl ether in commercial whole milk. These compounds are used as plastic additives, lacquers, resins, or surfactants and can be found in milk due to contact with plastic materials during food processing and storage. They are all suspected endocrine disrupters or mutagens. A multiresidue method based in solid-phase extraction with C-18 cartridges followed by a cleanup step using disposable cartridges was developed. Detection and quantification were performed by gas chromatography coupled to mass spectrometric (GC-MS) detection using an appropriate surrogate (4-n-nonylphenol) and internal standard [deuterated bis(2-ethylhexyl) phthalate]. Limits of detection were from 0.06 to 0.36 microg/kg and intraday variation from 3 and 27%, with recoveries between 73 and 119%. Five brands of commercial whole milk processed and packed in different ways were analyzed. All samples contained target compounds at concentrations between 0.28 and 85.3 microg/kg, and the total concentration ranged between 79.3 and 187.4 microg/kg, the levels being higher in sterilized milks. Nonylphenol, diethyl phthalate, dibutyl phthalate, and bis(2-ethylhexyl) phthalate were the major contributors.

  3. Development of garlic bioactive compounds analytical methodology based on liquid phase microextraction using response surface design. Implications for dual analysis: Cooked and biological fluids samples.

    Science.gov (United States)

    Ramirez, Daniela Andrea; Locatelli, Daniela Ana; Torres-Palazzolo, Carolina Andrea; Altamirano, Jorgelina Cecilia; Camargo, Alejandra Beatriz

    2017-01-15

    Organosulphur compounds (OSCs) present in garlic (Allium sativum L.) are responsible of several biological properties. Functional foods researches indicate the importance of quantifying these compounds in food matrices and biological fluids. For this purpose, this paper introduces a novel methodology based on dispersive liquid-liquid microextraction (DLLME) coupled to high performance liquid chromatography with ultraviolet detector (HPLC-UV) for the extraction and determination of organosulphur compounds in different matrices. The target analytes were allicin, (E)- and (Z)-ajoene, 2-vinyl-4H-1,2-dithiin (2-VD), diallyl sulphide (DAS) and diallyl disulphide (DADS). The microextraction technique was optimized using an experimental design, and the analytical performance was evaluated under optimum conditions. The desirability function presented an optimal value for 600μL of chloroform as extraction solvent using acetonitrile as dispersant. The method proved to be reliable, precise and accurate. It was successfully applied to determine OSCs in cooked garlic samples as well as blood plasma and digestive fluids. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. High Field In Vivo 13C Magnetic Resonance Spectroscopy of Brain by Random Radiofrequency Heteronuclear Decoupling and Data Sampling

    Science.gov (United States)

    Li, Ningzhi; Li, Shizhe; Shen, Jun

    2017-06-01

    In vivo 13C magnetic resonance spectroscopy (MRS) is a unique and effective tool for studying dynamic human brain metabolism and the cycling of neurotransmitters. One of the major technical challenges for in vivo 13C-MRS is the high radio frequency (RF) power necessary for heteronuclear decoupling. In the common practice of in vivo 13C-MRS, alkanyl carbons are detected in the spectra range of 10-65ppm. The amplitude of decoupling pulses has to be significantly greater than the large one-bond 1H-13C scalar coupling (1JCH=125-145 Hz). Two main proton decoupling methods have been developed: broadband stochastic decoupling and coherent composite or adiabatic pulse decoupling (e.g., WALTZ); the latter is widely used because of its efficiency and superb performance under inhomogeneous B1 field. Because the RF power required for proton decoupling increases quadratically with field strength, in vivo 13C-MRS using coherent decoupling is often limited to low magnetic fields (Drug Administration (FDA). Alternately, carboxylic/amide carbons are coupled to protons via weak long-range 1H-13C scalar couplings, which can be decoupled using low RF power broadband stochastic decoupling. Recently, the carboxylic/amide 13C-MRS technique using low power random RF heteronuclear decoupling was safely applied to human brain studies at 7T. Here, we review the two major decoupling methods and the carboxylic/amide 13C-MRS with low power decoupling strategy. Further decreases in RF power deposition by frequency-domain windowing and time-domain random under-sampling are also discussed. Low RF power decoupling opens the possibility of performing in vivo 13C experiments of human brain at very high magnetic fields (such as 11.7T), where signal-to-noise ratio as well as spatial and temporal spectral resolution are more favorable than lower fields.

  5. Electronic symptom reporting between patient and provider for improved health care service quality: a systematic review of randomized controlled trials. part 2: methodological quality and effects.

    Science.gov (United States)

    Johansen, Monika Alise; Berntsen, Gro K Rosvold; Schuster, Tibor; Henriksen, Eva; Horsch, Alexander

    2012-10-03

    We conducted in two parts a systematic review of randomized controlled trials (RCTs) on electronic symptom reporting between patients and providers to improve health care service quality. Part 1 reviewed the typology of patient groups, health service innovations, and research targets. Four innovation categories were identified: consultation support, monitoring with clinician support, self-management with clinician support, and therapy. To assess the methodological quality of the RCTs, and summarize effects and benefits from the methodologically best studies. We searched Medline, EMBASE, PsycINFO, Cochrane Central Register of Controlled Trials, and IEEE Xplore for original studies presented in English-language articles between 1990 and November 2011. Risk of bias and feasibility were judged according to the Cochrane recommendation, and theoretical evidence and preclinical testing were evaluated according to the Framework for Design and Evaluation of Complex Interventions to Improve Health. Three authors assessed the risk of bias and two authors extracted the effect data independently. Disagreement regarding bias assessment, extraction, and interpretation of results were resolved by consensus discussions. Of 642 records identified, we included 32 articles representing 29 studies. No articles fulfilled all quality requirements. All interventions were feasible to implement in a real-life setting, and theoretical evidence was provided for almost all studies. However, preclinical testing was reported in only a third of the articles. We judged three-quarters of the articles to have low risk for random sequence allocation and approximately half of the articles to have low risk for the following biases: allocation concealment, incomplete outcome data, and selective reporting. Slightly more than one fifth of the articles were judged as low risk for blinding of outcome assessment. Only 1 article had low risk of bias for blinding of participants and personnel. We excluded 12

  6. Comparing attitudes about legal sanctions and teratogenic effects for cocaine, alcohol, tobacco and caffeine: A randomized, independent samples design

    Directory of Open Access Journals (Sweden)

    Alanis Kelly L

    2006-02-01

    Full Text Available Abstract Background Establishing more sensible measures to treat cocaine-addicted mothers and their children is essential for improving U.S. drug policy. Favorable post-natal environments have moderated potential deleterious prenatal effects. However, since cocaine is an illicit substance having long been demonized, we hypothesized that attitudes toward prenatal cocaine exposure would be more negative than for licit substances, alcohol, nicotine and caffeine. Further, media portrayals about long-term outcomes were hypothesized to influence viewers' attitudes, measured immediately post-viewing. Reducing popular crack baby stigmas could influence future policy decisions by legislators. In Study 1, 336 participants were randomly assigned to 1 of 4 conditions describing hypothetical legal sanction scenarios for pregnant women using cocaine, alcohol, nicotine or caffeine. Participants rated legal sanctions against pregnant women who used one of these substances and risk potential for developing children. In Study 2, 139 participants were randomly assigned to positive, neutral and negative media conditions. Immediately post-viewing, participants rated prenatal cocaine-exposed or non-exposed teens for their academic performance and risk for problems at age18. Results Participants in Study 1 imposed significantly greater legal sanctions for cocaine, perceiving prenatal cocaine exposure as more harmful than alcohol, nicotine or caffeine. A one-way ANOVA for independent samples showed significant differences, beyond .0001. Post-hoc Sheffe test illustrated that cocaine was rated differently from other substances. In Study 2, a one-way ANOVA for independent samples was performed on difference scores for the positive, neutral or negative media conditions about prenatal cocaine exposure. Participants in the neutral and negative media conditions estimated significantly lower grade point averages and more problems for the teen with prenatal cocaine exposure

  7. Comparing attitudes about legal sanctions and teratogenic effects for cocaine, alcohol, tobacco and caffeine: A randomized, independent samples design

    Science.gov (United States)

    Ginsburg, Harvey J; Raffeld, Paul; Alanis, Kelly L; Boyce, Angela S

    2006-01-01

    Background Establishing more sensible measures to treat cocaine-addicted mothers and their children is essential for improving U.S. drug policy. Favorable post-natal environments have moderated potential deleterious prenatal effects. However, since cocaine is an illicit substance having long been demonized, we hypothesized that attitudes toward prenatal cocaine exposure would be more negative than for licit substances, alcohol, nicotine and caffeine. Further, media portrayals about long-term outcomes were hypothesized to influence viewers' attitudes, measured immediately post-viewing. Reducing popular crack baby stigmas could influence future policy decisions by legislators. In Study 1, 336 participants were randomly assigned to 1 of 4 conditions describing hypothetical legal sanction scenarios for pregnant women using cocaine, alcohol, nicotine or caffeine. Participants rated legal sanctions against pregnant women who used one of these substances and risk potential for developing children. In Study 2, 139 participants were randomly assigned to positive, neutral and negative media conditions. Immediately post-viewing, participants rated prenatal cocaine-exposed or non-exposed teens for their academic performance and risk for problems at age18. Results Participants in Study 1 imposed significantly greater legal sanctions for cocaine, perceiving prenatal cocaine exposure as more harmful than alcohol, nicotine or caffeine. A one-way ANOVA for independent samples showed significant differences, beyond .0001. Post-hoc Sheffe test illustrated that cocaine was rated differently from other substances. In Study 2, a one-way ANOVA for independent samples was performed on difference scores for the positive, neutral or negative media conditions about prenatal cocaine exposure. Participants in the neutral and negative media conditions estimated significantly lower grade point averages and more problems for the teen with prenatal cocaine exposure than for the non-exposed teen

  8. Association of macronutrient intake patterns with being overweight in a population-based random sample of men in France.

    Science.gov (United States)

    Ahluwalia, N; Ferrières, J; Dallongeville, J; Simon, C; Ducimetière, P; Amouyel, P; Arveiler, D; Ruidavets, J-B

    2009-04-01

    Diet is considered an important modifiable factor in the overweight. The role of macronutrients in obesity has been examined in general in selected populations, but the results of these studies are mixed, depending on the potential confounders and adjustments for other macronutrients. For this reason, we examined the association between macronutrient intake patterns and being overweight in a population-based representative sample of middle-aged (55.1+/-6.1 years) men (n=966), using various adjustment modalities. The study subjects kept 3-day food-intake records, and the standard cardiovascular risk factors were assessed. Weight, height and waist circumference (WC) were also measured. Carbohydrate intake was negatively associated and fat intake was positively associated with body mass index (BMI) and WC in regression models adjusted for energy intake and other factors, including age, smoking and physical activity. However, with mutual adjustments for other energy-yielding nutrients, the negative association of carbohydrate intake with WC remained significant, whereas the associations between fat intake and measures of obesity did not. Adjusted odds ratios (95% confidence interval) comparing the highest and lowest quartiles of carbohydrate intake were 0.50 (0.25-0.97) for obesity (BMI>29.9) and 0.41 (0.23-0.73) for abdominal obesity (WC>101.9 cm). Consistent negative associations between carbohydrate intake and BMI and WC were seen in this random representative sample of the general male population. The associations between fat intake and these measures of being overweight were attenuated on adjusting for carbohydrate intake. Thus, the balance of carbohydrate-to-fat intake is an important element in obesity in a general male population, and should be highlighted in dietary guidelines.

  9. Assessing causality in associations between cannabis use and schizophrenia risk: a two-sample Mendelian randomization study.

    Science.gov (United States)

    Gage, S H; Jones, H J; Burgess, S; Bowden, J; Davey Smith, G; Zammit, S; Munafò, M R

    2017-04-01

    Observational associations between cannabis and schizophrenia are well documented, but ascertaining causation is more challenging. We used Mendelian randomization (MR), utilizing publicly available data as a method for ascertaining causation from observational data. We performed bi-directional two-sample MR using summary-level genome-wide data from the International Cannabis Consortium (ICC) and the Psychiatric Genomics Consortium (PGC2). Single nucleotide polymorphisms (SNPs) associated with cannabis initiation (p schizophrenia (p cannabis initiation on risk of schizophrenia [odds ratio (OR) 1.04 per doubling odds of cannabis initiation, 95% confidence interval (CI) 1.01-1.07, p = 0.019]. There was strong evidence consistent with a causal effect of schizophrenia risk on likelihood of cannabis initiation (OR 1.10 per doubling of the odds of schizophrenia, 95% CI 1.05-1.14, p = 2.64 × 10-5). Findings were as predicted for the negative control (height: OR 1.00, 95% CI 0.99-1.01, p = 0.90) but weaker than predicted for the positive control (years in education: OR 0.99, 95% CI 0.97-1.00, p = 0.066) analyses. Our results provide some that cannabis initiation increases the risk of schizophrenia, although the size of the causal estimate is small. We find stronger evidence that schizophrenia risk predicts cannabis initiation, possibly as genetic instruments for schizophrenia are stronger than for cannabis initiation.

  10. Multiple-image authentication with a cascaded multilevel architecture based on amplitude field random sampling and phase information multiplexing.

    Science.gov (United States)

    Fan, Desheng; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Pan, Xuemei; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi

    2015-04-10

    A multiple-image authentication method with a cascaded multilevel architecture in the Fresnel domain is proposed, in which a synthetic encoded complex amplitude is first fabricated, and its real amplitude component is generated by iterative amplitude encoding, random sampling, and space multiplexing for the low-level certification images, while the phase component of the synthetic encoded complex amplitude is constructed by iterative phase information encoding and multiplexing for the high-level certification images. Then the synthetic encoded complex amplitude is iteratively encoded into two phase-type ciphertexts located in two different transform planes. During high-level authentication, when the two phase-type ciphertexts and the high-level decryption key are presented to the system and then the Fresnel transform is carried out, a meaningful image with good quality and a high correlation coefficient with the original certification image can be recovered in the output plane. Similar to the procedure of high-level authentication, in the case of low-level authentication with the aid of a low-level decryption key, no significant or meaningful information is retrieved, but it can result in a remarkable peak output in the nonlinear correlation coefficient of the output image and the corresponding original certification image. Therefore, the method realizes different levels of accessibility to the original certification image for different authority levels with the same cascaded multilevel architecture.

  11. Mental health impact of the 2010 Haiti earthquake on the Miami Haitian population: A random-sample survey.

    Science.gov (United States)

    Messiah, Antoine; Acuna, Juan M; Castro, Grettel; de la Vega, Pura Rodríguez; Vaiva, Guillaume; Shultz, James; Neria, Yuval; De La Rosa, Mario

    2014-07-01

    This study examined the mental health consequences of the January 2010 Haiti earthquake on Haitians living in Miami-Dade County, Florida, 2-3 years following the event. A random-sample household survey was conducted from October 2011 through December 2012 in Miami-Dade County, Florida. Haitian participants (N = 421) were assessed for their earthquake exposure and its impact on family, friends, and household finances; and for symptoms of posttraumatic stress disorder (PTSD), anxiety, and major depression; using standardized screening measures and thresholds. Exposure was considered as "direct" if the interviewee was in Haiti during the earthquake. Exposure was classified as "indirect" if the interviewee was not in Haiti during the earthquake but (1) family members or close friends were victims of the earthquake, and/or (2) family members were hosted in the respondent's household, and/or (3) assets or jobs were lost because of the earthquake. Interviewees who did not qualify for either direct or indirect exposure were designated as "lower" exposure. Eight percent of respondents qualified for direct exposure, and 63% qualified for indirect exposure. Among those with direct exposure, 19% exceeded threshold for PTSD, 36% for anxiety, and 45% for depression. Corresponding percentages were 9%, 22% and 24% for respondents with indirect exposure, and 6%, 14%, and 10% for those with lower exposure. A majority of Miami Haitians were directly or indirectly exposed to the earthquake. Mental health distress among them remains considerable two to three years post-earthquake.

  12. Mental Health Impact of Hosting Disaster Refugees: Analyses from a Random Sample Survey Among Haitians Living in Miami.

    Science.gov (United States)

    Messiah, Antoine; Lacoste, Jérôme; Gokalsing, Erick; Shultz, James M; Rodríguez de la Vega, Pura; Castro, Grettel; Acuna, Juan M

    2016-08-01

    Studies on the mental health of families hosting disaster refugees are lacking. This study compares participants in households that hosted 2010 Haitian earthquake disaster refugees with their nonhost counterparts. A random sample survey was conducted from October 2011 through December 2012 in Miami-Dade County, Florida. Haitian participants were assessed regarding their 2010 earthquake exposure and impact on family and friends and whether they hosted earthquake refugees. Using standardized scores and thresholds, they were evaluated for symptoms of three common mental disorders (CMDs): posttraumatic stress disorder, generalized anxiety disorder, and major depressive disorder (MDD). Participants who hosted refugees (n = 51) had significantly higher percentages of scores beyond thresholds for MDD than those who did not host refugees (n = 365) and for at least one CMD, after adjusting for participants' earthquake exposures and effects on family and friends. Hosting refugees from a natural disaster appears to elevate the risk for MDD and possibly other CMDs, independent of risks posed by exposure to the disaster itself. Families hosting refugees deserve special attention.

  13. Randomization modeling to ascertain clustering patterns of human papillomavirus types detected in cervicovaginal samples in the United States.

    Directory of Open Access Journals (Sweden)

    Troy David Querec

    Full Text Available Detection of multiple human papillomavirus (HPV types in the genital tract is common. Associations among HPV types may impact HPV vaccination modeling and type replacement. The objectives were to determine the distribution of concurrent HPV type infections in cervicovaginal samples and examine type-specific associations. We analyzed HPV genotyping results from 32,245 cervicovaginal specimens collected from women aged 11 to 83 years in the United States from 2001 through 2011. Statistical power was enhanced by combining 6 separate studies. Expected concurrent infection frequencies from a series of permutation models, each with increasing fidelity to the real data, were compared with the observed data. Statistics were computed based on the distributional properties of the randomized data. Concurrent detection occurred more than expected with 0 or ≥3 HPV types and less than expected with 1 and 2 types. Some women bear a disproportionate burden of the HPV type prevalence. Type associations were observed that exceeded multiple hypothesis corrected significance. Multiple HPV types were detected more frequently than expected by chance and associations among particular HPV types were detected. However vaccine-targeted types were not specifically affected, supporting the expectation that current bivalent/quadrivalent HPV vaccination will not result in type replacement with other high-risk types.

  14. The use of a lot quality assurance sampling methodology to assess and manage primary health interventions in conflict-affected West Darfur, Sudan.

    Science.gov (United States)

    Pham, Kiemanh; Sharpe, Emily Chambers; Weiss, William M; Vu, Alexander

    2016-01-01

    Organizations working in conflict-affected areas have a need to monitor and evaluate their programs, however this is often difficult due to the logistical challenges of conflict areas. Lot quality assurance sampling may be a suitable method of assessing programs in these situations. We conducted a secondary data analysis of information collected during Medair's routine program management functions. Medair's service area in West Darfur, Sudan was divided into seven supervisory areas. Using the available population information, a sampling frame was developed and interviews were conducted from randomly selected caretakers of children in each supervisory area every six months over 19 months. A survey instrument with questions related to key indicators for immunizations and maternal, newborn, and child health was used for the interviews. Based on Medair's goals for each indicator, decision rules were calculated for the indicators; these decision rules determined which supervisory areas and indicators performed adequately in each assessment period. Pearson's chi-squared tests, adjusted for the survey design using STATA "svy: tab" commands, were used to detect overall differences in coverage in this analysis. The coverage of tetanus toxoid vaccination among pregnant women increased from 47.2 to 69.7 % (p value = 0.046), and births attended by a skilled health professional increased from 35.7 to 52.7 % (p value = 0.025) from the first to last assessment periods. Measles vaccinations declined from 72.0 to 54.1 % (p value = 0.046). The estimated coverage for the proportion of women receiving a postpartum dose of vitamin A (54.7 to 61.3 %, p value = 0.44); pregnant women receiving a clean delivery kit (54.6 to 47.1 %, p value = 0.49); and pentavalent vaccinations (49.7 to 42.1 %, p value = 0.28) did not significantly change. Lot quality assurance sampling was a feasible method for Medair staff to evaluate and optimize primary health programs in

  15. [Acupuncture and moxibustion for peripheral facial palsy at different stages: multi-central large-sample randomized controlled trial].

    Science.gov (United States)

    Li, Ying; Li, Yan; Liu, Li-an; Zhao, Ling; Hu, Ka-ming; Wu, Xi; Chen, Xiao-qin; Li, Gui-ping; Mang, Ling-ling; Qi, Qi-hua

    2011-04-01

    To explore the best intervention time of acupuncture and moxibustion for peripheral facial palsy (Bell's palsy) and the clinical advantage program of selective treatment with acupuncture and moxibustion. Multi-central large-sample randomized controlled trial was carried out. Nine hundreds cases of Bell's palsy were randomized into 5 treatment groups, named selective filiform needle group (group A), selective acupuncture + moxibustion group (group B), selective acupuncture + electroacupuncture (group C), selective acupuncture + line-up needling on muscle region of meridian group (group D) and non-selective filiform needle group (group E). Four sessions of treatment were required in each group. Separately, during the enrollment, after 4 sessions of treatment, in 1 month and 3 months of follow-up after treatment, House-Brackmann Scale, Facial Disability Index Scale and Degree of Facial Nerve Paralysis (NFNP) were adopted for efficacy assessment. And the efficacy systematic analysis was provided in view of the intervention time and nerve localization of disease separately. The curative rates of intervention in acute stage and resting stage were 50.1% (223/445) and 52.1% (162/311), which were superior to recovery stage (25.9%, 35/135) separately. There were no statistical significant differences in efficacy in comparison among 5 treatment programs at the same stage (all P > 0.05). The efficacy of intervention of group A and group E in acute stage was superior to that in recovery stage (both P < 0.01). The difference was significant statistically between the efficacy on the localization above chorda tympani nerve and that on the localization below the nerve in group D (P < 0.01). The efficacy on the localization below chorda tympani nerve was superior to the localization above the nerve. The best intervention time for the treatment of Bell's palsy is in acute stage and resting stage, meaning 1 to 3 weeks after occurrence. All of the 5 treatment programs are advantageous

  16. Differentiating intraprofessional attitudes toward paradigms in health care delivery among chiropractic factions: results from a randomly sampled survey

    Science.gov (United States)

    2014-01-01

    Background As health care has increased in complexity and health care teams have been offered as a solution, so too is there an increased need for stronger interprofessional collaboration. However the intraprofessional factions that exist within every profession challenge interprofessional communication through contrary paradigms. As a contender in the conservative spinal health care market, factions within chiropractic that result in unorthodox practice behaviours may compromise interprofessional relations and that profession’s progress toward institutionalization. The purpose of this investigation was to quantify the professional stratification among Canadian chiropractic practitioners and evaluate the practice perceptions of those factions. Methods A stratified random sample of 740 Canadian chiropractors was surveyed to determine faction membership and how professional stratification could be related to views that could be considered unorthodox to current evidence-based care and guidelines. Stratification in practice behaviours is a stated concern of mainstream medicine when considering interprofessional referrals. Results Of 740 deliverable questionnaires, 503 were returned for a response rate of 68%. Less than 20% of chiropractors (18.8%) were aligned with a predefined unorthodox perspective of the conditions they treat. Prediction models suggest that unorthodox perceptions of health practice related to treatment choices, x-ray use and vaccinations were strongly associated with unorthodox group membership (X2 =13.4, p = 0.0002). Conclusion Chiropractors holding unorthodox views may be identified based on response to specific beliefs that appear to align with unorthodox health practices. Despite continued concerns by mainstream medicine, only a minority of the profession has retained a perspective in contrast to current scientific paradigms. Understanding the profession’s factions is important to the anticipation of care delivery when considering

  17. Methodology for the development of a drug library based upon collision-induced fragmentation for the identification of toxicologically relevant drugs in plasma samples.

    Science.gov (United States)

    Lips, A G; Lameijer, W; Fokkens, R H; Nibbering, N M

    2001-08-15

    The possibility of creating a robust mass spectral library with use of high-performance liquid chromatography-atmospheric pressure-electrospray ionization (HPLC-AP-ESI) for the identification of drugs misused in cases of clinical toxicology has been examined. Factors reported as influencing the fragmentation induced by "source transport region collision induced dissociation" (CID) have been tested in this study (i.e. solvent, pH, different acids or buffer salts and their concentration, different organic modifiers and the modifier concentration). The tests performed on a few "model drugs" were analysed with use of two different single quadrupole instruments. The large number of mass spectra obtained appears to be affected by the mobile phase conditions to only a minor extent. This also holds for the mass spectra obtained at two different instruments (laboratories). Subsequently breakdown curves have been measured for about 20 randomly chosen drugs by variation of the kinetic energy of their ions in the CID zone through changing the fragmenter voltage. These breakdown curves were used to optimize the fragmenter voltage for each drug. The optimized fragmenter voltages were then applied by use of a variably ramped fragmenter voltage to acquire mass spectra for the library. The chromatographic separations were run on a Zorbax Stable bond column using a 10-mM ammonium formate-acetonitrile gradient method. Spiked blank serum and patient samples with a total of 40 different drugs were extracted with use of a standard basic liquid-liquid extraction (LLE) method. A search of significant peaks in the chromatogram by application of the developed mass spectral library is shown to result in a more than 95% positive identification. reserved.

  18. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  19. From Planning to Implementation: An Examination of Changes in the Research Design, Sample Size, and Precision of Group Randomized Trials Launched by the Institute of Education Sciences

    Science.gov (United States)

    Spybrook, Jessaca; Puente, Anne Cullen; Lininger, Monica

    2013-01-01

    This article examines changes in the research design, sample size, and precision between the planning phase and implementation phase of group randomized trials (GRTs) funded by the Institute of Education Sciences. Thirty-eight GRTs funded between 2002 and 2006 were examined. Three studies revealed changes in the experimental design. Ten studies…

  20. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Science.gov (United States)

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  1. Changes in prevalence of, and risk factors for, lameness in random samples of English sheep flocks: 2004-2013.

    Science.gov (United States)

    Winter, Joanne R; Kaler, Jasmeet; Ferguson, Eamonn; KilBride, Amy L; Green, Laura E

    2015-11-01

    The aims of this study were to update the prevalence of lameness in sheep in England and identify novel risk factors. A total of 1260 sheep farmers responded to a postal survey. The survey captured detailed information on the period prevalence of lameness from May 2012-April 2013 and the prevalence and farmer naming of lesions attributable to interdigital dermatitis (ID), severe footrot (SFR), contagious ovine digital dermatitis (CODD) and shelly hoof (SH), management and treatment of lameness, and farm and flock details. The global mean period prevalence of lameness fell between 2004 and 2013 from 10.6% to 4.9% and the geometric mean period prevalence of lameness fell from 5.4% (95% CL: 4.7%-6.0%) to 3.5% (95% CI: 3.3%-3.7%). In 2013, more farmers were using vaccination and antibiotic treatment for ID and SFR and fewer farmers were using foot trimming as a routine or therapeutic treatment than in 2004. Two over-dispersed Poisson regression models were developed with the outcome the period prevalence of lameness, one investigated associations with farmer estimates of prevalence of the four foot lesions and one investigated associations with management practices to control and treat lameness and footrot. A prevalence of ID>10%, SFR>2.5% and CODD>2.5% were associated with a higher prevalence of lameness compared with those lesions being absent, however, the prevalence of SH was not associated with a change in risk of lameness. A key novel management risk associated with higher prevalence of lameness was the rate of feet bleeding/100 ewes trimmed/year. In addition, vaccination of ewes once per year and selecting breeding replacements from never-lame ewes were associated with a decreased risk of lameness. Other factors associated with a lower risk of lameness for the first time in a random sample of farmers and a full risk model were: recognising lameness in sheep at locomotion score 1 compared with higher scores, treatment of the first lame sheep in a group compared

  2. Nutritional status and falls in community-dwelling older people: a longitudinal study of a population-based random sample.

    Science.gov (United States)

    Chien, Ming-Hung; Guo, How-Ran

    2014-01-01

    Falls are common in older people and may lead to functional decline, disability, and death. Many risk factors have been identified, but studies evaluating effects of nutritional status are limited. To determine whether nutritional status is a predictor of falls in older people living in the community, we analyzed data collected through the Survey of Health and Living Status of the Elderly in Taiwan (SHLSET). SHLSET include a series of interview surveys conducted by the government on a random sample of people living in community dwellings in the nation. We included participants who received nutritional status assessment using the Mini Nutritional Assessment Taiwan Version 2 (MNA-T2) in the 1999 survey when they were 53 years or older and followed up on the cumulative incidence of falls in the one-year period before the interview in the 2003 survey. At the beginning of follow-up, the 4440 participants had a mean age of 69.5 (standard deviation= 9.1) years, and 467 participants were "not well-nourished," which was defined as having an MNA-T2 score of 23 or less. In the one-year study period, 659 participants reported having at least one fall. After adjusting for other risk factors, we found the associated odds ratio for falls was 1.73 (95% confidence interval, 1.23, 2.42) for "not well-nourished," 1.57 (1.30, 1.90) for female gender, 1.03 (1.02, 1.04) for one-year older, 1.55 (1.22, 1.98) for history of falls, 1.34 (1.05, 1.72) for hospital stay during the past 12 months, 1.66 (1.07, 2.58) for difficulties in activities of daily living, and 1.53 (1.23, 1.91) for difficulties in instrumental activities of daily living. Nutritional status is an independent predictor of falls in older people living in the community. Further studies are warranted to identify nutritional interventions that can help prevent falls in the elderly.

  3. Innovative Methodologies for 21st Century Learning, Teaching and Assessment: A Convenience Sampling Investigation into the Use of Social Media Technologies in Higher Education

    Science.gov (United States)

    Kivunja, Charles

    2015-01-01

    The advent of the Web as a social technology has created opportunities for the creation of informal learning environments, which have potential for innovative methodologies in learning, teaching and assessment. However, as Wolfe (2001) admonishes, "contrary to the rhetoric of cheerleaders, the Web places greater demands on students than…

  4. Stages of change of the readiness to quit smoking among a random sample of minority Arab male smokers in Israel.

    Science.gov (United States)

    Daoud, Nihaya; Hayek, Samah; Sheikh Muhammad, Ahmad; Abu-Saad, Kathleen; Osman, Amira; Thrasher, James F; Kalter-Leibovici, Ofra

    2015-07-16

    Despite advanced smoking prevention and cessation policies in many countries, the prevalence of cigarette smoking among indigenous and some ethnic minorities continues to be high. This study examined the stages of change (SOC) of the readiness to quit smoking among Arab men in Israel shortly after new regulations of free-of-charge smoking cessation workshops and subsidized medications were introduced through primary health care clinics. We conducted a countrywide study in Israel between 2012-2013. Participants, 735 current smokers; 18-64 years old; were recruited from a stratified random sample and interviewed face-to-face using a structured questionnaire in Arabic. We used ordered regression to examine the contribution of socio-economic position (SEP), health status, psychosocial attributes, smoking-related factors, and physician advice to the SOC of the readiness to quit smoking (pre-contemplation, contemplation and preparation). Of the current smokers, 61.8% were at the pre-contemplation stage, 23.8% were at the contemplation stage, and only 14.4% were at the preparation stage. In the multinomial analysis, factors significantly (P stage compared to pre-contemplation stage included [odds ratio (OR), 95% confidence interval (CI)]: chronic morbidity [0.52, (0.31-0.88)], social support [1.35, (1.07-1.70)], duration of smoking for 11-21 years [1.94, (1.07-3.50)], three or more previous attempts to quit [2.27, (1.26-4.01)], knowledge about smoking hazards [1.75, (1.29-2.35)], positive attitudes toward smoking prevention [1.44, (1.14-1.82)], and physician advice to quit smoking [1.88, (1.19-2.97)]. The factors significantly (P stage compared to pre-contemplation stage were [OR, (95 % CI)]: chronic morbidity [0.36, (0.20-0.67)], anxiety [1.07, (1.01-1.13)], social support [1.34, (1.01-1.78)], duration of smoking 5 years or less [2.93, (1.14-7.52)], three or more previous attempts to quit [3.16, (1.60-6.26)], knowledge about smoking hazards [1.57, (1.10-2.21)], and

  5. The Accuracy of Pass/Fail Decisions in Random and Difficulty-Balanced Domain-Sampling Tests.

    Science.gov (United States)

    Schnipke, Deborah L.

    A common practice in some certification fields (e.g., information technology) is to draw items from an item pool randomly and apply a common passing score, regardless of the items administered. Because these tests are commonly used, it is important to determine how accurate the pass/fail decisions are for such tests and whether fairly small,…

  6. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in differen...... codings and analysis, and tapping into the global network of social media.......This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  7. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... ways. Several contributions draw on a critical perspective that pushes the boundaries of traditional methods and techniques for studying tourists and their experiences. In particular, the traditional qualitative interview is challenged, not only regarding the typical questions asked, but also regarding...

  8. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations

    OpenAIRE

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2016-01-01

    Cluster-level dynamic treatment regimens can be used to guide sequential, intervention or treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level DTR, the intervention or treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including based on aggregate measures of the individuals or patients that comprise it. Cluster-randomized sequentia...

  9. Approximating the variance of estimated means for systematic random sampling, illustrated with data of the French Soil Monitoring Network

    NARCIS (Netherlands)

    Brus, D.J.; Saby, N.P.A.

    2016-01-01

    In France like in many other countries, the soil is monitored at the locations of a regular, square grid thus forming a systematic sample (SY). This sampling design leads to good spatial coverage, enhancing the precision of design-based estimates of spatial means and totals. Design-based

  10. Endothelial nitric oxide synthase (G894T) gene polymorphism in a random sample of the Egyptian population: comparison with myocardial infarction patients.

    Science.gov (United States)

    Gad, Mohamed Z; Abdel Rahman, Mohamed F; Hashad, Ingy M; Abdel-Maksoud, Sahar M; Farag, Nabil M; Abou-Aisha, Khaled

    2012-07-01

    The aim of this study was to detect endothelial nitric oxide synthase (eNOS) Glu298Asp gene variants in a random sample of the Egyptian population, compare it with those from other populations, and attempt to correlate these variants with serum levels of nitric oxide (NO). The association of eNOS genotypes or serum NO levels with the incidence of acute myocardial infarction (AMI) was also examined. One hundred one unrelated healthy subjects and 104 unrelated AMI patients were recruited randomly from the 57357 Hospital and intensive care units of El Demerdash Hospital and National Heart Institute, Cairo, Egypt. eNOS genotypes were determined by polymerase chain reaction-restriction fragment length polymorphism. Serum NO was determined spectrophotometrically. The genotype distribution of eNOS Glu298Asp polymorphism determined for our sample was 58.42% GG (wild type), 33.66% GT, and 7.92% TT genotypes while allele frequencies were 75.25% and 24.75% for G and T alleles, respectively. No significant association between serum NO and specific eNOS genotype could be detected. No significant correlation between eNOS genotype distribution or allele frequencies and the incidence of AMI was observed. The present study demonstrated the predominance of the homozygous genotype GG over the heterozygous GT and homozygous TT in random samples of Egyptian population. It also showed the lack of association between eNOS genotypes and mean serum levels of NO, as well as the incidence of AMI.

  11. Electroanalytical Methodology for the Direct Determination of 2,4-Dichlorophenoxyacetic Acid in Soil Samples Using a Graphite-Polyurethane Electrode

    Directory of Open Access Journals (Sweden)

    Fernanda Ramos de Andrade

    2014-01-01

    Full Text Available An electroanalytical methodology was developed for the direct determination of the herbicide 2,4-dichlorophenoxyacetic acid (2,4-D using a graphite-polyurethane composite electrode and square wave voltammetry (SWV. 2,4-D exhibited one reduction peak with characteristics of an irreversible process at −0.54 V (versus Ag/AgCl, which is controlled by the diffusion of the reagent on the electrode surface. After the experimental parameters optimization (pH 2.0, f=50 s−1, a=0.50 V, and ΔEi=0.03 V, analytical curves were constructed in the range of 0.66 mg L−1 to 2.62  mg L−1. Detection (LD and quantification (LQ limits were 17.6 μg L−1 and 58.6 μg L−1, respectively. The methodology was successfully applied to measure the percolation of the herbicide 2,4-D in undisturbed soil columns of different granulometric compositions.

  12. Methodological ignorance : A comment on field experiments and methodological intolerance

    NARCIS (Netherlands)

    Boumans, Marcel|info:eu-repo/dai/nl/075268574

    2016-01-01

    Glenn Harrison [Journal of Economic Methodology, 2013, 20, 103–117] discusses four related forms of methodological intolerance with respect to field experiments: field experiments should rely on some form of randomization, should be disconnected from theory, the concept of causality should only be

  13. Methodology for the detection of contamination by hydrocarbons and further soil sampling for volatile and semi-volatile organic enrichment in former petrol stations, SE Spain

    Directory of Open Access Journals (Sweden)

    Rosa María Rosales Aranda

    2012-01-01

    Full Text Available The optimal detection and quantification of contamination plumes in soil and groundwater by petroleum organic compounds, gasoline and diesel, is critical for the reclamation of hydrocarbons contaminated soil at petrol stations. Through this study it has been achieved a sampling stage optimization in these scenarios by means of the location of potential contamination areas before sampling with the application of the 2D electrical resistivity tomography method, a geophysical non destructive technique based on resistivity measurements in soils. After the detection of hydrocarbons contaminated areas, boreholes with continuous coring were performed in a petrol station located in Murcia Region (Spain. The drillholes reached depths down to 10 m and soil samples were taken from each meter of the drilling. The optimization in the soil samples handling and storage, for both volatile and semi-volatile organic compounds determinations, was achieved by designing a soil sampler to minimize volatilization losses and in order to avoid the manual contact with the environmental samples during the sampling. The preservation of soil samples was performed according to Europe regulations and US Environmental Protection Agency recommendations into two kinds of glass vials. Moreover, it has been taken into account the determination techniques to quantify the hydrocarbon pollution based on Gas Chromatography with different detectors and headspace technique to reach a liquid-gas equilibrium for volatile analyses.

  14. Design and methodology of the NorthStar Study: NT-proBNP stratified follow-up in outpatient heart failure clinics -- a randomized Danish multicenter study

    DEFF Research Database (Denmark)

    Schou, Morten; Gustafsson, Finn; Videbaek, Lars

    2008-01-01

    from 2006 to 2009. At present (March 2008), 720 patients are randomized. Results expect to be presented in the second half of 2010. CONCLUSIONS: This article outlines the design of the NorthStar study. If our hypotheses are confirmed, the results will help cardiologists and nurses in HFCs to identify...

  15. Rationale, design and methodology of a double-blind, randomized, placebo-controlled study of escitalopram in prevention of Depression in Acute Coronary Syndrome (DECARD)

    DEFF Research Database (Denmark)

    Hansen, Baiba Hedegaard; Hanash, Jamal Abed; Rasmussen, Alice

    2009-01-01

    with acute coronary syndrome. METHODS: Two hundred forty non-depressed patients with acute coronary syndrome are randomized to treatment with either escitalopram or placebo for 1 year. Psychiatric and cardiac assessment of patients is performed to evaluate the possibility of preventing depression. Diagnosis...

  16. Differentiating emotions across contexts: comparing adults with and without social anxiety disorder using random, social interaction, and daily experience sampling.

    Science.gov (United States)

    Kashdan, Todd B; Farmer, Antonina S

    2014-06-01

    The ability to recognize and label emotional experiences has been associated with well-being and adaptive functioning. This skill is particularly important in social situations, as emotions provide information about the state of relationships and help guide interpersonal decisions, such as whether to disclose personal information. Given the interpersonal difficulties linked to social anxiety disorder (SAD), deficient negative emotion differentiation may contribute to impairment in this population. We hypothesized that people with SAD would exhibit less negative emotion differentiation in daily life, and these differences would translate to impairment in social functioning. We recruited 43 people diagnosed with generalized SAD and 43 healthy adults to describe the emotions they experienced over 14 days. Participants received palmtop computers for responding to random prompts and describing naturalistic social interactions; to complete end-of-day diary entries, they used a secure online website. We calculated intraclass correlation coefficients to capture the degree of differentiation of negative and positive emotions for each context (random moments, face-to-face social interactions, and end-of-day reflections). Compared to healthy controls, the SAD group exhibited less negative (but not positive) emotion differentiation during random prompts, social interactions, and (at trend level) end-of-day assessments. These differences could not be explained by emotion intensity or variability over the 14 days, or to comorbid depression or anxiety disorders. Our findings suggest that people with generalized SAD have deficits in clarifying specific negative emotions felt at a given point of time. These deficits may contribute to difficulties with effective emotion regulation and healthy social relationship functioning.

  17. Differentiating Emotions Across Contexts: Comparing Adults with and without Social Anxiety Disorder Using Random, Social Interaction, and Daily Experience Sampling

    Science.gov (United States)

    Kashdan, Todd B.; Farmer, Antonina S.

    2014-01-01

    The ability to recognize and label emotional experiences has been associated with well-being and adaptive functioning. This skill is particularly important in social situations, as emotions provide information about the state of relationships and help guide interpersonal decisions, such as whether to disclose personal information. Given the interpersonal difficulties linked to social anxiety disorder (SAD), deficient negative emotion differentiation may contribute to impairment in this population. We hypothesized that people with SAD would exhibit less negative emotion differentiation in daily life, and these differences would translate to impairment in social functioning. We recruited 43 people diagnosed with generalized SAD and 43 healthy adults to describe the emotions they experienced over 14 days. Participants received palmtop computers for responding to random prompts and describing naturalistic social interactions; to complete end-of-day diary entries, they used a secure online website. We calculated intraclass correlation coefficients to capture the degree of differentiation of negative and positive emotions for each context (random moments, face-to-face social interactions, and end-of-day reflections). Compared to healthy controls, the SAD group exhibited less negative (but not positive) emotion differentiation during random prompts, social interactions, and (at trend level) end-of-day assessments. These differences could not be explained by emotion intensity or variability over the 14 days, or to comorbid depression or anxiety disorders. Our findings suggest that people with generalized SAD have deficits in clarifying specific negative emotions felt at a given point of time. These deficits may contribute to difficulties with effective emotion regulation and healthy social relationship functioning. PMID:24512246

  18. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  19. Genetically predicted body mass index and Alzheimer's disease-related phenotypes in three large samples: Mendelian randomization analyses.

    Science.gov (United States)

    Mukherjee, Shubhabrata; Walter, Stefan; Kauwe, John S K; Saykin, Andrew J; Bennett, David A; Larson, Eric B; Crane, Paul K; Glymour, M Maria

    2015-12-01

    Observational research shows that higher body mass index (BMI) increases Alzheimer's disease (AD) risk, but it is unclear whether this association is causal. We applied genetic variants that predict BMI in Mendelian randomization analyses, an approach that is not biased by reverse causation or confounding, to evaluate whether higher BMI increases AD risk. We evaluated individual-level data from the AD Genetics Consortium (ADGC: 10,079 AD cases and 9613 controls), the Health and Retirement Study (HRS: 8403 participants with algorithm-predicted dementia status), and published associations from the Genetic and Environmental Risk for AD consortium (GERAD1: 3177 AD cases and 7277 controls). No evidence from individual single-nucleotide polymorphisms or polygenic scores indicated BMI increased AD risk. Mendelian randomization effect estimates per BMI point (95% confidence intervals) were as follows: ADGC, odds ratio (OR) = 0.95 (0.90-1.01); HRS, OR = 1.00 (0.75-1.32); GERAD1, OR = 0.96 (0.87-1.07). One subscore (cellular processes not otherwise specified) unexpectedly predicted lower AD risk. Copyright © 2015 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  20. Family constellation seminars improve psychological functioning in a general population sample: results of a randomized controlled trial.

    Science.gov (United States)

    Weinhold, Jan; Hunger, Christina; Bornhäuser, Annette; Link, Leoni; Rochon, Justine; Wild, Beate; Schweitzer, Jochen

    2013-10-01

    The study examined the efficacy of nonrecurring family constellation seminars on psychological health. We conducted a monocentric, single-blind, stratified, and balanced randomized controlled trial (RCT). After choosing their roles for participating in a family constellation seminar as either active participant (AP) or observing participant (OP), 208 adults (M = 48 years, SD = 10; 79% women) from the general population were randomly allocated to the intervention group (IG; 3-day family constellation seminar; 64 AP, 40 OP) or a wait-list control group (WLG; 64 AP, 40 OP). It was predicted that family constellation seminars would improve psychological functioning (Outcome Questionnaire OQ-45.2) at 2-week and 4-month follow-ups. In addition, we assessed the impact of family constellation seminars on psychological distress and motivational incongruence. The IG showed significantly improved psychological functioning (d = 0.45 at 2-week follow-up, p = .003; d = 0.46 at 4-month follow-up, p = .003). Results were confirmed for psychological distress and motivational incongruence. No adverse events were reported. This RCT provides evidence for the efficacy of family constellation in a nonclinical population. The implications of the findings are discussed.

  1. Current practice in methodology and reporting of the sample size calculation in randomised trials of hip and knee osteoarthritis: a protocol for a systematic review.

    Science.gov (United States)

    Copsey, Bethan; Dutton, Susan; Fitzpatrick, Ray; Lamb, Sarah E; Cook, Jonathan A

    2017-10-10

    A key aspect of the design of randomised controlled trials (RCTs) is determining the sample size. It is important that the trial sample size is appropriately calculated. The required sample size will differ by clinical area, for instance, due to the prevalence of the condition and the choice of primary outcome. Additionally, it will depend upon the choice of target difference assumed in the calculation. Focussing upon the hip and knee osteoarthritis population, this study aims to systematically review how the trial size was determined for trials of osteoarthritis, on what basis, and how well these aspects are reported. Several electronic databases (Medline, Cochrane library, CINAHL, EMBASE, PsycINFO, PEDro and AMED) will be searched to identify articles on RCTs of hip and knee osteoarthritis published in 2016. Articles will be screened for eligibility and data extracted independently by two reviewers. Data will be extracted on study characteristics (design, population, intervention and control treatments), primary outcome, chosen sample size and justification, parameters used to calculate the sample size (including treatment effect in control arm, level of variability in primary outcome, loss to follow-up rates). Data will be summarised across the studies using appropriate summary statistics (e.g. n and %, median and interquartile range). The proportion of studies which report each key component of the sample size calculation will be presented. The reproducibility of the sample size calculation will be tested. The findings of this systematic review will summarise the current practice for sample size calculation in trials of hip and knee osteoarthritis. It will also provide evidence on the completeness of the reporting of the sample size calculation, reproducibility of the chosen sample size and the basis for the values used in the calculation. As this review was not eligible to be registered on PROSPERO, the summary information was uploaded to Figshare to make it

  2. Effects of delayed compared with early umbilical cord clamping on maternal postpartum hemorrhage and cord blood gas sampling: a randomized trial.

    Science.gov (United States)

    Andersson, Ola; Hellström-Westas, Lena; Andersson, Dan; Clausen, Jesper; Domellöf, Magnus

    2013-05-01

    To investigate the effect of delayed cord clamping (DCC) compared with early cord clamping (ECC) on maternal postpartum hemorrhage (PPH) and umbilical cord blood gas sampling. Secondary analysis of a parallel-group, single-center, randomized controlled trial. Swedish county hospital. 382 term deliveries after a low-risk pregnancy. Deliveries were randomized to DCC (≥180 seconds, n = 193) or ECC (≤10 seconds, n = 189). Maternal blood loss was estimated by the midwife. Samples for blood gas analysis were taken from one umbilical artery and the umbilical vein, from the pulsating unclamped cord in the DCC group and from the double-clamped cord in the ECC group. Samples were classified as valid when the arterial-venous difference was -0.02 or less for pH and 0.5 kPa or more for pCO2 . Main outcome measures. PPH and proportion of valid blood gas samples. The differences between the DCC and ECC groups with regard to PPH (1.2%, p = 0.8) and severe PPH (-2.7%, p = 0.3) were small and non-significant. The proportion of valid blood gas samples was similar between the DCC (67%, n = 130) and ECC (74%, n = 139) groups, with 6% (95% confidence interval: -4%-16%, p = 0.2) fewer valid samples after DCC. Delayed cord clamping, compared with early, did not have a significant effect on maternal postpartum hemorrhage or on the proportion of valid blood gas samples. We conclude that delayed cord clamping is a feasible method from an obstetric perspective. © 2012 The Authors Acta Obstetricia et Gynecologica Scandinavica© 2012 Nordic Federation of Societies of Obstetrics and Gynecology.

  3. Sample size determination for assessing equivalence based on proportion ratio under a randomized trial with non-compliance and missing outcomes.

    Science.gov (United States)

    Lui, Kung-Jong; Chang, Kuang-Chao

    2008-01-15

    When a generic drug is developed, it is important to assess the equivalence of therapeutic efficacy between the new and the standard drugs. Although the number of publications on testing equivalence and its relevant sample size determination is numerous, the discussion on sample size determination for a desired power of detecting equivalence under a randomized clinical trial (RCT) with non-compliance and missing outcomes is limited. In this paper, we derive under the compound exclusion restriction model the maximum likelihood estimator (MLE) for the ratio of probabilities of response among compliers between two treatments in a RCT with both non-compliance and missing outcomes. Using the MLE with the logarithmic transformation, we develop an asymptotic test procedure for assessing equivalence and find that this test procedure can perform well with respect to type I error based on Monte Carlo simulation. We further develop a sample size calculation formula for a desired power of detecting equivalence at a nominal alpha-level. To evaluate the accuracy of the sample size calculation formula, we apply Monte Carlo simulation again to calculate the simulated power of the proposed test procedure corresponding to the resulting sample size for a desired power of 80 per cent at 0.05 level in a variety of situations. We also include a discussion on determining the optimal ratio of sample size allocation subject to a desired power to minimize a linear cost function and provide a sensitivity analysis of the sample size formula developed here under an alterative model with missing at random. Copyright (c) 2007 John Wiley & Sons, Ltd.

  4. Variance results for the second and third reduced sample moments in neutron multiplicity counting for randomly triggered or signal-triggered counting gates

    Energy Technology Data Exchange (ETDEWEB)

    Burr, T. [Statistical Sciences Group, Los Alamos National Laboratory, Mail Stop F600, Los Alamos, NM 87545 (United States)], E-mail: tburr@lanl.gov; Butterfield, K. [Advanced Nuclear Technology Group, Los Alamos National Laboratory, Mail Stop F600, Los Alamos, NM 87545 (United States)

    2008-09-01

    Neutron multiplicity counting is an established method to estimate the spontaneous fission rate, and therefore also the plutonium mass for example, in a sample that includes other neutron sources. The extent to which the sample and detector obey the 'point model' assumptions impacts the estimate's total measurement error, but, in nearly all cases, for the random error contribution, it is useful to evaluate the variances of the second and third reduced sample moments of the neutron source strength. Therefore, this paper derives exact expressions for the variances and covariances of the second and third reduced sample moments for either randomly triggered or signal-triggered non-overlapping counting gates, and compares them to the corresponding variances in simulated data. Approximate expressions are also provided for the case of overlapping counting gates. These variances and covariances are useful in figure of merit calculations to predict assay performance prior to data collection. In addition, whenever real data are available, a bootstrap method is presented as an alternate but effective way to estimate these variances.

  5. Long-term sampling of CO(2) from waste-to-energy plants: (14)C determination methodology, data variation and uncertainty.

    Science.gov (United States)

    Fuglsang, Karsten; Pedersen, Niels Hald; Larsen, Anna Warberg; Astrup, Thomas Fruergaard

    2014-02-01

    A dedicated sampling and measurement method was developed for long-term measurements of biogenic and fossil-derived CO(2) from thermal waste-to-energy processes. Based on long-term sampling of CO(2) and (14)C determination, plant-specific emission factors can be determined more accurately, and the annual emission of fossil CO(2) from waste-to-energy plants can be monitored according to carbon trading schemes and renewable energy certificates. Weekly and monthly measurements were performed at five Danish waste incinerators. Significant variations between fractions of biogenic CO(2) emitted were observed, not only over time, but also between plants. From the results of monthly samples at one plant, the annual mean fraction of biogenic CO(2) was found to be 69% of the total annual CO(2) emissions. From weekly samples, taken every 3 months at the five plants, significant seasonal variations in biogenic CO(2) emissions were observed (between 56% and 71% biogenic CO(2)). These variations confirmed that biomass fractions in the waste can vary considerably, not only from day to day but also from month to month. An uncertainty budget for the measurement method itself showed that the expanded uncertainty of the method was ± 4.0 pmC (95 % confidence interval) at 62 pmC. The long-term sampling method was found to be useful for waste incinerators for determination of annual fossil and biogenic CO(2) emissions with relatively low uncertainty.

  6. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis

    OpenAIRE

    Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa

    2013-01-01

    Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare ...

  7. A double-blind randomized controlled trial to study the efficacy of topiramate in a civilian sample of PTSD.

    Science.gov (United States)

    Yeh, Mary S L; Mari, Jair Jesus; Costa, Mariana Caddrobi Pupo; Andreoli, Sergio Baxter; Bressan, Rodrigo Affonseca; Mello, Marcelo Feijó

    2011-10-01

    To evaluate the efficacy and tolerability of topiramate in patients with posttraumatic stress disorder (PTSD). We conducted a 12-week double-blind, randomized, placebo-controlled study comparing topiramate to placebo. Men and women aged 18-62 years with diagnosis of PTSD according to DSM-IV were recruited from the outpatient clinic of the violence program of Federal University of São Paulo Hospital (Prove-UNIFESP), São Paulo City, between April 2006 and December 2009. Subjects were assessed for the Clinician-Administered Posttraumatic Stress Scale (CAPS), Clinical Global Impression, and Beck Depression Inventory (BDI). After 1-week period of washout, 35 patients were randomized to either group. The primary outcome measure was the CAPS total score changes from baseline to the endpoint. 82.35% of patients in the topiramate group exhibited improvements in PTSD symptoms. The efficacy analysis demonstrated that patients in the topiramate group exhibited significant improvements in reexperiencing symptoms: flashbacks, intrusive memories, and nightmares of the trauma (CAPS-B; P= 0.04) and in avoidance/numbing symptoms associated with the trauma, social isolation, and emotional numbing (CAPS-C; P= 0.0001). Furthermore, the experimental group demonstrated a significant difference in decrease in CAPS total score (topiramate -57.78; placebo -32.41; P= 0.0076). Mean topiramate dose was 102.94 mg/d. Topiramate was generally well tolerated. Topiramate was effective in improving reexperiencing and avoidance/numbing symptom clusters in patients with PTSD. This study supports the use of anticonvulsants for the improvement of symptoms of PTSD. © 2010 Blackwell Publishing Ltd.

  8. An econometric method for estimating population parameters from non-random samples: An application to clinical case finding.

    Science.gov (United States)

    Burger, Rulof P; McLaren, Zoë M

    2017-09-01

    The problem of sample selection complicates the process of drawing inference about populations. Selective sampling arises in many real world situations when agents such as doctors and customs officials search for targets with high values of a characteristic. We propose a new method for estimating population characteristics from these types of selected samples. We develop a model that captures key features of the agent's sampling decision. We use a generalized method of moments with instrumental variables and maximum likelihood to estimate the population prevalence of the characteristic of interest and the agents' accuracy in identifying targets. We apply this method to tuberculosis (TB), which is the leading infectious disease cause of death worldwide. We use a national database of TB test data from South Africa to examine testing for multidrug resistant TB (MDR-TB). Approximately one quarter of MDR-TB cases was undiagnosed between 2004 and 2010. The official estimate of 2.5% is therefore too low, and MDR-TB prevalence is as high as 3.5%. Signal-to-noise ratios are estimated to be between 0.5 and 1. Our approach is widely applicable because of the availability of routinely collected data and abundance of potential instruments. Using routinely collected data to monitor population prevalence can guide evidence-based policy making. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Purposive versus random sampling for map validation: a case study on ecotope maps of floodplains in the Netherlands

    NARCIS (Netherlands)

    Knotters, M.; Brus, D.J.

    2013-01-01

    The quality of ecotope maps of five districts of main water courses in the Netherlands was assessed on the basis of independent validation samples of field observations. The overall proportion of area correctly classified, and user's and producer's accuracy for each map unit were estimated. In four

  10. Estimation of the Box Correction for Degrees of Freedom from Sample Data in Randomized Block and Split-Plot Designs

    Science.gov (United States)

    Huynh, Huynh; Feldt, Leonard S.

    1976-01-01

    When the variance assumptions of a repeated measures ANOVA are not met, the F distribution of the mean square ratio should be adjusted by the sample estimate of the Box correction factor. An alternative is proposed which is shown by Monte Carlo methods to be less biased for a moderately large factor. (RC)

  11. Randomized controlled trial of endoscopic ultrasound-guided fine-needle sampling with or without suction for better cytological diagnosis

    DEFF Research Database (Denmark)

    Puri, Rajesh; Vilmann, Peter; Saftoiu, Adrian

    2009-01-01

    ). The samples were characterized for cellularity and bloodiness, with a final cytology diagnosis established blindly. The final diagnosis was reached either by EUS-FNA if malignancy was definite, or by surgery and/or clinical follow-up of a minimum of 6 months in the cases of non-specific benign lesions...

  12. HPV self-sampling or the Pap-smear: a randomized study among cervical screening nonattenders from lower socioeconomic groups in France.

    Science.gov (United States)

    Sancho-Garnier, H; Tamalet, C; Halfon, P; Leandri, F X; Le Retraite, L; Djoufelkit, K; Heid, P; Davies, P; Piana, L

    2013-12-01

    Today in France, low attendance to cervical screening by Papanicolaou cytology (Pap-smear) is a major contributor to the 3,000 new cervical cancer cases and 1,000 deaths that occur from this disease every year. Nonattenders are mostly from lower socioeconomic groups and testing of self-obtained samples for high-risk Human Papilloma virus (HPV) types has been proposed as a method to increase screening participation in these groups. In 2011, we conducted a randomized study of women aged 35-69 from very low-income populations around Marseille who had not responded to an initial invitation for a free Pap-smear. After randomization, one group received a second invitation for a free Pap-smear and the other group was offered a free self-sampling kit for HPV testing. Participation rates were significantly different between the two groups with only 2.0% of women attending for a Pap-smear while 18.3% of women returned a self-sample for HPV testing (p ≤ 0.001). The detection rate of high-grade lesions (≥CIN2) was 0.2‰ in the Pap-smear group and 1.25‰ in the self-sampling group (p = 0.01). Offering self-sampling increased participation rates while the use of HPV testing increased the detection of cervical lesions (≥CIN2) in comparison to the group of women receiving a second invitation for a Pap-smear. However, low compliance to follow-up in the self-sampling group reduces the effectiveness of this screening approach in nonattenders women and must be carefully managed. Copyright © 2013 UICC.

  13. Prevalence and serogroup diversity of Salmonella for broiler neck skin, whole carcass rinse, and whole carcass enrichment sampling methodologies following air or immersion chilling

    Science.gov (United States)

    The purpose of this study was to evaluate neck skin (NS), whole carcass rinse (WCR), and whole carcass enrichment (WCE) sampling procedures for Salmonella isolation and serogroup from the same broiler carcass following either air or immersion chilling. Commercially processed and eviscerated broiler ...

  14. A novel statistical methodology to overcome sampling irregularities in the forest inventory data and to model forest changes under dynamic disturbance regimes

    Science.gov (United States)

    Nikolay Strigul; Jean. Lienard

    2015-01-01

    Forest inventory datasets offer unprecedented opportunities to model forest dynamics under evolving environmental conditions but they are analytically challenging due to irregular sampling time intervals of the same plot, across the years. We propose here a novel method to model dynamic changes in forest biomass and basal area using forest inventory data. Our...

  15. A school-based comprehensive lifestyle intervention among chinese kids against obesity (CLICK-Obesity): rationale, design and methodology of a randomized controlled trial in Nanjing city, China

    Science.gov (United States)

    2012-01-01

    Background The prevalence of childhood obesity among adolescents has been rapidly rising in Mainland China in recent decades, especially in urban and rich areas. There is an urgent need to develop effective interventions to prevent childhood obesity. Limited data regarding adolescent overweight prevention in China are available. Thus, we developed a school-based intervention with the aim of reducing excess body weight in children. This report described the study design. Methods/design We designed a cluster randomized controlled trial in 8 randomly selected urban primary schools between May 2010 and December 2013. Each school was randomly assigned to either the intervention or control group (four schools in each group). Participants were the 4th graders in each participating school. The multi-component program was implemented within the intervention group, while students in the control group followed their usual health and physical education curriculum with no additional intervention program. The intervention consisted of four components: a) classroom curriculum, (including physical education and healthy diet education), b) school environment support, c) family involvement, and d) fun programs/events. The primary study outcome was body composition, and secondary outcomes were behaviour and behavioural determinants. Discussion The intervention was designed with due consideration of Chinese cultural and familial tradition, social convention, and current primary education and exam system in Mainland China. We did our best to gain good support from educational authorities, school administrators, teachers and parents, and to integrate intervention components into schools’ regular academic programs. The results of and lesson learned from this study will help guide future school-based childhood obesity prevention programs in Mainland China. Trial registration Registration number: ChiCTR-ERC-11001819 PMID:22545755

  16. A school-based comprehensive lifestyle intervention among chinese kids against obesity (CLICK-Obesity: rationale, design and methodology of a randomized controlled trial in Nanjing city, China

    Directory of Open Access Journals (Sweden)

    Xu Fei

    2012-06-01

    Full Text Available Abstract Background The prevalence of childhood obesity among adolescents has been rapidly rising in Mainland China in recent decades, especially in urban and rich areas. There is an urgent need to develop effective interventions to prevent childhood obesity. Limited data regarding adolescent overweight prevention in China are available. Thus, we developed a school-based intervention with the aim of reducing excess body weight in children. This report described the study design. Methods/design We designed a cluster randomized controlled trial in 8 randomly selected urban primary schools between May 2010 and December 2013. Each school was randomly assigned to either the intervention or control group (four schools in each group. Participants were the 4th graders in each participating school. The multi-component program was implemented within the intervention group, while students in the control group followed their usual health and physical education curriculum with no additional intervention program. The intervention consisted of four components: a classroom curriculum, (including physical education and healthy diet education, b school environment support, c family involvement, and d fun programs/events. The primary study outcome was body composition, and secondary outcomes were behaviour and behavioural determinants. Discussion The intervention was designed with due consideration of Chinese cultural and familial tradition, social convention, and current primary education and exam system in Mainland China. We did our best to gain good support from educational authorities, school administrators, teachers and parents, and to integrate intervention components into schools’ regular academic programs. The results of and lesson learned from this study will help guide future school-based childhood obesity prevention programs in Mainland China. Trial registration Registration number: ChiCTR-ERC-11001819

  17. Assessing vaccine efficacy for the prevention of acute otitis media by pneumococcal vaccination in children: a methodological overview of statistical practice in randomized controlled clinical trials.

    Science.gov (United States)

    Jahn-Eimermacher, Antje; du Prel, Jean-Baptist; Schmitt, Heinz-Josef

    2007-08-14

    Acute otitis media (AOM) is the most common bacterial infectious disease among children. Vaccination is proposed to prevent otitis and several clinical trials were performed to assess the efficacy of pneumococcal vaccines. The way vaccine efficacy is analysed varies among trials. However, the clinical meaning of an estimate of vaccine effect and its statistical test depends on the applied statistical method. We aim to bring the meaning and validity of statistical trial results to the attention of researchers. We consider all methodological approaches for analysing vaccine efficacy applied in pneumococcal vaccination trials included in a recent Cochrane Review. We demonstrate how different methods address different scientific questions on the effect of vaccination, how they can complement each other and why some methods can produce misleading results.

  18. Aerobic physical activity and resistance training: an application of the theory of planned behavior among adults with type 2 diabetes in a random, national sample of Canadians

    Directory of Open Access Journals (Sweden)

    Karunamuni Nandini

    2008-12-01

    Full Text Available Abstract Background Aerobic physical activity (PA and resistance training are paramount in the treatment and management of type 2 diabetes (T2D, but few studies have examined the determinants of both types of exercise in the same sample. Objective The primary purpose was to investigate the utility of the Theory of Planned Behavior (TPB in explaining aerobic PA and resistance training in a population sample of T2D adults. Methods A total of 244 individuals were recruited through a random national sample which was created by generating a random list of household phone numbers. The list was proportionate to the actual number of household telephone numbers for each Canadian province (with the exception of Quebec. These individuals completed self-report TPB constructs of attitude, subjective norm, perceived behavioral control and intention, and a 3-month follow-up that assessed aerobic PA and resistance training. Results TPB explained 10% and 8% of the variance respectively for aerobic PA and resistance training; and accounted for 39% and 45% of the variance respectively for aerobic PA and resistance training intentions. Conclusion These results may guide the development of appropriate PA interventions for aerobic PA and resistance training based on the TPB.

  19. Approaches to Sampling Gay, Bisexual, and Other Men Who Have Sex with Men from Geosocial-Networking Smartphone Applications: A Methodological Note

    Directory of Open Access Journals (Sweden)

    William C. Goedel

    2016-09-01

    Full Text Available Geosocial-networking smartphone applications utilize global positioning system (GPS technologies to connect users based on their physical proximity. Many gay, bisexual, and other men who have sex with men (MSM have smartphones, and these new mobile technologies have generated quicker and easier modes for MSM to meet potential partners. In doing so, these technologies may facilitate a user’s ability to have multiple concurrent partners, thereby increasing their risk for acquiring HIV or other sexually transmitted infections. Researchers have sought to recruit users of these applications (e.g., Grindr, Jack’d, Scruff into HIV prevention studies, primarily through advertising on the application. Given that these advertisements often broadly targeted large urban areas, these approaches have generated samples that are not representative of the population of users of the given application in a given area. As such, we propose a method to generate a spatially representative sample of MSM via direct messaging on a given application using New York City and its geography as an example of this sampling and recruitment method. These methods can increase geographic representativeness and wider access to MSM who use geosocial-networking smartphone applications.

  20. A drink is a drink? Variation in the amount of alcohol contained in beer, wine and spirits drinks in a US methodological sample.

    Science.gov (United States)

    Kerr, William C; Greenfield, Thomas K; Tujague, Jennifer; Brown, Stephan E

    2005-11-01

    Empirically based estimates of the mean alcohol content of beer, wine and spirits drinks from a national sample of US drinkers are not currently available. A sample of 310 drinkers from the 2000 National Alcohol Survey were re-contacted to participate in a telephone survey with specific questions about the drinks they consume. Subjects were instructed to prepare their usual drink of each beverage at home and to measure each alcoholic beverage and other ingredients with a provided beaker. Information on the brand or type of each beverage was used to specify the percentage of alcohol. The weighted mean alcohol content of respondents' drinks was 0.67 ounces overall, 0.56 ounces for beer, 0.66 ounces for wine and 0.89 ounces for spirits. Spirits and wine drink contents were particularly variable with many high-alcohol drinks observed. While the 0.6-ounce of alcohol drink standard appears to be a reasonable single standard, it cannot capture the substantial variation evident in this sample and it underestimates average wine and spirits ethanol content. Direct measurement or beverage-specific mean ethanol content estimates would improve the precision of survey alcohol assessment.

  1. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions

    Directory of Open Access Journals (Sweden)

    Alexander J. Schmithausen

    2016-10-01

    Full Text Available Trace gases such as nitrous oxide (N2O, methane (CH4, and carbon dioxide (CO2 are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR; photoacoustic system (PAS are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC with electron capture detection (ECD, but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system and GC in the laboratory (offline system. Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges.

  2. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions.

    Science.gov (United States)

    Schmithausen, Alexander J; Trimborn, Manfred; Büscher, Wolfgang

    2016-10-03

    Trace gases such as nitrous oxide (N2O), methane (CH4), and carbon dioxide (CO2) are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR); photoacoustic system (PAS)) are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC) with electron capture detection (ECD), but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system) and GC in the laboratory (offline system). Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges.

  3. School-based mindfulness intervention for stress reduction in adolescents: Design and methodology of an open-label, parallel group, randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Jeanette M. Johnstone

    2016-12-01

    Full Text Available Adolescents are in a high-risk period developmentally, in terms of susceptibility to stress. A mindfulness intervention represents a potentially useful strategy for developing cognitive and emotion regulation skills associated with successful stress coping. Mindfulness strategies have been used successfully for emotional coping in adults, but are not as well studied in youth. This article details a novel proposal for the design of an 8-week randomized study to evaluate a high school-based mindfulness curriculum delivered as part of a two semester health class. A wellness education intervention is proposed as an active control, along with a waitlist control condition. All students enrolled in a sophomore (10th grade health class at a private suburban high school will be invited to participate (n = 300. Pre-test assessments will be obtained by youth report, parent ratings, and on-site behavioral testing. The assessments will evaluate baseline stress, mood, emotional coping, controlled attention, and working memory. Participants, divided into 13 classrooms, will be randomized into one of three conditions, by classroom: A mindfulness intervention, an active control (wellness education, and a passive control (waitlist. Waitlisted participants will receive one of the interventions in the following term. Intervention groups will meet weekly for 8 weeks during regularly scheduled health classes. Immediate post-tests will be conducted, followed by a 60-day post-test. It is hypothesized that the mindfulness intervention will outperform the other conditions with regard to the adolescents' mood, attention and response to stress.

  4. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...... with in different ways, depending on the ontological and epistemological stands of the researcher. The book suggests new methods and approaches, with innovative ways of collecting and creating empirical materials, by expanding the approaches to tried and tested methods, including digital innovations, digital...

  5. The yield and quality of cellular and bacterial DNA extracts from human oral rinse samples are variably affected by the cell lysis methodology.

    Science.gov (United States)

    Sohrabi, Mohsen; Nair, Raj G; Samaranayake, Lakshman P; Zhang, Li; Zulfiker, Abu Hasanat Md; Ahmetagic, Adnan; Good, David; Wei, Ming Q

    2016-03-01

    Recent culture-independent studies have enabled detailed mapping of human microbiome that has not been hitherto achievable by culture-based methods. DNA extraction is a key element of bacterial culture-independent studies that critically impacts on the outcome of the detected microbial profile. Despite the variations in DNA extraction methods described in the literature, no standardized technique is available for the purpose of microbiome profiling. Hence, standardization of DNA extraction methods is urgently needed to yield comparable data from different studies. We examined the effect of eight different cell lysis protocols on the yield and quality of the extracted DNA from oral rinse samples. These samples were exposed to cell lysis techniques based on enzymatic, mechanical, and a combination of enzymatic-mechanical methods. The outcome measures evaluated were total bacterial population, Firmicutes levels and human DNA contamination (in terms of surrogate GAPDH levels). We noted that all three parameters were significantly affected by the method of cell lysis employed. Although the highest yield of gDNA was obtained using lysozyme-achromopeptidase method, the lysozyme-zirconium beads method yielded the peak quantity of total bacterial DNA and Firmicutes with a lower degree of GAPDH contamination compared with the other methods. Taken together our data clearly points to an urgent need for a consensus, standardized DNA extraction technique to evaluate the oral microbiome using oral rinse samples. Further, if Firmicutes levels are the focus of investigation in oral rinse microbiome analyses then the lysozyme-zirconium bead method would be the method of choice in preference to others. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. A cluster-randomized trial of a middle school gender violence prevention program: Design, rationale, and sample characteristics.

    Science.gov (United States)

    Abebe, Kaleab Z; Jones, Kelley A; Ciaravino, Samantha; Ripper, Lisa; Paglisotti, Taylor; Morrow, Sarah Elizabeth; Grafals, Melanie; Van Dusen, Courtney; Miller, Elizabeth

    2017-11-01

    High rates of adolescent relationship abuse (ARA) and sexual violence (SV) reported among adolescents point to the need for prevention among middle school-age youth. This is a cluster randomized controlled trial to test an athletic coach-delivered ARA/SV prevention program in 41 middle schools (38 clusters). Trained coaches talk to their male athletes about 1) what constitutes harmful vs. respectful relationship behaviors, 2) dispelling myths that glorify male sexual aggression and promoting more gender-equitable attitudes, and 3) positive bystander intervention when aggressive male behaviors toward females are witnessed. A total of 973 male athletes (ages 11-14, grades 6-8) are participating. Athletes complete surveys at the beginning and end of sports season (Time 2), and one year later (Time 3). The primary outcome is an increase in positive bystander behaviors (i.e., intervening in peers' disrespectful or harmful behaviors); secondary outcomes are changes in recognition of what constitutes abusive behavior, intentions to intervene, and gender equitable attitudes (Time 2 and 3) as well as reduction in abuse perpetration (Time 3). Participating schools have a greater proportion of non-White students and students on free/reduced lunch compared to schools that declined participation. Participants' self-reported ethnicities are 54.5% White, 29.0% Black, 1.4% Hispanic and the remainder, multi-racial, other, or not reported. This study will evaluate the effectiveness of a coach-delivered ARA/SV prevention program for middle school male athletes. Findings will add to the evidence base regarding developmentally appropriate violence prevention programs as well as the role of coaches in adolescent health promotion. Clinical Trials #: NCT02331238. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Reducing eating disorder onset in a very high risk sample with significant comorbid depression: A randomized controlled trial.

    Science.gov (United States)

    Taylor, C Barr; Kass, Andrea E; Trockel, Mickey; Cunning, Darby; Weisman, Hannah; Bailey, Jakki; Sinton, Meghan; Aspen, Vandana; Schecthman, Kenneth; Jacobi, Corinna; Wilfley, Denise E

    2016-05-01

    Eating disorders (EDs) are serious problems among college-age women and may be preventable. An indicated online eating disorder (ED) intervention, designed to reduce ED and comorbid pathology, was evaluated. 206 women (M age = 20 ± 1.8 years; 51% White/Caucasian, 11% African American, 10% Hispanic, 21% Asian/Asian American, 7% other) at very high risk for ED onset (i.e., with high weight/shape concerns plus a history of being teased, current or lifetime depression, and/or nonclinical levels of compensatory behaviors) were randomized to a 10-week, Internet-based, cognitive-behavioral intervention or waitlist control. Assessments included the Eating Disorder Examination (EDE, to assess ED onset), EDE-Questionnaire, Structured Clinical Interview for DSM Disorders, and Beck Depression Inventory-II. ED attitudes and behaviors improved more in the intervention than control group (p = .02, d = 0.31); although ED onset rate was 27% lower, this difference was not significant (p = .28, NNT = 15). In the subgroup with highest shape concerns, ED onset rate was significantly lower in the intervention than control group (20% vs. 42%, p = .025, NNT = 5). For the 27 individuals with depression at baseline, depressive symptomatology improved more in the intervention than control group (p = .016, d = 0.96); although ED onset rate was lower in the intervention than control group, this difference was not significant (25% vs. 57%, NNT = 4). An inexpensive, easily disseminated intervention might reduce ED onset among those at highest risk. Low adoption rates need to be addressed in future research. (c) 2016 APA, all rights reserved).

  8. Long-term sampling of CO2 from waste-to-energy plants: 14C determination methodology, data variation and uncertainty

    DEFF Research Database (Denmark)

    Fuglsang, Karsten; Pedersen, Niels Hald; Larsen, Anna Warberg

    2014-01-01

    emissions were observed (between 56% and 71% biogenic CO2). These variations confirmed that biomass fractions in the waste can vary considerably, not only from day to day but also from month to month. An uncertainty budget for the measurement method itself showed that the expanded uncertainty of the method...... was ± 4.0 pmC (95 % confidence interval) at 62 pmC. The long-term sampling method was found to be useful for waste incinerators for determination of annual fossil and biogenic CO2 emissions with relatively low uncertainty....

  9. Proteomic study of a model causative agent of harmful red tide, Prorocentrum triestinum I: Optimization of sample preparation methodologies for analyzing with two-dimensional electrophoresis.

    Science.gov (United States)

    Chan, Leo Lai; Lo, Samuel Chun-Lap; Hodgkiss, Ivor John

    2002-09-01

    A comprehensive study to find the optimal sample preparation conditions for two-dimensional electrophoresis (2-DE) analysis of Prorocentrum triestinum, a model causative agent of harmful algal blooms (HABs) was carried out. The four major sample preparation steps for 2-DE: (a) cell disruption: i.e. sonication and homogenization with glass beads; (b) protein extraction : i.e. sequential and independent extraction procedures; (c) pre-electrophoretic treatment: these included (i) treatment with RNAase/DNAase or benzonase; (ii) ultracentrifugation to sediment large macromolecules such as DNA; (iii) desalting and concentration by ultrafiltration through a Microcon centrifugal filter device (MWCO: 3000 daltons); and (iv) desalting by a micro BioSpin chromatography column (MWCO: 6000 daltons); and (d) rehydration buffers, reducing agents and sample application in the first dimension isoelectric focussing were studied. Our results showed that sonication is easy to perform and resulted in a higher protein yield. Among the four extraction buffers, the urea containing buffers resulted in the extraction of the highest amount of protein while tris(hydroxymethyl)aminomethane buffers and trichloroacetic acid (TCA)/acetone precipitation allowed detection of a higher number of protein species (i.e. protein spots). Desalting by BioSpin and ultrafiltration have improved the 2-DE resolution of the water soluble fraction but have less effect on urea containing fractions. TCA/acetone precipitation was able to desalt all protein fractions independent of the extraction media, however extended exposure to this low pH medium has caused protein modification. Introduction of either DNase/RNase or benzonase treatment did not improve the discriminatory power of the 2-DE but this treatment did yield 2-DE with the clearest background. Proteolytic digestion was inhibited by addition of a protease inhibitor cocktail. Taken overall, a combination of sequential extraction and desalting by Bio

  10. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  11. A dispersive liquid--liquid microextraction methodology for copper (II) in environmental samples prior to determination using microsample injection flame atomic absorption spectrometry.

    Science.gov (United States)

    Alothman, Zeid A; Habila, Mohamed; Yilmaz, Erkan; Soylak, Mustafa

    2013-01-01

    A simple, environmentally friendly, and efficient dispersive liquid-liquid microextraction method combined with microsample injection flame atomic absorption spectrometry was developed for the separation and preconcentration of Cu(II). 2-(5-Bromo-2-pyridylazo)-5-(diethylamino)phenol (5-Br-PADAP) was used to form a hydrophobic complex of Cu(II) ions in the aqueous phase before extraction. To extract the Cu(II)-5-Br-PADAP complex from the aqueous phase to the organic phase, 2.0 mL of acetone as a disperser solvent and 200 microL of chloroform as an extraction solvent were used. The influences of important analytical parameters, such as the pH, types and volumes of the extraction and disperser solvents, amount of chelating agent, sample volume, and matrix effects, on the microextraction procedure were evaluated and optimized. Using the optimal conditions, the LOD, LOQ, preconcentration factor, and RSD were determined to be 1.4 microg/L, 4.7 microg/L, 120, and 6.5%, respectively. The accuracy of the proposed method was investigated using standard addition/recovery tests. The analysis of certified reference materials produced satisfactory analytical results. The developed method was applied for the determination of Cu in real samples.

  12. Synthetic and non-synthetic anthropogenic fibers in a river under the impact of Paris Megacity: Sampling methodological aspects and flux estimations.

    Science.gov (United States)

    Dris, Rachid; Gasperi, Johnny; Rocher, Vincent; Tassin, Bruno

    2017-11-09

    Processed fibers are highly present in our daily life and can be either natural, artificial (regenerated cellulose) and synthetic (made with petrochemicals). Their widespread use lead inevitably to a high contamination of environment. Previous studies focus on plastic particles regardless of their type or shape as long as they are comprised between 330μm and 5mm. On the contrary, this study focuses exclusively on fibers using a smaller mesh size net (80μm) to sample freshwater. Moreover, all processed organic fibers are considered, irrespective to their nature. First, the short term temporal variability of the fibers in the environment was assessed. While exposing the sampling net during 1min a coefficient of variation of approx. 45% (with n=6) was determined. It was of only 26% (n=6) when the exposure was of 3min. The assessment of the distribution through the section showed a possible difference in concentrations between the middle of the water surface and the river banks which could be attributed to the intense river traffic within the Paris Megacity. The vertical variability seems negligible as turbulence and current conditions homogenize the distribution of the fibers. A monthly monitoring showed concentrations of 100.6±99.9fibers·m-3 in the Marne River and of: 48.5±98.5, 27.9±26.3, 27.9±40.3 and 22.1±25.3fibers·m-3 from the upstream to downstream points in the Seine River. Once these concentrations are converted into fluxes, it seems that the impact generated by the Paris Megacity cannot be distinguished. Investigations on the role of sedimentation and deposition on the banks are required. This study helped fill some major knowledge gaps regarding the fibers in rivers, their sampling, occurrence, spatial-temporal distribution and fluxes. It is encouraged that future studies include both synthetic and none synthetic fibers. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Differing antidepressant maintenance methodologies.

    Science.gov (United States)

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  14. Cognitive Interventions for Cognitively Healthy, Mildly Impaired, and Mixed Samples of Older Adults: A Systematic Review and Meta-Analysis of Randomized-Controlled Trials.

    Science.gov (United States)

    Mewborn, Catherine M; Lindbergh, Cutter A; Stephen Miller, L

    2017-12-01

    Cognitive interventions may improve cognition, delay age-related cognitive declines, and improve quality of life for older adults. The current meta-analysis was conducted to update and expand previous work on the efficacy of cognitive interventions for older adults and to examine the impact of key demographic and methodological variables. EBSCOhost and Embase online databases and reference lists were searched to identify relevant randomized-controlled trials (RCTs) of cognitive interventions for cognitively healthy or mildly impaired (MCI) older adults (60+ years). Interventions trained a single cognitive domain (e.g., memory) or were multi-domain training, and outcomes were assessed immediately post-intervention using standard neuropsychological tests. In total, 279 effects from 97 studies were pooled based on a random-effects model and expressed as Hedges' g (unbiased). Overall, results indicated that cognitive interventions produce a small, but significant, improvement in the cognitive functioning of older adults, relative to active and passive control groups (g = 0.298, p < .001, 95% CI = 0.248-0.347). These results were confirmed using multi-level analyses adjusting for nesting of effect sizes within studies (g = 0.362, p < .001, 95% CI = 0.275, 0.449). Age, education, and cognitive status (healthy vs. MCI) were not significant moderators. Working memory interventions proved most effective (g = 0.479), though memory, processing speed, and multi-domain interventions also significantly improved cognition. Effects were larger for directly trained outcomes but were also significant for non-trained outcomes (i.e., "transfer effects"). Implications for future research and clinical practice are discussed. This project was pre-registered with PROSPERO (#42016038386).

  15. Two to five repeated measurements per patient reduced the required sample size considerably in a randomized clinical trial for patients with inflammatory rheumatic diseases

    Directory of Open Access Journals (Sweden)

    Smedslund Geir

    2013-02-01

    Full Text Available Abstract Background Patient reported outcomes are accepted as important outcome measures in rheumatology. The fluctuating symptoms in patients with rheumatic diseases have serious implications for sample size in clinical trials. We estimated the effects of measuring the outcome 1-5 times on the sample size required in a two-armed trial. Findings In a randomized controlled trial that evaluated the effects of a mindfulness-based group intervention for patients with inflammatory arthritis (n=71, the outcome variables Numerical Rating Scales (NRS (pain, fatigue, disease activity, self-care ability, and emotional wellbeing and General Health Questionnaire (GHQ-20 were measured five times before and after the intervention. For each variable we calculated the necessary sample sizes for obtaining 80% power (α=.05 for one up to five measurements. Two, three, and four measures reduced the required sample sizes by 15%, 21%, and 24%, respectively. With three (and five measures, the required sample size per group was reduced from 56 to 39 (32 for the GHQ-20, from 71 to 60 (55 for pain, 96 to 71 (73 for fatigue, 57 to 51 (48 for disease activity, 59 to 44 (45 for self-care, and 47 to 37 (33 for emotional wellbeing. Conclusions Measuring the outcomes five times rather than once reduced the necessary sample size by an average of 27%. When planning a study, researchers should carefully compare the advantages and disadvantages of increasing sample size versus employing three to five repeated measurements in order to obtain the required statistical power.

  16. Dealing with daily challenges in dementia (deal-id study): effectiveness of the experience sampling method intervention 'Partner in Sight' for spousal caregivers of people with dementia: design of a randomized controlled trial.

    Science.gov (United States)

    van Knippenberg, Rosalia J M; de Vugt, Marjolein E; Ponds, Rudolf W; Myin-Germeys, Inez; Verhey, Frans R J

    2016-05-11

    There is an urgent need for psychosocial interventions that effectively support dementia caregivers in daily life. The Experience Sampling Methodology (ESM) offers the possibility to provide a more dynamic view of caregiver functioning. ESM-derived feedback may help to redirect caregivers' behavior towards situations that elicit positive emotions and to increase their feelings of competence in the caretaking process. This paper presents the design of a study that evaluates the process characteristics and effects of the ESM-based intervention 'Partner in Sight'. A randomized controlled trial with 90 spousal caregivers of people with dementia will be conducted. Participants will be randomly assigned to the experimental (6-week ESM intervention including feedback), pseudo-experimental (6-week ESM intervention without feedback), or control group (care as usual). Assessments will be performed pre- and post-intervention and at 2-, and 6-month follow-up. Main outcomes will be sense of competence, perceived control, momentary positive affect, and psychological complaints (depressive symptoms, perceived stress, anxiety, momentary negative affect). In addition to the effect evaluation, a process and economic evaluation will be conducted to investigate the credibility and generalizability of the intervention, and its cost-effectiveness. The potential effects of the ESM intervention may help caregivers to endure their care responsibilities and prevent them from becoming overburdened. This is the first ESM intervention for caregivers of people with dementia. The results of this study, therefore, provide a valuable contribution to the growing knowledge on m-health interventions for dementia caregivers. Dutch Trial Register NTR4847 ; date registered Oct 9, 2014.

  17. Missing citations due to exact reference matching: Analysis of a random sample from WoS. Are publications from peripheral countries disadvantaged?

    Energy Technology Data Exchange (ETDEWEB)

    Donner, P.

    2016-07-01

    Citation counts of scientific research contributions are one fundamental data in scientometrics. Accuracy and completeness of citation links are therefore crucial data quality issues (Moed, 2005, Ch. 13). However, despite the known flaws of reference matching algorithms, usually no attempts are made to incorporate uncertainty about citation counts into indicators. This study is a step towards that goal. Particular attention is paid to the question whether publications from countries not using basic Latin script are differently affected by missed citations. The proprietary reference matching procedure of Web of Science (WoS) is based on (near) exact agreement of cited reference data (normalized during processing) to the target papers bibliographical data. Consequently, the procedure has near-optimal precision but incomplete recall - it is known to miss some slightly inaccurate reference links (Olensky, 2015). However, there has been no attempt so far to estimate the rate of missed citations by a principled method for a random sample. For this study a simple random sample of WoS source papers was drawn and it was attempted to find all reference strings of WoS indexed documents that refer to them, in particular inexact matches. The objective is to give a statistical estimate of the proportion of missed citations and to describe the relationship of the number of found citations to the number of missed citations, i.e. the conditional error distribution. The empirical error distribution is statistically analyzed and modelled. (Author)

  18. Development of a Virtual Reality Exposure Tool as Psychological Preparation for Elective Pediatric Day Care Surgery: Methodological Approach for a Randomized Controlled Trial.

    Science.gov (United States)

    Eijlers, Robin; Legerstee, Jeroen S; Dierckx, Bram; Staals, Lonneke M; Berghmans, Johan; van der Schroeff, Marc P; Wijnen, Rene Mh; Utens, Elisabeth Mwj

    2017-09-11

    Preoperative anxiety in children is highly prevalent and is associated with adverse outcomes. Existing psychosocial interventions to reduce preoperative anxiety are often aimed at distraction and are of limited efficacy. Gradual exposure is a far more effective way to reduce anxiety. Virtual reality (VR) provides a unique opportunity to gradually expose children to all aspects of the operating theater. The aims of our study are (1) to develop a virtual reality exposure (VRE) tool to prepare children psychologically for surgery; and (2) to examine the efficacy of the VRE tool in a randomized controlled trial (RCT), in which VRE will be compared to care as usual (CAU). The VRE tool is highly realistic and resembles the operating room environment accurately. With this tool, children will not only be able to explore the operating room environment, but also get accustomed to general anesthesia procedures. The PREoperative Virtual reality Intervention to Enhance Wellbeing (PREVIEW) study will be conducted. In this single-blinded RCT, 200 consecutive patients (aged 4 to 12 years) undergoing elective day care surgery for dental, oral, or ear-nose-throat problems, will be randomly allocated to the preoperative VRE intervention or CAU. The primary outcome is change in child state anxiety level between baseline and induction of anesthesia. Secondary outcome measures include child's postoperative anxiety, emergence delirium, postoperative pain, use of analgesics, health care use, and pre- and postoperative parental anxiety. The VRE tool has been developed. Participant recruitment began March 2017 and is expected to be completed by September 2018. To our knowledge, this is the first RCT evaluating the effect of a VRE tool to prepare children for surgery. The VRE intervention is expected to significantly diminish preoperative anxiety, postoperative pain, and the use of postoperative analgesics in pediatric patients. The tool could create a less stressful experience for both

  19. A facile and selective approach for enrichment of l-cysteine in human plasma sample based on zinc organic polymer: Optimization by response surface methodology.

    Science.gov (United States)

    Bahrani, Sonia; Ghaedi, Mehrorang; Ostovan, Abbas; Javadian, Hamedreza; Mansoorkhani, Mohammad Javad Khoshnood; Taghipour, Tahere

    2018-02-05

    In this research, a facile and selective method was described to extract l-cysteine (l-Cys), an essential α-amino acid for anti-ageing playing an important role in human health, from human blood plasma sample. The importance of this research was the mild and time-consuming synthesis of zinc organic polymer (Zn-MOP) as an adsorbent and evaluation of its ability for efficient enrichment of l-Cys by ultrasound-assisted dispersive micro solid-phase extraction (UA-DMSPE) method. The structure of Zn-MOP was investigated by FT-IR, XRD and SEM. Analysis of variance (ANOVA) was applied for the experimental data to reach the best optimum conditions. The quantification of l-Cys was carried out by high performance liquid chromatography with UV detection set at λ=230nm. The calibration graph showed reasonable linear responses towards l-Cys concentrations in the range of 4.0-1000μg/L (r 2 =0.999) with low limit of detection (0.76μg/L, S/N=3) and RSD≤2.18 (n=3). The results revealed the applicability and high performance of this novel strategy in detecting trace l-Cys by Zn-MOP in complicated matrices. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Tranport of p-aminohippuric acid (/sup 3/H-PAH), inulin and dextran out of the cranial cavity: A methodological study using intraventricular injection and sample combustion

    Energy Technology Data Exchange (ETDEWEB)

    Jakobson, Aa. M.

    1987-01-01

    Material injected into the cerebral ventricles can leave the cerebrospinal fluid (CSF) but remain in the cranial cavity. To analyze the disappearance of /sup 3/H- and of /sup 14/C-labelled material from the cranial cavity, such material was injected into the lateral ventricles together with a bulk flow marker, labelled with the other radionuclide. In the present pilot study /sup 3/H-PAH and /sup 14/C-inulin were used. Five ..mu..l of a mixture was injected into each lateral cerebral ventricles in rats, which were killed at various intervals. The whole skull was analyzed without opening the CSF space after homogenization in the deep-frozen state. The samples were combusted and analyzed by liquid scintillation counting. Probenecid, injected intraperitoneally, inhibited the removal of /sup 3/H-PAH from the skull cavity, as anticipated. Immediately after the intraventricular injection, however, /sup 3/H-PAH was transiently retained, probably by uptake into actively transporting tissue. After injection of probenecid, this delay in removal was reduced. The difference in disappearance rate between /sup 3/H-PAH and /sup 14/C-inulin was estimated by comparing the /sup 3/H//sup 14/C ratio in the skulls with that in the injected solution, which appeared to be a better method than comparing the recovery of each compound.

  1. Rationale, design and methodology of a double-blind, randomized, placebo-controlled study of escitalopram in prevention of Depression in Acute Coronary Syndrome (DECARD

    Directory of Open Access Journals (Sweden)

    Hansen Jørgen

    2009-04-01

    Full Text Available Abstract Background The prevalence of depression in patients with acute coronary syndrome, i.e. myocardial infarction and unstable angina, is higher than in the general population. The prevalence of anxiety is higher as well. Both depression and anxiety are associated with poor cardiac outcomes and higher mortality. Comorbid depression in patients with acute coronary syndrome often goes undiagnosed, and it is therefore a challenging task to prevent this risk factor. The study of DEpression in Coronary ARtery Disease (DECARD is designed to examine if it is possible to prevent depression in patients with acute coronary syndrome. Methods Two hundred forty non-depressed patients with acute coronary syndrome are randomized to treatment with either escitalopram or placebo for 1 year. Psychiatric and cardiac assessment of patients is performed to evaluate the possibility of preventing depression. Diagnosis of depression and Hamilton Depression Scale are the primary outcome measures. Discussion This is the first study of prevention of depression in patients after acute coronary syndrome with a selective serotonin reuptake inhibitor. Trial Registration http://www.ClinicalTrials.gov. Identifier: NCT00140257

  2. Occupational exposure to mineral oil metalworking fluid (MWFs) mist: Development of new methodologies for mist sampling and analysis. Results from an inter-laboratory comparison

    Energy Technology Data Exchange (ETDEWEB)

    Huynh, C Khanh; Herrera, H [Institut Universitaire Romand de Sante au Travail (IST), CH-1005 Lausanne (Switzerland); Parrat, J [Service des Arts et Metiers et du Travail (LIST), CH-2800 Delemont and Laboratoire intercantonal de sante au travail (LIST), CH-1034 Peseux (Switzerland); Wolf, R [SUVA, CH-6002 Luzern (Switzerland); Perret, V, E-mail: chhuynh@hospvd.c [Service cantonal de toxicologie industrielle et de protection contre les pollutions interieures (STIPI), CH-1211 Geneve (Switzerland)

    2009-02-01

    Metalworking Fluids (MWFs) are largely used in the sector of undercutting, a large professional activity in Switzerland, in particular in the fine mechanic and watch making industry. France proposes a Permissible Exposure Limit (PEL) of 1 mg.m{sup -3} of aerosol. The American Conference of Governmental Industrial Hygienists (ACGIH) sets its value at 5 mg.m{sup -3} but a proposal to lower the standard ('intended changes') to 0.2 mg.m{sup -3} of aerosol is pending since 2001. However, it has not become a recognized threshold limit value for exposure. Since 2003, the new Swiss PEL (MAK) recommendations would be 0.2 mg.m{sup -3} of aerosol (oil with boiling point > 350 deg. C without additives) and/or 20 mg.m{sup -3} of oil aerosol + vapour for medium or light oil. To evaluate evaporative losses of sampled oil, the German 'Berufsgenossenschaftliches Institut fuer Arbeitssicherheit' (BGIA) recommends the use of a XAD-2 cartridge behind the filter. The method seems to work perfectly for MWFs in a clean occupational atmosphere free from interference of light vapour cleaning solvent such as White Spirit. But, in real situation, machine shop atmosphere contaminated with traces of White Spirit, the BGIA method failed to estimate the MWFs levels (over-estimation). In this paper, we propose a new approach meant to measure both oil vapours and aerosols. Five inter-laboratory comparisons are discussed, based on the production of oil mist in an experimental chamber under controlled conditions.

  3. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  4. Community-based intermittent mass testing and treatment for malaria in an area of high transmission intensity, western Kenya: study design and methodology for a cluster randomized controlled trial.

    Science.gov (United States)

    Samuels, Aaron M; Awino, Nobert; Odongo, Wycliffe; Abong'o, Benard; Gimnig, John; Otieno, Kephas; Shi, Ya Ping; Were, Vincent; Allen, Denise Roth; Were, Florence; Sang, Tony; Obor, David; Williamson, John; Hamel, Mary J; Patrick Kachur, S; Slutsker, Laurence; Lindblade, Kim A; Kariuki, Simon; Desai, Meghna

    2017-06-07

    Most human Plasmodium infections in western Kenya are asymptomatic and are believed to contribute importantly to malaria transmission. Elimination of asymptomatic infections requires active treatment approaches, such as mass testing and treatment (MTaT) or mass drug administration (MDA), as infected persons do not seek care for their infection. Evaluations of community-based approaches that are designed to reduce malaria transmission require careful attention to study design to ensure that important effects can be measured accurately. This manuscript describes the study design and methodology of a cluster-randomized controlled trial to evaluate a MTaT approach for malaria transmission reduction in an area of high malaria transmission. Ten health facilities in western Kenya were purposively selected for inclusion. The communities within 3 km of each health facility were divided into three clusters of approximately equal population size. Two clusters around each health facility were randomly assigned to the control arm, and one to the intervention arm. Three times per year for 2 years, after the long and short rains, and again before the long rains, teams of community health volunteers visited every household within the intervention arm, tested all consenting individuals with malaria rapid diagnostic tests, and treated all positive individuals with an effective anti-malarial. The effect of mass testing and treatment on malaria transmission was measured through population-based longitudinal cohorts, outpatient visits for clinical malaria, periodic population-based cross-sectional surveys, and entomological indices.

  5. Validity of protein-osmolality versus protein-creatinine ratios in the estimation of quantitative proteinuria from random samples of urine in children.

    Science.gov (United States)

    Morgenstern, Bruce Z; Butani, Lavjay; Wollan, Peter; Wilson, David M; Larson, Timothy S

    2003-04-01

    Proteinuria is an important marker of kidney disease. Simple methods to determine the presence of proteinuria in a semiquantitative fashion require measurement of either a protein-creatinine or protein-osmolality ratio. Urine samples from 134 healthy infants and children and 150 children from the pediatric nephrology practice were analyzed to develop normative data for protein-osmolality ratios on random urine samples and compare protein-osmolality with protein-creatinine ratio as a predictor of 24-hour urine protein excretion. Children were grouped according to age. Three groups were established: infants (protein excretion was determined to be a protein-osmolality ratio of 0.15 mg x kg H2O/mOsm. L; for children between 2 and 8 years old, 0.14; and for children older than 8 years, 0.17 (P = not significant between age groups). The corresponding optimal cutoff value for protein-creatinine ratio for the entire group of children older than 2 years is 0.20. Area under the curve analysis of receiver operator characteristic curves showed protein-creatinine ratio was superior to protein-osmolality ratio for predicting abnormal amounts of proteinuria in children and adolescents (P protein-creatinine ratio in children, it would be appropriate to screen urine samples for proteinuria using protein-creatinine ratio rather than protein-osmolality ratio.

  6. Changes in brain volume and cognition in a randomized trial of exercise and social interaction in a community-based sample of non-demented Chinese elders.

    Science.gov (United States)

    Mortimer, James A; Ding, Ding; Borenstein, Amy R; DeCarli, Charles; Guo, Qihao; Wu, Yougui; Zhao, Qianhua; Chu, Shugang

    2012-01-01

    Physical exercise has been shown to increase brain volume and improve cognition in randomized trials of non-demented elderly. Although greater social engagement was found to reduce dementia risk in observational studies, randomized trials of social interventions have not been reported. A representative sample of 120 elderly from Shanghai, China was randomized to four groups (Tai Chi, Walking, Social Interaction, No Intervention) for 40 weeks. Two MRIs were obtained, one before the intervention period, the other after. A neuropsychological battery was administered at baseline, 20 weeks, and 40 weeks. Comparison of changes in brain volumes in intervention groups with the No Intervention group were assessed by t-tests. Time-intervention group interactions for neuropsychological measures were evaluated with repeated-measures mixed models. Compared to the No Intervention group, significant increases in brain volume were seen in the Tai Chi and Social Intervention groups (p < 0.05). Improvements also were observed in several neuropsychological measures in the Tai Chi group, including the Mattis Dementia Rating Scale score (p = 0.004), the Trailmaking Test A (p = 0.002) and B (p = 0.0002), the Auditory Verbal Learning Test (p = 0.009), and verbal fluency for animals (p = 0.01). The Social Interaction group showed improvement on some, but fewer neuropsychological indices. No differences were observed between the Walking and No Intervention groups. The findings differ from previous clinical trials in showing increases in brain volume and improvements in cognition with a largely non-aerobic exercise (Tai Chi). In addition, intellectual stimulation through social interaction was associated with increases in brain volume as well as with some cognitive improvements.

  7. Why choose Random Forest to predict rare species distribution with few samples in large undersampled areas? Three Asian crane species models provide supporting evidence

    Directory of Open Access Journals (Sweden)

    Chunrong Mi

    2017-01-01

    Full Text Available Species distribution models (SDMs have become an essential tool in ecology, biogeography, evolution and, more recently, in conservation biology. How to generalize species distributions in large undersampled areas, especially with few samples, is a fundamental issue of SDMs. In order to explore this issue, we used the best available presence records for the Hooded Crane (Grus monacha, n = 33, White-naped Crane (Grus vipio, n = 40, and Black-necked Crane (Grus nigricollis, n = 75 in China as three case studies, employing four powerful and commonly used machine learning algorithms to map the breeding distributions of the three species: TreeNet (Stochastic Gradient Boosting, Boosted Regression Tree Model, Random Forest, CART (Classification and Regression Tree and Maxent (Maximum Entropy Models. In addition, we developed an ensemble forecast by averaging predicted probability of the above four models results. Commonly used model performance metrics (Area under ROC (AUC and true skill statistic (TSS were employed to evaluate model accuracy. The latest satellite tracking data and compiled literature data were used as two independent testing datasets to confront model predictions. We found Random Forest demonstrated the best performance for the most assessment method, provided a better model fit to the testing data, and achieved better species range maps for each crane species in undersampled areas. Random Forest has been generally available for more than 20 years and has been known to perform extremely well in ecological predictions. However, while increasingly on the rise, its potential is still widely underused in conservation, (spatial ecological applications and for inference. Our results show that it informs ecological and biogeographical theories as well as being suitable for conservation applications, specifically when the study area is undersampled. This method helps to save model-selection time and effort, and allows robust and rapid

  8. In vivo photoacoustic mouse eye imaging of healing after chemical injury and automated eyeball surface estimation based on a random sample consensus algorithm

    Science.gov (United States)

    Jeon, Seungwan; Song, Hyun Beom; Kim, Jaewoo; Lee, Byung Joo; Kim, Jeong Hun; Kim, Chulhong

    2017-03-01

    Ocular chemical damage may induce limbal vessel ischemia and neovascularization, but the pathophysiology of the disease is not completely known. To observe changes in blood vessels after alkaline burn, we monitored the anterior segment and choroidal vasculature using a photoacoustic microscope (OR-PAM). We were able to observe not only the iris blood vessels but also the choroidal vessels under the sclera, which were difficult to be observed with conventional photographs. After alkali burning, we observed neovascularization and limbal ischemia and successfully tracked changes in vasculature during the 7-day healing process. We also used the RANdom SAmple Consensus (RANSAC) method to segment the abnormally generated blood vessels in the cornea by detecting the eyeball surface and successfully visualize the distance from each PA signal to the center of the eye. We believe that photoacoustic imaging has an important potential to reveal the pathophysiology of limb ischemia and neovascularization.

  9. Access to medical care among persons with musculoskeletal conditions. A study using a random sample of households in San Mateo County, California.

    Science.gov (United States)

    Yelin, E; Bernhard, G; Pflugrad, D

    1995-08-01

    To study access to medical care services, including subspecialty care, among persons with musculoskeletal conditions. In early 1993, a random sample of households in San Mateo County, California, was screened for the presence of household members with musculoskeletal conditions, and a member of each household so identified was administered a structured survey about access to medical care and other related subjects. Eighty-six percent of all persons with a musculoskeletal condition had ever seen at least one physician for the condition, but only 6.5% had ever seen a rheumatologist. Those without health insurance were only 82% as likely as those with health insurance to have ever seen a physician. Most persons with a musculoskeletal condition have seen a physician for the condition, but lack of health insurance significantly reduces the proportion who have done so.

  10. [Qualitative research methodology in health care].

    Science.gov (United States)

    Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara

    2017-03-01

    Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, “The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals”. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.

  11. New Methodology of Block Cipher Analysis using Chaos Game

    Directory of Open Access Journals (Sweden)

    Budi Sulistyo

    2011-08-01

    Full Text Available Block cipher analysis covers randomness analysis and cryptanalysis. This paper proposes a new method potentially used for randomness analysis and cryptanalysis. The method uses true random sequence concept as a reference for measuring randomness level of a random sequence. By using this concept, this paper defines bias which represents violation of a random sequence from true random sequence. In this paper, block cipher is treated as a mapping function of a discrete time dynamical system. The dynamical system framework is used to make the application of various analysis techniques developed in dynamical system field becomes possible. There are three main parts of the methodology presented in this paper: the dynamical system framework for block cipher analysis, a new chaos game scheme and an extended measure concept related to chaos game and fractal analysis. This paper also presents the general procedures of the proposed method, which includes: symbolic dynamic analysis of discrete dynamical system whose block cipher as its mapping function, random sequence construction, the random sequence usage as input of a chaos game scheme, output measurement of chaos game scheme using extended measure concept, analysis the result of the measurement. The analysis process and of a specific real or sample block cipher and the analysis result are beyond the scope of this paper.

  12. Random Sampling of Squamate Reptiles in Spanish Natural Reserves Reveals the Presence of Novel Adenoviruses in Lacertids (Family Lacertidae) and Worm Lizards (Amphisbaenia).

    Science.gov (United States)

    Szirovicza, Leonóra; López, Pilar; Kopena, Renáta; Benkő, Mária; Martín, José; Pénzes, Judit J

    2016-01-01

    Here, we report the results of a large-scale PCR survey on the prevalence and diversity of adenoviruses (AdVs) in samples collected randomly from free-living reptiles. On the territories of the Guadarrama Mountains National Park in Central Spain and of the Chafarinas Islands in North Africa, cloacal swabs were taken from 318 specimens of eight native species representing five squamate reptilian families. The healthy-looking animals had been captured temporarily for physiological and ethological examinations, after which they were released. We found 22 AdV-positive samples in representatives of three species, all from Central Spain. Sequence analysis of the PCR products revealed the existence of three hitherto unknown AdVs in 11 Carpetane rock lizards (Iberolacerta cyreni), nine Iberian worm lizards (Blanus cinereus), and two Iberian green lizards (Lacerta schreiberi), respectively. Phylogeny inference showed every novel putative virus to be a member of the genus Atadenovirus. This is the very first description of the occurrence of AdVs in amphisbaenian and lacertid hosts. Unlike all squamate atadenoviruses examined previously, two of the novel putative AdVs had A+T rich DNA, a feature generally deemed to mirror previous host switch events. Our results shed new light on the diversity and evolution of atadenoviruses.

  13. Impact of an educational intervention on women's knowledge and acceptability of human papillomavirus self-sampling: a randomized controlled trial in Cameroon.

    Science.gov (United States)

    Sossauer, Gaëtan; Zbinden, Michel; Tebeu, Pierre-Marie; Fosso, Gisèle K; Untiet, Sarah; Vassilakos, Pierre; Petignat, Patrick

    2014-01-01

    Human papillomavirus (HPV) self-sampling (Self-HPV) may be used as a primary cervical cancer screening method in a low resource setting. Our aim was to evaluate whether an educational intervention would improve women's knowledge and confidence in the Self-HPV method. Women aged between 25 and 65 years old, eligible for cervical cancer screening, were randomly chosen to receive standard information (control group) or standard information followed by educational intervention (interventional group). Standard information included explanations about what the test detects (HPV), the link between HPV and cervical cancer and how to perform HPV self-sampling. The educational intervention consisted of a culturally tailored video about HPV, cervical cancer, Self-HPV and its relevancy as a screening test. All participants completed a questionnaire that assessed sociodemographic data, women's knowledge about cervical cancer and acceptability of Self-HPV. A total of 302 women were enrolled in 4 health care centers in Yaoundé and the surrounding countryside. 301 women (149 in the "control group" and 152 in the "intervention group") completed the full process and were included into the analysis. Participants who received the educational intervention had a significantly higher knowledge about HPV and cervical cancer than the control group (pEducational intervention promotes an increase in knowledge about HPV and cervical cancer. Further investigation should be conducted to determine if this intervention can be sustained beyond the short term and influences screening behavior. International Standard Randomised Controlled Trial Number (ISRCTN) Register ISRCTN78123709.

  14. What's in a name? The challenge of describing interventions in systematic reviews: analysis of a random sample of reviews of non-pharmacological stroke interventions.

    Science.gov (United States)

    Hoffmann, Tammy C; Walker, Marion F; Langhorne, Peter; Eames, Sally; Thomas, Emma; Glasziou, Paul

    2015-11-17

    To assess, in a sample of systematic reviews of non-pharmacological interventions, the completeness of intervention reporting, identify the most frequently missing elements, and assess review authors' use of and beliefs about providing intervention information. Analysis of a random sample of systematic reviews of non-pharmacological stroke interventions; online survey of review authors. The Cochrane Library and PubMed were searched for potentially eligible systematic reviews and a random sample of these assessed for eligibility until 60 (30 Cochrane, 30 non-Cochrane) eligible reviews were identified. In each review, the completeness of the intervention description in each eligible trial (n=568) was assessed by 2 independent raters using the Template for Intervention Description and Replication (TIDieR) checklist. All review authors (n=46) were invited to complete a survey. Most reviews were missing intervention information for the majority of items. The most incompletely described items were: modifications, fidelity, materials, procedure and tailoring (missing from all interventions in 97%, 90%, 88%, 83% and 83% of reviews, respectively). Items that scored better, but were still incomplete for the majority of reviews, were: 'when and how much' (in 31% of reviews, adequate for all trials; in 57% of reviews, adequate for some trials); intervention mode (in 22% of reviews, adequate for all trials; in 38%, adequate for some trials); and location (in 19% of reviews, adequate for all trials). Of the 33 (71%) authors who responded, 58% reported having further intervention information but not including it, and 70% tried to obtain information. Most focus on intervention reporting has been directed at trials. Poor intervention reporting in stroke systematic reviews is prevalent, compounded by poor trial reporting. Without adequate intervention descriptions, the conduct, usability and interpretation of reviews are restricted and therefore, require action by trialists

  15. Analytical decision model for sample size and effectiveness projections for use in planning a population-based randomized controlled trial of colorectal cancer screening.

    Science.gov (United States)

    Chiu, Sherry Y-H; Malila, Nea; Yen, Amy M-F; Anttila, Ahti; Hakama, Matti; Chen, H-H

    2011-02-01

    Population-based randomized controlled trials (RCTs) often involve enormous costs and long-term follow-up to evaluate primary end points. Analytical decision-simulated model for sample size and effectiveness projections based on primary and surrogate end points are necessary before planning a population-based RCT. Based on the study design similar to two previous RCTs, transition rates were estimated using a five-state natural history model [normal, preclinical detection phase (PCDP) Dukes' A/B, PCDP Dukes' C/D, Clinical Dukes' A/B and Clinical Dukes' C/D]. The Markov cycle tree was assigned transition parameters, variables related to screening and survival rate that simulated results of 10-year follow-up in the absence of screening for a hypothetical cohort aged 45-74 years. The corresponding screened arm was to simulate the results after the introduction of population-based screening for colorectal cancer with fecal occult blood test with stop screen design. The natural course of mean sojourn time for five-state Markov model were estimated as 2.75 years for preclinical Dukes' A/B and 1.38 years for preclinical Dukes' C/D. The expected reductions in mortality and Dukes' C/D were 13% (95% confidence intervals: 7-19%) and 26% (95% confidence intervals: 20-32%), respectively, given a 70% acceptance rate and a 90% colonoscopy referral rate. Sample sizes required were 86,150 and 65,592 subjects for the primary end point and the surrogate end point, respectively, given an incidence rate up to 0.0020 per year. The sample sizes required for primary and surrogate end points and the projection of effectiveness of fecal occult blood test for colorectal cancer screening were developed. Both are very important to plan a population-based RCT. © 2010 Blackwell Publishing Ltd.

  16. Characterizing stand-level forest canopy cover and height using Landsat time series, samples of airborne LiDAR, and the Random Forest algorithm

    Science.gov (United States)

    Ahmed, Oumer S.; Franklin, Steven E.; Wulder, Michael A.; White, Joanne C.

    2015-03-01

    Many forest management activities, including the development of forest inventories, require spatially detailed forest canopy cover and height data. Among the various remote sensing technologies, LiDAR (Light Detection and Ranging) offers the most accurate and consistent means for obtaining reliable canopy structure measurements. A potential solution to reduce the cost of LiDAR data, is to integrate transects (samples) of LiDAR data with frequently acquired and spatially comprehensive optical remotely sensed data. Although multiple regression is commonly used for such modeling, often it does not fully capture the complex relationships between forest structure variables. This study investigates the potential of Random Forest (RF), a machine learning technique, to estimate LiDAR measured canopy structure using a time series of Landsat imagery. The study is implemented over a 2600 ha area of industrially managed coastal temperate forests on Vancouver Island, British Columbia, Canada. We implemented a trajectory-based approach to time series analysis that generates time since disturbance (TSD) and disturbance intensity information for each pixel and we used this information to stratify the forest land base into two strata: mature forests and young forests. Canopy cover and height for three forest classes (i.e. mature, young and mature and young (combined)) were modeled separately using multiple regression and Random Forest (RF) techniques. For all forest classes, the RF models provided improved estimates relative to the multiple regression models. The lowest validation error was obtained for the mature forest strata in a RF model (R2 = 0.88, RMSE = 2.39 m and bias = -0.16 for canopy height; R2 = 0.72, RMSE = 0.068% and bias = -0.0049 for canopy cover). This study demonstrates the value of using disturbance and successional history to inform estimates of canopy structure and obtain improved estimates of forest canopy cover and height using the RF algorithm.

  17. Employing a Multi-level Approach to Recruit a Representative Sample of Women with Recent Gestational Diabetes Mellitus into a Randomized Lifestyle Intervention Trial.

    Science.gov (United States)

    Nicklas, Jacinda M; Skurnik, Geraldine; Zera, Chloe A; Reforma, Liberty G; Levkoff, Sue E; Seely, Ellen W

    2016-02-01

    The postpartum period is a window of opportunity for diabetes prevention in women with recent gestational diabetes (GDM), but recruitment for clinical trials during this period of life is a major challenge. We adapted a social-ecologic model to develop a multi-level recruitment strategy at the macro (high or institutional level), meso (mid or provider level), and micro (individual) levels. Our goal was to recruit 100 women with recent GDM into the Balance after Baby randomized controlled trial over a 17-month period. Participants were asked to attend three in-person study visits at 6 weeks, 6, and 12 months postpartum. They were randomized into a control arm or a web-based intervention arm at the end of the baseline visit at six weeks postpartum. At the end of the recruitment period, we compared population characteristics of our enrolled subjects to the entire population of women with GDM delivering at Brigham and Women's Hospital (BWH). We successfully recruited 107 of 156 (69 %) women assessed for eligibility, with the majority (92) recruited during pregnancy at a mean 30 (SD ± 5) weeks of gestation, and 15 recruited postpartum, at a mean 2 (SD ± 3) weeks postpartum. 78 subjects attended the initial baseline visit, and 75 subjects were randomized into the trial at a mean 7 (SD ± 2) weeks postpartum. The recruited subjects were similar in age and race/ethnicity to the total population of 538 GDM deliveries at BWH over the 17-month recruitment period. Our multilevel approach allowed us to successfully meet our recruitment goal and recruit a representative sample of women with recent GDM. We believe that our most successful strategies included using a dedicated in-person recruiter, integrating recruitment into clinical flow, allowing for flexibility in recruitment, minimizing barriers to participation, and using an opt-out strategy with providers. Although the majority of women were recruited while pregnant, women recruited in the early postpartum period were

  18. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  19. A description of the demographic characteristics of the New Zealand non-commercial horse population with data collected using a generalised random-tessellation stratified sampling design.

    Science.gov (United States)

    Rosanowski, S M; Cogger, N; Rogers, C W; Benschop, J; Stevenson, M A

    2012-12-01

    We conducted a cross-sectional survey to determine the demographic characteristics of non-commercial horses in New Zealand. A sampling frame of properties with non-commercial horses was derived from the national farms database, AgriBase™. Horse properties were stratified by property size and a generalised random-tessellated stratified (GRTS) sampling strategy was used to select properties (n=2912) to take part in the survey. The GRTS sampling design allowed for the selection of properties that were spatially balanced relative to the distribution of horse properties throughout the country. The registered decision maker of the property, as identified in AgriBase™, was sent a questionnaire asking them to describe the demographic characteristics of horses on the property, including the number and reason for keeping horses, as well as information about other animals kept on the property and the proximity of boundary neighbours with horses. The response rate to the survey was 38% (1044/2912) and the response rate was not associated with property size or region. A total of 5322 horses were kept for recreation, competition, racing, breeding, stock work, or as pets. The reasons for keeping horses and the number and class of horses varied significantly between regions and by property size. Of the properties sampled, less than half kept horses that could have been registered with Equestrian Sports New Zealand or either of the racing codes. Of the respondents that reported knowing whether their neighbours had horses, 58.6% (455/776) of properties had at least one boundary neighbour that kept horses. The results of this study have important implications for New Zealand, which has an equine population that is naïve to many equine diseases considered endemic worldwide. The ability to identify, and apply accurate knowledge of the population at risk to infectious disease control strategies would lead to more effective strategies to control and prevent disease spread during an

  20. Response surface methodology based on central composite design as a chemometric tool for optimization of dispersive-solidification liquid-liquid microextraction for speciation of inorganic arsenic in environmental water samples.

    Science.gov (United States)

    Asadollahzadeh, Mehdi; Tavakoli, Hamed; Torab-Mostaedi, Meisam; Hosseini, Ghaffar; Hemmati, Alireza

    2014-06-01

    Dispersive-solidification liquid-liquid microextraction (DSLLME) coupled with electrothermal atomic absorption spectrometry (ETAAS) was developed for preconcentration and determination of inorganic arsenic (III, V) in water samples. At pH=1, As(III) formed complex with ammonium pyrrolidine dithiocarbamate (APDC) and extracted into the fine droplets of 1-dodecanol (extraction solvent) which were dispersed with ethanol (disperser solvent) into the water sample solution. After extraction, the organic phase was separated by centrifugation, and was solidified by transferring into an ice bath. The solidified solvent was transferred to a conical vial and melted quickly at room temperature. As(III) was determined in the melted organic phase while As(V) remained in the aqueous layer. Total inorganic As was determined after the reduction of the pentavalent forms of arsenic with sodium thiosulphate and potassium iodide. As(V) was calculated by difference between the concentration of total inorganic As and As(III). The variable of interest in the DSLLME method, such as the volume of extraction solvent and disperser solvent, pH, concentration of APDC (chelating agent), extraction time and salt effect, was optimized with the aid of chemometric approaches. First, in screening experiments, fractional factorial design (FFD) was used for selecting the variables which significantly affected the extraction procedure. Afterwards, the significant variables were optimized using response surface methodology (RSM) based on central composite design (CCD). In the optimum conditions, the proposed method has been successfully applied to the determination of inorganic arsenic in different environmental water samples and certified reference material (NIST RSM 1643e). Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Rat sperm motility analysis: methodologic considerations

    Science.gov (United States)

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  2. Application of bimodal distribution to the detection of changes in uranium concentration in drinking water collected by random daytime sampling method from a large water supply zone.

    Science.gov (United States)

    Garboś, Sławomir; Święcicka, Dorota

    2015-11-01

    The random daytime (RDT) sampling method was used for the first time in the assessment of average weekly exposure to uranium through drinking water in a large water supply zone. Data set of uranium concentrations determined in 106 RDT samples collected in three runs from the water supply zone in Wroclaw (Poland), cannot be simply described by normal or log-normal distributions. Therefore, a numerical method designed for the detection and calculation of bimodal distribution was applied. The extracted two distributions containing data from the summer season of 2011 and the winter season of 2012 (nI=72) and from the summer season of 2013 (nII=34) allowed to estimate means of U concentrations in drinking water: 0.947 μg/L and 1.23 μg/L, respectively. As the removal efficiency of uranium during applied treatment process is negligible, the effect of increase in uranium concentration can be explained by higher U concentration in the surface-infiltration water used for the production of drinking water. During the summer season of 2013, heavy rains were observed in Lower Silesia region, causing floods over the territory of the entire region. Fluctuations in uranium concentrations in surface-infiltration water can be attributed to releases of uranium from specific sources - migration from phosphate fertilizers and leaching from mineral deposits. Thus, exposure to uranium through drinking water may increase during extreme rainfall events. The average chronic weekly intakes of uranium through drinking water, estimated on the basis of central values of the extracted normal distributions, accounted for 3.2% and 4.1% of tolerable weekly intake. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Association between Spouse/Child Separation and Migration-Related Stress among a Random Sample of Rural-to-Urban Migrants in Wuhan, China.

    Science.gov (United States)

    Guo, Yan; Chen, Xinguang; Gong, Jie; Li, Fang; Zhu, Chaoyang; Yan, Yaqiong; Wang, Liang

    2016-01-01

    Millions of people move from rural areas to urban areas in China to pursue new opportunities while leaving their spouses and children at rural homes. Little is known about the impact of migration-related separation on mental health of these rural migrants in urban China. Survey data from a random sample of rural-to-urban migrants (n = 1113, aged 18-45) from Wuhan were analyzed. The Domestic Migration Stress Questionnaire (DMSQ), an instrument with four subconstructs, was used to measure migration-related stress. The relationship between spouse/child separation and stress was assessed using survey estimation methods to account for the multi-level sampling design. 16.46% of couples were separated from their spouses (spouse-separation only), 25.81% of parents were separated from their children (child separation only). Among the participants who married and had children, 5.97% were separated from both their spouses and children (double separation). Spouse-separation only and double separation did not scored significantly higher on DMSQ than those with no separation. Compared to parents without child separation, parents with child separation scored significantly higher on DMSQ (mean score = 2.88, 95% CI: [2.81, 2.95] vs. 2.60 [2.53, 2.67], p child-separation only and for female participants. Child-separation is an important source of migration-related stress, and the effect is particularly strong for migrant women. Public policies and intervention programs should consider these factors to encourage and facilitate the co-migration of parents with their children to mitigate migration-related stress.

  4. Influence of cigarette smoking on coronary artery and aortic calcium among random samples from populations of middle-age Japanese and Korean men

    Science.gov (United States)

    Hirooka, Nobutaka; Kadowaki, Takashi; Sekikawa, Akira; Ueshima, Hirotsugu; Choo, Jina; Miura, Katsuyuki; Okamura, Tomonori; Fujiyoshi, Akira; Kadowaki, Sayaka; Kadota, Aya; Nakamura, Yasuyuki; Maegawa, Hiroshi; Kashiwagi, Atsunori; Masaki, Kamal; Sutton-Tyrrell, Kim; Kuller, Lewis H.; Curb, J. David; Shin, Chol

    2012-01-01

    Background Cigarette smoking is a risk factor of coronary heart disease (CHD). Vascular calcification such as coronary artery calcium (CAC) and aortic calcium (AC) is associated with CHD. We hypothesized that cigarette smoking is associated with coronary artery and aortic calcifications in Japanese and Koreans with high smoking prevalence. Methods Random samples from populations of 313 Japanese and 302 Korean men aged 40 to 49 were examined for calcification of the coronary artery and aorta using electron beam computed tomography. Coronary artery calcium (CAC) and aortic calcium (AC) were quantified using the Agatston score. We examined the associations of cigarette smoking with CAC and AC after adjusting for conventional risk factors and alcohol consumption. Current and past smokers were combined and categorized into two groups using median pack-years as a cutoff point in each of Japanese and Koreans. The never smoker group was used as a reference for the multiple logistic regression analyses. Results The odds ratios of CAC (score ≥10) for smokers with higher pack-years were 2.9 in Japanese (PKoreans (non-significant) compared to never smokers. The odds ratios of AC (score ≥100) for smokers with higher pack-years were 10.4 in Japanese (PKoreans (Psmoking with higher pack-years is significantly associated with CAC and AC in Japanese men, while cigarette smoking with higher pack-years is significantly associated with AC but not significantly with CAC in Korean men. PMID:22844083

  5. Influence of cigarette smoking on coronary artery and aortic calcium among random samples from populations of middle-aged Japanese and Korean men.

    Science.gov (United States)

    Hirooka, Nobutaka; Kadowaki, Takashi; Sekikawa, Akira; Ueshima, Hirotsugu; Choo, Jina; Miura, Katsuyuki; Okamura, Tomonori; Fujiyoshi, Akira; Kadowaki, Sayaka; Kadota, Aya; Nakamura, Yasuyuki; Maegawa, Hiroshi; Kashiwagi, Atsunori; Masaki, Kamal; Sutton-Tyrrell, Kim; Kuller, Lewis H; Curb, J David; Shin, Chol

    2013-02-01

    Cigarette smoking is a risk factor of coronary heart disease. Vascular calcification such as coronary artery calcium (CAC) and aortic calcium (AC) is associated with coronary heart disease. The authors hypothesised that cigarette smoking is associated with coronary artery and aortic calcifications in Japanese and Koreans with high smoking prevalence. Random samples from populations of 313 Japanese and 302 Korean men aged 40-49 years were examined for calcification of the coronary artery and aorta using electron beam CT. CAC and AC were quantified using the Agatston score. The authors examined the associations of cigarette smoking with CAC and AC after adjusting for conventional risk factors and alcohol consumption. Current and past smokers were combined and categorised into two groups using median pack-years as a cut-off point in each of Japanese and Koreans. The never-smoker group was used as a reference for the multiple logistic regression analyses. The ORs of CAC (score ≥10) for smokers with higher pack-years were 2.9 in Japanese (pKoreans (non-significant) compared with never-smokers. The ORs of AC (score ≥100) for smokers with higher pack-years were 10.4 in Japanese (pKoreans (psmoking with higher pack-years is significantly associated with CAC and AC in Japanese men, while cigarette smoking with higher pack-years is significantly associated with AC but not significantly with CAC in Korean men.

  6. A profile of US-Mexico border mobility among a stratified random sample of Hispanics living in the El Paso-Juarez area.

    Science.gov (United States)

    Lapeyrouse, L M; Morera, O; Heyman, J M C; Amaya, M A; Pingitore, N E; Balcazar, H

    2012-04-01

    Examination of border-specific characteristics such as trans-border mobility and transborder health service illuminates the heterogeneity of border Hispanics and may provide greater insight toward understanding differential health behaviors and status among these populations. In this study, we create a descriptive profile of the concept of trans-border mobility by exploring the relationship between mobility status and a series of demographic, economic and socio-cultural characteristics among mobile and non-mobile Hispanics living in the El Paso-Juarez border region. Using a two-stage stratified random sampling design, bilingual interviewers collected survey data from border residents (n = 1,002). Findings show that significant economic, cultural, and behavioral differences exist between mobile and non-mobile respondents. While non-mobile respondents were found to have higher social economic status than their mobile counterparts, mobility across the border was found to offer less acculturated and poorer Hispanics access to alternative sources of health care and other services.

  7. Physical abuse of older adults in nursing homes: a random sample survey of adults with an elderly family member in a nursing home.

    Science.gov (United States)

    Schiamberg, Lawrence B; Oehmke, James; Zhang, Zhenmei; Barboza, Gia E; Griffore, Robert J; Von Heydrich, Levente; Post, Lori A; Weatherill, Robin P; Mastin, Teresa

    2012-01-01

    Few empirical studies have focused on elder abuse in nursing home settings. The present study investigated the prevalence and risk factors of staff physical abuse among elderly individuals receiving nursing home care in Michigan. A random sample of 452 adults with elderly relatives, older than 65 years, and in nursing home care completed a telephone survey regarding elder abuse and neglect experienced by this elder family member in the care setting. Some 24.3% of respondents reported at least one incident of physical abuse by nursing home staff. A logistic regression model was used to estimate the importance of various risk factors in nursing home abuse. Limitations in activities of daily living (ADLs), older adult behavioral difficulties, and previous victimization by nonstaff perpetrators were associated with a greater likelihood of physical abuse. Interventions that address these risk factors may be effective in reducing older adult physical abuse in nursing homes. Attention to the contextual or ecological character of nursing home abuse is essential, particularly in light of the findings of this study.

  8. Thermal discomfort with cold extremities in relation to age, gender, and body mass index in a random sample of a Swiss urban population

    Directory of Open Access Journals (Sweden)

    Orgül Selim

    2010-06-01

    Full Text Available Abstract Background The aim of this epidemiological study was to investigate the relationship of thermal discomfort with cold extremities (TDCE to age, gender, and body mass index (BMI in a Swiss urban population. Methods In a random population sample of Basel city, 2,800 subjects aged 20-40 years were asked to complete a questionnaire evaluating the extent of cold extremities. Values of cold extremities were based on questionnaire-derived scores. The correlation of age, gender, and BMI to TDCE was analyzed using multiple regression analysis. Results A total of 1,001 women (72.3% response rate and 809 men (60% response rate returned a completed questionnaire. Statistical analyses revealed the following findings: Younger subjects suffered more intensely from cold extremities than the elderly, and women suffered more than men (particularly younger women. Slimmer subjects suffered significantly more often from cold extremities than subjects with higher BMIs. Conclusions Thermal discomfort with cold extremities (a relevant symptom of primary vascular dysregulation occurs at highest intensity in younger, slimmer women and at lowest intensity in elderly, stouter men.

  9. Predictors of poor retention on antiretroviral therapy as a major HIV drug resistance early warning indicator in Cameroon: results from a nationwide systematic random sampling

    Directory of Open Access Journals (Sweden)

    Serge Clotaire Billong

    2016-11-01

    Full Text Available Abstract Background Retention on lifelong antiretroviral therapy (ART is essential in sustaining treatment success while preventing HIV drug resistance (HIVDR, especially in resource-limited settings (RLS. In an era of rising numbers of patients on ART, mastering patients in care is becoming more strategic for programmatic interventions. Due to lapses and uncertainty with the current WHO sampling approach in Cameroon, we thus aimed to ascertain the national performance of, and determinants in, retention on ART at 12 months. Methods Using a systematic random sampling, a survey was conducted in the ten regions (56 sites of Cameroon, within the “reporting period” of October 2013–November 2014, enrolling 5005 eligible adults and children. Performance in retention on ART at 12 months was interpreted following the definition of HIVDR early warning indicator: excellent (>85%, fair (85–75%, poor (<75; and factors with p-value < 0.01 were considered statistically significant. Results Majority (74.4% of patients were in urban settings, and 50.9% were managed in reference treatment centres. Nationwide, retention on ART at 12 months was 60.4% (2023/3349; only six sites and one region achieved acceptable performances. Retention performance varied in reference treatment centres (54.2% vs. management units (66.8%, p < 0.0001; male (57.1% vs. women (62.0%, p = 0.007; and with WHO clinical stage I (63.3% vs. other stages (55.6%, p = 0.007; but neither for age (adults [60.3%] vs. children [58.8%], p = 0.730 nor for immune status (CD4351–500 [65.9%] vs. other CD4-staging [59.86%], p = 0.077. Conclusions Poor retention in care, within 12 months of ART initiation, urges active search for lost-to-follow-up targeting preferentially male and symptomatic patients, especially within reference ART clinics. Such sampling strategy could be further strengthened for informed ART monitoring and HIVDR prevention perspectives.

  10. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  11. Optimization of solid-phase extraction using artificial neural networks and response surface methodology in combination with experimental design for determination of gold by atomic absorption spectrometry in industrial wastewater samples.

    Science.gov (United States)

    Ebrahimzadeh, H; Tavassoli, N; Sadeghi, O; Amini, M M

    2012-08-15

    Solid-phase extraction (SPE) is often used for preconcentration and determination of metal ions from industrial and natural samples. A traditional single variable approach (SVA) is still often carried out for optimization in analytical chemistry. Since there is always a risk of not finding the real optimum by single variation method, more advanced optimization approaches such as multivariable approach (MVA) should be applied. Applying MVA optimization can save both time and chemical materials, and consequently decrease analytical costs. Nowadays, using artificial neural network (ANN) and response surface methodology (RSM) in combination with experimental design (MVA) are rapidly developing. After prediction of model equation in RSM and training of artificial neurons in ANNs, the products were used for estimation of the response of the 27 experimental runs. In the present work, the optimization of SPE using single variation method and optimization by ANN and RSM in combination with central composite design (CCD) are compared and the latter approach is practically illustrated. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. An Abecedary of Sampling.

    Science.gov (United States)

    Doyle, Kenneth O., Jr.

    1979-01-01

    The vocabulary of sampling is examined in order to provide a clear understanding of basic sampling concepts. The basic vocabulary of sampling (population, probability sampling, precision and bias, stratification), the fundamental grammar of sampling (random sample), sample size and response rate, and cluster, multiphase, snowball, and panel…

  13. Self-help interventions for adjustment disorder problems: a randomized waiting-list controlled study in a sample of burglary victims.

    Science.gov (United States)

    Bachem, Rahel; Maercker, Andreas

    2016-09-01

    Adjustment disorders (AjD) are among the most frequent mental disorders yet often remain untreated. The high prevalence, comparatively mild symptom impairment, and transient nature make AjD a promising target for low-threshold self-help interventions. Bibliotherapy represents a potential treatment for AjD problems. This study investigates the effectiveness of a cognitive behavioral self-help manual specifically directed at alleviating AjD symptoms in a homogenous sample of burglary victims. Participants with clinical or subclinical AjD symptoms following experience of burglary were randomized to an intervention group (n = 30) or waiting-list control group (n = 24). The new explicit stress response syndrome model for diagnosing AjD was applied. Participants received no therapist support and assessments took place at baseline, after the one-month intervention, and at three-month follow-up. Based on completer analyses, group by time interactions indicated that the intervention group showed more improvement in AjD symptoms of preoccupation and in post-traumatic stress symptoms. Post-intervention between-group effect sizes ranged from Cohen's d = .17 to .67 and the proportion of participants showing reliable change was consistently higher in the intervention group than in the control group. Engagement with the self-help manual was high: 87% of participants had worked through at least half the manual. This is the first published RCT of a bibliotherapeutic self-help intervention for AjD problems. The findings provide evidence that a low-threshold self-help intervention without therapist contact is a feasible and effective treatment for symptoms of AjD.

  14. 22G versus 25G biopsy needles for EUS-guided tissue sampling of solid pancreatic masses: a randomized controlled study.

    Science.gov (United States)

    Woo, Young Sik; Lee, Kwang Hyuck; Noh, Dong Hyo; Park, Joo Kyung; Lee, Kyu Taek; Lee, Jong Kyun; Jang, Kee-Taek

    2017-12-01

    No comparative study of 22-gauge biopsy needles (PC22) and 25-gauge biopsy needles (PC25) has been conducted. We prospectively compared the diagnostic accuracy of PC22 and PC25 in patients with pancreatic and peripancreatic solid masses. We conducted a randomized noninferiority clinical study from January 2013 to May 2014 at Samsung Medical Center. A cytological and histological specimen of each pass was analyzed separately by an experienced pathologist. The primary outcome was to assess the diagnostic accuracy using the PC22 or PC25. Secondary outcomes included the optimal number of passes for adequate diagnosis, core specimen yield, sample adequacy, and complication rates. Diagnostic accuracy of combining cytology with histology in three cumulative passes was 97.1% (100/103) for the PC22 and 91.3% (94/103) for the PC25 group. Thus, noninferiority of PC25 to PC22 was not shown with a 10% noninferiority margin (difference, -5.8%; 95% CI, -12.1 to -0.5%). In a pairwise comparison with each needle type, two passes was non-inferior to three passes in the PC22 (96.1% vs. 97.1%; difference, -0.97%; 95% CI -6.63 to 4.69%) but noninferiority of two passes to three passes was not shown in the PC25 group (87.4% vs. 91.3%; difference, -3.88%; 95% CI, -13.5 to 5.7%). Non-inferiority of PC25 to PC22 diagnostic accuracy was not observed for solid pancreatic or peripancreatic masses without on-site cytology. PC22 may be a more ideal device because only two PC22 needle passes was sufficient to establish an adequate diagnosis, whereas PC25 required three or more needle passes.

  15. Chronic kidney disease in the type 2 diabetic patients: prevalence and associated variables in a random sample of 2642 patients of a Mediterranean area

    Directory of Open Access Journals (Sweden)

    Coll-de-Tuero Gabriel

    2012-08-01

    Full Text Available Abstract Background Kidney disease is associated with an increased total mortality and cardiovascular morbimortality in the general population and in patients with Type 2 diabetes. The aim of this study is to determine the prevalence of kidney disease and different types of renal disease in patients with type 2 diabetes (T2DM. Methods Cross-sectional study in a random sample of 2,642 T2DM patients cared for in primary care during 2007. Studied variables: demographic and clinical characteristics, pharmacological treatments and T2DM complications (diabetic foot, retinopathy, coronary heart disease and stroke. Variables of renal function were defined as follows: 1 Microalbuminuria: albumin excretion rate & 30 mg/g or 3.5 mg/mmol, 2 Macroalbuminuria: albumin excretion rate & 300 mg/g or 35 mg/mmol, 3 Kidney disease (KD: glomerular filtration rate according to Modification of Diet in Renal Disease 2 and/or the presence of albuminuria, 4 Renal impairment (RI: glomerular filtration rate 2, 5 Nonalbuminuric RI: glomerular filtration rate 2 without albuminuria and, 5 Diabetic nephropathy (DN: macroalbuminuria or microalbuminuria plus diabetic retinopathy. Results The prevalence of different types of renal disease in patients was: 34.1% KD, 22.9% RI, 19.5% albuminuria and 16.4% diabetic nephropathy (DN. The prevalence of albuminuria without RI (13.5% and nonalbuminuric RI (14.7% was similar. After adjusting per age, BMI, cholesterol, blood pressure and macrovascular disease, RI was significantly associated with the female gender (OR 2.20; CI 95% 1.86–2.59, microvascular disease (OR 2.14; CI 95% 1.8–2.54 and insulin treatment (OR 1.82; CI 95% 1.39–2.38, and inversely associated with HbA1c (OR 0.85 for every 1% increase; CI 95% 0.80–0.91. Albuminuria without RI was inversely associated with the female gender (OR 0.27; CI 95% 0.21–0.35, duration of diabetes (OR 0.94 per year; CI 95% 0.91–0.97 and directly associated with HbA1c (OR 1.19 for every

  16. Children's Quality of Life Based on the KIDSCREEN-27: Child Self-Report, Parent Ratings and Child-Parent Agreement in a Swedish Random Population Sample.

    Directory of Open Access Journals (Sweden)

    Anne H Berman

    Full Text Available The KIDSCREEN-27 is a measure of child and adolescent quality of life (QoL, with excellent psychometric properties, available in child-report and parent-rating versions in 38 languages. This study provides child-reported and parent-rated norms for the KIDSCREEN-27 among Swedish 11-16 year-olds, as well as child-parent agreement. Sociodemographic correlates of self-reported wellbeing and parent-rated wellbeing were also measured.A random population sample consisting of 600 children aged 11-16, 100 per age group and one of their parents (N = 1200, were approached for response to self-reported and parent-rated versions of the KIDSCREEN-27. Parents were also asked about their education, employment status and their own QoL based on the 26-item WHOQOL-Bref. Based on the final sampling pool of 1158 persons, a 34.8% response rate of 403 individuals was obtained, including 175 child-parent pairs, 27 child singleton responders and 26 parent singletons. Gender and age differences for parent ratings and child-reported data were analyzed using t-tests and the Mann-Whitney U-test. Post-hoc Dunn tests were conducted for pairwise comparisons when the p-value for specific subscales was 0.05 or lower. Child-parent agreement was tested item-by-item, using the Prevalence- and Bias-Adjusted Kappa (PABAK coefficient for ordinal data (PABAK-OS; dimensional and total score agreement was evaluated based on dichotomous cut-offs for lower well-being, using the PABAK and total, continuous scores were evaluated using Bland-Altman plots.Compared to European norms, Swedish children in this sample scored lower on Physical wellbeing (48.8 SE/49.94 EU but higher on the other KIDSCREEN-27 dimensions: Psychological wellbeing (53.4/49.77, Parent relations and autonomy (55.1/49.99, Social Support and peers (54.1/49.94 and School (55.8/50.01. Older children self-reported lower wellbeing than younger children. No significant self-reported gender differences occurred and parent ratings

  17. Children’s Quality of Life Based on the KIDSCREEN-27: Child Self-Report, Parent Ratings and Child-Parent Agreement in a Swedish Random Population Sample

    Science.gov (United States)

    Berman, Anne H.; Liu, Bojing; Ullman, Sara; Jadbäck, Isabel; Engström, Karin

    2016-01-01

    Background The KIDSCREEN-27 is a measure of child and adolescent quality of life (QoL), with excellent psychometric properties, available in child-report and parent-rating versions in 38 languages. This study provides child-reported and parent-rated norms for the KIDSCREEN-27 among Swedish 11–16 year-olds, as well as child-parent agreement. Sociodemographic correlates of self-reported wellbeing and parent-rated wellbeing were also measured. Methods A random population sample consisting of 600 children aged 11–16, 100 per age group and one of their parents (N = 1200), were approached for response to self-reported and parent-rated versions of the KIDSCREEN-27. Parents were also asked about their education, employment status and their own QoL based on the 26-item WHOQOL-Bref. Based on the final sampling pool of 1158 persons, a 34.8% response rate of 403 individuals was obtained, including 175 child-parent pairs, 27 child singleton responders and 26 parent singletons. Gender and age differences for parent ratings and child-reported data were analyzed using t-tests and the Mann-Whitney U-test. Post-hoc Dunn tests were conducted for pairwise comparisons when the p-value for specific subscales was 0.05 or lower. Child-parent agreement was tested item-by-item, using the Prevalence- and Bias-Adjusted Kappa (PABAK) coefficient for ordinal data (PABAK-OS); dimensional and total score agreement was evaluated based on dichotomous cut-offs for lower well-being, using the PABAK and total, continuous scores were evaluated using Bland-Altman plots. Results Compared to European norms, Swedish children in this sample scored lower on Physical wellbeing (48.8 SE/49.94 EU) but higher on the other KIDSCREEN-27 dimensions: Psychological wellbeing (53.4/49.77), Parent relations and autonomy (55.1/49.99), Social Support and peers (54.1/49.94) and School (55.8/50.01). Older children self-reported lower wellbeing than younger children. No significant self-reported gender differences

  18. Children's Quality of Life Based on the KIDSCREEN-27: Child Self-Report, Parent Ratings and Child-Parent Agreement in a Swedish Random Population Sample.

    Science.gov (United States)

    Berman, Anne H; Liu, Bojing; Ullman, Sara; Jadbäck, Isabel; Engström, Karin

    2016-01-01

    The KIDSCREEN-27 is a measure of child and adolescent quality of life (QoL), with excellent psychometric properties, available in child-report and parent-rating versions in 38 languages. This study provides child-reported and parent-rated norms for the KIDSCREEN-27 among Swedish 11-16 year-olds, as well as child-parent agreement. Sociodemographic correlates of self-reported wellbeing and parent-rated wellbeing were also measured. A random population sample consisting of 600 children aged 11-16, 100 per age group and one of their parents (N = 1200), were approached for response to self-reported and parent-rated versions of the KIDSCREEN-27. Parents were also asked about their education, employment status and their own QoL based on the 26-item WHOQOL-Bref. Based on the final sampling pool of 1158 persons, a 34.8% response rate of 403 individuals was obtained, including 175 child-parent pairs, 27 child singleton responders and 26 parent singletons. Gender and age differences for parent ratings and child-reported data were analyzed using t-tests and the Mann-Whitney U-test. Post-hoc Dunn tests were conducted for pairwise comparisons when the p-value for specific subscales was 0.05 or lower. Child-parent agreement was tested item-by-item, using the Prevalence- and Bias-Adjusted Kappa (PABAK) coefficient for ordinal data (PABAK-OS); dimensional and total score agreement was evaluated based on dichotomous cut-offs for lower well-being, using the PABAK and total, continuous scores were evaluated using Bland-Altman plots. Compared to European norms, Swedish children in this sample scored lower on Physical wellbeing (48.8 SE/49.94 EU) but higher on the other KIDSCREEN-27 dimensions: Psychological wellbeing (53.4/49.77), Parent relations and autonomy (55.1/49.99), Social Support and peers (54.1/49.94) and School (55.8/50.01). Older children self-reported lower wellbeing than younger children. No significant self-reported gender differences occurred and parent ratings showed

  19. Randomization-based adjustment of multiple treatment hazard ratios for covariates with missing data.

    Science.gov (United States)

    Lam, Diana; Koch, Gary G; Preisser, John S; Saville, Benjamin R; Hussey, Michael A

    2017-01-01

    Clinical trials are designed to compare treatment effects when applied to samples from the same population. Randomization is used so that the samples are not biased with respect to baseline covariates that may influence the efficacy of the treatment. We develop randomization-based covariance adjustment methodology to estimate the log hazard ratios and their confidence intervals of multiple treatments in a randomized clinical trial with time-to-event outcomes and missingness among the baseline covariates. The randomization-based covariance adjustment method is a computationally straight-forward method for handling missing baseline covariate values.

  20. A Randomized Clinical Trial of Cogmed Working Memory Training in School-Age Children with ADHD: A Replication in a Diverse Sample Using a Control Condition

    Science.gov (United States)

    Chacko, A.; Bedard, A. C.; Marks, D. J.; Feirsen, N.; Uderman, J. Z.; Chimiklis, A.; Rajwan, E.; Cornwell, M.; Anderson, L.; Zwilling, A.; Ramon, M.

    2014-01-01

    Background: Cogmed Working Memory Training (CWMT) has received considerable attention as a promising intervention for the treatment of Attention-Deficit/Hyperactivity Disorder (ADHD) in children. At the same time, methodological weaknesses in previous clinical trials call into question reported efficacy of CWMT. In particular, lack of equivalence…

  1. The clinical and rehabilitative complexity in dementia with Lewy bodies (DLB): experience on a random sample of elderly patients dwelling in an RSA ("Residenza Sanitaria Assistita") of Catania.

    Science.gov (United States)

    Santangelo, Antonino; Testai', Manuela; Albani, Salvatore; Mamazza, Grazia; Pavano, Salvatore; Zuccaro, Carmela; Atteritano, Marco; Berretta, Massimiliano; Tomarchio, Marcello; Maugeri, Domenico

    2010-01-01

    This study was aimed at evaluating the occurrence of DLB in a population sample recovered in assisted sanitary residence (RSA=from the Italian name of "Residenza Sanitaria Assistita") in the Province of Catania. We considered 126 patients from a randomized population recovered in RSA of Viagrande (Catania) in the period from 1st March, 2005 and 31st March, 2007. Those who proved to be demented according to the DSM-III diagnostic protocols, and having a mini mental state examination (MMSE)-score <24 were divided in 2 groups: Group A, all the demented people without the DLB; and Group B, the DLB patients, according to the diagnostic criteria of McKeith. All patients underwent at admission, after 1 month, and at emission the following psychometric and functional tests: MMSE, geriatric depression scale (GDS) [Yesavage J.A., Brink T.L., Rose T.L., Adey M., 1983. The development and validation of geriatric depression screening scale: a preliminary report. J. Psych. Res. 17, 37], Barthel index (BI), activities of daily living (ADL) and instrumental ADL (IADL). Particular attention was dedicated to the presence of delirium during the last 15 days before the admission and during the recovery, the mortality and the prevalence of other complaints. The observed data confirm the prevalence of fragility of DLB patients in 20% of them, a fluctuation of the cognitive capacities, a better recovery of the affectivity, a reduced functional autonomy and autosufficiency. In addition, the DLB patients display a major presence of prevalent delirium, compared to the total population of demented patients, while in this last population only incidental delirium episodes occurred during the recovery period (31.6% vs. 16.6%; p<0.001). In the DLB population decubital lesions occurred more frequently, and were of more severe staging, compared to the controls (45% vs. 27%; p<0.001). Also, the mortality of the DLB patients was higher (about 30% vs. 17% in 12 months). These data confirm the

  2. The influence factors of medical professionalism: A stratified-random sampling study based on the physicians and patients in ambulatory care clinics of Chengdu, China.

    Science.gov (United States)

    Lin, Yifei; Yin, Senlin; Lai, Sike; Tang, Ji; Huang, Jin; Du, Liang

    2016-10-01

    As the relationship between physicians and patients deteriorated in China recently, medical conflicts occurred more frequently now. Physicians, to a certain extent, also take some responsibilities. Awareness of medical professionalism and its influence factors can be helpful to take targeted measures and alleviate the contradiction. Through a combination of physicians' self-assessment and patients' assessment in ambulatory care clinics in Chengdu, this research aims to evaluate the importance of medical professionalism in hospitals and explore the influence factors, hoping to provide decision-making references to improve this grim situation. From February to March, 2013, a cross-sectional study was conducted in 2 tier 3 hospitals, 5 tier 2 hospitals, and 10 community hospitals through a stratified-random sampling method on physicians and patients, at a ratio of 1/5. Questionnaires are adopted from a pilot study. A total of 382 physicians and 1910 patients were matched and surveyed. Regarding the medical professionalism, the scores of the self-assessment for physicians were 85.18 ± 7.267 out of 100 and the scores of patient-assessment were 57.66 ± 7.043 out of 70. The influence factors of self-assessment were physicians' working years (P = 0.003) and patients' complaints (P = 0.006), whereas the influence factors of patient-assessment were patients' ages (P = 0.001) and their physicians' working years (P < 0.01) and satisfaction on the payment mode (P = 0.006). Higher self-assessment on the medical professionalism was in accordance with physicians of more working years and no complaint history. Higher patient-assessment was in line with elder patients, the physicians' more working years, and higher satisfaction on the payment mode. Elder patients, encountering with physicians who worked more years in health care services or with higher satisfaction on the payment mode, contribute to higher scores in patient assessment part. The government should

  3. The use of dedicated methodology and statistical reviewers for peer review: a content analysis of comments to authors made by methodology and regular reviewers.

    Science.gov (United States)

    Day, Frank C; Schriger, David L; Todd, Christopher; Wears, Robert L

    2002-09-01

    In 1997, Annals of Emergency Medicine initiated a protocol by which every original research article, in addition to each regular review, was concurrently evaluated by 1 of 2 methodology and statistical reviewers. We characterized and contrasted comments made by the methodology and regular peer reviewers. After pilot testing, interrater reliability assessment, and revision, we finalized a 99-item taxonomy of reviewer comments organized in 8 categories. Two authors, uninvolved in the writing of reviews, classified each comment from a random sample of methodology reviews from 1999. For 30 of these reviews (15 for each methodology reviewer), the 2 authors also scored all (range 2 to 5) regular reviews. Sixty-five reviews by methodologist A, 60 by methodologist B, and 68 by regular reviewers were analyzed. Comments by methodologist A most frequently concerned the presentation of results (33% of all comments) and methods (17%). Methodologist B commented most frequently on presentation of results (28%) and statistical methods (16%). Regular reviewers most frequently made non-methodology/statistical comments (45%) and comments on presentation of results (18%). Of note, comments made by methodology and regular reviewers about methods issues were often contradictory. The distributions of comments made by the 2 methodology and statistical reviewers were similar, although reviewer A emphasized presentation and reviewer B stressed statistical issues. The regular reviewers (most of whom were unaware that a dedicated methodology and statistical reviewer would be reviewing the article) paid much less attention to methodology issues. The 2 dedicated methodology and statistical reviewers created reviews that were similarly focused and emphasized methodology issues that were distinct from the issues raised by regular reviewers.

  4. New Methodology of ENSO Forecast

    Science.gov (United States)

    Feigin, A. M.; Gavrilov, A.; Mukhin, D.; Loskutov, E.; Seleznev, A.

    2016-12-01

    We describe methodology of ENSO forecast based on data-driven construction of evolution operator of underlying climate sub-system. The methodology is composed of two key algorithms: (i) space-distributed data preparation aiming to reduce data dimensionality with minimal loss of information about system's dynamics, and (ii) construction of operator that reproduces evolution of the system in reduced data space. The first algorithm combines several known data preprocessing techniques: decomposition via empirical orthogonal function basis, its spatiotemporal generalization as well as singular value decomposition techniques. The second algorithm supposes construction of evolution operator in the form of random dynamical system realized as nonlinear random mapping; the last is parameterized by artificial neural networks. General Bayesian approach is applied for mutual searching optimal parameters of both algorithms: optimal dimensionality of reduced data space and optimal complexity of the evolution operator. Abilities of suggested methodology will be demonstrated via reproduction and forecast of different ENSO related indexes including comparison of prediction skill of new methodology with power of other existing techniques. This research was supported by the Government of the Russian Federation (Agreement No.14.Z50.31.0033 with the Institute of Applied Physics RAS).

  5. Analytical procedures for urban transportation energy conservation: summary of findings and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Suhrbier, J.H.; Byrne, W.D.

    1979-10-01

    Analytical methodologies are described and illustrated for use by metropolitan planning organizations and other state and local transportation agencies in analyzing the energy conservation potential of candidate urban transportation measurements. Quantitative methodologies oriented to carpooling, vanpooling, transit, pricing, traffic regulation and control, and auto ownership are provided based on the use of disaggregate behavioral travel demand models. Changes are indicated in trip frequency and distribution as well as in travel model, operating conditions, and vehicle miles of travel. Trip-based estimates of fuel consumption and vehicle emissions are included. The methodologies can be adapted to different levels and types of data availability, and can employ manual sketch planning procedures, a programmable calculator, or a fully-calibrated computer program utilizing a random sample household enumeration forecasting technique. Application of the developed methodologies has been performed in cooperation with metropolitan planning organizations representing the Dallas-Fort Worth, San Francisco, and Denver urban areas.

  6. PREVALENCE OF ANTI- Toxocara ANTIBODIES IN A RANDOM SAMPLE OF INPATIENTS AT A CHILDREN'S HOSPITAL IN VITÓRIA, ESPÍRITO SANTO, BRAZIL

    Directory of Open Access Journals (Sweden)

    Sandra F. MOREIRA-SILVA

    1998-07-01

    Full Text Available In the streets of Vitória, in the State of Espírito Santo, Brazil, are large number of stray dogs, many of which are infected with Toxocara canis, suggesting a high risk for human infection. In order to investigate the prevalence of Toxocara infection in children in Espírito Santo we studied the prevalence of anti-Toxocara antibodies in 100 random inpatients over one year of age, at the Children's Hospital N.S. da Glória, the reference children's hospital for the State.All the sera were collected during the period between October 1996 and January 1997. The mean age was 6.6±4.1 yrs. (1 to 14 yrs., median 6yrs. and there were patients from all of the different wards of the hospital. Sixty-eigth patients came from the metropolitan area of Vitória and the other 32 from 17 other municipalities. The anti-Toxocara antibodies were investigated by ELISA-IgG using a secretory-excretory antigen obtained from second stage larvae. All sera were adsorbed with Ascaris suum antigen before the test. Thirty-nine sera (39% were positive, predominantly from boys, but the gender difference was not statistically significant (boys:25/56 or 44.6%; girls:14/44 or 31.8%; p=0.311. The prevalence of positive sera was higher, but not statistically significant, in children from the urban periphery of metropolitan Vitória (formed by the cities of Vitória, Cariacica, Vila Velha, Serra and Viana than in children from 17 other municipalities (44.1% and 28.1% respectively, p=0.190. Although the samples studied do not represent all children living in the State of Espírito Santo, since the Children's Hospital N.S. da Glória admits only patients from the state health system, it is probable that these results indicate a high frequency of Toxocara infection in children living in Espírito Santo. Further studies of population samples are necessary to ascertain the prevalence of Toxocara infection in our country.Em Vitória é grande o número de cães soltos nas ruas, muitos

  7. Measurement methodology of vegetable samples from an area affected by residual contamination due to uranium mining sterile; Metodologia de medida de muestras vegetales procedentes de un terreno afectado por contaminacion residual debida a esteriles de mineria de uranio

    Energy Technology Data Exchange (ETDEWEB)

    Navarro, N.; Suarez, J. A.; Yague, L.; Ortiz Gandia, M. I.; Marijuan, M. J.; Garcia, E.; Ortiz, T.; Alvarez, A.

    2013-07-01

    This paper presents the methodology established for radiological characterization of plant material generated during the first stage of the realization of a movement of land in an area of land affected by residual contamination due to the burial of sterile of uranium mining. (Author)

  8. Random-effects linear modeling and sample size tables for two special crossover designs of average bioequivalence studies: the four-period, two-sequence, two-formulation and six-period, three-sequence, three-formulation designs.

    Science.gov (United States)

    Diaz, Francisco J; Berg, Michel J; Krebill, Ron; Welty, Timothy; Gidal, Barry E; Alloway, Rita; Privitera, Michael

    2013-12-01

    Due to concern and debate in the epilepsy medical community and to the current interest of the US Food and Drug Administration (FDA) in revising approaches to the approval of generic drugs, the FDA is currently supporting ongoing bioequivalence studies of antiepileptic drugs, the EQUIGEN studies. During the design of these crossover studies, the researchers could not find commercial or non-commercial statistical software that quickly allowed computation of sample sizes for their designs, particularly software implementing the FDA requirement of using random-effects linear models for the analyses of bioequivalence studies. This article presents tables for sample-size evaluations of average bioequivalence studies based on the two crossover designs used in the EQUIGEN studies: the four-period, two-sequence, two-formulation design, and the six-period, three-sequence, three-formulation design. Sample-size computations assume that random-effects linear models are used in bioequivalence analyses with crossover designs. Random-effects linear models have been traditionally viewed by many pharmacologists and clinical researchers as just mathematical devices to analyze repeated-measures data. In contrast, a modern view of these models attributes an important mathematical role in theoretical formulations in personalized medicine to them, because these models not only have parameters that represent average patients, but also have parameters that represent individual patients. Moreover, the notation and language of random-effects linear models have evolved over the years. Thus, another goal of this article is to provide a presentation of the statistical modeling of data from bioequivalence studies that highlights the modern view of these models, with special emphasis on power analyses and sample-size computations.

  9. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  10. Network Sampling with Memory: A proposal for more efficient sampling from social networks.

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M

    2012-08-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)-the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a "List" mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a "Search" mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS.

  11. Methods and analysis of realizing randomized grouping.

    Science.gov (United States)

    Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi

    2011-07-01

    Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.

  12. Acceptability, effectiveness, and cost-effectiveness of internet-based exposure treatment for irritable bowel syndrome in a clinical sample: a randomized controlled trial

    OpenAIRE

    Andréewitch Sergej; Lindfors Perjohan; Hedman Erik; Andersson Erik; Andersson Gerhard; Ljótsson Brjánn; Rück Christian; Lindefors Nils

    2011-01-01

    Background: Internet-based cognitive behavior therapy (ICBT) has shown promising effects in the treatment of irritable bowel syndrome (IBS). However, to date no study has used a design where participants have been sampled solely from a clinical population. We aimed to investigate the acceptability, effectiveness, and cost-effectiveness of ICBT for IBS using a consecutively recruited sample from a gastroenterological clinic. less thanbrgreater than less thanbrgreater thanMethods: Sixty-one pat...

  13. Power analysis for a linear regression model when regressors are matrix sampled

    OpenAIRE

    Kolenikov, Stanislav; Hammer, Heather

    2017-01-01

    Multiple matrix sampling is a survey methodology technique that randomly chooses a relatively small subset of items to be presented to survey respondents for the purpose of reducing respondent burden. The data produced are missing completely at random (MCAR), and special missing data techniques should be used in linear regression and other multivariate statistical analysis. We derive asymptotic variances of regression parameter estimates that allow us to conduct power analysis for linear regr...

  14. A randomized trial found online questionnaires supplemented by postal reminders generated a cost-effective and generalizable sample but don't forget the reminders.

    Science.gov (United States)

    Loban, Amanda; Mandefield, Laura; Hind, Daniel; Bradburn, Mike

    2017-12-01

    The objective of this study was to compare the response rates, data completeness, and representativeness of survey data produced by online and postal surveys. A randomized trial nested within a cohort study in Yorkshire, United Kingdom. Participants were randomized to receive either an electronic (online) survey questionnaire with paper reminder (N = 2,982) or paper questionnaire with electronic reminder (N = 2,855). Response rates were similar for electronic contact and postal contacts (50.9% vs. 49.7%, difference = 1.2%, 95% confidence interval: -1.3% to 3.8%). The characteristics of those responding to the two groups were similar. Participants nevertheless demonstrated an overwhelming preference for postal questionnaires, with the majority responding by post in both groups. Online survey questionnaire systems need to be supplemented with a postal reminder to achieve acceptable uptake, but doing so provides a similar response rate and case mix when compared to postal questionnaires alone. For large surveys, online survey systems may be cost saving. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Methodological challenges and solutions for leadership researchers

    OpenAIRE

    Hanges, Paul J.; Shteynberg, Gary

    2004-01-01

    Leadership scholars are faced with pressing methodological issues that challenge our current scholarship. In this paper, we discuss methodological concerns such as multilevel hypothesis testing, construct validity of group-level scales, sampling countries in cross-cultural research, self-report measurement - issues that are endemic to organizational leadership research. We point out the methodological challenges facing modern leadership researchers, and point to possible solutions that can pr...

  16. Optimization of a methodology for the simultaneous determination of deltamethrin, permethrin and malathion in stored wheat samples using dispersive liquid-liquid microextraction with solidification of floating organic drop and HPLC-UV.

    Science.gov (United States)

    Pirsaheb, Meghdad; Rezaei, Mansour; Fattahi, Nazir; Karami, Mohammad; Sharafi, Kiomars; Ghaffari, Hamid R

    2017-09-02

    The purpose of this study was to investigate common pesticides in stored wheat at Kermanshah province's silos in Iran. A simple, inexpensive, reliable and environmentally friendly method based on dispersive liquid-liquid microextraction with solidification of floating organic drop was developed. The analytical characteristics of the method were determined. Also, various parameters such as the materials of the silos, types of ownerships of the silos, geographic orientation of silo locations and climatic conditions of silo locations on pesticide residues in studied wheat samples were investigated. Among all the studied parameters, the climatic conditions of silo locations showed the highest influence on pesticide residues in wheat samples. Generally, 61.2% of the samples had pesticide levels below the method detection limits and 38.8% of the total samples had at least one of the understudied pesticides. Also, 13.9% of the samples had deltamethrin residues, 16.7% of the samples had permethrin, 22.2% of the samples had malathion, 11.1% of the samples had both permethrin and malathion and 2.8% of the samples had both deltamethrin and malathion. The results revealed that the residues of deltamethrin and malathion were lower than the standard level announced by European Union regulation and only three samples contained permethrin higher than Europe standard level.

  17. Language Policy and Methodology

    Science.gov (United States)

    Liddicoat, Antony J.

    2004-01-01

    The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit…

  18. Is mindfulness research methodology improving over time? A systematic review.

    Directory of Open Access Journals (Sweden)

    Simon B Goldberg

    Full Text Available Despite an exponential growth in research on mindfulness-based interventions, the body of scientific evidence supporting these treatments has been criticized for being of poor methodological quality.The current systematic review examined the extent to which mindfulness research demonstrated increased rigor over the past 16 years regarding six methodological features that have been highlighted as areas for improvement. These feature included using active control conditions, larger sample sizes, longer follow-up assessment, treatment fidelity assessment, and reporting of instructor training and intent-to-treat (ITT analyses.We searched PubMed, PsychInfo, Scopus, and Web of Science in addition to a publically available repository of mindfulness studies.Randomized clinical trials of mindfulness-based interventions for samples with a clinical disorder or elevated symptoms of a clinical disorder listed on the American Psychological Association's list of disorders with recognized evidence-based treatment.Independent raters screened 9,067 titles and abstracts, with 303 full text reviews. Of these, 171 were included, representing 142 non-overlapping samples.Across the 142 studies published between 2000 and 2016, there was no evidence for increases in any study quality indicator, although changes were generally in the direction of improved quality. When restricting the sample to those conducted in Europe and North America (continents with the longest history of scientific research in this area, an increase in reporting of ITT analyses was found. When excluding an early, high-quality study, improvements were seen in sample size, treatment fidelity assessment, and reporting of ITT analyses.Taken together, the findings suggest modest adoption of the recommendations for methodological improvement voiced repeatedly in the literature. Possible explanations for this and implications for interpreting this body of research and conducting future studies are

  19. The potential of Virtual Reality as anxiety management tool: a randomized controlled study in a sample of patients affected by Generalized Anxiety Disorder

    Directory of Open Access Journals (Sweden)

    Gorini Alessandra

    2008-05-01

    Full Text Available Abstract Background Generalized anxiety disorder (GAD is a psychiatric disorder characterized by a constant and unspecific anxiety that interferes with daily-life activities. Its high prevalence in general population and the severe limitations it causes, point out the necessity to find new efficient strategies to treat it. Together with the cognitive-behavioural treatments, relaxation represents a useful approach for the treatment of GAD, but it has the limitation that it is hard to be learned. To overcome this limitation we propose the use of virtual reality (VR to facilitate the relaxation process by visually presenting key relaxing images to the subjects. The visual presentation of a virtual calm scenario can facilitate patients' practice and mastery of relaxation, making the experience more vivid and real than the one that most subjects can create using their own imagination and memory, and triggering a broad empowerment process within the experience induced by a high sense of presence. According to these premises, the aim of the present study is to investigate the advantages of using a VR-based relaxation protocol in reducing anxiety in patients affected by GAD. Methods/Design The trial is based on a randomized controlled study, including three groups of 25 patients each (for a total of 75 patients: (1 the VR group, (2 the non-VR group and (3 the waiting list (WL group. Patients in the VR group will be taught to relax using a VR relaxing environment and audio-visual mobile narratives; patients in the non-VR group will be taught to relax using the same relaxing narratives proposed to the VR group, but without the VR support, and patients in the WL group will not receive any kind of relaxation training. Psychometric and psychophysiological outcomes will serve as quantitative dependent variables, while subjective reports of participants will be used as qualitative dependent variables. Conclusion We argue that the use of VR for relaxation

  20. Zooplankton Methodology, Collection & identyification - A field manual

    Digital Repository Service at National Institute of Oceanography (India)

    Goswami, S.C.

    and productivity would largely depend upon the use of correct methodology which involves collection of samples, fixation, preservation, analysis and computation of data. The detailed procedures on all these aspects are given in this manual....

  1. Psychosis Incident Cohort Outcome Study (PICOS). A multisite study of clinical, social and biological characteristics, patterns of care and predictors of outcome in first-episode psychosis. Background, methodology and overview of the patient sample.

    Science.gov (United States)

    Lasalvia, A; Tosato, S; Brambilla, P; Bertani, M; Bonetto, C; Cristofalo, D; Bissoli, S; De Santi, K; Lazzarotto, L; Zanatta, G; Marrella, G; Mazzoncini, R; Zanoni, M; Garzotto, N; Dolce, C; Nicolau, S; Ramon, L; Perlini, C; Rambaldelli, G; Bellani, M; Tansella, M; Ruggeri, M

    2012-09-01

    This paper aims at providing an overview of the background, design and initial findings of Psychosis Incident Cohort Outcome Study (PICOS). PICOS is a large multi-site population-based study on first-episode psychosis (FEP) patients attending public mental health services in the Veneto region (Italy) over a 3-year period. PICOS has a naturalistic longitudinal design and it includes three different modules addressing, respectively, clinical and social variables, genetics and brain imaging. Its primary aims are to characterize FEP patients in terms of clinical, psychological and social presentation, and to investigate the relative weight of clinical, environmental and biological factors (i.e. genetics and brain structure/functioning) in predicting the outcome of FEP. An in-depth description of the research methodology is given first. Details on recruitment phase and baseline and follow-up evaluations are then provided. Initial findings relating to patients' baseline assessments are also presented. Future planned analyses are outlined. Both strengths and limitations of PICOS are discussed in the light of issues not addressed in the current literature on FEP. This study aims at making a substantial contribution to research on FEP patients. It is hoped that the research strategies adopted in PICOS will enhance the convergence of methodologies in ongoing and future studies on FEP.

  2. Internet-based incentives increase blood glucose testing with a non-adherent, diverse sample of teens with type 1 diabetes mellitus: a randomized controlled Trial.

    Science.gov (United States)

    Raiff, Bethany R; Barrry, Victoria B; Ridenour, Ty A; Jitnarin, Natinee

    2016-06-01

    Non-adherence with self-monitoring blood glucose (SMBG) among teenagers with type 1 diabetes can be a problem. The purpose of this study was to investigate the feasibility, acceptability, and preliminary efficacy of using Internet-based incentives to improve adherence with SMBG in non-adherent teenagers. Participants were randomly assigned to contingent (CS; N = 23), where they had to meet web camera-verified SMBG goals to earn incentives, or non-contingent (NS) groups (N = 18), where they earned incentives independent of adherence. Brief motivational interviewing (MI) was given prior to the intervention. Attrition was 15 % in the CS group. Participants and parents endorsed the intervention on all intervention dimensions. Daily SMBG increased after one MI session, and further increased when incentives were added, but significantly more for so for older participants. SMBG declined slowly over time, but only returned to baseline levels for younger NS participants. Internet-based incentive interventions are feasible, acceptable, and show promise for improving adherence with SMBG.

  3. Parent-Child Associations in Pedometer-Determined Physical Activity and Sedentary Behaviour on Weekdays and Weekends in Random Samples of Families in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Dagmar Sigmundová

    2014-07-01

    Full Text Available This study investigates whether more physically active parents bring up more physically active children and whether parents’ level of physical activity helps children achieve step count recommendations on weekdays and weekends. The participants (388 parents aged 35–45 and their 485 children aged 9–12 were randomly recruited from 21 Czech government-funded primary schools. The participants recorded pedometer step counts for seven days (≥10 h a day during April–May and September–October of 2013. Logistic regression (Enter method was used to examine the achievement of the international recommendations of 11,000 steps/day for girls and 13,000 steps/day for boys. The children of fathers and mothers who met the weekend recommendation of 10,000 steps were 5.48 (95% confidence interval: 1.65; 18.19; p < 0.01 and 3.60 times, respectively (95% confidence interval: 1.21; 10.74; p < 0.05 more likely to achieve the international weekend recommendation than the children of less active parents. The children of mothers who reached the weekday pedometer-based step count recommendation were 4.94 times (95% confidence interval: 1.45; 16.82; p < 0.05 more likely to fulfil the step count recommendation on weekdays than the children of less active mothers.

  4. Dietary patterns of early childhood and maternal socioeconomic status in a unique prospective sample from a randomized controlled trial of Prenatal DHA Supplementation.

    Science.gov (United States)

    Hidaka, Brandon H; Kerling, Elizabeth H; Thodosoff, Jocelynn M; Sullivan, Debra K; Colombo, John; Carlson, Susan E

    2016-11-25

    Dietary habits established in early childhood and maternal socioeconomic status (SES) are important, complex, interrelated factors that influence a child's growth and development. The aim of this study was to define the major dietary patterns in a cohort of young US children, construct a maternal SES index, and evaluate their associations. The diets of 190 children from a randomized, controlled trial of prenatal supplementation of docosahexaenoic acid (DHA) were recorded at 6-mo intervals from 2-4.5 years by 24-h dietary recall. Hierarchical cluster analysis of age-adjusted, average daily intake of 24 food and beverage groups was used to categorize diet. Unrotated factor analysis generated an SES score from maternal race, ethnicity, age, education, and neighborhood income. We identified two major dietary patterns: "Prudent" and "Western." The 85 (45%) children with a Prudent diet consumed more whole grains, fruit, yogurt and low-fat milk, green and non-starchy vegetables, and nuts and seeds. Conversely, those with a Western diet had greater intake of red meat, discretionary fat and condiments, sweet beverages, refined grains, French fries and potato chips, eggs, starchy vegetables, processed meats, chicken and seafood, and whole-fat milk. Compared to a Western diet, a Prudent diet was associated with one standard deviation higher maternal SES (95% CI: 0.80 to 1.30). We found two major dietary patterns of young US children and defined a single, continuous axis of maternal SES that differed strongly between groups. This is an important first step to investigate how child diet, SES, and prenatal DHA supplementation interact to influence health outcomes. NCT00266825 . Prospectively registered on December 15, 2005.

  5. Methodology of the fasting sub-sample from the Mexican Health Survey, 2000 Metodología de la submuestra de suero de la Encuesta Nacional de Salud 2000

    OpenAIRE

    Simón Barquera; Citlalli Carrión; Ismael Campos; Juan Espinosa; Juan Rivera; Gustavo Olaiz-Fernández

    2007-01-01

    OBJECTIVE: To report the comparative results of the sub-sample of fasting adults selected for the biochemical measurement of cardiovascular risk factors and the rest of the Mexican Health Survey (MHS) (2000) participants. MATERIAL AND METHODS: The nationally representative, cross-sectional Mexican Health Survey (2000) was analyzed. Survey participants reporting a fasting state period of 9- to 12-h were included in a sub-sample (n= 2 535) and compared with all other participants (n= 41 126). P...

  6. Exploration of the methodological quality and clinical usefulness of a cross-sectional sample of published guidance about exercise training and physical activity for the secondary prevention of coronary heart disease.

    Science.gov (United States)

    Abell, Bridget; Glasziou, Paul; Hoffmann, Tammy

    2017-06-13

    Clinicians are encouraged to use guidelines to assist in providing evidence-based secondary prevention to patients with coronary heart disease. However, the expanding number of publications providing guidance about exercise training may confuse cardiac rehabilitation clinicians. We therefore sought to explore the number, scope, publication characteristics, methodological quality, and clinical usefulness of published exercise-based cardiac rehabilitation guidance. We included publications recommending physical activity, exercise or cardiac rehabilitation for patients with coronary heart disease. These included systematically developed clinical practice guidelines, as well as other publications intended to support clinician decision making, such as position papers or consensus statements. Publications were obtained via electronic searches of preventive cardiology societies, guideline databases and PubMed, to November 2016. Publication characteristics were extracted, and two independent assessors evaluated quality using the 23-item Appraisal of Guidelines Research and Evaluation II (AGREE) tool. Fifty-four international publications from 1994 to 2016 were identified. Most were found on preventive cardiology association websites (n = 35; 65%) and were freely accessible (n = 50; 93%). Thirty (56%) publications contained only broad recommendations for physical activity and cardiac rehabilitation referral, while 24 (44%) contained the necessary detailed exercise training recommendations. Many were labelled as "guidelines", however publications with other titles (e.g. scientific statements) were common (n = 24; 44%). This latter group of publications contained a significantly greater proportion of detailed exercise training recommendations than clinical guidelines (p = 0.017). Wide variation in quality also existed, with 'applicability' the worst scoring AGREE II domain for clinical guidelines (mean score 53%) and 'rigour of development' rated lowest for other

  7. Scenario development methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Eng, T. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hudson, J. [Rock Engineering Consultants, Welwyn Garden City, Herts (United Kingdom); Stephansson, O. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Engineering Geology; Skagius, K.; Wiborgh, M. [Kemakta, Stockholm (Sweden)

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are (a) Event tree analysis, (b) Influence diagrams and (c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs.

  8. Data fabrication and other reasons for non-random sampling in 5087 randomised, controlled trials in anaesthetic and general medical journals.

    Science.gov (United States)

    Carlisle, J B

    2017-08-01

    10(-10) . The difference between the distributions of these two subgroups was confirmed by comparison of their overall distributions, p = 5.3 × 10(-15) . Each journal exhibited the same abnormal distribution of baseline means. There was no difference in distributions of baseline means for 1453 trials in non-anaesthetic journals and 3634 trials in anaesthetic journals, p = 0.30. The rate of retractions from JAMA and NEJM, 6/1453 or 1 in 242, was one-quarter the rate from the six anaesthetic journals, 66/3634 or 1 in 55, relative risk (99%CI) 0.23 (0.08-0.68), p = 0.00022. A probability threshold of 1 in 10,000 identified 8/72 (11%) retracted trials (7 by Fujii et al.) and 82/5015 (1.6%) unretracted trials. Some p values were so extreme that the baseline data could not be correct: for instance, for 43/5015 unretracted trials the probability was less than 1 in 10(15) (equivalent to one drop of water in 20,000 Olympic-sized swimming pools). A probability threshold of 1 in 100 for two or more trials by the same author identified three authors of retracted trials (Boldt, Fujii and Reuben) and 21 first or corresponding authors of 65 unretracted trials. Fraud, unintentional error, correlation, stratified allocation and poor methodology might have contributed to the excess of randomised, controlled trials with similar or dissimilar means, a pattern that was common to all the surveyed journals. It is likely that this work will lead to the identification, correction and retraction of hitherto unretracted randomised, controlled trials. © 2017 The Association of Anaesthetists of Great Britain and Ireland.

  9. Language policy and methodology

    OpenAIRE

    Liddicoat, Antony J.

    2004-01-01

    The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit recommendations about the methods to be used in classroom practice, or indirectly, through the conceptualisation of language leaming which underlies the pol...

  10. Methodology to estimate parameters of an excitation system based on experimental conditions

    Energy Technology Data Exchange (ETDEWEB)

    Saavedra-Montes, A.J. [Carrera 80 No 65-223, Bloque M8 oficina 113, Escuela de Mecatronica, Universidad Nacional de Colombia, Medellin (Colombia); Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Ramirez-Scarpetta, J.M. [Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Malik, O.P. [2500 University Drive N.W., Electrical and Computer Engineering Department, University of Calgary, Calgary, Alberta (Canada)

    2011-01-15

    A methodology to estimate the parameters of a potential-source controlled rectifier excitation system model is presented in this paper. The proposed parameter estimation methodology is based on the characteristics of the excitation system. A comparison of two pseudo random binary signals, two sampling periods for each one, and three estimation algorithms is also presented. Simulation results from an excitation control system model and experimental results from an excitation system of a power laboratory setup are obtained. To apply the proposed methodology, the excitation system parameters are identified at two different levels of the generator saturation curve. The results show that it is possible to estimate the parameters of the standard model of an excitation system, recording two signals and the system operating in closed loop with the generator. The normalized sum of squared error obtained with experimental data is below 10%, and with simulation data is below 5%. (author)

  11. Ergodicity of Random Walks on Random DFA

    OpenAIRE

    Balle, Borja

    2013-01-01

    Given a DFA we consider the random walk that starts at the initial state and at each time step moves to a new state by taking a random transition from the current state. This paper shows that for typical DFA this random walk induces an ergodic Markov chain. The notion of typical DFA is formalized by showing that ergodicity holds with high probability when a DFA is sampled uniformly at random from the set of all automata with a fixed number of states. We also show the same result applies to DF...

  12. A random sample survey program of traffic air pollution. Steekproefcontroleprogramma, onderzoek naar luchtverontreiniging door voertuigen in het verkeer; Jaarrapport 1987-1988

    Energy Technology Data Exchange (ETDEWEB)

    Rijkeboer, R.C.; Van Vloten, P.; Schmal, P.

    1988-01-01

    An in-use compliance test with regard to exhaust emissions has been performed by means of a sample of 66 vehicles. Of these vehicles 60 belonged to 12 selected vehicle types (5 vehicles per type) and the other 6 to different types (1 per type). All vehicles complied with EEC Directive 88/76/EEC concerning clean cars and had been sold under the tax incentive scheme for the stimulation of an early introduction of clean cars into the Dutch car fleet. Of the 12 types selected 8 belonged to the class below 1.4 litre. Only 1 of these was fitted with an (open-loop) catalyst. The other 4 types belonged to the class 1.4-2.0 litre; they were all fitted with an open-loop catalyst system. The 6 individual vehicles were all fitted with a closed-loop threeway catalyst. Five belonged to the class 1.4-2.0 litre and one to the class above 2.0 litre. Of these vehicles the exhaust emission was measured in the simulated coldstart urban cycle that forms the basis of the Directive. Additionally measurements were also made in the 'extra-urban driving cycle', that is going to be part of the Directive, in order to gather data. In many cases (so with all catalyst vehicles) a hotstart urban driving cycle was also performed. The vehicles were all checked for a number of adjustment and maintenance aspects. The 12 'complete' types were measured both before and after tuning. The 6 individual vehicles were only measured after tuning. Of 37 vehicles also the emission of noise was determined on the testtrack of the Department of Road Transport in Lelystad.

  13. Increasing exercise capacity and quality of life of patients with heart failure through Wii gaming: the rationale, design and methodology of the HF-Wii study; a multicentre randomized controlled trial.

    Science.gov (United States)

    Jaarsma, Tiny; Klompstra, Leonie; Ben Gal, Tuvia; Boyne, Josiane; Vellone, Ercole; Bäck, Maria; Dickstein, Kenneth; Fridlund, Bengt; Hoes, Arno; Piepoli, Massimo F; Chialà, Oronzo; Mårtensson, Jan; Strömberg, Anna

    2015-07-01

    Exercise is known to be beneficial for patients with heart failure (HF), and these patients should therefore be routinely advised to exercise and to be or to become physically active. Despite the beneficial effects of exercise such as improved functional capacity and favourable clinical outcomes, the level of daily physical activity in most patients with HF is low. Exergaming may be a promising new approach to increase the physical activity of patients with HF at home. The aim of this study is to determine the effectiveness of the structured introduction and access to a Wii game computer in patients with HF to improve exercise capacity and level of daily physical activity, to decrease healthcare resource use, and to improve self-care and health-related quality of life. A multicentre randomized controlled study with two treatment groups will include 600 patients with HF. In each centre, patients will be randomized to either motivational support only (control) or structured access to a Wii game computer (Wii). Patients in the control group will receive advice on physical activity and will be contacted by four telephone calls. Patients in the Wii group also will receive advice on physical activity along with a Wii game computer, with instructions and training. The primary endpoint will be exercise capacity at 3 months as measured by the 6 min walk test. Secondary endpoints include exercise capacity at 6 and 12 months, level of daily physical activity, muscle function, health-related quality of life, and hospitalization or death during the 12 months follow-up. The HF-Wii study is a randomized study that will evaluate the effect of exergaming in patients with HF. The findings can be useful to healthcare professionals and improve our understanding of the potential role of exergaming in the treatment and management of patients with HF. NCT01785121. © 2015 The Authors. European Journal of Heart Failure © 2015 European Society of Cardiology.

  14. Narrative Visual Methodologies

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth

    2017-01-01

    The paper has a methodological focus, discussed in relation to examples from an auto-ethnographic study with the working title Family Anecdotes. The main methodological question is how to map storytelling practice as a process for unfolding multiple cultural and social contexts of meaning using a...

  15. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    2017-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis is o...

  16. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  17. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  18. Archetype Modeling Methodology.

    Science.gov (United States)

    Moner, David; Alberto Maldonado, José; Robles, Montserrat

    2018-02-14

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Secular trends: a ten-year comparison of the amount and type of physical activity and inactivity of random samples of adolescents in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Frömel Karel

    2011-09-01

    Full Text Available Abstract Background An optimal level of physical activity (PA in adolescence influences the level of PA in adulthood. Although PA declines with age have been demonstrated repeatedly, few studies have been carried out on secular trends. The present study assessed levels, types and secular trends of PA and sedentary behaviour of a sample of adolescents in the Czech Republic. Methods The study comprised two cross-sectional cohorts of adolescents ten years apart. The analysis compared data collected through a week-long monitoring of adolescents' PA in 1998-2000 and 2008-2010. Adolescents wore either Yamax SW-701 or Omron HJ-105 pedometer continuously for 7 days (at least 10 hours per day excluding sleeping, hygiene and bathing. They also recorded their number of steps per day, the type and duration of PA and sedentary behaviour (in minutes on record sheets. In total, 902 adolescents (410 boys; 492 girls aged 14-18 were eligible for analysis. Results Overweight and obesity in Czech adolescents participating in this study increased from 5.5% (older cohort, 1998-2000 to 10.4% (younger cohort, 2008-2010. There were no inter-cohort significant changes in the total amount of sedentary behaviour in boys. However in girls, on weekdays, there was a significant increase in the total duration of sedentary behaviour of the younger cohort (2008-2010 compared with the older one (1998-2000. Studying and screen time (television and computer were among the main sedentary behaviours in Czech adolescents. The types of sedentary behaviour also changed: watching TV (1998-2000 was replaced by time spent on computers (2008-2010. The Czech health-related criterion (achieving 11,000 steps per day decreased only in boys from 68% (1998-2000 to 55% (2008-2010. Across both genders, 55%-75% of Czech adolescents met the health-related criterion of recommended steps per day, however less participants in the younger cohort (2008-2010 met this criterion than in the older cohort

  20. Investigating the Randomness of Numbers

    Science.gov (United States)

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  1. Methodological update in Medicina Intensiva.

    Science.gov (United States)

    García Garmendia, J L

    2018-02-06

    Research in the critically ill is complex by the heterogeneity of patients, the difficulties to achieve representative sample sizes and the number of variables simultaneously involved. However, the quantity and quality of records is high as well as the relevance of the variables used, such as survival. The methodological tools have evolved to offering new perspectives and analysis models that allow extracting relevant information from the data that accompanies the critically ill patient. The need for training in methodology and interpretation of results is an important challenge for the intensivists who wish to be updated on the research developments and clinical advances in Intensive Medicine. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  2. High resolution satellite remote sensing used in a stratified random sampling scheme to quantify the constituent land cover components of the shifting cultivation mosaic of the Democratic Republic of Congo

    Science.gov (United States)

    Molinario, G.; Hansen, M.; Potapov, P.

    2016-12-01

    High resolution satellite imagery obtained from the National Geospatial Intelligence Agency through NASA was used to photo-interpret sample areas within the DRC. The area sampled is a stratifcation of the forest cover loss from circa 2014 that either occurred completely within the previosly mapped homogenous area of the Rural Complex, at it's interface with primary forest, or in isolated forest perforations. Previous research resulted in a map of these areas that contextualizes forest loss depending on where it occurs and with what spatial density, leading to a better understading of the real impacts on forest degradation of livelihood shifting cultivation. The stratified random sampling approach of these areas allows the characterization of the constituent land cover types within the