WorldWideScience

Sample records for random sampling methodology

  1. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  2. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  3. Rationale, design, methodology and sample characteristics for the Vietnam pre-conceptual micronutrient supplementation trial (PRECONCEPT: a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Nguyen Phuong H

    2012-10-01

    Full Text Available Abstract Background Low birth weight and maternal anemia remain intractable problems in many developing countries. The adequacy of the current strategy of providing iron-folic acid (IFA supplements only during pregnancy has been questioned given many women enter pregnancy with poor iron stores, the substantial micronutrient demand by maternal and fetal tissues, and programmatic issues related to timing and coverage of prenatal care. Weekly IFA supplementation for women of reproductive age (WRA improves iron status and reduces the burden of anemia in the short term, but few studies have evaluated subsequent pregnancy and birth outcomes. The Preconcept trial aims to determine whether pre-pregnancy weekly IFA or multiple micronutrient (MM supplementation will improve birth outcomes and maternal and infant iron status compared to the current practice of prenatal IFA supplementation only. This paper provides an overview of study design, methodology and sample characteristics from baseline survey data and key lessons learned. Methods/design We have recruited 5011 WRA in a double-blind stratified randomized controlled trial in rural Vietnam and randomly assigned them to receive weekly supplements containing either: 1 2800 μg folic acid 2 60 mg iron and 2800 μg folic acid or 3 MM. Women who become pregnant receive daily IFA, and are being followed through pregnancy, delivery, and up to three months post-partum. Study outcomes include birth outcomes and maternal and infant iron status. Data are being collected on household characteristics, maternal diet and mental health, anthropometry, infant feeding practices, morbidity and compliance. Discussion The study is timely and responds to the WHO Global Expert Consultation which identified the need to evaluate the long term benefits of weekly IFA and MM supplementation in WRA. Findings will generate new information to help guide policy and programs designed to reduce the burden of anemia in women and

  4. Sampling methodology and PCB analysis

    International Nuclear Information System (INIS)

    Dominelli, N.

    1995-01-01

    As a class of compounds PCBs are extremely stable and resist chemical and biological decomposition. Diluted solutions exposed to a range of environmental conditions will undergo some preferential degradation and the resulting mixture may differ considerably from the original PCB used as insulating fluid in electrical equipment. The structure of mixtures of PCBs (synthetic compounds prepared by direct chlorination of biphenyl with chlorine gas) is extremely complex and presents a formidable analytical problem, further complicated by the presence of PCBs as contaminants in oils to soils to water. This paper provides some guidance into sampling and analytical procedures; it also points out various potential problems encountered during these processes. The guidelines provided deal with sample collection, storage and handling, sample stability, laboratory analysis (usually gas chromatography), determination of PCB concentration, calculation of total PCB content, and quality assurance. 1 fig

  5. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  6. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  7. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  8. Methodology Series Module 5: Sampling Strategies

    OpenAIRE

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ? Sampling Method?. There are essentially two types of sampling methods: 1) probability sampling ? based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling ? based on researcher's choice, population that accessible & available. Some of the non-probabilit...

  9. Environmental sample banking-research and methodology

    International Nuclear Information System (INIS)

    Becker, D.A.

    1976-01-01

    The National Bureau of Standards (NBS), in cooperation with the Environment Protection Agency and the National Science Foundation, is engaged in a research program establishing methodology for environmental sample banking. This program is aimed toward evaluating the feasibility of a National Environment Specimen Bank (NESB). The capability for retrospective chemical analyses to evaluate changes in our environment would provide useful information. Much of this information could not be obtained using data from previously analyzed samples. However, to assure validity for these stored samples, they must be sampled, processed and stored under rigorously evaluated, controlled and documented conditions. The program currently under way in the NBS Analytical Chemistry Division has 3 main components. The first is an extension survey of available literature concerning problems of contamination, losses and storage. The components of interest include trace elements, pesticides, other trace organics (PCBs, plasticizers, etc.), radionuclides and microbiological species. The second component is an experimental evaluation of contamination and losses during sampling and sample handling. Of particular interest here is research into container cleaning methodology for trace elements, with respect to adsorption, desorption, leaching and partial dissolution by various sample matrices. The third component of this program is an evaluation of existing methodology for long-term sample storage

  10. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  11. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  12. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  13. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  14. Sampling problems for randomly broken sticks

    Energy Technology Data Exchange (ETDEWEB)

    Huillet, Thierry [Laboratoire de Physique Theorique et Modelisation, CNRS-UMR 8089 et Universite de Cergy-Pontoise, 5 mail Gay-Lussac, 95031, Neuville sur Oise (France)

    2003-04-11

    Consider the random partitioning model of a population (represented by a stick of length 1) into n species (fragments) with identically distributed random weights (sizes). Upon ranking the fragments' weights according to ascending sizes, let S{sub m:n} be the size of the mth smallest fragment. Assume that some observer is sampling such populations as follows: drop at random k points (the sample size) onto this stick and record the corresponding numbers of visited fragments. We shall investigate the following sampling problems: (1) what is the sample size if the sampling is carried out until the first visit of the smallest fragment (size S{sub 1:n})? (2) For a given sample size, have all the fragments of the stick been visited at least once or not? This question is related to Feller's random coupon collector problem. (3) In what order are new fragments being discovered and what is the random number of samples separating the discovery of consecutive new fragments until exhaustion of the list? For this problem, the distribution of the size-biased permutation of the species' weights, as the sequence of their weights in their order of appearance is needed and studied.

  15. An unbiased estimator of the variance of simple random sampling using mixed random-systematic sampling

    OpenAIRE

    Padilla, Alberto

    2009-01-01

    Systematic sampling is a commonly used technique due to its simplicity and ease of implementation. The drawback of this simplicity is that it is not possible to estimate the design variance without bias. There are several ways to circumvent this problem. One method is to suppose that the variable of interest has a random order in the population, so the sample variance of simple random sampling without replacement is used. By means of a mixed random - systematic sample, an unbiased estimator o...

  16. Methodological aspects on microdialysis sampling and measurements

    OpenAIRE

    Abrahamsson, Pernilla

    2010-01-01

    Background:     The microdialysis (MD) technique is widely spread and used both experi­mentally and in clinical practice. The MD technique allows continuous collection of small molecules such as glucose, lactate, pyruvate and glycerol. Samples are often analysed using the CMA 600 analyser, an enzymatic and colorimetric analyser.  Data evaluating the performance of the CMA 600 analysis system and associated sample han­dling are sparse. The aim of this work was to identify sources of variabilit...

  17. Generation and Analysis of Constrained Random Sampling Patterns

    DEFF Research Database (Denmark)

    Pierzchlewski, Jacek; Arildsen, Thomas

    2016-01-01

    Random sampling is a technique for signal acquisition which is gaining popularity in practical signal processing systems. Nowadays, event-driven analog-to-digital converters make random sampling feasible in practical applications. A process of random sampling is defined by a sampling pattern, which...... indicates signal sampling points in time. Practical random sampling patterns are constrained by ADC characteristics and application requirements. In this paper, we introduce statistical methods which evaluate random sampling pattern generators with emphasis on practical applications. Furthermore, we propose...... algorithm generates random sampling patterns dedicated for event-driven-ADCs better than existed sampling pattern generators. Finally, implementation issues of random sampling patterns are discussed....

  18. Acceptance sampling using judgmental and randomly selected samples

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Shulman, Stanley A.; Anderson, Kevin K.; Wilson, John E.; Pulsipher, Brent A.; Sieber, W. Karl

    2010-09-01

    We present a Bayesian model for acceptance sampling where the population consists of two groups, each with different levels of risk of containing unacceptable items. Expert opinion, or judgment, may be required to distinguish between the high and low-risk groups. Hence, high-risk items are likely to be identifed (and sampled) using expert judgment, while the remaining low-risk items are sampled randomly. We focus on the situation where all observed samples must be acceptable. Consequently, the objective of the statistical inference is to quantify the probability that a large percentage of the unsampled items in the population are also acceptable. We demonstrate that traditional (frequentist) acceptance sampling and simpler Bayesian formulations of the problem are essentially special cases of the proposed model. We explore the properties of the model in detail, and discuss the conditions necessary to ensure that required samples sizes are non-decreasing function of the population size. The method is applicable to a variety of acceptance sampling problems, and, in particular, to environmental sampling where the objective is to demonstrate the safety of reoccupying a remediated facility that has been contaminated with a lethal agent.

  19. A random sampling procedure for anisotropic distributions

    International Nuclear Information System (INIS)

    Nagrajan, P.S.; Sethulakshmi, P.; Raghavendran, C.P.; Bhatia, D.P.

    1975-01-01

    A procedure is described for sampling the scattering angle of neutrons as per specified angular distribution data. The cosine of the scattering angle is written as a double Legendre expansion in the incident neutron energy and a random number. The coefficients of the expansion are given for C, N, O, Si, Ca, Fe and Pb and these elements are of interest in dosimetry and shielding. (author)

  20. Systematic random sampling of the comet assay.

    Science.gov (United States)

    McArt, Darragh G; Wasson, Gillian R; McKerr, George; Saetzler, Kurt; Reed, Matt; Howard, C Vyvyan

    2009-07-01

    The comet assay is a technique used to quantify DNA damage and repair at a cellular level. In the assay, cells are embedded in agarose and the cellular content is stripped away leaving only the DNA trapped in an agarose cavity which can then be electrophoresed. The damaged DNA can enter the agarose and migrate while the undamaged DNA cannot and is retained. DNA damage is measured as the proportion of the migratory 'tail' DNA compared to the total DNA in the cell. The fundamental basis of these arbitrary values is obtained in the comet acquisition phase using fluorescence microscopy with a stoichiometric stain in tandem with image analysis software. Current methods deployed in such an acquisition are expected to be both objectively and randomly obtained. In this paper we examine the 'randomness' of the acquisition phase and suggest an alternative method that offers both objective and unbiased comet selection. In order to achieve this, we have adopted a survey sampling approach widely used in stereology, which offers a method of systematic random sampling (SRS). This is desirable as it offers an impartial and reproducible method of comet analysis that can be used both manually or automated. By making use of an unbiased sampling frame and using microscope verniers, we are able to increase the precision of estimates of DNA damage. Results obtained from a multiple-user pooled variation experiment showed that the SRS technique attained a lower variability than that of the traditional approach. The analysis of a single user with repetition experiment showed greater individual variances while not being detrimental to overall averages. This would suggest that the SRS method offers a better reflection of DNA damage for a given slide and also offers better user reproducibility.

  1. Novel methodology to isolate microplastics from vegetal-rich samples.

    Science.gov (United States)

    Herrera, Alicia; Garrido-Amador, Paloma; Martínez, Ico; Samper, María Dolores; López-Martínez, Juan; Gómez, May; Packard, Theodore T

    2018-04-01

    Microplastics are small plastic particles, globally distributed throughout the oceans. To properly study them, all the methodologies for their sampling, extraction, and measurement should be standardized. For heterogeneous samples containing sediments, animal tissues and zooplankton, several procedures have been described. However, definitive methodologies for samples, rich in algae and plant material, have not yet been developed. The aim of this study was to find the best extraction protocol for vegetal-rich samples by comparing the efficacies of five previously described digestion methods, and a novel density separation method. A protocol using 96% ethanol for density separation was better than the five digestion methods tested, even better than using H 2 O 2 digestion. As it was the most efficient, simple, safe and inexpensive method for isolating microplastics from vegetal rich samples, we recommend it as a standard separation method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. BWIP-RANDOM-SAMPLING, Random Sample Generation for Nuclear Waste Disposal

    International Nuclear Information System (INIS)

    Sagar, B.

    1989-01-01

    1 - Description of program or function: Random samples for different distribution types are generated. Distribution types as required for performance assessment modeling of geologic nuclear waste disposal are provided. These are: - Uniform, - Log-uniform (base 10 or natural), - Normal, - Lognormal (base 10 or natural), - Exponential, - Bernoulli, - User defined continuous distribution. 2 - Method of solution: A linear congruential generator is used for uniform random numbers. A set of functions is used to transform the uniform distribution to the other distributions. Stratified, rather than random, sampling can be chosen. Truncated limits can be specified on many distributions, whose usual definition has an infinite support. 3 - Restrictions on the complexity of the problem: Generation of correlated random variables is not included

  3. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  4. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects.

    Science.gov (United States)

    Dreyhaupt, Jens; Mayer, Benjamin; Keis, Oliver; Öchsner, Wolfgang; Muche, Rainer

    2017-01-01

    An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention) studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called "cluster randomization"). Compared with studies with individual randomization, studies with cluster randomization normally require (significantly) larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies. Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  5. Health plan auditing: 100-percent-of-claims vs. random-sample audits.

    Science.gov (United States)

    Sillup, George P; Klimberg, Ronald K

    2011-01-01

    The objective of this study was to examine the relative efficacy of two different methodologies for auditing self-funded medical claim expenses: 100-percent-of-claims auditing versus random-sampling auditing. Multiple data sets of claim errors or 'exceptions' from two Fortune-100 corporations were analysed and compared to 100 simulated audits of 300- and 400-claim random samples. Random-sample simulations failed to identify a significant number and amount of the errors that ranged from $200,000 to $750,000. These results suggest that health plan expenses of corporations could be significantly reduced if they audited 100% of claims and embraced a zero-defect approach.

  6. Cluster-randomized Studies in Educational Research: Principles and Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Dreyhaupt, Jens

    2017-05-01

    Full Text Available An increasing number of studies are being performed in educational research to evaluate new teaching methods and approaches. These studies could be performed more efficiently and deliver more convincing results if they more strictly applied and complied with recognized standards of scientific studies. Such an approach could substantially increase the quality in particular of prospective, two-arm (intervention studies that aim to compare two different teaching methods. A key standard in such studies is randomization, which can minimize systematic bias in study findings; such bias may result if the two study arms are not structurally equivalent. If possible, educational research studies should also achieve this standard, although this is not yet generally the case. Some difficulties and concerns exist, particularly regarding organizational and methodological aspects. An important point to consider in educational research studies is that usually individuals cannot be randomized, because of the teaching situation, and instead whole groups have to be randomized (so-called “cluster randomization”. Compared with studies with individual randomization, studies with cluster randomization normally require (significantly larger sample sizes and more complex methods for calculating sample size. Furthermore, cluster-randomized studies require more complex methods for statistical analysis. The consequence of the above is that a competent expert with respective special knowledge needs to be involved in all phases of cluster-randomized studies.Studies to evaluate new teaching methods need to make greater use of randomization in order to achieve scientifically convincing results. Therefore, in this article we describe the general principles of cluster randomization and how to implement these principles, and we also outline practical aspects of using cluster randomization in prospective, two-arm comparative educational research studies.

  7. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  8. Experience-Sampling Methodology with a Mobile Device in Fibromyalgia

    Directory of Open Access Journals (Sweden)

    Castilla Diana

    2012-01-01

    Full Text Available This work describes the usability studies conducted in the development of an experience-sampling methodology (ESM system running in a mobile device. The goal of the system is to improve the accuracy and ecology in gathering daily self-report data in individuals suffering a chronic pain condition, fibromyalgia. The usability studies showed that the developed software to conduct ESM with mobile devices (smartphones, cell phones can be successfully used by individuals with fibromyalgia of different ages and with low level of expertise in the use of information and communication technologies. 100% of users completed the tasks successfully, although some have completely illiterate. Also there seems to be a clear difference in the way of interaction obtained in the two studies carried out.

  9. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Wessman, R.A.; Lee, K.D.; Curry, B.; Leventhal, L.

    1978-01-01

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243 Am ( 239 Np) and 236 Pu or 242 Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243 Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  10. Power Spectrum Estimation of Randomly Sampled Signals

    DEFF Research Database (Denmark)

    Velte, C. M.; Buchhave, P.; K. George, W.

    algorithms; sample and-hold and the direct spectral estimator without residence time weighting. The computer generated signal is a Poisson process with a sample rate proportional to velocity magnitude that consist of well-defined frequency content, which makes bias easy to spot. The idea...

  11. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    Science.gov (United States)

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  12. Efficient sampling of complex network with modified random walk strategies

    Science.gov (United States)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  13. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  14. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  15. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  16. SOME SYSTEMATIC SAMPLING STRATEGIES USING MULTIPLE RANDOM STARTS

    OpenAIRE

    Sampath Sundaram; Ammani Sivaraman

    2010-01-01

    In this paper an attempt is made to extend linear systematic sampling using multiple random starts due to Gautschi(1957)for various types of systematic sampling schemes available in literature, namely(i)  Balanced Systematic Sampling (BSS) of  Sethi (1965) and (ii) Modified Systematic Sampling (MSS) of Singh, Jindal, and Garg  (1968). Further, the proposed methods were compared with Yates corrected estimator developed with reference to Gautschi’s Linear systematic samplin...

  17. Towards Standardization of Sampling Methodology for Evaluation of ...

    African Journals Online (AJOL)

    This article proposes the procedure that may be adopted for comparable, representative and cost effective, soil sampling, and thereafter explores the policy issues regarding standardization of sampling activities and analytical process as it relates to soil pollution in Nigeria. Standardized sampling and analytical data for soil ...

  18. The importance of sound methodology in environmental DNA sampling

    Science.gov (United States)

    T. M. Wilcox; K. J. Carim; M. K. Young; K. S. McKelvey; T. W. Franklin; M. K. Schwartz

    2018-01-01

    Environmental DNA (eDNA) sampling - which enables inferences of species’ presence from genetic material in the environment - is a powerful tool for sampling rare fishes. Numerous studies have demonstrated that eDNA sampling generally provides greater probabilities of detection than traditional techniques (e.g., Thomsen et al. 2012; McKelvey et al. 2016; Valentini et al...

  19. Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.

    Science.gov (United States)

    Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe

    2017-12-27

    The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.

  20. Randomized controlled trials of simulation-based interventions in Emergency Medicine: a methodological review.

    Science.gov (United States)

    Chauvin, Anthony; Truchot, Jennifer; Bafeta, Aida; Pateron, Dominique; Plaisance, Patrick; Yordanov, Youri

    2018-04-01

    The number of trials assessing Simulation-Based Medical Education (SBME) interventions has rapidly expanded. Many studies show that potential flaws in design, conduct and reporting of randomized controlled trials (RCTs) can bias their results. We conducted a methodological review of RCTs assessing a SBME in Emergency Medicine (EM) and examined their methodological characteristics. We searched MEDLINE via PubMed for RCT that assessed a simulation intervention in EM, published in 6 general and internal medicine and in the top 10 EM journals. The Cochrane Collaboration risk of Bias tool was used to assess risk of bias, intervention reporting was evaluated based on the "template for intervention description and replication" checklist, and methodological quality was evaluated by the Medical Education Research Study Quality Instrument. Reports selection and data extraction was done by 2 independents researchers. From 1394 RCTs screened, 68 trials assessed a SBME intervention. They represent one quarter of our sample. Cardiopulmonary resuscitation (CPR) is the most frequent topic (81%). Random sequence generation and allocation concealment were performed correctly in 66 and 49% of trials. Blinding of participants and assessors was performed correctly in 19 and 68%. Risk of attrition bias was low in three-quarters of the studies (n = 51). Risk of selective reporting bias was unclear in nearly all studies. The mean MERQSI score was of 13.4/18.4% of the reports provided a description allowing the intervention replication. Trials assessing simulation represent one quarter of RCTs in EM. Their quality remains unclear, and reproducing the interventions appears challenging due to reporting issues.

  1. Random sampling of evolution time space and Fourier transform processing

    International Nuclear Information System (INIS)

    Kazimierczuk, Krzysztof; Zawadzka, Anna; Kozminski, Wiktor; Zhukov, Igor

    2006-01-01

    Application of Fourier Transform for processing 3D NMR spectra with random sampling of evolution time space is presented. The 2D FT is calculated for pairs of frequencies, instead of conventional sequence of one-dimensional transforms. Signal to noise ratios and linewidths for different random distributions were investigated by simulations and experiments. The experimental examples include 3D HNCA, HNCACB and 15 N-edited NOESY-HSQC spectra of 13 C 15 N labeled ubiquitin sample. Obtained results revealed general applicability of proposed method and the significant improvement of resolution in comparison with conventional spectra recorded in the same time

  2. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  3. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  4. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  5. Adaptive importance sampling of random walks on continuous state spaces

    International Nuclear Information System (INIS)

    Baggerly, K.; Cox, D.; Picard, R.

    1998-01-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material

  6. A Mixed Methods Sampling Methodology for a Multisite Case Study

    Science.gov (United States)

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  7. Analytical methodologies for the determination of benzodiazepines in biological samples.

    Science.gov (United States)

    Persona, Karolina; Madej, Katarzyna; Knihnicki, Paweł; Piekoszewski, Wojciech

    2015-09-10

    Benzodiazepine drugs belong to important and most widely used medicaments. They demonstrate such therapeutic properties as anxiolytic, sedative, somnifacient, anticonvulsant, diastolic and muscle relaxant effects. However, despite the fact that benzodiazepines possess high therapeutic index and are considered to be relatively safe, their use can be dangerous when: (1) co-administered with alcohol, (2) co-administered with other medicaments like sedatives, antidepressants, neuroleptics or morphine like substances, (3) driving under their influence, (4) using benzodiazepines non-therapeutically as drugs of abuse or in drug-facilitated crimes. For these reasons benzodiazepines are still studied and determined in a variety of biological materials. In this article, sample preparation techniques which have been applied in analysis of benzodiazepine drugs in biological samples have been reviewed and presented. The next part of the article is focused on a review of analytical methods which have been employed for pharmacological, toxicological or forensic study of this group of drugs in the biological matrices. The review was preceded by a description of the physicochemical properties of the selected benzodiazepines and two, very often coexisting in the same analyzed samples, sedative-hypnotic drugs. Copyright © 2015. Published by Elsevier B.V.

  8. Quality of methodological reporting of randomized clinical trials of sodium-glucose cotransporter-2 (sglt2 inhibitors

    Directory of Open Access Journals (Sweden)

    Hadeel Alfahmi

    2017-01-01

    Full Text Available Sodium-glucose cotransporter-2 (SGLT2 inhibitors are a new class of medicines approved recently for the treatment of type 2 diabetes. To improve the quality of randomized clinical trial (RCT reports, the Consolidated Standards of Reporting Trials (CONSORT statement for methodological features was created. For achieving our objective in this study, we assessed the quality of methodological reporting of RCTs of SGLT2 inhibitors according to the 2010 CONSORT statement. We reviewed and analyzed the methodology of SGLT2 inhibitors RCTs that were approved by the Food & Drug Administration (FDA. Of the 27 trials, participants, eligibility criteria, and additional analyses were reported in 100% of the trials. In addition, trial design, interventions, and statistical methods were reported in 96.3% of the trials. Outcomes were reported in 93.6% of the trials. Settings were reported in 85.2% of the trials. Blinding and sample size were reported in 66.7 and 59.3% of the trials, respectively. Sequence allocation and the type of randomization were reported in 63 and 74.1% of the trials, respectively. Besides those, a few methodological items were inadequate in the trials. Allocation concealment was inadequate in most of the trials. It was reported only in 11.1% of the trials. The majority of RCTs have high percentage adherence for more than half of the methodological items of the 2010 CONSORT statement.

  9. Randomized clinical trials in dentistry: Risks of bias, risks of random errors, reporting quality, and methodologic quality over the years 1955-2013.

    Directory of Open Access Journals (Sweden)

    Humam Saltaji

    Full Text Available To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time.We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics.Sequence generation was assessed to be inadequate (at unclear or high risk of bias in 68% (n = 367 of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%. Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154 and 40.5% (n = 219 of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427 of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95 of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%, while the method of blinding was appropriate in 53% (n = 286 of the trials. We identified a significant decrease over time (1955-2013 in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05 in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias.The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent

  10. Random vs. systematic sampling from administrative databases involving human subjects.

    Science.gov (United States)

    Hagino, C; Lo, R J

    1998-09-01

    Two sampling techniques, simple random sampling (SRS) and systematic sampling (SS), were compared to determine whether they yield similar and accurate distributions for the following four factors: age, gender, geographic location and years in practice. Any point estimate within 7 yr or 7 percentage points of its reference standard (SRS or the entire data set, i.e., the target population) was considered "acceptably similar" to the reference standard. The sampling frame was from the entire membership database of the Canadian Chiropractic Association. The two sampling methods were tested using eight different sample sizes of n (50, 100, 150, 200, 250, 300, 500, 800). From the profile/characteristics, summaries of four known factors [gender, average age, number (%) of chiropractors in each province and years in practice], between- and within-methods chi 2 tests and unpaired t tests were performed to determine whether any of the differences [descriptively greater than 7% or 7 yr] were also statistically significant. The strengths of the agreements between the provincial distributions were quantified by calculating the percent agreements for each (provincial pairwise-comparison methods). Any percent agreement less than 70% was judged to be unacceptable. Our assessments of the two sampling methods (SRS and SS) for the different sample sizes tested suggest that SRS and SS yielded acceptably similar results. Both methods started to yield "correct" sample profiles at approximately the same sample size (n > 200). SS is not only convenient, it can be recommended for sampling from large databases in which the data are listed without any inherent order biases other than alphabetical listing by surname.

  11. Are marketed topical metronidazole creams bioequivalent? Evaluation by in vivo microdialysis sampling and tape stripping methodology

    DEFF Research Database (Denmark)

    Garcia Ortiz, Patricia Elodia; Hansen, S H; Shah, Surendra P.

    2011-01-01

    To evaluate the bioequivalence of 3 marketed topical metronidazole formulations by simultaneous dermal microdialysis and stratum corneum sampling by the tape stripping methodology, and to compare the techniques as tools for the determination of bioequivalence.......To evaluate the bioequivalence of 3 marketed topical metronidazole formulations by simultaneous dermal microdialysis and stratum corneum sampling by the tape stripping methodology, and to compare the techniques as tools for the determination of bioequivalence....

  12. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  13. [Methodological quality and reporting quality evaluation of randomized controlled trials published in China Journal of Chinese Materia Medica].

    Science.gov (United States)

    Yu, Dan-Dan; Xie, Yan-Ming; Liao, Xing; Zhi, Ying-Jie; Jiang, Jun-Jie; Chen, Wei

    2018-02-01

    To evaluate the methodological quality and reporting quality of randomized controlled trials(RCTs) published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia webpage to collect RCTs since the establishment of the magazine. The Cochrane risk of bias assessment tool was used to evaluate the methodological quality of RCTs. The CONSORT 2010 list was adopted as reporting quality evaluating tool. Finally, 184 RCTs were included and evaluated methodologically, of which 97 RCTs were evaluated with reporting quality. For the methodological evaluating, 62 trials(33.70%) reported the random sequence generation; 9(4.89%) trials reported the allocation concealment; 25(13.59%) trials adopted the method of blinding; 30(16.30%) trials reported the number of patients withdrawing, dropping out and those lost to follow-up;2 trials (1.09%) reported trial registration and none of the trial reported the trial protocol; only 8(4.35%) trials reported the sample size estimation in details. For reporting quality appraising, 3 reporting items of 25 items were evaluated with high-quality,including: abstract, participants qualified criteria, and statistical methods; 4 reporting items with medium-quality, including purpose, intervention, random sequence method, and data collection of sites and locations; 9 items with low-quality reporting items including title, backgrounds, random sequence types, allocation concealment, blindness, recruitment of subjects, baseline data, harms, and funding;the rest of items were of extremely low quality(the compliance rate of reporting item<10%). On the whole, the methodological and reporting quality of RCTs published in the magazine are generally low. Further improvement in both methodological and reporting quality for RCTs of traditional Chinese medicine are warranted. It is recommended that the international standards and procedures for RCT design should be strictly followed to conduct high-quality trials

  14. Comparing U.S. Army suicide cases to a control sample: initial data and methodological lessons.

    Science.gov (United States)

    Alexander, Cynthia L; Reger, Mark A; Smolenski, Derek J; Fullerton, Nicole R

    2014-10-01

    Identification of risk and protective factors for suicide is a priority for the United States military, especially in light of the recent steady increase in military suicide rates. The Department of Defense Suicide Event Report contains comprehensive data on suicides for active duty military personnel, but no analogous control data is available to permit identification of factors that differentially determine suicide risk. This proof-of-concept study was conducted to determine the feasibility of collecting such control data. The study employed a prospective case-control design in which control cases were randomly selected from a large Army installation at a rate of four control participants for every qualifying Army suicide. Although 111 Army suicides were confirmed during the study period, just 27 control soldiers completed the study. Despite the small control sample, preliminary analyses comparing suicide cases to controls identified several factors more frequently reported for suicide cases, including recent failed intimate relationships, outpatient mental health history, mood disorder diagnosis, substance abuse history, and prior self-injury. No deployment-related risk factors were found. These data are consistent with existing literature and form a foundation for larger control studies. Methodological lessons learned regarding study design and recruitment are discussed to inform future studies. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  15. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  16. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    Science.gov (United States)

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  17. Methodology for Speech Assessment in the Scandcleft Project-An International Randomized Clinical Trial on Palatal Surgery

    DEFF Research Database (Denmark)

    Willadsen, Elisabeth

    2009-01-01

    Objective: To present the methodology for speech assessment in the Scandcleft project and discuss issues from a pilot study. Design: Description of methodology and blinded test for speech assessment. Speech samples and instructions for data collection and analysis for comparisons of speech outcomes...... across five included languages were developed and tested. Participants and Materials: Randomly selected video recordings of 10 5-year-old children from each language (n = 50) were included in the project. Speech material consisted of test consonants in single words, connected speech, and syllable chains......-sum and the overall rating of VPC was 78%. Conclusions: Pooling data of speakers of different languages in the same trial and comparing speech outcome across trials seems possible if the assessment of speech concerns consonants and is confined to speech units that are phonetically similar across languages. Agreed...

  18. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  19. Determination of Initial Conditions for the Safety Analysis by Random Sampling of Operating Parameters

    International Nuclear Information System (INIS)

    Jeong, Hae-Yong; Park, Moon-Ghu

    2015-01-01

    In most existing evaluation methodologies, which follow a conservative approach, the most conservative initial conditions are searched for each transient scenario through tremendous assessment for wide operating windows or limiting conditions for operation (LCO) allowed by the operating guidelines. In this procedure, a user effect could be involved and a remarkable time and human resources are consumed. In the present study, we investigated a more effective statistical method for the selection of the most conservative initial condition by the use of random sampling of operating parameters affecting the initial conditions. A method for the determination of initial conditions based on random sampling of plant design parameters is proposed. This method is expected to be applied for the selection of the most conservative initial plant conditions in the safety analysis using a conservative evaluation methodology. In the method, it is suggested that the initial conditions of reactor coolant flow rate, pressurizer level, pressurizer pressure, and SG level are adjusted by controlling the pump rated flow, setpoints of PLCS, PPCS, and FWCS, respectively. The proposed technique is expected to contribute to eliminate the human factors introduced in the conventional safety analysis procedure and also to reduce the human resources invested in the safety evaluation of nuclear power plants

  20. A systematic examination of a random sampling strategy for source apportionment calculations.

    Science.gov (United States)

    Andersson, August

    2011-12-15

    Estimating the relative contributions from multiple potential sources of a specific component in a mixed environmental matrix is a general challenge in diverse fields such as atmospheric, environmental and earth sciences. Perhaps the most common strategy for tackling such problems is by setting up a system of linear equations for the fractional influence of different sources. Even though an algebraic solution of this approach is possible for the common situation with N+1 sources and N source markers, such methodology introduces a bias, since it is implicitly assumed that the calculated fractions and the corresponding uncertainties are independent of the variability of the source distributions. Here, a random sampling (RS) strategy for accounting for such statistical bias is examined by investigating rationally designed synthetic data sets. This random sampling methodology is found to be robust and accurate with respect to reproducibility and predictability. This method is also compared to a numerical integration solution for a two-source situation where source variability also is included. A general observation from this examination is that the variability of the source profiles not only affects the calculated precision but also the mean/median source contributions. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Methodological issues affecting the study of fish parasites. II. Sampling method affects ectoparasite studies

    Czech Academy of Sciences Publication Activity Database

    Kvach, Yuriy; Ondračková, Markéta; Janáč, Michal; Jurajda, Pavel

    2016-01-01

    Roč. 121, č. 1 (2016), s. 59-66 ISSN 0177-5103 R&D Projects: GA ČR GBP505/12/G112 Institutional support: RVO:68081766 Keywords : Parasite community * Fish sampling method * Methodology * Parasitological examination * Rutilus rutilus Subject RIV: EG - Zoology Impact factor: 1.549, year: 2016

  2. Radioimmunoassay of h-TSH - methodological suggestions for dealing with medium to large numbers of samples

    International Nuclear Information System (INIS)

    Mahlstedt, J.

    1977-01-01

    The article deals with practical aspects of establishing a TSH-RIA for patients, with particular regard to predetermined quality criteria. Methodological suggestions are made for medium to large numbers of samples with the target of reducing monotonous precision working steps by means of simple aids. The quality criteria required are well met, while the test procedure is well adapted to the rhythm of work and may be carried out without loss of precision even with large numbers of samples. (orig.) [de

  3. Different methodologies in neutron activation to approach the full analysis of environmental and nutritional samples

    International Nuclear Information System (INIS)

    Freitas, M.C.; Dionisio, I.; Dung, H.M.

    2008-01-01

    Different methodologies of neutron activation analysis (NAA) are now available at the Technological and Nuclear Institute (Sacavem, Portugal), namely Compton suppression, epithermal activation, replicate and cyclic activation, and low energy photon measurement. Prompt gamma activation analysis (PGAA) will be implemented soon. Results by instrumental NAA and PGAA on environmental and nutritional samples are discussed herein, showing that PGAA - carried out at the Institute of Isotope Research (Budapest, Hungary) - brings about an effective input to assessing relevant elements. Sensitivity enhancement in NAA by Compton suppression is also illustrated. Through a judicious combination of methodologies, practically all elements of interest in pollution and nutrition terms can be determined. (author)

  4. The U-tube sampling methodology and real-time analysis of geofluids

    International Nuclear Information System (INIS)

    Freifeld, Barry; Perkins, Ernie; Underschultz, James; Boreham, Chris

    2009-01-01

    The U-tube geochemical sampling methodology, an extension of the porous cup technique proposed by Wood (1973), provides minimally contaminated aliquots of multiphase fluids from deep reservoirs and allows for accurate determination of dissolved gas composition. The initial deployment of the U-tube during the Frio Brine Pilot CO 2 storage experiment, Liberty County, Texas, obtained representative samples of brine and supercritical CO 2 from a depth of 1.5 km. A quadrupole mass spectrometer provided real-time analysis of dissolved gas composition. Since the initial demonstration, the U-tube has been deployed for (1) sampling of fluids down gradient of the proposed Yucca Mountain High-Level Waste Repository, Armagosa Valley, Nevada (2) acquiring fluid samples beneath permafrost in Nunuvut Territory, Canada, and (3) at a CO 2 storage demonstration project within a depleted gas reservoir, Otway Basin, Victoria, Australia. The addition of in-line high-pressure pH and EC sensors allows for continuous monitoring of fluid during sample collection. Difficulties have arisen during U-tube sampling, such as blockage of sample lines from naturally occurring waxes or from freezing conditions; however, workarounds such as solvent flushing or heating have been used to address these problems. The U-tube methodology has proven to be robust, and with careful consideration of the constraints and limitations, can provide high quality geochemical samples.

  5. Methodological reporting of randomized controlled trials in major hepato-gastroenterology journals in 2008 and 1998: a comparative study

    Science.gov (United States)

    2011-01-01

    Background It was still unclear whether the methodological reporting quality of randomized controlled trials (RCTs) in major hepato-gastroenterology journals improved after the Consolidated Standards of Reporting Trials (CONSORT) Statement was revised in 2001. Methods RCTs in five major hepato-gastroenterology journals published in 1998 or 2008 were retrieved from MEDLINE using a high sensitivity search method and their reporting quality of methodological details were evaluated based on the CONSORT Statement and Cochrane Handbook for Systematic Reviews of interventions. Changes of the methodological reporting quality between 2008 and 1998 were calculated by risk ratios with 95% confidence intervals. Results A total of 107 RCTs published in 2008 and 99 RCTs published in 1998 were found. Compared to those in 1998, the proportion of RCTs that reported sequence generation (RR, 5.70; 95%CI 3.11-10.42), allocation concealment (RR, 4.08; 95%CI 2.25-7.39), sample size calculation (RR, 3.83; 95%CI 2.10-6.98), incomplete outecome data addressed (RR, 1.81; 95%CI, 1.03-3.17), intention-to-treat analyses (RR, 3.04; 95%CI 1.72-5.39) increased in 2008. Blinding and intent-to-treat analysis were reported better in multi-center trials than in single-center trials. The reporting of allocation concealment and blinding were better in industry-sponsored trials than in public-funded trials. Compared with historical studies, the methodological reporting quality improved with time. Conclusion Although the reporting of several important methodological aspects improved in 2008 compared with those published in 1998, which may indicate the researchers had increased awareness of and compliance with the revised CONSORT statement, some items were still reported badly. There is much room for future improvement. PMID:21801429

  6. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    DEFF Research Database (Denmark)

    Edjabou, Vincent Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona

    2015-01-01

    Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub......-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating,comparison of the waste data between individual sub-areas with different fractionation (waste...

  7. [Methodological quality evaluation of randomized controlled trials for traditional Chinese medicines for treatment of sub-health].

    Science.gov (United States)

    Zhao, Jun; Liao, Xing; Zhao, Hui; Li, Zhi-Geng; Wang, Nan-Yue; Wang, Li-Min

    2016-11-01

    To evaluate the methodological quality of the randomized controlled trials(RCTs) for traditional Chinese medicines for treatment of sub-health, in order to provide a scientific basis for the improvement of clinical trials and systematic review. Such databases as CNKI, CBM, VIP, Wanfang, EMbase, Medline, Clinical Trials, Web of Science and Cochrane Library were searched for RCTS for traditional Chinese medicines for treatment of sub-health between the time of establishment and February 29, 2016. Cochrane Handbook 5.1 was used to screen literatures and extract data, and CONSORT statement and CONSORT for traditional Chinese medicine statement were adopted as the basis for quality evaluation. Among the 72 RCTs included in this study, 67 (93.05%) trials described the inter-group baseline data comparability, 39(54.17%) trials described the unified diagnostic criteria, 28(38.89%) trials described the unified standards of efficacy, 4 (5.55%) trials mentioned the multi-center study, 19(26.38%) trials disclosed the random distribution method, 6(8.33%) trials used the random distribution concealment, 15(20.83%) trials adopted the method of blindness, 3(4.17%) study reported the sample size estimation in details, 5 (6.94%) trials showed a sample size of more than two hundred, 19(26.38%) trials reported the number of withdrawal, defluxion cases and those lost to follow-up, but only 2 trials adopted the ITT analysis,10(13.89%) trials reported the follow-up results, none of the trial reported the test registration and the test protocol, 48(66.7%) trials reported all of the indicators of expected outcomes, 26(36.11%) trials reported the adverse reactions and adverse events, and 4(5.56%) trials reported patient compliance. The overall quality of these randomized controlled trials for traditional Chinese medicines for treatment of sub-health is low, with methodological defects in different degrees. Therefore, it is still necessary to emphasize the correct application of principles

  8. APPLICATION OF LOT QUALITY ASSURANCE SAMPLING FOR ASSESSING DISEASE CONTROL PROGRAMMES - EXAMINATION OF SOME METHODOLOGICAL ISSUES

    OpenAIRE

    T. R. RAMESH RAO

    2011-01-01

    Lot Quality Assurance Sampling (LQAS), a statistical tool in industrial setup, has been in use since 1980 for monitoring and evaluation of programs on disease control / immunization status among children / health workers performance in health system. While conducting LQAS in the field, there are occasions, even after due care of design, there are practical and methodological issues to be addressed before it is recommended for implementation and intervention. LQAS is applied under the assumpti...

  9. Methodological effects in Fourier transform infrared (FTIR) spectroscopy: Implications for structural analyses of biomacromolecular samples

    Science.gov (United States)

    Kamnev, Alexander A.; Tugarova, Anna V.; Dyatlova, Yulia A.; Tarantilis, Petros A.; Grigoryeva, Olga P.; Fainleib, Alexander M.; De Luca, Stefania

    2018-03-01

    A set of experimental data obtained by Fourier transform infrared (FTIR) spectroscopy (involving the use of samples ground and pressed with KBr, i.e. in a polar halide matrix) and by matrix-free transmission FTIR or diffuse reflectance infrared Fourier transform (DRIFT) spectroscopic methodologies (involving measurements of thin films or pure powdered samples, respectively) were compared for several different biomacromolecular substances. The samples under study included poly-3-hydroxybutyrate (PHB) isolated from cell biomass of the rhizobacterium Azospirillum brasilense; dry PHB-containing A. brasilense biomass; pectin (natural carboxylated heteropolysaccharide of plant origin; obtained from apple peel) as well as its chemically modified derivatives obtained by partial esterification of its galacturonide-chain hydroxyl moieties with palmitic, oleic and linoleic acids. Significant shifts of some FTIR vibrational bands related to polar functional groups of all the biomacromolecules under study, induced by the halide matrix used for preparing the samples for spectroscopic measurements, were shown and discussed. A polar halide matrix used for preparing samples for FTIR measurements was shown to be likely to affect band positions not only per se, by affecting band energies or via ion exchange (e.g., with carboxylate moieties), but also by inducing crystallisation of metastable amorphous biopolymers (e.g., PHB of microbial origin). The results obtained have important implications for correct structural analyses of polar, H-bonded and/or amphiphilic biomacromolecular systems using different methodologies of FTIR spectroscopy.

  10. How effective is the comprehensive approach to rehabilitation (CARe) methodology? A cluster randomized controlled trial.

    Science.gov (United States)

    Bitter, Neis; Roeg, Diana; van Assen, Marcel; van Nieuwenhuizen, Chijs; van Weeghel, Jaap

    2017-12-11

    The CARe methodology aims to improve the quality of life of people with severe mental illness by supporting them in realizing their goals, handling their vulnerability and improving the quality of their social environment. This study aims to investigate the effectiveness of the CARe methodology for people with severe mental illness on their quality of life, personal recovery, participation, hope, empowerment, self-efficacy beliefs and unmet needs. A cluster Randomized Controlled Trial (RCT) was conducted in 14 teams of three organizations for sheltered and supported housing in the Netherlands. Teams in the intervention group received training in the CARe methodology. Teams in the control group continued working according to care as usual. Questionnaires were filled out at baseline, after 10 months and after 20 months. A total of 263 clients participated in the study. Quality of life increased in both groups, however, no differences between the intervention and control group were found. Recovery and social functioning did not change over time. Regarding the secondary outcomes, the number of unmet needs decreased in both groups. All intervention teams received the complete training program. The model fidelity at T1 was 53.4% for the intervention group and 33.4% for the control group. At T2 this was 50.6% for the intervention group and 37.2% for the control group. All clients improved in quality of life. However we did not find significant differences between the clients of the both conditions on any outcome measure. Possible explanations of these results are: the difficulty to implement rehabilitation-supporting practice, the content of the methodology and the difficulty to improve the lives of a group of people with longstanding and severe impairments in a relatively short period. More research is needed on how to improve effects of rehabilitation trainings in practice and on outcome level. ISRCTN77355880 , retrospectively registered (05/07/2013).

  11. Lumbar Sympathetic Plexus Block as a Treatment for Postamputation Pain: Methodology for a Randomized Controlled Trial.

    Science.gov (United States)

    McCormick, Zachary L; Hendrix, Andrew; Dayanim, David; Clay, Bryan; Kirsling, Amy; Harden, Norman

    2018-03-08

    We present a technical protocol for rigorous assessment of patient-reported outcomes and psychophysical testing relevant to lumbar sympathetic blocks for the treatment of postamputation pain (PAP). This description is intended to inform future prospective investigation. Series of four participants from a blinded randomized sham-controlled trial. Tertiary, urban, academic pain medicine center. Four participants with a single lower limb amputation and associated chronic PAP. Participants were randomized to receive a lumbar sympathetic block with 0.25% bupivacaine or sham needle placement. Patient-rated outcome measures included the numerical rating scale (NRS) for pain, the McGill Pain Questionnaire-Short Form, Center for Epidemiological Studies Depression Scale, Pain and Anxiety Symptoms Scale-short version, and Pain Disability Index (PDI). Psychophysical and biometric testing was also performed, which included vibration sensation testing, pinprick sensation testing, brush sensation testing, Von Frey repeated weighted pinprick sensation, and thermal quantitative sensory testing. In the four described cases, treatment of PAP with a single lumbar sympathetic block but not sham intervention resulted in reduction of both residual limb pain and phantom limb pain as well as perceived disability on the PDI at three-month follow-up. An appropriately powered randomized controlled study using this methodology may not only aid in determining the possible clinical efficacy of lumbar sympathetic block in PAP, but could also improve our understanding of underlying pathophysiologic mechanisms of PAP.

  12. Sampling Polya-Gamma random variates: alternate and approximate techniques

    OpenAIRE

    Windle, Jesse; Polson, Nicholas G.; Scott, James G.

    2014-01-01

    Efficiently sampling from the P\\'olya-Gamma distribution, ${PG}(b,z)$, is an essential element of P\\'olya-Gamma data augmentation. Polson et. al (2013) show how to efficiently sample from the ${PG}(1,z)$ distribution. We build two new samplers that offer improved performance when sampling from the ${PG}(b,z)$ distribution and $b$ is not unity.

  13. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  14. Trends in analytical methodologies for the determination of alkylphenols and bisphenol A in water samples.

    Science.gov (United States)

    Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2017-04-15

    In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Estimates of Inequality Indices Based on Simple Random, Ranked Set, and Systematic Sampling

    OpenAIRE

    Bansal, Pooja; Arora, Sangeeta; Mahajan, Kalpana K.

    2013-01-01

    Gini index, Bonferroni index, and Absolute Lorenz index are some popular indices of inequality showing different features of inequality measurement. In general simple random sampling procedure is commonly used to estimate the inequality indices and their related inference. The key condition that the samples must be drawn via simple random sampling procedure though makes calculations much simpler but this assumption is often violated in practice as the data does not always yield simple random ...

  16. Study of adolescents exposed in utero. Methodological evaluation of the Nagasaki sample

    Energy Technology Data Exchange (ETDEWEB)

    Hrubec, Zdenek, Noble, K.B.; Burrow, G N

    1962-09-12

    Fetal tissues have been shown to be extremely sensitive to ionizing radiation, and therefore a group of children who were exposed in utero are of special interest. When these children entered adolescence, an intensive study was undertaken to determine whether differences not otherwise apparent would be revealed during the stress of this period of rapid growth. The purpose of this report is to describe the sample used to study these adolescent children who were exposed in utero and to provide reference information. The problems of using ex post facto methods as employed in this study have been discussed in detail elsewhere. In summary, the extent to which findings of a retrospective study may be generalized to a larger population can be determined only from a careful and extensive study of the characteristics of the sample and an evaluation of the procedures used in its selection. It is generally recognized that even an extensive methodologic exploration of this kind offers no conclusive proof that a sample is useful for a specific study. In the sample, some variables which may have a considerable effect on the medical data, such as socioeconomic status, have been taken into account only superficially. There is always the possibility that some important, completely unsuspected variables may produce spurious associations. However there is an almost infinite number of such factors which might conceivably affect the data. Vast research resources could be committed to a methodologic evaluation without fulfilling the basic purpose of the study. An approach must be devised which is judged methodologically adequate but which will not tax the research resource to the detriment of the basic objectives. It is hoped that this report will satisfy the requirements of this compromise. 30 references, 36 tables.

  17. Importance sampling of heavy-tailed iterated random functions

    NARCIS (Netherlands)

    B. Chen (Bohan); C.H. Rhee (Chang-Han); A.P. Zwart (Bert)

    2016-01-01

    textabstractWe consider a stochastic recurrence equation of the form $Z_{n+1} = A_{n+1} Z_n+B_{n+1}$, where $\\mathbb{E}[\\log A_1]<0$, $\\mathbb{E}[\\log^+ B_1]<\\infty$ and $\\{(A_n,B_n)\\}_{n\\in\\mathbb{N}}$ is an i.i.d. sequence of positive random vectors. The stationary distribution of this Markov

  18. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    Directory of Open Access Journals (Sweden)

    Zoraida Sosa-Ferrera

    2013-01-01

    Full Text Available Endocrine-disruptor compounds (EDCs can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented.

  19. Methodology for sample preparation and size measurement of commercial ZnO nanoparticles

    Directory of Open Access Journals (Sweden)

    Pei-Jia Lu

    2018-04-01

    Full Text Available This study discusses the strategies on sample preparation to acquire images with sufficient quality for size characterization by scanning electron microscope (SEM using two commercial ZnO nanoparticles of different surface properties as a demonstration. The central idea is that micrometer sized aggregates of ZnO in powdered forms need to firstly be broken down to nanosized particles through an appropriate process to generate nanoparticle dispersion before being deposited on a flat surface for SEM observation. Analytical tools such as contact angle, dynamic light scattering and zeta potential have been utilized to optimize the procedure for sample preparation and to check the quality of the results. Meanwhile, measurements of zeta potential values on flat surfaces also provide critical information and save lots of time and efforts in selection of suitable substrate for particles of different properties to be attracted and kept on the surface without further aggregation. This simple, low-cost methodology can be generally applied on size characterization of commercial ZnO nanoparticles with limited information from vendors. Keywords: Zinc oxide, Nanoparticles, Methodology

  20. STATISTICAL LANDMARKS AND PRACTICAL ISSUES REGARDING THE USE OF SIMPLE RANDOM SAMPLING IN MARKET RESEARCHES

    Directory of Open Access Journals (Sweden)

    CODRUŢA DURA

    2010-01-01

    Full Text Available The sample represents a particular segment of the statistical populationchosen to represent it as a whole. The representativeness of the sample determines the accuracyfor estimations made on the basis of calculating the research indicators and the inferentialstatistics. The method of random sampling is part of probabilistic methods which can be usedwithin marketing research and it is characterized by the fact that it imposes the requirementthat each unit belonging to the statistical population should have an equal chance of beingselected for the sampling process. When the simple random sampling is meant to be rigorouslyput into practice, it is recommended to use the technique of random number tables in order toconfigure the sample which will provide information that the marketer needs. The paper alsodetails the practical procedure implemented in order to create a sample for a marketingresearch by generating random numbers using the facilities offered by Microsoft Excel.

  1. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  2. Effect of passive acoustic sampling methodology on detecting bats after declines from white nose syndrome

    Science.gov (United States)

    Coleman, Laci S.; Ford, W. Mark; Dobony, Christopher A.; Britzke, Eric R.

    2014-01-01

    Concomitant with the emergence and spread of white-nose syndrome (WNS) and precipitous decline of many bat species in North America, natural resource managers need modified and/or new techniques for bat inventory and monitoring that provide robust occupancy estimates. We used Anabat acoustic detectors to determine the most efficient passive acoustic sampling design for optimizing detection probabilities of multiple bat species in a WNS-impacted environment in New York, USA. Our sampling protocol included: six acoustic stations deployed for the entire duration of monitoring as well as a 4 x 4 grid and five transects of 5-10 acoustic units that were deployed for 6-8 night sample durations surveyed during the summers of 2011-2012. We used Program PRESENCE to determine detection probability and site occupancy estimates. Overall, the grid produced the highest detection probabilities for most species because it contained the most detectors and intercepted the greatest spatial area. However, big brown bats (Eptesicus fuscus) and species not impacted by WNS were detected easily regardless of sampling array. Endangered Indiana (Myotis sodalis) and little brown (Myotis lucifugus) and tri-colored bats (Perimyotis subflavus) showed declines in detection probabilities over our study, potentially indicative of continued WNS-associated declines. Identification of species presence through efficient methodologies is vital for future conservation efforts as bat populations decline further due to WNS and other factors.   

  3. Nonlinear Methodologies for Identifying Seismic Event and Nuclear Explosion Using Random Forest, Support Vector Machine, and Naive Bayes Classification

    Directory of Open Access Journals (Sweden)

    Longjun Dong

    2014-01-01

    Full Text Available The discrimination of seismic event and nuclear explosion is a complex and nonlinear system. The nonlinear methodologies including Random Forests (RF, Support Vector Machines (SVM, and Naïve Bayes Classifier (NBC were applied to discriminant seismic events. Twenty earthquakes and twenty-seven explosions with nine ratios of the energies contained within predetermined “velocity windows” and calculated distance are used in discriminators. Based on the one out cross-validation, ROC curve, calculated accuracy of training and test samples, and discriminating performances of RF, SVM, and NBC were discussed and compared. The result of RF method clearly shows the best predictive power with a maximum area of 0.975 under the ROC among RF, SVM, and NBC. The discriminant accuracies of RF, SVM, and NBC for test samples are 92.86%, 85.71%, and 92.86%, respectively. It has been demonstrated that the presented RF model can not only identify seismic event automatically with high accuracy, but also can sort the discriminant indicators according to calculated values of weights.

  4. Analytical methodologies for aluminium speciation in environmental and biological samples--a review.

    Science.gov (United States)

    Bi, S P; Yang, X D; Zhang, F P; Wang, X L; Zou, G W

    2001-08-01

    It is recognized that aluminium (Al) is a potential environmental hazard. Acidic deposition has been linked to increased Al concentrations in natural waters. Elevated levels of Al might have serious consequences for biological communities. Of particular interest is the speciation of Al in aquatic environments, because Al toxicity depends on its forms and concentrations. In this paper, advances in analytical methodologies for Al speciation in environmental and biological samples during the past five years are reviewed. Concerns about the specific problems of Al speciation and highlights of some important methods are elucidated in sections devoted to hybrid techniques (HPLC or FPLC coupled with ET-AAS, ICP-AES, or ICP-MS), flow-injection analysis (FIA), nuclear magnetic resonance (27Al NMR), electrochemical analysis, and computer simulation. More than 130 references are cited.

  5. Sampling and analytical methodologies for energy dispersive X-ray fluorescence analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1993-01-01

    The present document represents an attempt to summarize the most important features of the different forms of ED-XFR as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of ED-XRF to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability. Emphasis is also placed on the sources of errors affecting the sampling of airborne particulate matter. The analytical part of the document describes the different forms of ED-XRF and their potential applications. Spectrum evaluation, a key step in X-ray spectrometry, is covered in depth, including discussion on several calibration and peak fitting techniques and computer programs especially designed for this purpose. 148 refs, 25 figs, 13 tabs

  6. Multiresidual determination of pesticides in agricultural soil sample using Quechers extraction methodology

    International Nuclear Information System (INIS)

    Castro Garcia, Consuelo del Pilar

    2011-01-01

    To achieve a sustainable agricultural production there are used different organic and inorganic products, among them we found the fertilizers and pesticides. When they are applied most of the product falls to the ground, generating significant sources of pollution in the areas near the application and depending on the mobility of the pesticide, it can reach more remote areas. That is why it is important to determine the pesticide residues in soil after their application, being the selection of the extraction method crucial for the subsequent traces detection. In the present work there was evaluated the QUECHERS extraction technique, a method used in food but modified for a different and complex matrix like soil in order to achieve acceptable efficiencies multi-residue extraction of 20 pesticides and their subsequent determination by gas chromatography with electron capture and mass detection. The method was applied for the determination of pesticides in three soil samples from an agricultural site with different slopes between them. The Results indicated that 75% of the pesticides tested had acceptable efficiencies, thus meeting the objective of achieving multiresidue determination of pesticides in agricultural soil samples by extraction methodology QUECHERS. Besides, the presence of the fungicide penconazole was only detected in the three samples, being the highest concentration of pesticide found in the area with less slope (V_A_B_A_J_O) (author)

  7. English Language Teaching in Spain: Do Textbooks Comply with the Official Methodological Regulations? A Sample Analysis

    Directory of Open Access Journals (Sweden)

    Aquilino Sánchez

    2009-06-01

    Full Text Available The goal of this paper is to verify up to what point ELT textbooks used in Spanish educational settings comply with the official regulations prescribed, which fully advocate the Communicative Language Teaching Method (CLT. For that purpose, seven representative coursebooks of different educational levels and modalities in Spain – secondary, upper secondary, teenager and adult textbooks – were selected to be analysed. A full unit randomly selected from each coursebook was examined through the parameters of the communicative potential of the activities – measured on a scale from 0 to 10 – and the communicative nature of the methodological strategies implemented – measured on a dichotomous scale (yes/no. Global results per educational levels point to the prevailing communicative nature of all the materials, which was shown to be above 50%. The remaining non-communicative block was covered by activities focused on the formal features of language (grammar and vocabulary. This resulting degree of dissociation between official regulations and what is really found in teaching materials may be positive, since the learning of languages is complex and results from the intervention of multiple factors and learning styles, as is evidenced by the professional experience of teachers from different backgrounds and beliefs.

  8. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    International Nuclear Information System (INIS)

    Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2015-01-01

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  9. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  10. An alternative procedure for estimating the population mean in simple random sampling

    Directory of Open Access Journals (Sweden)

    Housila P. Singh

    2012-03-01

    Full Text Available This paper deals with the problem of estimating the finite population mean using auxiliary information in simple random sampling. Firstly we have suggested a correction to the mean squared error of the estimator proposed by Gupta and Shabbir [On improvement in estimating the population mean in simple random sampling. Jour. Appl. Statist. 35(5 (2008, pp. 559-566]. Later we have proposed a ratio type estimator and its properties are studied in simple random sampling. Numerically we have shown that the proposed class of estimators is more efficient than different known estimators including Gupta and Shabbir (2008 estimator.

  11. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster

    DEFF Research Database (Denmark)

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column...... and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness...... and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila....

  12. 40 CFR 761.306 - Sampling 1 meter square surfaces by random selection of halves.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sampling 1 meter square surfaces by...(b)(3) § 761.306 Sampling 1 meter square surfaces by random selection of halves. (a) Divide each 1 meter square portion where it is necessary to collect a surface wipe test sample into two equal (or as...

  13. What about N? A methodological study of sample-size reporting in focus group studies.

    Science.gov (United States)

    Carlsen, Benedicte; Glenton, Claire

    2011-03-11

    Focus group studies are increasingly published in health related journals, but we know little about how researchers use this method, particularly how they determine the number of focus groups to conduct. The methodological literature commonly advises researchers to follow principles of data saturation, although practical advise on how to do this is lacking. Our objectives were firstly, to describe the current status of sample size in focus group studies reported in health journals. Secondly, to assess whether and how researchers explain the number of focus groups they carry out. We searched PubMed for studies that had used focus groups and that had been published in open access journals during 2008, and extracted data on the number of focus groups and on any explanation authors gave for this number. We also did a qualitative assessment of the papers with regard to how number of groups was explained and discussed. We identified 220 papers published in 117 journals. In these papers insufficient reporting of sample sizes was common. The number of focus groups conducted varied greatly (mean 8.4, median 5, range 1 to 96). Thirty seven (17%) studies attempted to explain the number of groups. Six studies referred to rules of thumb in the literature, three stated that they were unable to organize more groups for practical reasons, while 28 studies stated that they had reached a point of saturation. Among those stating that they had reached a point of saturation, several appeared not to have followed principles from grounded theory where data collection and analysis is an iterative process until saturation is reached. Studies with high numbers of focus groups did not offer explanations for number of groups. Too much data as a study weakness was not an issue discussed in any of the reviewed papers. Based on these findings we suggest that journals adopt more stringent requirements for focus group method reporting. The often poor and inconsistent reporting seen in these

  14. What about N? A methodological study of sample-size reporting in focus group studies

    Directory of Open Access Journals (Sweden)

    Glenton Claire

    2011-03-01

    Full Text Available Abstract Background Focus group studies are increasingly published in health related journals, but we know little about how researchers use this method, particularly how they determine the number of focus groups to conduct. The methodological literature commonly advises researchers to follow principles of data saturation, although practical advise on how to do this is lacking. Our objectives were firstly, to describe the current status of sample size in focus group studies reported in health journals. Secondly, to assess whether and how researchers explain the number of focus groups they carry out. Methods We searched PubMed for studies that had used focus groups and that had been published in open access journals during 2008, and extracted data on the number of focus groups and on any explanation authors gave for this number. We also did a qualitative assessment of the papers with regard to how number of groups was explained and discussed. Results We identified 220 papers published in 117 journals. In these papers insufficient reporting of sample sizes was common. The number of focus groups conducted varied greatly (mean 8.4, median 5, range 1 to 96. Thirty seven (17% studies attempted to explain the number of groups. Six studies referred to rules of thumb in the literature, three stated that they were unable to organize more groups for practical reasons, while 28 studies stated that they had reached a point of saturation. Among those stating that they had reached a point of saturation, several appeared not to have followed principles from grounded theory where data collection and analysis is an iterative process until saturation is reached. Studies with high numbers of focus groups did not offer explanations for number of groups. Too much data as a study weakness was not an issue discussed in any of the reviewed papers. Conclusions Based on these findings we suggest that journals adopt more stringent requirements for focus group method

  15. Stratified random sampling plans designed to assist in the determination of radon and radon daughter concentrations in underground uranium mine atmosphere

    International Nuclear Information System (INIS)

    Makepeace, C.E.

    1981-01-01

    Sampling strategies for the monitoring of deleterious agents present in uranium mine air in underground and surface mining areas are described. These methods are designed to prevent overexposure of the lining of the respiratory system of uranium miners to ionizing radiation from radon and radon daughters, and whole body overexposure to external gamma radiation. A detailed description is provided of stratified random sampling monitoring methodology for obtaining baseline data to be used as a reference for subsequent compliance assessment

  16. Finding Biomarker Signatures in Pooled Sample Designs: A Simulation Framework for Methodological Comparisons

    Directory of Open Access Journals (Sweden)

    Anna Telaar

    2010-01-01

    Full Text Available Detection of discriminating patterns in gene expression data can be accomplished by using various methods of statistical learning. It has been proposed that sample pooling in this context would have negative effects; however, pooling cannot always be avoided. We propose a simulation framework to explicitly investigate the parameters of patterns, experimental design, noise, and choice of method in order to find out which effects on classification performance are to be expected. We use a two-group classification task and simulated gene expression data with independent differentially expressed genes as well as bivariate linear patterns and the combination of both. Our results show a clear increase of prediction error with pool size. For pooled training sets powered partial least squares discriminant analysis outperforms discriminance analysis, random forests, and support vector machines with linear or radial kernel for two of three simulated scenarios. The proposed simulation approach can be implemented to systematically investigate a number of additional scenarios of practical interest.

  17. Methodologies for the Extraction of Phenolic Compounds from Environmental Samples: New Approaches

    Directory of Open Access Journals (Sweden)

    Cristina Mahugo Santana

    2009-01-01

    Full Text Available Phenolic derivatives are among the most important contaminants present in the environment. These compounds are used in several industrial processes to manufacture chemicals such as pesticides, explosives, drugs and dyes. They also are used in the bleaching process of paper manufacturing. Apart from these sources, phenolic compounds have substantial applications in agriculture as herbicides, insecticides and fungicides. However, phenolic compounds are not only generated by human activity, but they are also formed naturally, e.g., during the decomposition of leaves or wood. As a result of these applications, they are found in soils and sediments and this often leads to wastewater and ground water contamination. Owing to their high toxicity and persistence in the environment, both, the US Environmental Protection Agency (EPA and the European Union have included some of them in their lists of priority pollutants. Current standard methods of phenolic compounds analysis in water samples are based on liquid–liquid extraction (LLE while Soxhlet extraction is the most used technique for isolating phenols from solid matrices. However, these techniques require extensive cleanup procedures that are time-intensive and involve expensive and hazardous organic solvents, which are undesirable for health and disposal reasons. In the last years, the use of news methodologies such as solid-phase extraction (SPE and solid-phase microextraction (SPME have increased for the extraction of phenolic compounds from liquid samples. In the case of solid samples, microwave assisted extraction (MAE is demonstrated to be an efficient technique for the extraction of these compounds. In this work we review the developed methods in the extraction and determination of phenolic derivatives in different types of environmental matrices such as water, sediments and soils. Moreover, we present the new approach in the use of micellar media coupled with SPME process for the

  18. The Hubble Space Telescope Medium Deep Survey Cluster Sample: Methodology and Data

    Science.gov (United States)

    Ostrander, E. J.; Nichol, R. C.; Ratnatunga, K. U.; Griffiths, R. E.

    1998-12-01

    We present a new, objectively selected, sample of galaxy overdensities detected in the Hubble Space Telescope Medium Deep Survey (MDS). These clusters/groups were found using an automated procedure that involved searching for statistically significant galaxy overdensities. The contrast of the clusters against the field galaxy population is increased when morphological data are used to search around bulge-dominated galaxies. In total, we present 92 overdensities above a probability threshold of 99.5%. We show, via extensive Monte Carlo simulations, that at least 60% of these overdensities are likely to be real clusters and groups and not random line-of-sight superpositions of galaxies. For each overdensity in the MDS cluster sample, we provide a richness and the average of the bulge-to-total ratio of galaxies within each system. This MDS cluster sample potentially contains some of the most distant clusters/groups ever detected, with about 25% of the overdensities having estimated redshifts z > ~0.9. We have made this sample publicly available to facilitate spectroscopic confirmation of these clusters and help more detailed studies of cluster and galaxy evolution. We also report the serendipitous discovery of a new cluster close on the sky to the rich optical cluster Cl l0016+16 at z = 0.546. This new overdensity, HST 001831+16208, may be coincident with both an X-ray source and a radio source. HST 001831+16208 is the third cluster/group discovered near to Cl 0016+16 and appears to strengthen the claims of Connolly et al. of superclustering at high redshift.

  19. Field screening sampling and analysis strategy and methodology for the 183-H Solar Evaporation Basins: Phase 2, Soils

    International Nuclear Information System (INIS)

    Antipas, A.; Hopkins, A.M.; Wasemiller, M.A.; McCain, R.G.

    1996-01-01

    This document provides a sampling/analytical strategy and methodology for Resource Conservation and Recovery Act (RCRA) closure of the 183-H Solar Evaporation Basins within the boundaries and requirements identified in the initial Phase II Sampling and Analysis Plan for RCRA Closure of the 183-H Solar Evaporation Basins

  20. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  1. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    Science.gov (United States)

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  2. Definitions of love in a sample of British women: an empirical study using Q methodology.

    Science.gov (United States)

    Watts, Simon; Stenner, Paul

    2014-09-01

    Social psychological research has increasingly acknowledged that any pretensions to a singular theory of love should be replaced with a concern about its affirmation and what people actually say and do in love's name. Lee's (1977) love styles research and Sternberg's (1995) theory of love as a story are prime examples. Despite traditional definitions of love in western cultures being dominated by feminine images and tales of gender difference, however, the personal definitions and experiences of women have received comparatively little empirical attention, particularly in recent years and despite some well-documented changes in their cultural circumstances. This study remedies that situation through presentation of a Q methodological study in which a convenience sample of 59 British women were asked to Q sort 54 single-word descriptors of love to define love as they had experienced it. Factor analysis of the resulting Q sorts revealed six distinct definitions of love, interpreted as 'attraction, passion & romance', 'unconditional love', 'sex & fun', 'friendship & spirituality', 'a permanent commitment', and 'separate people, separate lives'. The six definitions are then discussed in terms of their allegiance to traditionally feminine and/or masculine values and as a means of highlighting the changing face of Britain's relational culture. © 2013 The British Psychological Society.

  3. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described.

  4. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1992-01-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described

  5. Generating Random Samples of a Given Size Using Social Security Numbers.

    Science.gov (United States)

    Erickson, Richard C.; Brauchle, Paul E.

    1984-01-01

    The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)

  6. Occupational position and its relation to mental distress in a random sample of Danish residents

    DEFF Research Database (Denmark)

    Rugulies, Reiner Ernst; Madsen, Ida E H; Nielsen, Maj Britt D

    2010-01-01

    PURPOSE: To analyze the distribution of depressive, anxiety, and somatization symptoms across different occupational positions in a random sample of Danish residents. METHODS: The study sample consisted of 591 Danish residents (50% women), aged 20-65, drawn from an age- and gender-stratified random...... sample of the Danish population. Participants filled out a survey that included the 92 item version of the Hopkins Symptom Checklist (SCL-92). We categorized occupational position into seven groups: high- and low-grade non-manual workers, skilled and unskilled manual workers, high- and low-grade self...

  7. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  8. Experience sampling methodology in mental health research: new insights and technical developments.

    Science.gov (United States)

    Myin-Germeys, Inez; Kasanova, Zuzana; Vaessen, Thomas; Vachon, Hugo; Kirtley, Olivia; Viechtbauer, Wolfgang; Reininghaus, Ulrich

    2018-06-01

    In the mental health field, there is a growing awareness that the study of psychiatric symptoms in the context of everyday life, using experience sampling methodology (ESM), may provide a powerful and necessary addition to more conventional research approaches. ESM, a structured self-report diary technique, allows the investigation of experiences within, and in interaction with, the real-world context. This paper provides an overview of how zooming in on the micro-level of experience and behaviour using ESM adds new insights and additional perspectives to standard approaches. More specifically, it discusses how ESM: a) contributes to a deeper understanding of psychopathological phenomena, b) allows to capture variability over time, c) aids in identifying internal and situational determinants of variability in symptomatology, and d) enables a thorough investigation of the interaction between the person and his/her environment and of real-life social interactions. Next to improving assessment of psychopathology and its underlying mechanisms, ESM contributes to advancing and changing clinical practice by allowing a more fine-grained evaluation of treatment effects as well as by providing the opportunity for extending treatment beyond the clinical setting into real life with the development of ecological momentary interventions. Furthermore, this paper provides an overview of the technical details of setting up an ESM study in terms of design, questionnaire development and statistical approaches. Overall, although a number of considerations and challenges remain, ESM offers one of the best opportunities for personalized medicine in psychiatry, from both a research and a clinical perspective. © 2018 World Psychiatric Association.

  9. Design and methodology of the LA Sprouts nutrition, cooking and gardening program for Latino youth: A randomized controlled intervention.

    Science.gov (United States)

    Martinez, Lauren C; Gatto, Nicole M; Spruijt-Metz, Donna; Davis, Jaimie N

    2015-05-01

    The LA Sprouts 12-week nutrition, cooking and gardening intervention targets obesity reduction in Latino children. While other gardening and nutrition programs are shown to improve dietary intake, LA Sprouts is unique in that it utilized a curriculum demonstrated to decrease obesity. This methodology paper outlines the design and processes of the LA Sprouts study, and discusses key strategies employed to foster successful implementation of the program. After-school program in four Los Angeles elementary schools. 3rd-5th grade students. Randomized controlled trial. Gardens were built on two of four school campuses, and the 90-minute weekly lessons focused on strategies to increase fruit and vegetable consumption, gardening at school and home, and cooking healthy meals/snacks. Data collection was conducted pre- and post-intervention and included basic clinical and anthropometric measures, dietary intake and psychosocial constructs measured by questionnaire, and an optional fasting blood draw. Baseline data was collected from 364 children, and 320 (88%) completed follow-up. No participants withdrew from the program (data were missing for other reasons). Intervention students attended 9.7 ± 2.3 lessons. Fasting blood samples were collected on 169 children at baseline, and 113 (67%) at follow-up. Questionnaire scales had good internal consistency (IC) and intra-rater reliability (IRR; in child scales: 88% items with IC > 0.7 and 70% items with IRR > 0.50; in parent scales: 75% items with IC > 0.7). The intervention was successfully implemented in the schools and scales appear appropriate to evaluate psychosocial constructs relevant to a gardening intervention. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. The saving and empowering young lives in Europe (SEYLE) randomized controlled trial (RCT): methodological issues and participant characteristics.

    Science.gov (United States)

    Carli, Vladimir; Wasserman, Camilla; Wasserman, Danuta; Sarchiapone, Marco; Apter, Alan; Balazs, Judit; Bobes, Julio; Brunner, Romuald; Corcoran, Paul; Cosman, Doina; Guillemin, Francis; Haring, Christian; Kaess, Michael; Kahn, Jean Pierre; Keeley, Helen; Keresztény, Agnes; Iosue, Miriam; Mars, Ursa; Musa, George; Nemes, Bogdan; Postuvan, Vita; Reiter-Theil, Stella; Saiz, Pilar; Varnik, Peeter; Varnik, Airi; Hoven, Christina W

    2013-05-16

    Mental health problems and risk behaviours among young people are of great public health concern. Consequently, within the VII Framework Programme, the European Commission funded the Saving and Empowering Young Lives in Europe (SEYLE) project. This Randomized Controlled Trial (RCT) was conducted in eleven European countries, with Sweden as the coordinating centre, and was designed to identify an effective way to promote mental health and reduce suicidality and risk taking behaviours among adolescents. To describe the methodological and field procedures in the SEYLE RCT among adolescents, as well as to present the main characteristics of the recruited sample. Analyses were conducted to determine: 1) representativeness of study sites compared to respective national data; 2) response rate of schools and pupils, drop-out rates from baseline to 3 and 12 month follow-up, 3) comparability of samples among the four Intervention Arms; 4) properties of the standard scales employed: Beck Depression Inventory, Second Edition (BDI-II), Zung Self-Rating Anxiety Scale (Z-SAS), Strengths and Difficulties Questionnaire (SDQ), World Health Organization Well-Being Scale (WHO-5). Participants at baseline comprised 12,395 adolescents (M/F: 5,529/6,799; mean age=14.9±0.9) from Austria, Estonia, France, Germany, Hungary, Ireland, Israel, Italy, Romania, Slovenia and Spain. At the 3 and 12 months follow up, participation rates were 87.3% and 79.4%, respectively. Demographic characteristics of participating sites were found to be reasonably representative of their respective national population. Overall response rate of schools was 67.8%. All scales utilised in the study had good to very good internal reliability, as measured by Cronbach's alpha (BDI-II: 0.864; Z-SAS: 0.805; SDQ: 0.740; WHO-5: 0.799). SEYLE achieved its objective of recruiting a large representative sample of adolescents within participating European countries. Analysis of SEYLE data will shed light on the effectiveness

  11. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    International Nuclear Information System (INIS)

    Maziero, Jonas

    2015-01-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  12. An application of Random Forests to a genome-wide association dataset: Methodological considerations & new findings

    Directory of Open Access Journals (Sweden)

    Hubbard Alan E

    2010-06-01

    Full Text Available Abstract Background As computational power improves, the application of more advanced machine learning techniques to the analysis of large genome-wide association (GWA datasets becomes possible. While most traditional statistical methods can only elucidate main effects of genetic variants on risk for disease, certain machine learning approaches are particularly suited to discover higher order and non-linear effects. One such approach is the Random Forests (RF algorithm. The use of RF for SNP discovery related to human disease has grown in recent years; however, most work has focused on small datasets or simulation studies which are limited. Results Using a multiple sclerosis (MS case-control dataset comprised of 300 K SNP genotypes across the genome, we outline an approach and some considerations for optimally tuning the RF algorithm based on the empirical dataset. Importantly, results show that typical default parameter values are not appropriate for large GWA datasets. Furthermore, gains can be made by sub-sampling the data, pruning based on linkage disequilibrium (LD, and removing strong effects from RF analyses. The new RF results are compared to findings from the original MS GWA study and demonstrate overlap. In addition, four new interesting candidate MS genes are identified, MPHOSPH9, CTNNA3, PHACTR2 and IL7, by RF analysis and warrant further follow-up in independent studies. Conclusions This study presents one of the first illustrations of successfully analyzing GWA data with a machine learning algorithm. It is shown that RF is computationally feasible for GWA data and the results obtained make biologic sense based on previous studies. More importantly, new genes were identified as potentially being associated with MS, suggesting new avenues of investigation for this complex disease.

  13. Abundance, distribution and diversity of gelatinous predators along the northern Mid-Atlantic Ridge: A comparison of different sampling methodologies.

    Directory of Open Access Journals (Sweden)

    Aino Hosia

    Full Text Available The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP & Remotely Operated Vehicle (ROV. Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d. when used at the same stations (n = 6. While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering

  14. Abundance, distribution and diversity of gelatinous predators along the northern Mid-Atlantic Ridge: A comparison of different sampling methodologies

    Science.gov (United States)

    Falkenhaug, Tone; Baxter, Emily J.

    2017-01-01

    The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology

  15. Abundance, distribution and diversity of gelatinous predators along the northern Mid-Atlantic Ridge: A comparison of different sampling methodologies.

    Science.gov (United States)

    Hosia, Aino; Falkenhaug, Tone; Baxter, Emily J; Pagès, Francesc

    2017-01-01

    The diversity and distribution of gelatinous zooplankton were investigated along the northern Mid-Atlantic Ridge (MAR) from June to August 2004.Here, we present results from macrozooplankton trawl sampling, as well as comparisons made between five different methodologies that were employed during the MAR-ECO survey. In total, 16 species of hydromedusae, 31 species of siphonophores and four species of scyphozoans were identified to species level from macrozooplankton trawl samples. Additional taxa were identified to higher taxonomic levels and a single ctenophore genus was observed. Samples were collected at 17 stations along the MAR between the Azores and Iceland. A divergence in the species assemblages was observed at the southern limit of the Subpolar Frontal Zone. The catch composition of gelatinous zooplankton is compared between different sampling methodologies including: a macrozooplankton trawl; a Multinet; a ringnet attached to bottom trawl; and optical platforms (Underwater Video Profiler (UVP) & Remotely Operated Vehicle (ROV)). Different sampling methodologies are shown to exhibit selectivity towards different groups of gelatinous zooplankton. Only ~21% of taxa caught during the survey were caught by both the macrozooplankton trawl and the Multinet when deployed at the same station. The estimates of gelatinous zooplankton abundance calculated using these two gear types also varied widely (1.4 ± 0.9 individuals 1000 m-3 estimated by the macrozooplankton trawl vs. 468.3 ± 315.4 individuals 1000 m-3 estimated by the Multinet (mean ± s.d.) when used at the same stations (n = 6). While it appears that traditional net sampling can generate useful data on pelagic cnidarians, comparisons with results from the optical platforms suggest that ctenophore diversity and abundance are consistently underestimated, particularly when net sampling is conducted in combination with formalin fixation. The results emphasise the importance of considering sampling methodology

  16. Development of Methodology and Field Deployable Sampling Tools for Spent Nuclear Fuel Interrogation in Liquid Storage

    International Nuclear Information System (INIS)

    Berry, T.; Milliken, C.; Martinez-Rodriguez, M.; Hathcock, D.; Heitkamp, M.

    2012-01-01

    This project developed methodology and field deployable tools (test kits) to analyze the chemical and microbiological condition of the fuel storage medium and determine the oxide thickness on the spent fuel basin materials. The overall objective of this project was to determine the amount of time fuel has spent in a storage basin to determine if the operation of the reactor and storage basin is consistent with safeguard declarations or expectations. This project developed and validated forensic tools that can be used to predict the age and condition of spent nuclear fuels stored in liquid basins based on key physical, chemical and microbiological basin characteristics. Key parameters were identified based on a literature review, the parameters were used to design test cells for corrosion analyses, tools were purchased to analyze the key parameters, and these were used to characterize an active spent fuel basin, the Savannah River Site (SRS) L-Area basin. The key parameters identified in the literature review included chloride concentration, conductivity, and total organic carbon level. Focus was also placed on aluminum based cladding because of their application to weapons production. The literature review was helpful in identifying important parameters, but relationships between these parameters and corrosion rates were not available. Bench scale test systems were designed, operated, harvested, and analyzed to determine corrosion relationships between water parameters and water conditions, chemistry and microbiological conditions. The data from the bench scale system indicated that corrosion rates were dependent on total organic carbon levels and chloride concentrations. The highest corrosion rates were observed in test cells amended with sediment, a large microbial inoculum and an organic carbon source. A complete characterization test kit was field tested to characterize the SRS L-Area spent fuel basin. The sampling kit consisted of a TOC analyzer, a YSI

  17. A new methodology for sampling blackflies for the entomological surveillance of onchocerciasis in Brazil.

    Directory of Open Access Journals (Sweden)

    Érika S do Nascimento-Carvalho

    Full Text Available The effectiveness of the MosqTent® trap was evaluated in endemic area to onchocerciasis in Brazil. This study seeks to provide subsidies for the monitoring of the onchocerciasis transmission in the country. The study was carried out at the Homoxi and Thirei villages, located in the Yanomami Indigenous Land, in the state of Roraima. This area presents hyperendemicity, high blackflies densities, large population migrations and mining activities. The Homoxi and Thirei villages are assisted by the Brazilian Ministry of Health. To conduct the present study, the village leader, health leaders and the Brazilian Ethics Committee were consulted. Blackflies captures were carried out simultaneously at the Homoxi and Thirei, using systematized methods to allow for comparisons between the traditional Human Landing Catch (HLC and HLC protected by the MosqTent®. The female blackflies were captured at two equidistant capture stations per locality, by two collectors per station, for five consecutive days. Individuals captured by interval/station/day were counted, identified and maintained at -20°C. The underlying probability distributions and the differences between the methods for the independent sample data were verified in a comparative statistical analysis between the use of the MosqTent® and the HLC. A total of 10,855 antropophilic blackflies were captured by both methodologies. A total of 7,367 (67.87% blackflies belonging to seven species were captured by MosqTent® -Simulium incrustatum s.l (99.06%; S. guianense s.l (0.74%, S. oyapockense s.l (0.01%, S. exiguum (0.10%, S. metallicum (0.05%, S. ochraceum (0.03% and S. minusculum s.l (0.01%. Moreover, 3,488 (32.14% blackflies belonging to four species were captured by HLC-S. incrustatum s.l (98.33%; S. guianense s.l (1.38%, S. oyapockense s.l (0.26% and S. metallicum (0.03%. The MosqTent® was more effective and efficient when compared to HLC. When comparing total blackflies captured/day, the Mosq

  18. A new methodology for sampling blackflies for the entomological surveillance of onchocerciasis in Brazil.

    Science.gov (United States)

    Nascimento-Carvalho, Érika S do; Cesário, Raquel de Andrade; do Vale, Vladimir Fazito; Aranda, Arion Tulio; Valente, Ana Carolina Dos Santos; Maia-Herzog, Marilza

    2017-01-01

    The effectiveness of the MosqTent® trap was evaluated in endemic area to onchocerciasis in Brazil. This study seeks to provide subsidies for the monitoring of the onchocerciasis transmission in the country. The study was carried out at the Homoxi and Thirei villages, located in the Yanomami Indigenous Land, in the state of Roraima. This area presents hyperendemicity, high blackflies densities, large population migrations and mining activities. The Homoxi and Thirei villages are assisted by the Brazilian Ministry of Health. To conduct the present study, the village leader, health leaders and the Brazilian Ethics Committee were consulted. Blackflies captures were carried out simultaneously at the Homoxi and Thirei, using systematized methods to allow for comparisons between the traditional Human Landing Catch (HLC) and HLC protected by the MosqTent®. The female blackflies were captured at two equidistant capture stations per locality, by two collectors per station, for five consecutive days. Individuals captured by interval/station/day were counted, identified and maintained at -20°C. The underlying probability distributions and the differences between the methods for the independent sample data were verified in a comparative statistical analysis between the use of the MosqTent® and the HLC. A total of 10,855 antropophilic blackflies were captured by both methodologies. A total of 7,367 (67.87%) blackflies belonging to seven species were captured by MosqTent® -Simulium incrustatum s.l (99.06%); S. guianense s.l (0.74%), S. oyapockense s.l (0.01%), S. exiguum (0.10%), S. metallicum (0.05%), S. ochraceum (0.03%) and S. minusculum s.l (0.01%). Moreover, 3,488 (32.14%) blackflies belonging to four species were captured by HLC-S. incrustatum s.l (98.33%); S. guianense s.l (1.38%), S. oyapockense s.l (0.26%) and S. metallicum (0.03%). The MosqTent® was more effective and efficient when compared to HLC. When comparing total blackflies captured/day, the MosqTent® was

  19. DEVELOPMENT OF METHODOLOGY AND FIELD DEPLOYABLE SAMPLING TOOLS FOR SPENT NUCLEAR FUEL INTERROGATION IN LIQUID STORAGE

    Energy Technology Data Exchange (ETDEWEB)

    Berry, T.; Milliken, C.; Martinez-Rodriguez, M.; Hathcock, D.; Heitkamp, M.

    2012-06-04

    This project developed methodology and field deployable tools (test kits) to analyze the chemical and microbiological condition of the fuel storage medium and determine the oxide thickness on the spent fuel basin materials. The overall objective of this project was to determine the amount of time fuel has spent in a storage basin to determine if the operation of the reactor and storage basin is consistent with safeguard declarations or expectations. This project developed and validated forensic tools that can be used to predict the age and condition of spent nuclear fuels stored in liquid basins based on key physical, chemical and microbiological basin characteristics. Key parameters were identified based on a literature review, the parameters were used to design test cells for corrosion analyses, tools were purchased to analyze the key parameters, and these were used to characterize an active spent fuel basin, the Savannah River Site (SRS) L-Area basin. The key parameters identified in the literature review included chloride concentration, conductivity, and total organic carbon level. Focus was also placed on aluminum based cladding because of their application to weapons production. The literature review was helpful in identifying important parameters, but relationships between these parameters and corrosion rates were not available. Bench scale test systems were designed, operated, harvested, and analyzed to determine corrosion relationships between water parameters and water conditions, chemistry and microbiological conditions. The data from the bench scale system indicated that corrosion rates were dependent on total organic carbon levels and chloride concentrations. The highest corrosion rates were observed in test cells amended with sediment, a large microbial inoculum and an organic carbon source. A complete characterization test kit was field tested to characterize the SRS L-Area spent fuel basin. The sampling kit consisted of a TOC analyzer, a YSI

  20. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    Science.gov (United States)

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  1. Determination of rare earth elements in natural water samples – A review of sample separation, preconcentration and direct methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Fisher, Andrew, E-mail: afisher@plymouth.ac.uk [School of Geography, Earth and Environmental Sciences, Plymouth University, Drake Circus, Plymouth, Devon, PL4 8AA (United Kingdom); Kara, Derya [Department of Chemistry, Art and Science Faculty, Balikesir University, 10100, Balikesir (Turkey)

    2016-09-07

    This review discusses and compares the methods given for the determination of rare earth elements (REE) in natural water samples, including sea, river, lake, tap, ground and waste waters as well as Antarctic ice. Since REE are at very low concentrations in natural waters, numerous different preconcentration methods have been proposed to enable their measurement. These include liquid liquid extraction, dispersive liquid-liquid micro-extraction and solidified floating drop micro-extraction. In addition to liquid-liquid extraction methods, solid phase extraction using commercial resins, resins made in-house, silica-based exchange materials and other solid media is also discussed. These and other techniques such as precipitation/co-precipitation and flotation are compared in terms of speed, preconcentration factors achieved, precision, accuracy and limits of detection (LOD). Some papers have discussed the direct determination of REE in these sample types. Some have used specialised sample introduction systems such as ultrasonic nebulization whereas others have used a standard sample introduction system coupled with inductively coupled plasma mass spectrometry (ICP-MS) detection. These direct methods have also been discussed and compared. - Highlights: • The determination of rare earth elements in waters is reviewed. • Assorted preconcentration techniques are discussed and evaluated. • Detection techniques include atomic spectrometry, potentiometry and spectrophotometry. • Special nebulisers and electrothermal vaporization approaches are reviewed.

  2. Estimation of Sensitive Proportion by Randomized Response Data in Successive Sampling

    Directory of Open Access Journals (Sweden)

    Bo Yu

    2015-01-01

    Full Text Available This paper considers the problem of estimation for binomial proportions of sensitive or stigmatizing attributes in the population of interest. Randomized response techniques are suggested for protecting the privacy of respondents and reducing the response bias while eliciting information on sensitive attributes. In many sensitive question surveys, the same population is often sampled repeatedly on each occasion. In this paper, we apply successive sampling scheme to improve the estimation of the sensitive proportion on current occasion.

  3. 40 CFR 761.308 - Sample selection by random number generation on any two-dimensional square grid.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Sample selection by random number... § 761.79(b)(3) § 761.308 Sample selection by random number generation on any two-dimensional square... area created in accordance with paragraph (a) of this section, select two random numbers: one each for...

  4. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  5. The Dirichet-Multinomial model for multivariate randomized response data and small samples

    NARCIS (Netherlands)

    Avetisyan, Marianna; Fox, Gerardus J.A.

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The

  6. The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples

    Science.gov (United States)

    Avetisyan, Marianna; Fox, Jean-Paul

    2012-01-01

    In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…

  7. A Nationwide Random Sampling Survey of Potential Complicated Grief in Japan

    Science.gov (United States)

    Mizuno, Yasunao; Kishimoto, Junji; Asukai, Nozomu

    2012-01-01

    To investigate the prevalence of significant loss, potential complicated grief (CG), and its contributing factors, we conducted a nationwide random sampling survey of Japanese adults aged 18 or older (N = 1,343) using a self-rating Japanese-language version of the Complicated Grief Brief Screen. Among them, 37.0% experienced their most significant…

  8. A simple sample size formula for analysis of covariance in cluster randomized trials.

    NARCIS (Netherlands)

    Teerenstra, S.; Eldridge, S.; Graff, M.J.; Hoop, E. de; Borm, G.F.

    2012-01-01

    For cluster randomized trials with a continuous outcome, the sample size is often calculated as if an analysis of the outcomes at the end of the treatment period (follow-up scores) would be performed. However, often a baseline measurement of the outcome is available or feasible to obtain. An

  9. Random selection of items. Selection of n1 samples among N items composing a stratum

    International Nuclear Information System (INIS)

    Jaech, J.L.; Lemaire, R.J.

    1987-02-01

    STR-224 provides generalized procedures to determine required sample sizes, for instance in the course of a Physical Inventory Verification at Bulk Handling Facilities. The present report describes procedures to generate random numbers and select groups of items to be verified in a given stratum through each of the measurement methods involved in the verification. (author). 3 refs

  10. Reinforcing Sampling Distributions through a Randomization-Based Activity for Introducing ANOVA

    Science.gov (United States)

    Taylor, Laura; Doehler, Kirsten

    2015-01-01

    This paper examines the use of a randomization-based activity to introduce the ANOVA F-test to students. The two main goals of this activity are to successfully teach students to comprehend ANOVA F-tests and to increase student comprehension of sampling distributions. Four sections of students in an advanced introductory statistics course…

  11. An efficient method of randomly sampling the coherent angular scatter distribution

    International Nuclear Information System (INIS)

    Williamson, J.F.; Morin, R.L.

    1983-01-01

    Monte Carlo simulations of photon transport phenomena require random selection of an interaction process at each collision site along the photon track. Possible choices are usually limited to photoelectric absorption and incoherent scatter as approximated by the Klein-Nishina distribution. A technique is described for sampling the coherent angular scatter distribution, for the benefit of workers in medical physics. (U.K.)

  12. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  13. Comparing the performance of cluster random sampling and integrated threshold mapping for targeting trachoma control, using computer simulation.

    Directory of Open Access Journals (Sweden)

    Jennifer L Smith

    Full Text Available Implementation of trachoma control strategies requires reliable district-level estimates of trachomatous inflammation-follicular (TF, generally collected using the recommended gold-standard cluster randomized surveys (CRS. Integrated Threshold Mapping (ITM has been proposed as an integrated and cost-effective means of rapidly surveying trachoma in order to classify districts according to treatment thresholds. ITM differs from CRS in a number of important ways, including the use of a school-based sampling platform for children aged 1-9 and a different age distribution of participants. This study uses computerised sampling simulations to compare the performance of these survey designs and evaluate the impact of varying key parameters.Realistic pseudo gold standard data for 100 districts were generated that maintained the relative risk of disease between important sub-groups and incorporated empirical estimates of disease clustering at the household, village and district level. To simulate the different sampling approaches, 20 clusters were selected from each district, with individuals sampled according to the protocol for ITM and CRS. Results showed that ITM generally under-estimated the true prevalence of TF over a range of epidemiological settings and introduced more district misclassification according to treatment thresholds than did CRS. However, the extent of underestimation and resulting misclassification was found to be dependent on three main factors: (i the district prevalence of TF; (ii the relative risk of TF between enrolled and non-enrolled children within clusters; and (iii the enrollment rate in schools.Although in some contexts the two methodologies may be equivalent, ITM can introduce a bias-dependent shift as prevalence of TF increases, resulting in a greater risk of misclassification around treatment thresholds. In addition to strengthening the evidence base around choice of trachoma survey methodologies, this study illustrates

  14. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    Science.gov (United States)

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  15. A simple and reliable methodology to detect egg white in art samples

    Indian Academy of Sciences (India)

    2013-04-26

    Apr 26, 2013 ... threshold density values useful for the detection of ovalbumin in samples from ancient works of art. .... slides a mixture of a water solution of dry egg white and the .... ily, facing the problems of sample leakage, background.

  16. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    International Nuclear Information System (INIS)

    Ma, Y C; Liu, H Y; Yan, S B; Li, J M; Tang, J; Yang, Y H; Yang, M W

    2013-01-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency. (paper)

  17. Additive non-uniform random sampling in superimposed fiber Bragg grating strain gauge

    Science.gov (United States)

    Ma, Y. C.; Liu, H. Y.; Yan, S. B.; Yang, Y. H.; Yang, M. W.; Li, J. M.; Tang, J.

    2013-05-01

    This paper demonstrates an additive non-uniform random sampling and interrogation method for dynamic and/or static strain gauge using a reflection spectrum from two superimposed fiber Bragg gratings (FBGs). The superimposed FBGs are designed to generate non-equidistant space of a sensing pulse train in the time domain during dynamic strain gauge. By combining centroid finding with smooth filtering methods, both the interrogation speed and accuracy are improved. A 1.9 kHz dynamic strain is measured by generating an additive non-uniform randomly distributed 2 kHz optical sensing pulse train from a mean 500 Hz triangular periodically changing scanning frequency.

  18. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  19. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination

    Energy Technology Data Exchange (ETDEWEB)

    Milliard, Alex; Durand-Jezequel, Myriam [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada); Lariviere, Dominic, E-mail: dominic.lariviere@chm.ulaval.ca [Laboratoire de Radioecologie, Departement de chimie, Universite Laval, 1045 Avenue de la Medecine, Quebec, QC, G1V 0A6 (Canada)

    2011-01-17

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (<8 min) was investigated for the complete dissolution of various samples. It could be preceded, if required, by an effective ashing procedure using the M4 fluxer and a newly designed platinum lid. Complete dissolution of the sample was observed and measured using standard reference materials (SRMs) and experimental data show no evidence of cross-contamination of crucibles when LiBO{sub 2}/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg{sup -1} for 5-300 mg of sample.

  20. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    Science.gov (United States)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-09-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1,2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  1. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2011-01-01

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi 2 /Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.

  2. LOD score exclusion analyses for candidate genes using random population samples.

    Science.gov (United States)

    Deng, H W; Li, J; Recker, R R

    2001-05-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes with random population samples. We develop a LOD score approach for exclusion analyses of candidate genes with random population samples. Under this approach, specific genetic effects and inheritance models at candidate genes can be analysed and if a LOD score is < or = - 2.0, the locus can be excluded from having an effect larger than that specified. Computer simulations show that, with sample sizes often employed in association studies, this approach has high power to exclude a gene from having moderate genetic effects. In contrast to regular association analyses, population admixture will not affect the robustness of our analyses; in fact, it renders our analyses more conservative and thus any significant exclusion result is robust. Our exclusion analysis complements association analysis for candidate genes in random population samples and is parallel to the exclusion mapping analyses that may be conducted in linkage analyses with pedigrees or relative pairs. The usefulness of the approach is demonstrated by an application to test the importance of vitamin D receptor and estrogen receptor genes underlying the differential risk to osteoporotic fractures.

  3. Path integral methods for primordial density perturbations - sampling of constrained Gaussian random fields

    International Nuclear Information System (INIS)

    Bertschinger, E.

    1987-01-01

    Path integrals may be used to describe the statistical properties of a random field such as the primordial density perturbation field. In this framework the probability distribution is given for a Gaussian random field subjected to constraints such as the presence of a protovoid or supercluster at a specific location in the initial conditions. An algorithm has been constructed for generating samples of a constrained Gaussian random field on a lattice using Monte Carlo techniques. The method makes possible a systematic study of the density field around peaks or other constrained regions in the biased galaxy formation scenario, and it is effective for generating initial conditions for N-body simulations with rare objects in the computational volume. 21 references

  4. Concrete crushing and sampling, a methodology and technology for the unconditional release of concrete material from decommissioning

    International Nuclear Information System (INIS)

    Baumann, S.; Teunckens, L.; Walthery, R.; Lewandowski, P.; Millen, D.

    2002-01-01

    Belgoprocess started the industrial decommissioning of the main process building of the former Eurochemic reprocessing plant in 1990, after completion of a pilot project. Two small storage buildings for final products from reprocessing were dismantled to verify the assumptions made in a previous paper study on decommissioning, to demonstrate and develop dismantling techniques and to train personnel. Both buildings were emptied and decontaminated to background levels. They were demolished and the remaining concrete debris was disposed of as industrial waste and green field conditions restored. Currently, the decommissioning operations carried out at the main building have made substantial progress. They are executed on an industrial scale and will continue till the end of 2005. In view of the final demolition of the building, a clearance methodology has to be proposed. Application of the methodology applied for the storage buildings of the pilot project is complicated for several reasons. Although this methodology is not rejected as such, an alternative has been studied thoroughly. It considers at least one complete measurement of all concrete structures and the removal of all detected residual radioactivity. This monitoring sequence is followed by a controlled demolition of the concrete structures and crushing of the resulting concrete parts to smaller particles. During the crushing operations, metal parts are separated from the concrete and representative concrete samples are taken. The frequency of sampling meets the prevailing standards. In a further step, the concrete samples are milled, homogenised, and a smaller fraction is sent to the laboratory for analyses. The paper describes the developed concrete crushing and sampling methodology. (authors)

  5. Investigation of Super Learner Methodology on HIV-1 Small Sample: Application on Jaguar Trial Data.

    Science.gov (United States)

    Houssaïni, Allal; Assoumou, Lambert; Marcelin, Anne Geneviève; Molina, Jean Michel; Calvez, Vincent; Flandre, Philippe

    2012-01-01

    Background. Many statistical models have been tested to predict phenotypic or virological response from genotypic data. A statistical framework called Super Learner has been introduced either to compare different methods/learners (discrete Super Learner) or to combine them in a Super Learner prediction method. Methods. The Jaguar trial is used to apply the Super Learner framework. The Jaguar study is an "add-on" trial comparing the efficacy of adding didanosine to an on-going failing regimen. Our aim was also to investigate the impact on the use of different cross-validation strategies and different loss functions. Four different repartitions between training set and validations set were tested through two loss functions. Six statistical methods were compared. We assess performance by evaluating R(2) values and accuracy by calculating the rates of patients being correctly classified. Results. Our results indicated that the more recent Super Learner methodology of building a new predictor based on a weighted combination of different methods/learners provided good performance. A simple linear model provided similar results to those of this new predictor. Slight discrepancy arises between the two loss functions investigated, and slight difference arises also between results based on cross-validated risks and results from full dataset. The Super Learner methodology and linear model provided around 80% of patients correctly classified. The difference between the lower and higher rates is around 10 percent. The number of mutations retained in different learners also varys from one to 41. Conclusions. The more recent Super Learner methodology combining the prediction of many learners provided good performance on our small dataset.

  6. Use of FTA® card methodology for sampling and molecular characterization of Echinococcus granulosus sensu lato in Africa.

    Science.gov (United States)

    Boué, Franck; El Berbri, Ikhlass; Hormaz, Vanessa; Boucher, Jean-Marc; El Mamy, Ahmed Bezeid; Traore, Abdallah; Fihri, Ouafaa Fassi; Petavy, Anne-Françoise; Dakkak, Allal; Umhang, Gérald

    2017-02-01

    Cystic Echinococcosis is a parasitic disease caused by the cestode Echinococcus granulosus widely distributed in Africa. Monitoring of this parasite requires access to cyst samples on intermediate hosts observed at the slaughterhouse. In order to facilitate sampling in the field and analysis, the French National Reference Laboratory for Echinococcus spp. has developed a tissue derived from DNA sampling with FTA ® card technology. The DNA samples were taken by applying the FTA ® paper on the germinal layer after opening the cysts. The sampling technique was validated using frozen cysts (n = 76) stored in the laboratory and from field samples (n = 134) taken at the slaughterhouse by veterinarian technicians during meat inspection in Morocco, Mali and Mauritania. DNA was extracted after several weeks of storage at room temperature. PCR assays were performed using primers for generic cestode (cox1) and amplified fragments were sequenced. All samples taken in the lab and 80% of field samples were capable of molecular characterization. Cyst-derived DNA from FTA ® samples can be useful for easy sampling, storage and rapid, safe and cheap shipment. The use of the FTA methodology will facilitate studies in the field to investigate the presence and genetic characterization of E. granulosus sensu lato in African countries. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.

    Science.gov (United States)

    Sztepanacz, Jacqueline L; Blows, Mark W

    2017-07-01

    The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.

  8. Methodology of testing environmental samples from the area surrounding radioactive waste deposits

    International Nuclear Information System (INIS)

    Kropikova, S.; Pastuchova, D.

    1979-01-01

    Methods are described of environmental sample investigation in the area surrounding radioactive waste deposits, namely monitoring ground water, surface water, sediments, water flows and catchments, vegetation and soil. Methods of sample preparation, and methods of radionuclides determination in mixtures are also discussed, as are spot activity measurement methods. (author)

  9. An automated laboratory-scale methodology for the generation of sheared mammalian cell culture samples.

    Science.gov (United States)

    Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel

    2017-05-01

    Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. A call to improve sampling methodology and reporting in young novice driver research.

    Science.gov (United States)

    Scott-Parker, B; Senserrick, T

    2017-02-01

    Young drivers continue to be over-represented in road crash fatalities despite a multitude of research, communication and intervention. Evidence-based improvement depends to a great extent upon research methodology quality and its reporting, with known limitations in the peer-review process. The aim of the current research was to review the scope of research methodologies applied in 'young driver' and 'teen driver' research and their reporting in four peer-review journals in the field between January 2006 and December 2013. In total, 806 articles were identified and assessed. Reporting omissions included participant gender (11% of papers), response rates (49%), retention rates (39%) and information regarding incentives (44%). Greater breadth and specific improvements in study designs and reporting are thereby identified as a means to further advance the field. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  11. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  12. Tobacco smoking surveillance: is quota sampling an efficient tool for monitoring national trends? A comparison with a random cross-sectional survey.

    Directory of Open Access Journals (Sweden)

    Romain Guignard

    Full Text Available OBJECTIVES: It is crucial for policy makers to monitor the evolution of tobacco smoking prevalence. In France, this monitoring is based on a series of cross-sectional general population surveys, the Health Barometers, conducted every five years and based on random samples. A methodological study has been carried out to assess the reliability of a monitoring system based on regular quota sampling surveys for smoking prevalence. DESIGN / OUTCOME MEASURES: In 2010, current and daily tobacco smoking prevalences obtained in a quota survey on 8,018 people were compared with those of the 2010 Health Barometer carried out on 27,653 people. Prevalences were assessed separately according to the telephone equipment of the interviewee (landline phone owner vs "mobile-only", and logistic regressions were conducted in the pooled database to assess the impact of the telephone equipment and of the survey mode on the prevalences found. Finally, logistic regressions adjusted for sociodemographic characteristics were conducted in the random sample in order to determine the impact of the needed number of calls to interwiew "hard-to-reach" people on the prevalence found. RESULTS: Current and daily prevalences were higher in the random sample (respectively 33.9% and 27.5% in 15-75 years-old than in the quota sample (respectively 30.2% and 25.3%. In both surveys, current and daily prevalences were lower among landline phone owners (respectively 31.8% and 25.5% in the random sample and 28.9% and 24.0% in the quota survey. The required number of calls was slightly related to the smoking status after adjustment for sociodemographic characteristics. CONCLUSION: Random sampling appears to be more effective than quota sampling, mainly by making it possible to interview hard-to-reach populations.

  13. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  14. Study of radioelements drained by Rhone stream to Mediterranean Sea: Strategy of sampling and methodology

    International Nuclear Information System (INIS)

    Arnaud, M.; Charmasson, S.; Calmet, D.; Fernandez, J.M.

    1992-01-01

    This paper describes the methods used for water and sediments sampling in rivers and sea. The purpose is the study of radionuclide migration (Cesium 134, Cesium 137) in Mediterranean Sea (Gulf of Lion). 20 refs., 11 figs., 1 tab

  15. Iterative algorithm of discrete Fourier transform for processing randomly sampled NMR data sets

    International Nuclear Information System (INIS)

    Stanek, Jan; Kozminski, Wiktor

    2010-01-01

    Spectra obtained by application of multidimensional Fourier Transformation (MFT) to sparsely sampled nD NMR signals are usually corrupted due to missing data. In the present paper this phenomenon is investigated on simulations and experiments. An effective iterative algorithm for artifact suppression for sparse on-grid NMR data sets is discussed in detail. It includes automated peak recognition based on statistical methods. The results enable one to study NMR spectra of high dynamic range of peak intensities preserving benefits of random sampling, namely the superior resolution in indirectly measured dimensions. Experimental examples include 3D 15 N- and 13 C-edited NOESY-HSQC spectra of human ubiquitin.

  16. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2013-01-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  17. A methodology for more efficient tail area sampling with discrete probability distribution

    International Nuclear Information System (INIS)

    Park, Sang Ryeol; Lee, Byung Ho; Kim, Tae Woon

    1988-01-01

    Monte Carlo Method is commonly used to observe the overall distribution and to determine the lower or upper bound value in statistical approach when direct analytical calculation is unavailable. However, this method would not be efficient if the tail area of a distribution is concerned. A new method entitled 'Two Step Tail Area Sampling' is developed, which uses the assumption of discrete probability distribution and samples only the tail area without distorting the overall distribution. This method uses two step sampling procedure. First, sampling at points separated by large intervals is done and second, sampling at points separated by small intervals is done with some check points determined at first step sampling. Comparison with Monte Carlo Method shows that the results obtained from the new method converge to analytic value faster than Monte Carlo Method if the numbers of calculation of both methods are the same. This new method is applied to DNBR (Departure from Nucleate Boiling Ratio) prediction problem in design of the pressurized light water nuclear reactor

  18. Randomized branch sampling to estimatefruit production in Pecan trees cv. ‘Barton’

    Directory of Open Access Journals (Sweden)

    Filemom Manoel Mokochinski

    Full Text Available ABSTRACT: Sampling techniques to quantify the production of fruits are still very scarce and create a gap in crop development research. This study was conducted in a rural property in the county of Cachoeira do Sul - RS to estimate the efficiency of randomized branch sampling (RBS in quantifying the production of pecan fruit at three different ages (5,7 and 10 years. Two selection techniques were tested: the probability proportional to the diameter (PPD and the uniform probability (UP techniques, which were performed on nine trees, three from each age and randomly chosen. The RBS underestimated fruit production for all ages, and its main drawback was the high sampling error (125.17% - PPD and 111.04% - UP. The UP was regarded as more efficient than the PPD, though both techniques estimated similar production and similar experimental errors. In conclusion, we reported that branch sampling was inaccurate for this case study, requiring new studies to produce estimates with smaller sampling error.

  19. Revisiting random walk based sampling in networks: evasion of burn-in period and frequent regenerations.

    Science.gov (United States)

    Avrachenkov, Konstantin; Borkar, Vivek S; Kadavankandy, Arun; Sreedharan, Jithin K

    2018-01-01

    In the framework of network sampling, random walk (RW) based estimation techniques provide many pragmatic solutions while uncovering the unknown network as little as possible. Despite several theoretical advances in this area, RW based sampling techniques usually make a strong assumption that the samples are in stationary regime, and hence are impelled to leave out the samples collected during the burn-in period. This work proposes two sampling schemes without burn-in time constraint to estimate the average of an arbitrary function defined on the network nodes, for example, the average age of users in a social network. The central idea of the algorithms lies in exploiting regeneration of RWs at revisits to an aggregated super-node or to a set of nodes, and in strategies to enhance the frequency of such regenerations either by contracting the graph or by making the hitting set larger. Our first algorithm, which is based on reinforcement learning (RL), uses stochastic approximation to derive an estimator. This method can be seen as intermediate between purely stochastic Markov chain Monte Carlo iterations and deterministic relative value iterations. The second algorithm, which we call the Ratio with Tours (RT)-estimator, is a modified form of respondent-driven sampling (RDS) that accommodates the idea of regeneration. We study the methods via simulations on real networks. We observe that the trajectories of RL-estimator are much more stable than those of standard random walk based estimation procedures, and its error performance is comparable to that of respondent-driven sampling (RDS) which has a smaller asymptotic variance than many other estimators. Simulation studies also show that the mean squared error of RT-estimator decays much faster than that of RDS with time. The newly developed RW based estimators (RL- and RT-estimators) allow to avoid burn-in period, provide better control of stability along the sample path, and overall reduce the estimation time. Our

  20. Extraction of ochratoxin A in bread samples by the QuEChERS methodology.

    Science.gov (United States)

    Paíga, Paula; Morais, Simone; Oliva-Teles, Teresa; Correia, Manuela; Delerue-Matos, Cristina; Duarte, Sofia C; Pena, Angelina; Lino, Celeste Matos

    2012-12-15

    A QuEChERS method for the extraction of ochratoxin A (OTA) from bread samples was evaluated. A factorial design (2(3)) was used to find the optimal QuEChERS parameters (extraction time, extraction solvent volume and sample mass). Extracts were analysed by LC with fluorescence detection. The optimal extraction conditions were: 5 g of sample, 15 mL of acetonitrile and 3 min of agitation. The extraction procedure was validated by systematic recovery experiments at three levels. The recoveries obtained ranged from 94.8% (at 1.0 μg kg(-1)) to 96.6% (at 3.0 μg kg(-1)). The limit of quantification of the method was 0.05 μg kg(-1). The optimised procedure was applied to 20 samples of different bread types ("Carcaça", "Broa de Milho", and "Broa de Avintes") highly consumed in Portugal. None of the samples exceeded the established European legal limit of 3 μg kg(-1). Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Improving ambulatory saliva-sampling compliance in pregnant women: a randomized controlled study.

    Directory of Open Access Journals (Sweden)

    Julian Moeller

    Full Text Available OBJECTIVE: Noncompliance with scheduled ambulatory saliva sampling is common and has been associated with biased cortisol estimates in nonpregnant subjects. This study is the first to investigate in pregnant women strategies to improve ambulatory saliva-sampling compliance, and the association between sampling noncompliance and saliva cortisol estimates. METHODS: We instructed 64 pregnant women to collect eight scheduled saliva samples on two consecutive days each. Objective compliance with scheduled sampling times was assessed with a Medication Event Monitoring System and self-reported compliance with a paper-and-pencil diary. In a randomized controlled study, we estimated whether a disclosure intervention (informing women about objective compliance monitoring and a reminder intervention (use of acoustical reminders improved compliance. A mixed model analysis was used to estimate associations between women's objective compliance and their diurnal cortisol profiles, and between deviation from scheduled sampling and the cortisol concentration measured in the related sample. RESULTS: Self-reported compliance with a saliva-sampling protocol was 91%, and objective compliance was 70%. The disclosure intervention was associated with improved objective compliance (informed: 81%, noninformed: 60%, F(1,60  = 17.64, p<0.001, but not the reminder intervention (reminders: 68%, without reminders: 72%, F(1,60 = 0.78, p = 0.379. Furthermore, a woman's increased objective compliance was associated with a higher diurnal cortisol profile, F(2,64  = 8.22, p<0.001. Altered cortisol levels were observed in less objective compliant samples, F(1,705  = 7.38, p = 0.007, with delayed sampling associated with lower cortisol levels. CONCLUSIONS: The results suggest that in pregnant women, objective noncompliance with scheduled ambulatory saliva sampling is common and is associated with biased cortisol estimates. To improve sampling compliance, results suggest

  2. Concrete crushing and sampling, a methodology and technology for the unconditional release of concrete material from decommissioning

    International Nuclear Information System (INIS)

    Gills, R.; Lewandowski, P.; Ooms, B.; Reusen, N.; Van Laer, W.; Walthery, R.

    2007-01-01

    Belgoprocess started the industrial decommissioning of the main process building of the former Eurochemic reprocessing plant in 1990, after completion of a pilot project. Two small storage buildings for final products from reprocessing were dismantled to verify the assumptions made in a previous paper study on decommissioning, to demonstrate and develop dismantling techniques and to train personnel. Both buildings were emptied and decontaminated to background levels. They were demolished and the remaining concrete debris was disposed of as industrial waste and green field conditions restored. Currently, the decommissioning operations carried out at the main building have made substantial progress. They are executed on an industrial scale. In view of the final demolition of the building, foreseen to start in the middle of 2008, a clearance methodology for the concrete from the cells into the Eurochemic building has been developed. It considers at least one complete measurement of all concrete structures and the removal of all detected residual radionuclides. This monitoring sequence is followed by a controlled demolition of the concrete structures and crushing of the resulting concrete parts to smaller particles. During the crushing operations, metal parts are separated from the concrete and representative concrete samples are taken. The frequency of sampling meets the prevailing standards. In a further step, the concrete samples are milled, homogenised, and a smaller fraction is sent to the laboratory for analyses. The paper describes the developed concrete crushing and sampling methodology. (authors)

  3. A Comparison of Online versus On-site Training in Health Research Methodology: A Randomized Study

    Directory of Open Access Journals (Sweden)

    Kanchanaraksa Sukon

    2011-06-01

    Full Text Available Abstract Background Distance learning may be useful for building health research capacity. However, evidence that it can improve knowledge and skills in health research, particularly in resource-poor settings, is limited. We compared the impact and acceptability of teaching two distinct content areas, Biostatistics and Research Ethics, through either on-line distance learning format or traditional on-site training, in a randomized study in India. Our objective was to determine whether on-line courses in Biostatistics and Research Ethics could achieve similar improvements in knowledge, as traditional on-site, classroom-based courses. Methods Subjects: Volunteer Indian scientists were randomly assigned to one of two arms. Intervention: Students in Arm 1 attended a 3.5-day on-site course in Biostatistics and completed a 3.5-week on-line course in Research Ethics. Students in Arm 2 attended a 3.5-week on-line course in Biostatistics and 3.5-day on-site course in Research Ethics. For the two course formats, learning objectives, course contents and knowledge tests were identical. Main Outcome Measures: Improvement in knowledge immediately and 3-months after course completion, compared to baseline. Results Baseline characteristics were similar in both arms (n = 29 each. Median knowledge score for Biostatistics increased from a baseline of 49% to 64% (p Conclusion On-line and on-site training formats led to marked and similar improvements of knowledge in Biostatistics and Research Ethics. This, combined with logistical and cost advantages of on-line training, may make on-line courses particularly useful for expanding health research capacity in resource-limited settings.

  4. Development of the methodology of sample preparation to X-ray diffractometry of clay minerals at Petrobras Research Center

    International Nuclear Information System (INIS)

    Alves, D.B.

    1987-01-01

    Various procedures can be used in the analysis of the clay mineral content of rocks by X-ray diffraction. This article describes the principal ones and discusses those adopted in the X-ray clay mineral laboratory of the PETROBRAS Research Center (CENPES) in Rio de Janeiro. This article presents the methodology used and provides users with information about its application and limitations. The methodology has been developed to study polymineral samples. The aim to identify clay mineral groups and to estimate their relative proportions. Of the four main steps of this analysis - separation and concentration of clay minerals, preparation of oriented specimens, X-ray irradiation under standard conditions and interpretation of X-ray diffraction patterns - only the first three are discussed here. Clay minerals occur mainly in the [pt

  5. Assessment of Psychopathic Traits in an Incarcerated Adolescent Sample: A Methodological Comparison

    Science.gov (United States)

    Fink, Brandi C.; Tant, Adam S.; Tremba, Katherine; Kiehl, Kent A.

    2012-01-01

    Analyses of convergent validity and group assignment using self-report, caregiver-report and interview-based measures of adolescent psychopathy were conducted in a sample of 160 incarcerated adolescents. Results reveal significant convergent validity between caregiver-report measures of adolescent psychopathy, significant convergent validity…

  6. The Expression of Adult ADHD Symptoms in Daily Life: An Application of Experience Sampling Methodology

    Science.gov (United States)

    Knouse, Laura E.; Mitchell, John T.; Brown, Leslie H.; Silvia, Paul J.; Kane, Michael J.; Myin-Germeys, Inez; Kwapil, Thomas R.

    2008-01-01

    Objective: To use experience sampling method (ESM) to examine the impact of inattentive and hyperactive-impulsive ADHD symptoms on emotional well-being, activities and distress, cognitive impairment, and social functioning assessed in the daily lives of young adults. The impact of subjective appraisals on their experiences is also examined.…

  7. Investigating causal associations between use of nicotine, alcohol, caffeine and cannabis: a two-sample bidirectional Mendelian randomization study.

    Science.gov (United States)

    Verweij, Karin J H; Treur, Jorien L; Vink, Jacqueline M

    2018-07-01

    Epidemiological studies consistently show co-occurrence of use of different addictive substances. Whether these associations are causal or due to overlapping underlying influences remains an important question in addiction research. Methodological advances have made it possible to use published genetic associations to infer causal relationships between phenotypes. In this exploratory study, we used Mendelian randomization (MR) to examine the causality of well-established associations between nicotine, alcohol, caffeine and cannabis use. Two-sample MR was employed to estimate bidirectional causal effects between four addictive substances: nicotine (smoking initiation and cigarettes smoked per day), caffeine (cups of coffee per day), alcohol (units per week) and cannabis (initiation). Based on existing genome-wide association results we selected genetic variants associated with the exposure measure as an instrument to estimate causal effects. Where possible we applied sensitivity analyses (MR-Egger and weighted median) more robust to horizontal pleiotropy. Most MR tests did not reveal causal associations. There was some weak evidence for a causal positive effect of genetically instrumented alcohol use on smoking initiation and of cigarettes per day on caffeine use, but these were not supported by the sensitivity analyses. There was also some suggestive evidence for a positive effect of alcohol use on caffeine use (only with MR-Egger) and smoking initiation on cannabis initiation (only with weighted median). None of the suggestive causal associations survived corrections for multiple testing. Two-sample Mendelian randomization analyses found little evidence for causal relationships between nicotine, alcohol, caffeine and cannabis use. © 2018 Society for the Study of Addiction.

  8. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  9. Validated methodology for quantifying infestation levels of dreissenid mussels in environmental DNA (eDNA) samples.

    Science.gov (United States)

    Peñarrubia, Luis; Alcaraz, Carles; Vaate, Abraham Bij de; Sanz, Nuria; Pla, Carles; Vidal, Oriol; Viñas, Jordi

    2016-12-14

    The zebra mussel (Dreissena polymorpha Pallas, 1771) and the quagga mussel (D. rostriformis Deshayes, 1838) are successful invasive bivalves with substantial ecological and economic impacts in freshwater systems once they become established. Since their eradication is extremely difficult, their detection at an early stage is crucial to prevent spread. In this study, we optimized and validated a qPCR detection method based on the histone H2B gene to quantify combined infestation levels of zebra and quagga mussels in environmental DNA samples. Our results show specific dreissenid DNA present in filtered water samples for which microscopic diagnostic identification for larvae failed. Monitoring a large number of locations for invasive dreissenid species based on a highly specific environmental DNA qPCR assay may prove to be an essential tool for management and control plans focused on prevention of establishment of dreissenid mussels in new locations.

  10. Validation of an analytical methodology for the quantitative analysis of petroleum hydrocarbons in marine sediment samples

    Directory of Open Access Journals (Sweden)

    Eloy Yordad Companioni Damas

    2009-01-01

    Full Text Available This work describes a validation of an analytical procedure for the analysis of petroleum hydrocarbons in marine sediment samples. The proposed protocol is able to measure n-alkanes and polycyclic aromatic hydrocarbons (PAH in samples at concentrations as low as 30 ng/g, with a precision better than 15% for most of analytes. The extraction efficiency of fortified sediments varied from 65.1 to 105.6% and 59.7 to 97.8%, for n-alkanes and PAH in the ranges: C16 - C32 and fluoranthene - benzo(apyrene, respectively. The analytical protocol was applied to determine petroleum hydrocarbons in sediments collected from a marine coastal zone.

  11. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population

    Directory of Open Access Journals (Sweden)

    Guillermo Rey Gozalo

    2016-05-01

    Full Text Available Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods were analysed and compared using the city of Talca (Chile as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population.

  12. Analysis of Sampling Methodologies for Noise Pollution Assessment and the Impact on the Population.

    Science.gov (United States)

    Rey Gozalo, Guillermo; Barrigón Morillas, Juan Miguel

    2016-05-11

    Today, noise pollution is an increasing environmental stressor. Noise maps are recognised as the main tool for assessing and managing environmental noise, but their accuracy largely depends on the sampling method used. The sampling methods most commonly used by different researchers (grid, legislative road types and categorisation methods) were analysed and compared using the city of Talca (Chile) as a test case. The results show that the stratification of sound values in road categories has a significantly lower prediction error and a higher capacity for discrimination and prediction than in the legislative road types used by the Ministry of Transport and Telecommunications in Chile. Also, the use of one or another method implies significant differences in the assessment of population exposure to noise pollution. Thus, the selection of a suitable method for performing noise maps through measurements is essential to achieve an accurate assessment of the impact of noise pollution on the population.

  13. Culturally appropriate methodology in obtaining a representative sample of South Australian Aboriginal adults for a cross-sectional population health study: challenges and resolutions.

    Science.gov (United States)

    Marin, Tania; Taylor, Anne Winifred; Grande, Eleonora Dal; Avery, Jodie; Tucker, Graeme; Morey, Kim

    2015-05-19

    The considerably lower average life expectancy of Aboriginal and Torres Strait Islander Australians, compared with non-Aboriginal and non-Torres Strait Islander Australians, has been widely reported. Prevalence data for chronic disease and health risk factors are needed to provide evidence based estimates for Australian Aboriginal and Torres Strait Islanders population health planning. Representative surveys for these populations are difficult due to complex methodology. The focus of this paper is to describe in detail the methodological challenges and resolutions of a representative South Australian Aboriginal population-based health survey. Using a stratified multi-stage sampling methodology based on the Australian Bureau of Statistics 2006 Census with culturally appropriate and epidemiological rigorous methods, 11,428 randomly selected dwellings were approached from a total of 209 census collection districts. All persons eligible for the survey identified as Aboriginal and/or Torres Strait Islander and were selected from dwellings identified as having one or more Aboriginal person(s) living there at the time of the survey. Overall, the 399 interviews from an eligible sample of 691 SA Aboriginal adults yielded a response rate of 57.7%. These face-to-face interviews were conducted by ten interviewers retained from a total of 27 trained Aboriginal interviewers. Challenges were found in three main areas: identification and recruitment of participants; interviewer recruitment and retainment; and using appropriate engagement with communities. These challenges were resolved, or at least mainly overcome, by following local protocols with communities and their representatives, and reaching agreement on the process of research for Aboriginal people. Obtaining a representative sample of Aboriginal participants in a culturally appropriate way was methodologically challenging and required high levels of commitment and resources. Adhering to these principles has resulted in a

  14. Use of NIR spectroscopy and multivariate process spectra calibration methodology for pharmaceutical solid samples analysis

    OpenAIRE

    Cárdenas Espitia, Vanessa

    2012-01-01

    Accomplish high quality of final products in pharmaceutical industry is a challenge that requires the control and supervision of all the manufacturing steps. This request created the necessity of developing fast and accurate analytical methods. Near infrared spectroscopy together with chemometrics, fulfill this growing demand. The high speed providing relevant information and the versatility of its application to different types of samples lead these combined techniques as one of the most app...

  15. Acute stress symptoms during the second Lebanon war in a random sample of Israeli citizens.

    Science.gov (United States)

    Cohen, Miri; Yahav, Rivka

    2008-02-01

    The aims of this study were to assess prevalence of acute stress disorder (ASD) and acute stress symptoms (ASS) in Israel during the second Lebanon war. A telephone survey was conducted in July 2006 of a random sample of 235 residents of northern Israel, who were subjected to missile attacks, and of central Israel, who were not subjected to missile attacks. Results indicate that ASS scores were higher in the northern respondents; 6.8% of the northern sample and 3.9% of the central sample met ASD criteria. Appearance of each symptom ranged from 15.4% for dissociative to 88.4% for reexperiencing, with significant differences between northern and central respondents only for reexperiencing and arousal. A low ASD rate and a moderate difference between areas subjected and not subjected to attack were found.

  16. Distributed fiber sparse-wideband vibration sensing by sub-Nyquist additive random sampling

    Science.gov (United States)

    Zhang, Jingdong; Zheng, Hua; Zhu, Tao; Yin, Guolu; Liu, Min; Bai, Yongzhong; Qu, Dingrong; Qiu, Feng; Huang, Xianbing

    2018-05-01

    The round trip time of the light pulse limits the maximum detectable vibration frequency response range of phase-sensitive optical time domain reflectometry ({\\phi}-OTDR). Unlike the uniform laser pulse interval in conventional {\\phi}-OTDR, we randomly modulate the pulse interval, so that an equivalent sub-Nyquist additive random sampling (sNARS) is realized for every sensing point of the long interrogation fiber. For an {\\phi}-OTDR system with 10 km sensing length, the sNARS method is optimized by theoretical analysis and Monte Carlo simulation, and the experimental results verify that a wide-band spars signal can be identified and reconstructed. Such a method can broaden the vibration frequency response range of {\\phi}-OTDR, which is of great significance in sparse-wideband-frequency vibration signal detection, such as rail track monitoring and metal defect detection.

  17. Location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling

    Directory of Open Access Journals (Sweden)

    Alireza Goli

    2015-09-01

    Full Text Available Distribution and optimum allocation of emergency resources are the most important tasks, which need to be accomplished during crisis. When a natural disaster such as earthquake, flood, etc. takes place, it is necessary to deliver rescue efforts as quickly as possible. Therefore, it is important to find optimum location and distribution of emergency relief resources. When a natural disaster occurs, it is not possible to reach some damaged areas. In this paper, location and multi-depot vehicle routing for emergency vehicles using tour coverage and random sampling is investigated. In this study, there is no need to visit all the places and some demand points receive their needs from the nearest possible location. The proposed study is implemented for some randomly generated numbers in different sizes. The preliminary results indicate that the proposed method was capable of reaching desirable solutions in reasonable amount of time.

  18. Random Walks on Directed Networks: Inference and Respondent-Driven Sampling

    Directory of Open Access Journals (Sweden)

    Malmros Jens

    2016-06-01

    Full Text Available Respondent-driven sampling (RDS is often used to estimate population properties (e.g., sexual risk behavior in hard-to-reach populations. In RDS, already sampled individuals recruit population members to the sample from their social contacts in an efficient snowball-like sampling procedure. By assuming a Markov model for the recruitment of individuals, asymptotically unbiased estimates of population characteristics can be obtained. Current RDS estimation methodology assumes that the social network is undirected, that is, all edges are reciprocal. However, empirical social networks in general also include a substantial number of nonreciprocal edges. In this article, we develop an estimation method for RDS in populations connected by social networks that include reciprocal and nonreciprocal edges. We derive estimators of the selection probabilities of individuals as a function of the number of outgoing edges of sampled individuals. The proposed estimators are evaluated on artificial and empirical networks and are shown to generally perform better than existing estimators. This is the case in particular when the fraction of directed edges in the network is large.

  19. Stature in archeological samples from central Italy: methodological issues and diachronic changes.

    Science.gov (United States)

    Giannecchini, Monica; Moggi-Cecchi, Jacopo

    2008-03-01

    Stature reconstructions from skeletal remains are usually obtained through regression equations based on the relationship between height and limb bone length. Different equations have been employed to reconstruct stature in skeletal samples, but this is the first study to provide a systematic analysis of the reliability of the different methods for Italian historical samples. Aims of this article are: 1) to analyze the reliability of different regression methods to estimate stature for populations living in Central Italy from the Iron Age to Medieval times; 2) to search for trends in stature over this time period by applying the most reliable regression method. Long bone measurements were collected from 1,021 individuals (560 males, 461 females), from 66 archeological sites for males and 54 for females. Three time periods were identified: Iron Age, Roman period, and Medieval period. To determine the most appropriate equation to reconstruct stature the Delta parameter of Gini (Memorie di metodologia statistica. Milano: Giuffre A. 1939), in which stature estimates derived from different limb bones are compared, was employed. The equations proposed by Pearson (Philos Trans R Soc London 192 (1899) 169-244) and Trotter and Gleser for Afro-Americans (Am J Phys Anthropol 10 (1952) 463-514; Am J Phys Anthropol 47 (1977) 355-356) provided the most consistent estimates when applied to our sample. We then used the equation by Pearson for further analyses. Results indicate a reduction in stature in the transition from the Iron Age to the Roman period, and a subsequent increase in the transition from the Roman period to the Medieval period. Changes of limb lengths over time were more pronounced in the distal than in the proximal elements in both limbs. 2007 Wiley-Liss, Inc.

  20. Uncertainty Determination Methodology, Sampling Maps Generation and Trend Studies with Biomass Thermogravimetric Analysis

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis) for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets), completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications. PMID:21152292

  1. Finding needles in a haystack: a methodology for identifying and sampling community-based youth smoking cessation programs.

    Science.gov (United States)

    Emery, Sherry; Lee, Jungwha; Curry, Susan J; Johnson, Tim; Sporer, Amy K; Mermelstein, Robin; Flay, Brian; Warnecke, Richard

    2010-02-01

    Surveys of community-based programs are difficult to conduct when there is virtually no information about the number or locations of the programs of interest. This article describes the methodology used by the Helping Young Smokers Quit (HYSQ) initiative to identify and profile community-based youth smoking cessation programs in the absence of a defined sample frame. We developed a two-stage sampling design, with counties as the first-stage probability sampling units. The second stage used snowball sampling to saturation, to identify individuals who administered youth smoking cessation programs across three economic sectors in each county. Multivariate analyses modeled the relationship between program screening, eligibility, and response rates and economic sector and stratification criteria. Cumulative logit models analyzed the relationship between the number of contacts in a county and the number of programs screened, eligible, or profiled in a county. The snowball process yielded 9,983 unique and traceable contacts. Urban and high-income counties yielded significantly more screened program administrators; urban counties produced significantly more eligible programs, but there was no significant association between the county characteristics and program response rate. There is a positive relationship between the number of informants initially located and the number of programs screened, eligible, and profiled in a county. Our strategy to identify youth tobacco cessation programs could be used to create a sample frame for other nonprofit organizations that are difficult to identify due to a lack of existing directories, lists, or other traditional sample frames.

  2. Random On-Board Pixel Sampling (ROPS) X-Ray Camera

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhehui [Los Alamos; Iaroshenko, O. [Los Alamos; Li, S. [Los Alamos; Liu, T. [Fermilab; Parab, N. [Argonne (main); Chen, W. W. [Purdue U.; Chu, P. [Los Alamos; Kenyon, G. [Los Alamos; Lipton, R. [Fermilab; Sun, K.-X. [Nevada U., Las Vegas

    2017-09-25

    Recent advances in compressed sensing theory and algorithms offer new possibilities for high-speed X-ray camera design. In many CMOS cameras, each pixel has an independent on-board circuit that includes an amplifier, noise rejection, signal shaper, an analog-to-digital converter (ADC), and optional in-pixel storage. When X-ray images are sparse, i.e., when one of the following cases is true: (a.) The number of pixels with true X-ray hits is much smaller than the total number of pixels; (b.) The X-ray information is redundant; or (c.) Some prior knowledge about the X-ray images exists, sparse sampling may be allowed. Here we first illustrate the feasibility of random on-board pixel sampling (ROPS) using an existing set of X-ray images, followed by a discussion about signal to noise as a function of pixel size. Next, we describe a possible circuit architecture to achieve random pixel access and in-pixel storage. The combination of a multilayer architecture, sparse on-chip sampling, and computational image techniques, is expected to facilitate the development and applications of high-speed X-ray camera technology.

  3. Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates.

    Science.gov (United States)

    Long, Jiang; Liu, Tie-Qiao; Liao, Yan-Hui; Qi, Chang; He, Hao-Yu; Chen, Shu-Bao; Billieux, Joël

    2016-11-17

    Smartphones are becoming a daily necessity for most undergraduates in Mainland China. Because the present scenario of problematic smartphone use (PSU) is largely unexplored, in the current study we aimed to estimate the prevalence of PSU and to screen suitable predictors for PSU among Chinese undergraduates in the framework of the stress-coping theory. A sample of 1062 undergraduate smartphone users was recruited by means of the stratified cluster random sampling strategy between April and May 2015. The Problematic Cellular Phone Use Questionnaire was used to identify PSU. We evaluated five candidate risk factors for PSU by using logistic regression analysis while controlling for demographic characteristics and specific features of smartphone use. The prevalence of PSU among Chinese undergraduates was estimated to be 21.3%. The risk factors for PSU were majoring in the humanities, high monthly income from the family (≥1500 RMB), serious emotional symptoms, high perceived stress, and perfectionism-related factors (high doubts about actions, high parental expectations). PSU among undergraduates appears to be ubiquitous and thus constitutes a public health issue in Mainland China. Although further longitudinal studies are required to test whether PSU is a transient phenomenon or a chronic and progressive condition, our study successfully identified socio-demographic and psychological risk factors for PSU. These results, obtained from a random and thus representative sample of undergraduates, opens up new avenues in terms of prevention and regulation policies.

  4. [The methodology and sample description of the National Survey on Addiction Problems in Hungary 2015 (NSAPH 2015)].

    Science.gov (United States)

    Paksi, Borbala; Demetrovics, Zsolt; Magi, Anna; Felvinczi, Katalin

    2017-06-01

    This paper introduces the methods and methodological findings of the National Survey on Addiction Problems in Hungary (NSAPH 2015). Use patterns of smoking, alcohol use and other psychoactive substances were measured as well as that of certain behavioural addictions (problematic gambling - PGSI, DSM-V, eating disorders - SCOFF, problematic internet use - PIUQ, problematic on-line gaming - POGO, problematic social media use - FAS, exercise addictions - EAI-HU, work addiction - BWAS, compulsive buying - CBS). The paper describes the applied measurement techniques, sample selection, recruitment of respondents and the data collection strategy as well. Methodological results of the survey including reliability and validity of the measures are reported. The NSAPH 2015 research was carried out on a nationally representative sample of the Hungarian adult population aged 16-64 yrs (gross sample 2477, net sample 2274 persons) with the age group of 18-34 being overrepresented. Statistical analysis of the weight-distribution suggests that weighting did not create any artificial distortion in the database leaving the representativeness of the sample unaffected. The size of the weighted sample of the 18-64 years old adult population is 1490 persons. The extent of the theoretical margin of error in the weighted sample is ±2,5%, at a reliability level of 95% which is in line with the original data collection plans. Based on the analysis of reliability and the extent of errors beyond sampling within the context of the database we conclude that inconsistencies create relatively minor distortions in cumulative prevalence rates; consequently the database makes possible the reliable estimation of risk factors related to different substance use behaviours. The reliability indexes of measurements used for prevalence estimates of behavioural addictions proved to be appropriate, though the psychometric features in some cases suggest the presence of redundant items. The comparison of

  5. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  6. Environmental methodology. Sampling and preparing fresh water organisms. Measuring of emitting radionuclides

    International Nuclear Information System (INIS)

    Foulquier, Luc; Philippot, J.C.; Baudin-Jaulent, Yvette.

    1982-05-01

    This paper provides some initial responses to questions asked by users of radioecological documents. By using aquatic plants and fish drawn ''in situ'' the authors' results often reveal very low activity levels; they make a point of knowing how to deal with such levels, since the fundamental objective is to interpret transfer mechanisms. The establishment of the environmental level of radioactivity requires that the write-ups produced demonstrate the use of reproducible methods, and contain results for which the extent of reliability is clearly specified. Aquatic plants and fish are, among all fresh water organisms, the most interesting links in the study of artificial and natural radioactivity. By systematically using concrete examples, this work reaffirms the precautions that should be taken in a site study. Once the objective is clearly defined, the properties to give to the sampling can be specified [fr

  7. Methodology for estimation of 32P in bioassay samples by Cerenkov counting

    International Nuclear Information System (INIS)

    Wankhede, Sonal; Sawant, Pramilla D.; Yadav, R.K.B.; Rao, D.D.

    2016-01-01

    Radioactive phosphorus ( 32 P) as phosphate is used to effectively reduce bone pain in terminal cancer patients. Several hospitals in India carry out this palliative care procedure on a regular basis. Thus, production as well as synthesis of 32 P compounds has increased over the years to meet this requirement. Monitoring of radiation workers handling 32 P compounds is important for further strengthening of radiological protection program at processing facility. 32 P being a pure beta emitter (β max = 1.71 MeV, t 1/2 = 14.3 d), bioassay is the preferred individual monitoring technique. Method standardized at Bioassay Lab, Trombay, includes estimation of 32 P in urine by co-precipitation with ammonium phosphomolybdate (AMP) followed by gross beta counting. In the present study, feasibility of Cerenkov counting for detection of 32 P in bioassay samples was explored and the results obtained were compared with the gross beta counting technique

  8. A sampling and metagenomic sequencing-based methodology for monitoring antimicrobial resistance in swine herds

    DEFF Research Database (Denmark)

    Munk, Patrick; Dalhoff Andersen, Vibe; de Knegt, Leonardo

    2016-01-01

    Objectives Reliable methods for monitoring antimicrobial resistance (AMR) in livestock and other reservoirs are essential to understand the trends, transmission and importance of agricultural resistance. Quantification of AMR is mostly done using culture-based techniques, but metagenomic read...... mapping shows promise for quantitative resistance monitoring. Methods We evaluated the ability of: (i) MIC determination for Escherichia coli; (ii) cfu counting of E. coli; (iii) cfu counting of aerobic bacteria; and (iv) metagenomic shotgun sequencing to predict expected tetracycline resistance based...... cultivation-based techniques in terms of predicting expected tetracycline resistance based on antimicrobial consumption. Our metagenomic approach had sufficient resolution to detect antimicrobial-induced changes to individual resistance gene abundances. Pen floor manure samples were found to represent rectal...

  9. Assessing heat load in drylot dairy cattle: Refining on-farm sampling methodology.

    Science.gov (United States)

    Tresoldi, Grazyne; Schütz, Karin E; Tucker, Cassandra B

    2016-11-01

    Identifying dairy cattle experiencing heat stress and adopting appropriate mitigation strategies can improve welfare and profitability. However, little is known about how cattle use heat abatement resources (shade, sprayed water) on drylot dairies. It is also unclear how often we need to observe animals to measure high heat load, or the relevance of specific aspects of this response, particularly in terms of panting. Our objectives were to describe and determine sampling intervals to measure cattle use of heat abatement resources, respiration rate (RR) and panting characteristics (drooling, open mouth, protruding tongue), and to evaluate the relationship between the latter 2. High-producing cows were chosen from 4 drylots (8 cows/dairy, n=32) and observed for at least 5.9h (1000 to 1800h, excluding milking) when air temperature, humidity, and the combined index averaged 33°C, 30%, and 79, respectively. Use of heat abatement resources was recorded continuously; RR and the presence and absence of each panting characteristic were recorded every 5min. From the observed values, estimates using the specified sub-sampling intervals were calculated for heat abatement resource use (1, 5, 10, 15, 20, 30, 60, 90, and 120min), and for RR and panting (10, 15, 20, 30, 60, 90, and 120min). Estimates and observed values were compared using linear regression. Sampling intervals were considered accurate if they met 3 criteria: R 2 ≥0.9, intercept=0, and slope=1. The relationship between RR and each panting characteristic was analyzed using mixed models. Cows used shade (at corral or over feed bunk) and feed bunk area (where water was sprayed) for about 90 and 50% of the observed time, respectively, and used areas with no cooling for 2min at a time, on average. Cows exhibited drooling (34±4% of observations) more often than open mouth and protruding tongue (11±3 and 8±3% of observations, respectively). Respiration rate varied depending on the presence of panting (with vs

  10. Standardization of a PIGE methodology for simultaneous quantification of low Z elements in barium borosilicate glass samples

    International Nuclear Information System (INIS)

    Chhillar, S.; Acharya, R.; Dasari, K.B.; Pujari, P.K.; Mishra, R.K.; Kaushik, C.P.

    2013-01-01

    In order to standardize particle induced gamma-ray emission (PIGE) methodology for simultaneous quantification of light elements, analytical sensitivities of Li, F, B, Na, Al and Si were evaluated using 4 MeV proton beam ( ∼ 10 nA current) using 3 MV Pelletron at IOP, Bhubaneswar. The PIGE method was validated by determining all six elements in a synthetic sample in graphite matrix and applied to two barium borosilicate glass (BaBSG) samples. The prompt γ-rays emitted from inelastic scattering or nuclear reactions of corresponding isotopes were measured using a 60% HPGe coupled to MCA and the current normalized count rates were used for concentration calculation. (author)

  11. Methodologies and perspectives of proteomics applied to filamentous fungi: from sample preparation to secretome analysis.

    Science.gov (United States)

    Bianco, Linda; Perrotta, Gaetano

    2015-03-12

    Filamentous fungi possess the extraordinary ability to digest complex biomasses and mineralize numerous xenobiotics, as consequence of their aptitude to sensing the environment and regulating their intra and extra cellular proteins, producing drastic changes in proteome and secretome composition. Recent advancement in proteomic technologies offers an exciting opportunity to reveal the fluctuations of fungal proteins and enzymes, responsible for their metabolic adaptation to a large variety of environmental conditions. Here, an overview of the most commonly used proteomic strategies will be provided; this paper will range from sample preparation to gel-free and gel-based proteomics, discussing pros and cons of each mentioned state-of-the-art technique. The main focus will be kept on filamentous fungi. Due to the biotechnological relevance of lignocellulose degrading fungi, special attention will be finally given to their extracellular proteome, or secretome. Secreted proteins and enzymes will be discussed in relation to their involvement in bio-based processes, such as biomass deconstruction and mycoremediation.

  12. Application of response surface methodology for determination of methyl red in water samples by spectrophotometry method.

    Science.gov (United States)

    Khodadoust, Saeid; Ghaedi, Mehrorang

    2014-12-10

    In this study a rapid and effective method (dispersive liquid-liquid microextraction (DLLME)) was developed for extraction of methyl red (MR) prior to its determination by UV-Vis spectrophotometry. Influence variables on DLLME such as volume of chloroform (as extractant solvent) and methanol (as dispersive solvent), pH and ionic strength and extraction time were investigated. Then significant variables were optimized by using a Box-Behnken design (BBD) and desirability function (DF). The optimized conditions (100μL of chloroform, 1.3mL of ethanol, pH 4 and 4% (w/v) NaCl) resulted in a linear calibration graph in the range of 0.015-10.0mgmL(-1) of MR in initial solution with R(2)=0.995 (n=5). The limits of detection (LOD) and limit of quantification (LOQ) were 0.005 and 0.015mgmL(-1), respectively. Finally, the DLLME method was applied for determination of MR in different water samples with relative standard deviation (RSD) less than 5% (n=5). Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Methodology For Reduction Of Sampling On The Visual Inspection Of Developed And Etched Wafers

    Science.gov (United States)

    van de Ven, Jamie S.; Khorasani, Fred

    1989-07-01

    There is a lot of inspection in the manufacturing of semiconductor devices. Generally, the more important a manufacturing step, the higher is the level of inspection. In some cases 100% of the wafers are inspected after certain steps. Inspection is a non-value added and expensive activity. It requires an army of "inspectors," often times expensive equipment and becomes a "bottle neck" when the level of inspection is high. Although inspection helps identify quality problems, it hurts productivity. The new management, quality and productivity philosophies recommend against over inspection. [Point #3 in Dr. Deming's 14 Points for Management (1)] 100% inspection is quite unnecessary . Often the nature of a process allows us to reduce inspection drastically and still maintain a high level of confidence in quality. In section 2, we discuss such situations and show that some elementary probability theory allows us to determine sample sizes and measure the chances of catching a bad "lot" and accepting a good lot. In section 3, we provide an example and application of the theory, and make a few comments on money and time saved because of this work. Finally, in section 4, we draw some conclusions about the new quality and productivity philosophies and how applied statisticians and engineers should study every situation individually and avoid blindly using methods and tables given in books.

  14. Trace-element characterization of evidential cannabis sative samples using k{sub 0}-standardization methodology

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, D.P. Jr.; Vernetson, W.G.; Ratner, R.T. [Univ. of Florida, Gainesville, FL (United States)] [and others

    1995-12-31

    The University of Florida Training Reactor (UFTR) facilities including the analytical laboratory are used for a wide range of educational, research, training, and service functions. The UFTR is a 100-kW light-water-cooled, graphite-and-water-moderated modified Argonaut-type reactor. The UFTR utilizes high enriched plate-type fuel in a two-slab arrangement and operates at a 100-kW power level. Since first licensed to operate at 10 kW in 1959, this nonpower reactor facility has had an active but evolving record of continuous service to a wide range of academic, utility, and community users. The services of the UFTR have also been used by various state authorities in criminal investigations. Because of its relatively low power and careful laboratory analyses, the UFTR neutron flux characteristics in several ports are not only well characterized but they are also quite invariant with time. As a result, such a facility is well-suited to the application of the multielement analysis using the k{sub o}-standardization method of neutron activation analysis. The analysis of untreated evidential botanical samples presented a unique opportunity to demonstrate implementation of this method at the UFTR facilities.

  15. Methodologies and Perspectives of Proteomics Applied to Filamentous Fungi: From Sample Preparation to Secretome Analysis

    Science.gov (United States)

    Bianco, Linda; Perrotta, Gaetano

    2015-01-01

    Filamentous fungi possess the extraordinary ability to digest complex biomasses and mineralize numerous xenobiotics, as consequence of their aptitude to sensing the environment and regulating their intra and extra cellular proteins, producing drastic changes in proteome and secretome composition. Recent advancement in proteomic technologies offers an exciting opportunity to reveal the fluctuations of fungal proteins and enzymes, responsible for their metabolic adaptation to a large variety of environmental conditions. Here, an overview of the most commonly used proteomic strategies will be provided; this paper will range from sample preparation to gel-free and gel-based proteomics, discussing pros and cons of each mentioned state-of-the-art technique. The main focus will be kept on filamentous fungi. Due to the biotechnological relevance of lignocellulose degrading fungi, special attention will be finally given to their extracellular proteome, or secretome. Secreted proteins and enzymes will be discussed in relation to their involvement in bio-based processes, such as biomass deconstruction and mycoremediation. PMID:25775160

  16. A Methodology to Estimate Ores Work Index Values, Using Miduk Copper Mine Sample

    Directory of Open Access Journals (Sweden)

    Mohammad Noaparast

    2012-12-01

    Full Text Available It is always attempted to reduce the costs of comminution in mineral processing plants. One of thedifficulties in size reduction section is not to be designed properly. The key factor to design size reductionunits such as crushers and grinding mills, is ore’s work index. The work index, wi, presents the oregrindability, and is used in Bond formula to calculate the required energy. Bond has defined a specificrelationship between some parameters which is applied to calculate wi, which are control screen, fineparticles produced, feed and product d80.In this research work, a high grade copper sample from Miduk copper concentrator was prepared, and itswork index values were experimentally estimated, using different control screens, 600, 425, 212, 150, 106and 75 microns. The obtained results from the tests showed two different behaviors in fine production.According to these two trends the required models were then defined to present the fine mass calculationusing control screen. In next step, an equation was presented in order to calculate Miduk copper ore workindex for any size. In addition to verify the model creditability, a test using 300 microns control screenwas performed and its result was compared with calculated ones using defined model, which showed agood fit. Finally the experimental and calculated values were compared and their relative error was equalto 4.11% which is an indication of good fit for the results.

  17. Trace-element characterization of evidential cannabis sative samples using k0-standardization methodology

    International Nuclear Information System (INIS)

    Henderson, D.P. Jr.; Vernetson, W.G.; Ratner, R.T.

    1995-01-01

    The University of Florida Training Reactor (UFTR) facilities including the analytical laboratory are used for a wide range of educational, research, training, and service functions. The UFTR is a 100-kW light-water-cooled, graphite-and-water-moderated modified Argonaut-type reactor. The UFTR utilizes high enriched plate-type fuel in a two-slab arrangement and operates at a 100-kW power level. Since first licensed to operate at 10 kW in 1959, this nonpower reactor facility has had an active but evolving record of continuous service to a wide range of academic, utility, and community users. The services of the UFTR have also been used by various state authorities in criminal investigations. Because of its relatively low power and careful laboratory analyses, the UFTR neutron flux characteristics in several ports are not only well characterized but they are also quite invariant with time. As a result, such a facility is well-suited to the application of the multielement analysis using the k o -standardization method of neutron activation analysis. The analysis of untreated evidential botanical samples presented a unique opportunity to demonstrate implementation of this method at the UFTR facilities

  18. Application of response surface methodology for determination of methyl red in water samples by spectrophotometry method

    Science.gov (United States)

    Khodadoust, Saeid; Ghaedi, Mehrorang

    2014-12-01

    In this study a rapid and effective method (dispersive liquid-liquid microextraction (DLLME) was developed for extraction of methyl red (MR) prior to its determination by UV-Vis spectrophotometry. Influence variables on DLLME such as volume of chloroform (as extractant solvent) and methanol (as dispersive solvent), pH and ionic strength and extraction time were investigated. Then significant variables were optimized by using a Box-Behnken design (BBD) and desirability function (DF). The optimized conditions (100 μL of chloroform, 1.3 mL of ethanol, pH 4 and 4% (w/v) NaCl) resulted in a linear calibration graph in the range of 0.015-10.0 mg mL-1 of MR in initial solution with R2 = 0.995 (n = 5). The limits of detection (LOD) and limit of quantification (LOQ) were 0.005 and 0.015 mg mL-1, respectively. Finally, the DLLME method was applied for determination of MR in different water samples with relative standard deviation (RSD) less than 5% (n = 5).

  19. A methodology to measure cervical vertebral bone maturation in a sample from low-income children.

    Science.gov (United States)

    Aguiar, Luciana Barreto Vieira; Caldas, Maria de Paula; Haiter Neto, Francisco; Ambrosano, Glaucia Maria Bovi

    2013-01-01

    This study evaluated the applicability of the regression method for determining vertebral age developed by Caldas et al. (2007) by testing this method in children from low-income families of the rural zone. The sample comprised cephalometric and hand-wrist radiographs of 76 boys and 64 girls aged 7.0 to 14.9 years living in a medium-sized city in the desert region of the northeastern region of Brazil, with an HDI of 0.678. C3 and C4 vertebrae were traced and measured on cephalometric radiographs to estimate the bone age. The average age, average hand-wrist age and average error estimated for girls and boys were, respectively, 10.62 and 10.44 years, 11.28 and 10.57 years, and 1.42 and 1.18 years. Based on these results, the formula proposed by Caldas et al. (2007) was not applicable to the studied population, and new multiple regression models were developed to obtain the children's vertebral bone age accurately.

  20. Sample Preparation Methodologies for In Situ Liquid and Gaseous Cell Analytical Transmission Electron Microscopy of Electropolished Specimens.

    Science.gov (United States)

    Zhong, Xiang Li; Schilling, Sibylle; Zaluzec, Nestor J; Burke, M Grace

    2016-12-01

    In recent years, an increasing number of studies utilizing in situ liquid and/or gaseous cell scanning/transmission electron microscopy (S/TEM) have been reported. Because of the difficulty in the preparation of suitable specimens, these environmental S/TEM studies have been generally limited to studies of nanoscale structured materials such as nanoparticles, nanowires, or sputtered thin films. In this paper, we present two methodologies which have been developed to facilitate the preparation of electron-transparent samples from conventional bulk metals and alloys for in situ liquid/gaseous cell S/TEM experiments. These methods take advantage of combining sequential electrochemical jet polishing followed by focused ion beam extraction techniques to create large electron-transparent areas for site-specific observation. As an example, we illustrate the application of this methodology for the preparation of in situ specimens from a cold-rolled Type 304 austenitic stainless steel sample, which was subsequently examined in both 1 atm of air as well as fully immersed in a H2O environment in the S/TEM followed by hyperspectral imaging. These preparation techniques can be successfully applied as a general procedure for a wide range of metals and alloys, and are suitable for a variety of in situ analytical S/TEM studies in both aqueous and gaseous environments.

  1. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    Science.gov (United States)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  2. Defining the Enterovirus Diversity Landscape of a Fecal Sample: A Methodological Challenge?

    Science.gov (United States)

    Faleye, Temitope Oluwasegun Cephas; Adewumi, Moses Olubusuyi; Adeniji, Johnson Adekunle

    2016-01-12

    Enteroviruses are a group of over 250 naked icosahedral virus serotypes that have been associated with clinical conditions that range from intrauterine enterovirus transmission withfataloutcome through encephalitis and meningitis, to paralysis. Classically, enterovirus detection was done by assaying for the development of the classic enterovirus-specific cytopathic effect in cell culture. Subsequently, the isolates were historically identified by a neutralization assay. More recently, identification has been done by reverse transcriptase-polymerase chain reaction (RT-PCR). However, in recent times, there is a move towards direct detection and identification of enteroviruses from clinical samples using the cell culture-independent RT semi-nested PCR (RT-snPCR) assay. This RT-snPCR procedure amplifies the VP1 gene, which is then sequenced and used for identification. However, while cell culture-based strategies tend to show a preponderance of certain enterovirus species depending on the cell lines included in the isolation protocol, the RT-snPCR strategies tilt in a different direction. Consequently, it is becoming apparent that the diversity observed in certain enterovirus species, e.g., enterovirus species B(EV-B), might not be because they are the most evolutionarily successful. Rather, it might stem from cell line-specific bias accumulated over several years of use of the cell culture-dependent isolation protocols. Furthermore, it might also be a reflection of the impact of the relative genome concentration on the result of pan-enterovirus VP1 RT-snPCR screens used during the identification of cell culture isolates. This review highlights the impact of these two processes on the current diversity landscape of enteroviruses and the need to re-assess enterovirus detection and identification algorithms in a bid to better balance our understanding of the enterovirus diversity landscape.

  3. Methodological interference of biochar in the determination of extracellular enzyme activities in composting samples

    Science.gov (United States)

    Jindo, K.; Matsumoto, K.; García Izquierdo, C.; Sonoki, T.; Sanchez-Monedero, M. A.

    2014-07-01

    Biochar application has received increasing attention as a means to trap recalcitrant carbon and enhance soil fertility. Hydrolytic enzymatic assays, such as β-glucosidase and phosphatase activities, are used for the assessment of soil quality and composting process, which are based on use of p-nitrophenol (PNP) derivatives as substrate. However, sorption capacity of biochar can interfere with colorimetric determination of the hydrolysed PNP, either by the sorption of the substrate or the reaction product of hydrolysis into biochar surface. The aim of the present work is to study the biochar sorption capacity for PNP in biochar-blended composting mixtures in order to assess its impact on the estimation of the colorimetric-based enzymatic assays. A retention test was conducted by adding a solution of known amounts of PNP in universal buffer solution (pH = 5, 6.5 and 11, corresponding to the β-glucosidase, acid and alkaline phosphatase activity assays, respectively), in samples taken at the initial stage and after maturation stage from four different composting piles (two manure composting piles; PM: poultry manure, CM: cow manure and two other similar piles containing 10% of additional biochar (PM + B, CM + B)). The results show that biochar-blended composts (PM + B, CM + B) generally exhibited low enzymatic activities, compared to manure compost without biochar (PM, CM). In terms of the difference between the initial and maturation stage of composting process, the PNP retention in biochar was shown higher at maturation stage, caused most probably by an enlarged proportion of biochar inside compost mixture after the selective degradation of easily decomposable organic matter. TThe retention of PNP on biochar was influenced by pH dependency of sorption capacity of biochar and/or PNP solubility, since PNP was more efficiently retained by biochar at low pH values (5 and 6.5) than at high pH values (11).

  4. LOD score exclusion analyses for candidate QTLs using random population samples.

    Science.gov (United States)

    Deng, Hong-Wen

    2003-11-01

    While extensive analyses have been conducted to test for, no formal analyses have been conducted to test against, the importance of candidate genes as putative QTLs using random population samples. Previously, we developed an LOD score exclusion mapping approach for candidate genes for complex diseases. Here, we extend this LOD score approach for exclusion analyses of candidate genes for quantitative traits. Under this approach, specific genetic effects (as reflected by heritability) and inheritance models at candidate QTLs can be analyzed and if an LOD score is < or = -2.0, the locus can be excluded from having a heritability larger than that specified. Simulations show that this approach has high power to exclude a candidate gene from having moderate genetic effects if it is not a QTL and is robust to population admixture. Our exclusion analysis complements association analysis for candidate genes as putative QTLs in random population samples. The approach is applied to test the importance of Vitamin D receptor (VDR) gene as a potential QTL underlying the variation of bone mass, an important determinant of osteoporosis.

  5. Landslide Susceptibility Assessment Using Frequency Ratio Technique with Iterative Random Sampling

    Directory of Open Access Journals (Sweden)

    Hyun-Joo Oh

    2017-01-01

    Full Text Available This paper assesses the performance of the landslide susceptibility analysis using frequency ratio (FR with an iterative random sampling. A pair of before-and-after digital aerial photographs with 50 cm spatial resolution was used to detect landslide occurrences in Yongin area, Korea. Iterative random sampling was run ten times in total and each time it was applied to the training and validation datasets. Thirteen landslide causative factors were derived from the topographic, soil, forest, and geological maps. The FR scores were calculated from the causative factors and training occurrences repeatedly ten times. The ten landslide susceptibility maps were obtained from the integration of causative factors that assigned FR scores. The landslide susceptibility maps were validated by using each validation dataset. The FR method achieved susceptibility accuracies from 89.48% to 93.21%. And the landslide susceptibility accuracy of the FR method is higher than 89%. Moreover, the ten times iterative FR modeling may contribute to a better understanding of a regularized relationship between the causative factors and landslide susceptibility. This makes it possible to incorporate knowledge-driven considerations of the causative factors into the landslide susceptibility analysis and also be extensively used to other areas.

  6. A Combined Weighting Method Based on Hybrid of Interval Evidence Fusion and Random Sampling

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2017-01-01

    Full Text Available Due to the complexity of system and lack of expertise, epistemic uncertainties may present in the experts’ judgment on the importance of certain indices during group decision-making. A novel combination weighting method is proposed to solve the index weighting problem when various uncertainties are present in expert comments. Based on the idea of evidence theory, various types of uncertain evaluation information are uniformly expressed through interval evidence structures. Similarity matrix between interval evidences is constructed, and expert’s information is fused. Comment grades are quantified using the interval number, and cumulative probability function for evaluating the importance of indices is constructed based on the fused information. Finally, index weights are obtained by Monte Carlo random sampling. The method can process expert’s information with varying degrees of uncertainties, which possesses good compatibility. Difficulty in effectively fusing high-conflict group decision-making information and large information loss after fusion is avertible. Original expert judgments are retained rather objectively throughout the processing procedure. Cumulative probability function constructing and random sampling processes do not require any human intervention or judgment. It can be implemented by computer programs easily, thus having an apparent advantage in evaluation practices of fairly huge index systems.

  7. Randomized comparison of vaginal self-sampling by standard vs. dry swabs for Human papillomavirus testing

    International Nuclear Information System (INIS)

    Eperon, Isabelle; Vassilakos, Pierre; Navarria, Isabelle; Menoud, Pierre-Alain; Gauthier, Aude; Pache, Jean-Claude; Boulvain, Michel; Untiet, Sarah; Petignat, Patrick

    2013-01-01

    To evaluate if human papillomavirus (HPV) self-sampling (Self-HPV) using a dry vaginal swab is a valid alternative for HPV testing. Women attending colposcopy clinic were recruited to collect two consecutive Self-HPV samples: a Self-HPV using a dry swab (S-DRY) and a Self-HPV using a standard wet transport medium (S-WET). These samples were analyzed for HPV using real time PCR (Roche Cobas). Participants were randomized to determine the order of the tests. Questionnaires assessing preferences and acceptability for both tests were conducted. Subsequently, women were invited for colposcopic examination; a physician collected a cervical sample (physician-sampling) with a broom-type device and placed it into a liquid-based cytology medium. Specimens were then processed for the production of cytology slides and a Hybrid Capture HPV DNA test (Qiagen) was performed from the residual liquid. Biopsies were performed if indicated. Unweighted kappa statistics (κ) and McNemar tests were used to measure the agreement among the sampling methods. A total of 120 women were randomized. Overall HPV prevalence was 68.7% (95% Confidence Interval (CI) 59.3–77.2) by S-WET, 54.4% (95% CI 44.8–63.9) by S-DRY and 53.8% (95% CI 43.8–63.7) by HC. Among paired samples (S-WET and S-DRY), the overall agreement was good (85.7%; 95% CI 77.8–91.6) and the κ was substantial (0.70; 95% CI 0.57-0.70). The proportion of positive type-specific HPV agreement was also good (77.3%; 95% CI 68.2-84.9). No differences in sensitivity for cervical intraepithelial neoplasia grade one (CIN1) or worse between the two Self-HPV tests were observed. Women reported the two Self-HPV tests as highly acceptable. Self-HPV using dry swab transfer does not appear to compromise specimen integrity. Further study in a large screening population is needed. ClinicalTrials.gov: http://clinicaltrials.gov/show/NCT01316120

  8. Smoothing the redshift distributions of random samples for the baryon acoustic oscillations: applications to the SDSS-III BOSS DR12 and QPM mock samples

    Science.gov (United States)

    Wang, Shao-Jiang; Guo, Qi; Cai, Rong-Gen

    2017-12-01

    We investigate the impact of different redshift distributions of random samples on the baryon acoustic oscillations (BAO) measurements of D_V(z)r_d^fid/r_d from the two-point correlation functions of galaxies in the Data Release 12 of the Baryon Oscillation Spectroscopic Survey (BOSS). Big surveys, such as BOSS, usually assign redshifts to the random samples by randomly drawing values from the measured redshift distributions of the data, which would necessarily introduce fiducial signals of fluctuations into the random samples, weakening the signals of BAO, if the cosmic variance cannot be ignored. We propose a smooth function of redshift distribution that fits the data well to populate the random galaxy samples. The resulting cosmological parameters match the input parameters of the mock catalogue very well. The significance of BAO signals has been improved by 0.33σ for a low-redshift sample and by 0.03σ for a constant-stellar-mass sample, though the absolute values do not change significantly. Given the precision of the measurements of current cosmological parameters, it would be appreciated for the future improvements on the measurements of galaxy clustering.

  9. Assessment of fracture risk: value of random population-based samples--the Geelong Osteoporosis Study.

    Science.gov (United States)

    Henry, M J; Pasco, J A; Seeman, E; Nicholson, G C; Sanders, K M; Kotowicz, M A

    2001-01-01

    Fracture risk is determined by bone mineral density (BMD). The T-score, a measure of fracture risk, is the position of an individual's BMD in relation to a reference range. The aim of this study was to determine the magnitude of change in the T-score when different sampling techniques were used to produce the reference range. Reference ranges were derived from three samples, drawn from the same region: (1) an age-stratified population-based random sample, (2) unselected volunteers, and (3) a selected healthy subset of the population-based sample with no diseases or drugs known to affect bone. T-scores were calculated using the three reference ranges for a cohort of women who had sustained a fracture and as a group had a low mean BMD (ages 35-72 yr; n = 484). For most comparisons, the T-scores for the fracture cohort were more negative using the population reference range. The difference in T-scores reached 1.0 SD. The proportion of the fracture cohort classified as having osteoporosis at the spine was 26, 14, and 23% when the population, volunteer, and healthy reference ranges were applied, respectively. The use of inappropriate reference ranges results in substantial changes to T-scores and may lead to inappropriate management.

  10. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies

    International Nuclear Information System (INIS)

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-01-01

    Graphical abstract: -- Highlights: •Several methods based on nanotechnology achieve limit of detections in the pM and nM ranges for mercury (II) analysis. •Most of these methods are validated in filtered water samples and/or spiked samples. •Thiols in real samples constitute an actual competence for any sensor based on the binding of mercury (II) ions. •Future research should include the study of matrix interferences including thiols and dissolved organic matter. -- Abstract: In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis

  11. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    Science.gov (United States)

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  12. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    Science.gov (United States)

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  13. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  14. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    Science.gov (United States)

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Randomized controlled trial of attention bias modification in a racially diverse, socially anxious, alcohol dependent sample.

    Science.gov (United States)

    Clerkin, Elise M; Magee, Joshua C; Wells, Tony T; Beard, Courtney; Barnett, Nancy P

    2016-12-01

    Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Adult participants (N = 86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  17. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  18. Random Photon Absorption Model Elucidates How Early Gain Control in Fly Photoreceptors Arises from Quantal Sampling

    Science.gov (United States)

    Song, Zhuoyi; Zhou, Yu; Juusola, Mikko

    2016-01-01

    Many diurnal photoreceptors encode vast real-world light changes effectively, but how this performance originates from photon sampling is unclear. A 4-module biophysically-realistic fly photoreceptor model, in which information capture is limited by the number of its sampling units (microvilli) and their photon-hit recovery time (refractoriness), can accurately simulate real recordings and their information content. However, sublinear summation in quantum bump production (quantum-gain-nonlinearity) may also cause adaptation by reducing the bump/photon gain when multiple photons hit the same microvillus simultaneously. Here, we use a Random Photon Absorption Model (RandPAM), which is the 1st module of the 4-module fly photoreceptor model, to quantify the contribution of quantum-gain-nonlinearity in light adaptation. We show how quantum-gain-nonlinearity already results from photon sampling alone. In the extreme case, when two or more simultaneous photon-hits reduce to a single sublinear value, quantum-gain-nonlinearity is preset before the phototransduction reactions adapt the quantum bump waveform. However, the contribution of quantum-gain-nonlinearity in light adaptation depends upon the likelihood of multi-photon-hits, which is strictly determined by the number of microvilli and light intensity. Specifically, its contribution to light-adaptation is marginal (≤ 1%) in fly photoreceptors with many thousands of microvilli, because the probability of simultaneous multi-photon-hits on any one microvillus is low even during daylight conditions. However, in cells with fewer sampling units, the impact of quantum-gain-nonlinearity increases with brightening light. PMID:27445779

  19. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  20. The contribution of simple random sampling to observed variations in faecal egg counts.

    Science.gov (United States)

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Intention to treat (ITT) analysis as reported in orthodontic randomized controlled trials-evaluations of methodology and recommendations for the accurate use of ITT analysis and handling dropouts.

    Science.gov (United States)

    Bondemark, Lars; Abdulraheem, Salem

    2017-10-21

    To systematically evaluate in five orthodontic journals how many randomized controlled trials (RCTs) use intention to treat (ITT) analysis and to assess the methodological quality of the ITT analysis, and finally, to demonstrate in an academic way how outcomes can be affected when not implementing the ITT analysis. A search of the database, Medline, was performed via PubMed for publication type 'randomized controlled trial' published for each journal between 1 January 2013 and 30 April 2017. The five orthodontic journals assessed were the American Journal of Orthodontics and Dentofacial Orthopedics, Angle Orthodontics, European Journal of Orthodontics, Journal of Orthodontics, and Orthodontics and Craniofacial Research. Two independent reviewers assessed each RCT to determine whether the trial reported an ITT or not or if a per-protocol analysis was accomplished. The initial search generated 137 possible trials. After applying the inclusion and exclusion criteria, 90 RCTs were included and assessed. Seventeen out of 90 RCTs (18.9%) either reported an ITT analysis in the text and/or supported the ITT by flow diagrams or tables. However, six RCTs applied and reported the ITT analysis correctly, while the majority performed a per-protocol analysis instead. Nearly all the trials that applied the ITT analysis incorrectly analysed the results using a per-protocol analysis, and thus, overestimating the results and/or having a reduced sample size which then could produce a diminished statistical power. © The Author 2017. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com

  2. Korean Brain Aging Study for the Early Diagnosis and Prediction of Alzheimer's Disease: Methodology and Baseline Sample Characteristics.

    Science.gov (United States)

    Byun, Min Soo; Yi, Dahyun; Lee, Jun Ho; Choe, Young Min; Sohn, Bo Kyung; Lee, Jun-Young; Choi, Hyo Jung; Baek, Hyewon; Kim, Yu Kyeong; Lee, Yun-Sang; Sohn, Chul-Ho; Mook-Jung, Inhee; Choi, Murim; Lee, Yu Jin; Lee, Dong Woo; Ryu, Seung-Ho; Kim, Shin Gyeom; Kim, Jee Wook; Woo, Jong Inn; Lee, Dong Young

    2017-11-01

    The Korean Brain Aging Study for the Early Diagnosis and Prediction of Alzheimer's disease (KBASE) aimed to recruit 650 individuals, aged from 20 to 90 years, to search for new biomarkers of Alzheimer's disease (AD) and to investigate how multi-faceted lifetime experiences and bodily changes contribute to the brain changes or brain pathologies related to the AD process. All participants received comprehensive clinical and neuropsychological evaluations, multi-modal brain imaging, including magnetic resonance imaging, magnetic resonance angiography, [ 11 C]Pittsburgh compound B-positron emission tomography (PET), and [ 18 F]fluorodeoxyglucose-PET, blood and genetic marker analyses at baseline, and a subset of participants underwent actigraph monitoring and completed a sleep diary. Participants are to be followed annually with clinical and neuropsychological assessments, and biannually with the full KBASE assessment, including neuroimaging and laboratory tests. As of March 2017, in total, 758 individuals had volunteered for this study. Among them, in total, 591 participants-291 cognitively normal (CN) old-aged individuals, 74 CN young- and middle-aged individuals, 139 individuals with mild cognitive impairment (MCI), and 87 individuals with AD dementia (ADD)-were enrolled at baseline, after excluding 162 individuals. A subset of participants (n=275) underwent actigraph monitoring. The KBASE cohort is a prospective, longitudinal cohort study that recruited participants with a wide age range and a wide distribution of cognitive status (CN, MCI, and ADD) and it has several strengths in its design and methodologies. Details of the recruitment, study methodology, and baseline sample characteristics are described in this paper.

  3. Variability and predictors of negative mood intensity in patients with borderline personality disorder and recurrent suicidal behavior: multilevel analyses applied to experience sampling methodology.

    Science.gov (United States)

    Nisenbaum, Rosane; Links, Paul S; Eynan, Rahel; Heisel, Marnin J

    2010-05-01

    Variability in mood swings is a characteristic of borderline personality disorder (BPD) and is associated with suicidal behavior. This study investigated patterns of mood variability and whether such patterns could be predicted from demographic and suicide-related psychological risk factors. Eighty-two adults with BPD and histories of recurrent suicidal behavior were recruited from 3 outpatient psychiatric programs in Canada. Experience sampling methodology (ESM) was used to assess negative mood intensity ratings on a visual analogue scale, 6 random times daily, for 21 days. Three-level models estimated variability between times (52.8%), days (22.2%), and patients (25.1%) and supported a quadratic pattern of daily mood variability. Depression scores predicted variability between patients' initial rating of the day. Average daily mood patterns depended on levels of hopelessness, suicide ideation, and sexual abuse history. Patients reporting moderate to severe sexual abuse and elevated suicide ideation were characterized by worsening moods from early morning up through evening, with little or no relief; patients reporting mild sexual abuse and low suicide ideation reported improved mood throughout the day. These patterns, if replicated in larger ESM studies, may potentially assist the clinician in determining which patients require close monitoring.

  4. An R package for spatial coverage sampling and random sampling from compact geographical strata by k-means

    NARCIS (Netherlands)

    Walvoort, D.J.J.; Brus, D.J.; Gruijter, de J.J.

    2010-01-01

    Both for mapping and for estimating spatial means of an environmental variable, the accuracy of the result will usually be increased by dispersing the sample locations so that they cover the study area as uniformly as possible. We developed a new R package for designing spatial coverage samples for

  5. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  6. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  7. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  8. Hardware architecture for projective model calculation and false match refining using random sample consensus algorithm

    Science.gov (United States)

    Azimi, Ehsan; Behrad, Alireza; Ghaznavi-Ghoushchi, Mohammad Bagher; Shanbehzadeh, Jamshid

    2016-11-01

    The projective model is an important mapping function for the calculation of global transformation between two images. However, its hardware implementation is challenging because of a large number of coefficients with different required precisions for fixed point representation. A VLSI hardware architecture is proposed for the calculation of a global projective model between input and reference images and refining false matches using random sample consensus (RANSAC) algorithm. To make the hardware implementation feasible, it is proved that the calculation of the projective model can be divided into four submodels comprising two translations, an affine model and a simpler projective mapping. This approach makes the hardware implementation feasible and considerably reduces the required number of bits for fixed point representation of model coefficients and intermediate variables. The proposed hardware architecture for the calculation of a global projective model using the RANSAC algorithm was implemented using Verilog hardware description language and the functionality of the design was validated through several experiments. The proposed architecture was synthesized by using an application-specific integrated circuit digital design flow utilizing 180-nm CMOS technology as well as a Virtex-6 field programmable gate array. Experimental results confirm the efficiency of the proposed hardware architecture in comparison with software implementation.

  9. Accelerating Markov chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Hyman, James M [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Higdon, Dave [Los Alamos National Laboratory; Ter Braak, Cajo J F [NETHERLANDS; Diks, Cees G H [UNIV OF AMSTERDAM

    2008-01-01

    Markov chain Monte Carlo (MCMC) methods have found widespread use in many fields of study to estimate the average properties of complex systems, and for posterior inference in a Bayesian framework. Existing theory and experiments prove convergence of well constructed MCMC schemes to the appropriate limiting distribution under a variety of different conditions. In practice, however this convergence is often observed to be disturbingly slow. This is frequently caused by an inappropriate selection of the proposal distribution used to generate trial moves in the Markov Chain. Here we show that significant improvements to the efficiency of MCMC simulation can be made by using a self-adaptive Differential Evolution learning strategy within a population-based evolutionary framework. This scheme, entitled DiffeRential Evolution Adaptive Metropolis or DREAM, runs multiple different chains simultaneously for global exploration, and automatically tunes the scale and orientation of the proposal distribution in randomized subspaces during the search. Ergodicity of the algorithm is proved, and various examples involving nonlinearity, high-dimensionality, and multimodality show that DREAM is generally superior to other adaptive MCMC sampling approaches. The DREAM scheme significantly enhances the applicability of MCMC simulation to complex, multi-modal search problems.

  10. Discriminative motif discovery via simulated evolution and random under-sampling.

    Directory of Open Access Journals (Sweden)

    Tao Song

    Full Text Available Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  11. Discriminative motif discovery via simulated evolution and random under-sampling.

    Science.gov (United States)

    Song, Tao; Gu, Hong

    2014-01-01

    Conserved motifs in biological sequences are closely related to their structure and functions. Recently, discriminative motif discovery methods have attracted more and more attention. However, little attention has been devoted to the data imbalance problem, which is one of the main reasons affecting the performance of the discriminative models. In this article, a simulated evolution method is applied to solve the multi-class imbalance problem at the stage of data preprocessing, and at the stage of Hidden Markov Models (HMMs) training, a random under-sampling method is introduced for the imbalance between the positive and negative datasets. It is shown that, in the task of discovering targeting motifs of nine subcellular compartments, the motifs found by our method are more conserved than the methods without considering data imbalance problem and recover the most known targeting motifs from Minimotif Miner and InterPro. Meanwhile, we use the found motifs to predict protein subcellular localization and achieve higher prediction precision and recall for the minority classes.

  12. Neurofeedback Against Binge Eating: A Randomized Controlled Trial in a Female Subclinical Threshold Sample.

    Science.gov (United States)

    Schmidt, Jennifer; Martin, Alexandra

    2016-09-01

    Brain-directed treatment techniques, such as neurofeedback, have recently been proposed as adjuncts in the treatment of eating disorders to improve therapeutic outcomes. In line with this recommendation, a cue exposure EEG-neurofeedback protocol was developed. The present study aimed at the evaluation of the specific efficacy of neurofeedback to reduce subjective binge eating in a female subthreshold sample. A total of 75 subjects were randomized to EEG-neurofeedback, mental imagery with a comparable treatment set-up or a waitlist group. At post-treatment, only EEG-neurofeedback led to a reduced frequency of binge eating (p = .015, g = 0.65). The effects remained stable to a 3-month follow-up. EEG-neurofeedback further showed particular beneficial effects on perceived stress and dietary self-efficacy. Differences in outcomes did not arise from divergent treatment expectations. Because EEG-neurofeedback showed a specific efficacy, it may be a promising brain-directed approach that should be tested as a treatment adjunct in clinical groups with binge eating. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2016 John Wiley & Sons, Ltd and Eating Disorders Association.

  13. Randomization-Based Inference about Latent Variables from Complex Samples: The Case of Two-Stage Sampling

    Science.gov (United States)

    Li, Tiandong

    2012-01-01

    In large-scale assessments, such as the National Assessment of Educational Progress (NAEP), plausible values based on Multiple Imputations (MI) have been used to estimate population characteristics for latent constructs under complex sample designs. Mislevy (1991) derived a closed-form analytic solution for a fixed-effect model in creating…

  14. At convenience and systematic random sampling: effects on the prognostic value of nuclear area assessments in breast cancer patients.

    Science.gov (United States)

    Jannink, I; Bennen, J N; Blaauw, J; van Diest, P J; Baak, J P

    1995-01-01

    This study compares the influence of two different nuclear sampling methods on the prognostic value of assessments of mean and standard deviation of nuclear area (MNA, SDNA) in 191 consecutive invasive breast cancer patients with long term follow up. The first sampling method used was 'at convenience' sampling (ACS); the second, systematic random sampling (SRS). Both sampling methods were tested with a sample size of 50 nuclei (ACS-50 and SRS-50). To determine whether, besides the sampling methods, sample size had impact on prognostic value as well, the SRS method was also tested using a sample size of 100 nuclei (SRS-100). SDNA values were systematically lower for ACS, obviously due to (unconsciously) not including small and large nuclei. Testing prognostic value of a series of cut off points, MNA and SDNA values assessed by the SRS method were prognostically significantly stronger than the values obtained by the ACS method. This was confirmed in Cox regression analysis. For the MNA, the Mantel-Cox p-values from SRS-50 and SRS-100 measurements were not significantly different. However, for the SDNA, SRS-100 yielded significantly lower p-values than SRS-50. In conclusion, compared with the 'at convenience' nuclear sampling method, systematic random sampling of nuclei is not only superior with respect to reproducibility of results, but also provides a better prognostic value in patients with invasive breast cancer.

  15. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples.

    Science.gov (United States)

    Shen, Lujun; Yang, Lei; Zhang, Jing; Zhang, Meng

    2018-01-01

    To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students. The Test Anxiety Scale (TAS) was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants' writing manuscripts. Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05). Students' writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days' manuscripts and the last 10 days' ones. Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study.

  16. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples.

    Directory of Open Access Journals (Sweden)

    Lujun Shen

    Full Text Available To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students.The Test Anxiety Scale (TAS was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants' writing manuscripts.Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05. Students' writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days' manuscripts and the last 10 days' ones.Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study.

  17. Benefits of expressive writing in reducing test anxiety: A randomized controlled trial in Chinese samples

    Science.gov (United States)

    Zhang, Jing; Zhang, Meng

    2018-01-01

    Purpose To explore the effect of expressive writing of positive emotions on test anxiety among senior-high-school students. Methods The Test Anxiety Scale (TAS) was used to assess the anxiety level of 200 senior-high-school students. Seventy-five students with high anxiety were recruited and divided randomly into experimental and control groups. Each day for 30 days, the experimental group engaged in 20 minutes of expressive writing of positive emotions, while the control group was asked to merely write down their daily events. A second test was given after the month-long experiment to analyze whether there had been a reduction in anxiety among the sample. Quantitative data was obtained from TAS scores. The NVivo10.0 software program was used to examine the frequency of particular word categories used in participants’ writing manuscripts. Results Senior-high-school students indicated moderate to high test anxiety. There was a significant difference in post-test results (P 0.05). Students’ writing manuscripts were mainly encoded on five code categories: cause, anxiety manifestation, positive emotion, insight and evaluation. There was a negative relation between positive emotion, insight codes and test anxiety. There were significant differences in the positive emotion, anxiety manifestation, and insight code categories between the first 10 days’ manuscripts and the last 10 days’ ones. Conclusions Long-term expressive writing of positive emotions appears to help reduce test anxiety by using insight and positive emotion words for Chinese students. Efficient and effective intervention programs to ease test anxiety can be designed based on this study. PMID:29401473

  18. [Systematic review on methodology of randomized controlled trials of post-marketing Chinese patent drugs for treatment of type 2 diabetes].

    Science.gov (United States)

    Ma, Li-xin; Wang, Yu-yi; Li, Xin-xue; Liu, Jian-ping

    2012-03-01

    Randomized controlled trial (RCT) is considered as the gold standard for the efficacy assessment of medicines. With the increasing number of Chinese patent drugs for treatment of type 2 diabetes, the methodology of post-marketing RCTs evaluating the efficacy and specific effect has become more important. To investigate post-marketing Chinese patent drugs for treatment of type 2 diabetes, as well as the methodological quality of post-marketing RCTs. Literature was searched from the books of Newly Compiled Traditional Chinese Patent Medicine and Chinese Pharmacopeia, the websites of the State Food and Drug Administration and the Ministry of Human Resources and Social Security of the People's Republic of China, China National Knowledge Infrastructure Database, Chongqing VIP Chinese Science and Technology Periodical Database, Chinese Biomedical Database (SinoMed) and Wanfang Data. The time period for searching ran from the commencement of each database to August 2011. RCTs of post-marketing Chinese patent drugs for treatment of type 2 diabetes with intervention course no less than 3 months. Two authors independently evaluated the research quality of the RCTs by the checklist of risk bias assessment and the data collection forms based on the CONSORT Statement. Independent double data-extraction was performed. The authors identified a total of 149 Chinese patent drugs for treatment of type 2 diabetes. According to different indicative syndromes, the Chinese patent drugs can be divided into the following types, namely, yin deficiency and interior heat (n=48, 32%), dual deficiency of qi and yin (n=58, 39%) and dual deficiency of qi and yin combined with blood stasis (n=22, 15%). A total of 41 RCTs meeting the inclusion criteria were included. Neither multicenter RCTs nor endpoint outcome reports were found. Risk bias analysis showed that 81% of the included studies reported randomization for grouping without sequence generation, 98% of these studies did not report

  19. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  20. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge

    OpenAIRE

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Fr?d?ric; Petignat, Patrick

    2015-01-01

    Background Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. Methods A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed f...

  1. Application of a stratified random sampling technique to the estimation and minimization of respirable quartz exposure to underground miners

    International Nuclear Information System (INIS)

    Makepeace, C.E.; Horvath, F.J.; Stocker, H.

    1981-11-01

    The aim of a stratified random sampling plan is to provide the best estimate (in the absence of full-shift personal gravimetric sampling) of personal exposure to respirable quartz among underground miners. One also gains information of the exposure distribution of all the miners at the same time. Three variables (or strata) are considered in the present scheme: locations, occupations and times of sampling. Random sampling within each stratum ensures that each location, occupation and time of sampling has equal opportunity of being selected without bias. Following implementation of the plan and analysis of collected data, one can determine the individual exposures and the mean. This information can then be used to identify those groups whose exposure contributes significantly to the collective exposure. In turn, this identification, along with other considerations, allows the mine operator to carry out a cost-benefit optimization and eventual implementation of engineering controls for these groups. This optimization and engineering control procedure, together with the random sampling plan, can then be used in an iterative manner to minimize the mean value of the distribution and collective exposures

  2. Albumin to creatinine ratio in a random urine sample: Correlation with severity of preeclampsia

    Directory of Open Access Journals (Sweden)

    Fady S. Moiety

    2014-06-01

    Conclusions: Random urine ACR may be a reliable method for prediction and assessment of severity of preeclampsia. Using the estimated cut-off may add to the predictive value of such a simple quick test.

  3. Breast ductal lavage for biomarker assessment in high risk women: rationale, design and methodology of a randomized phase II clinical trial with nimesulide, simvastatin and placebo

    International Nuclear Information System (INIS)

    Lazzeroni, Matteo; Radice, Davide; Bonanni, Bernardo; Guerrieri-Gonzaga, Aliana; Serrano, Davide; Cazzaniga, Massimiliano; Mora, Serena; Casadio, Chiara; Jemos, Costantino; Pizzamiglio, Maria; Cortesi, Laura

    2012-01-01

    Despite positive results from large phase III clinical trials proved that it is possible to prevent estrogen-responsive breast cancers with selective estrogen receptor modulators and aromatase inhibitors, no significant results have been reached so far to prevent hormone non-responsive tumors. The Ductal Lavage (DL) procedure offers a minimally invasive method to obtain breast epithelial cells from the ductal system for cytopathologic analysis. Several studies with long-term follow-up have shown that women with atypical hyperplasia have an elevated risk of developing breast cancer. The objective of the proposed trial is to assess the efficacy and safety of a daily administration of nimesulide or simvastatin in women at higher risk for breast cancer, focused particularly on hormone non-responsive tumor risk. The primary endpoint is the change in prevalence of atypical cells and cell proliferation (measured by Ki67) in DL or fine needle aspirate samples, after 12 months of treatment and 12 months after treatment cessation. From 2005 to 2011, 150 women with a history of estrogen receptor negative ductal intraepithelial neoplasia or lobular intraepithelial neoplasia or atypical hyperplasia, or unaffected subjects carrying a mutation of BRCA1 or with a probability of mutation >10% (according to BRCAPRO) were randomized to receive nimesulide 100mg/day versus simvastatin 20mg/day versus placebo for one year followed by a second year of follow-up. This is the first randomized placebo controlled trial to evaluate the role of DL to study surrogate endpoints biomarkers and the effects of these drugs on breast carcinogenesis. In 2007 the European Medicines Agency limited the use of systemic formulations of nimesulide to 15 days. According to the European Institute of Oncology Ethics Committee communication, we are now performing an even more careful monitoring of the study participants. Preliminary results showed that DL is a feasible procedure, the treatment is well tolerated

  4. Evaluation of a Class of Simple and Effective Uncertainty Methods for Sparse Samples of Random Variables and Functions

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bonney, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a class of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10-4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.

  5. Randomization tests

    CERN Document Server

    Edgington, Eugene

    2007-01-01

    Statistical Tests That Do Not Require Random Sampling Randomization Tests Numerical Examples Randomization Tests and Nonrandom Samples The Prevalence of Nonrandom Samples in Experiments The Irrelevance of Random Samples for the Typical Experiment Generalizing from Nonrandom Samples Intelligibility Respect for the Validity of Randomization Tests Versatility Practicality Precursors of Randomization Tests Other Applications of Permutation Tests Questions and Exercises Notes References Randomized Experiments Unique Benefits of Experiments Experimentation without Mani

  6. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  7. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    OpenAIRE

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through T...

  8. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology.

    Science.gov (United States)

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-09-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.

  9. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology

    Directory of Open Access Journals (Sweden)

    Jordi Sánchez-Ribas

    2015-09-01

    Full Text Available Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.

  10. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Elsa Tavernier

    Full Text Available We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT. In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review. Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was 90%. Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  11. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  12. Characteristics of men with substance use disorder consequent to illicit drug use: comparison of a random sample and volunteers.

    Science.gov (United States)

    Reynolds, Maureen D; Tarter, Ralph E; Kirisci, Levent

    2004-09-06

    Men qualifying for substance use disorder (SUD) consequent to consumption of an illicit drug were compared according to recruitment method. It was hypothesized that volunteers would be more self-disclosing and exhibit more severe disturbances compared to randomly recruited subjects. Personal, demographic, family, social, substance use, psychiatric, and SUD characteristics of volunteers (N = 146) were compared to randomly recruited (N = 102) subjects. Volunteers had lower socioceconomic status, were more likely to be African American, and had lower IQ than randomly recruited subjects. Volunteers also evidenced greater social and family maladjustment and more frequently had received treatment for substance abuse. In addition, lower social desirability response bias was observed in the volunteers. SUD was not more severe in the volunteers; however, they reported a higher lifetime rate of opiate, diet, depressant, and analgesic drug use. Volunteers and randomly recruited subjects qualifying for SUD consequent to illicit drug use are similar in SUD severity but differ in terms of severity of psychosocial disturbance and history of drug involvement. The factors discriminating volunteers and randomly recruited subjects are well known to impact on outcome, hence they need to be considered in research design, especially when selecting a sampling strategy in treatment research.

  13. Sampling methodologies for epidemiologic surveillance of men who have sex with men and transgender women in Latin America: an empiric comparison of convenience sampling, time space sampling, and respondent driven sampling.

    Science.gov (United States)

    Clark, J L; Konda, K A; Silva-Santisteban, A; Peinado, J; Lama, J R; Kusunoki, L; Perez-Brumer, A; Pun, M; Cabello, R; Sebastian, J L; Suarez-Ognio, L; Sanchez, J

    2014-12-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants' self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods.

  14. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    Science.gov (United States)

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June–August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants’ self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods. PMID:24362754

  15. Randomization of grab-sampling strategies for estimating the annual exposure of U miners to Rn daughters.

    Science.gov (United States)

    Borak, T B

    1986-04-01

    Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.

  16. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Directory of Open Access Journals (Sweden)

    Andreas Steimer

    Full Text Available Oscillations between high and low values of the membrane potential (UP and DOWN states respectively are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs of the exponential integrate and fire (EIF model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing

  17. Random Sampling with Interspike-Intervals of the Exponential Integrate and Fire Neuron: A Computational Interpretation of UP-States.

    Science.gov (United States)

    Steimer, Andreas; Schindler, Kaspar

    2015-01-01

    Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational

  18. Parameters, test criteria and fault assessment in random sampling of waste barrels from non-qualified processes

    International Nuclear Information System (INIS)

    Martens, B.R.

    1989-01-01

    In the context of random sampling tests, parameters are checked on the waste barrels and criteria are given on which these tests are based. Also, it is shown how faulty data on the properties of the waste or faulty waste barrels should be treated. To decide the extent of testing, the properties of the waste relevant to final storage are determined based on the conditioning process used. (DG) [de

  19. Random or systematic sampling to detect a localised microbial contamination within a batch of food

    NARCIS (Netherlands)

    Jongenburger, I.; Reij, M.W.; Boer, E.P.J.; Gorris, L.G.M.; Zwietering, M.H.

    2011-01-01

    Pathogenic microorganisms are known to be distributed heterogeneously in food products that are solid, semi-solid or powdered, like for instance peanut butter, cereals, or powdered milk. This complicates effective detection of the pathogens by sampling. Two-class sampling plans, which are deployed

  20. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  1. Inferences about Variance Components and Reliability-Generalizability Coefficients in the Absence of Random Sampling.

    Science.gov (United States)

    Kane, Michael

    2002-01-01

    Reviews the criticisms of sampling assumptions in generalizability theory (and in reliability theory) and examines the feasibility of using representative sampling, stratification, homogeneity assumptions, and replications to address these criticisms. Suggests some general outlines for the conduct of generalizability theory studies. (SLD)

  2. Implementing the PAIN RelieveIt Randomized Controlled Trial in Hospice Care: Mechanisms for Success and Meeting PCORI Methodology Standards.

    Science.gov (United States)

    Ezenwa, Miriam O; Suarez, Marie L; Carrasco, Jesus D; Hipp, Theresa; Gill, Anayza; Miller, Jacob; Shea, Robert; Shuey, David; Zhao, Zhongsheng; Angulo, Veronica; McCurry, Timothy; Martin, Joanna; Yao, Yingwei; Molokie, Robert E; Wang, Zaijie Jim; Wilkie, Diana J

    2017-07-01

    This purpose of this article is to describe how we adhere to the Patient-Centered Outcomes Research Institute's (PCORI) methodology standards relevant to the design and implementation of our PCORI-funded study, the PAIN RelieveIt Trial. We present details of the PAIN RelieveIt Trial organized by the PCORI methodology standards and components that are relevant to our study. The PAIN RelieveIt Trial adheres to four PCORI standards and 21 subsumed components. The four standards include standards for formulating research questions, standards associated with patient centeredness, standards for data integrity and rigorous analyses, and standards for preventing and handling missing data. In the past 24 months, we screened 2,837 cancer patients and their caregivers; 874 dyads were eligible; 223.5 dyads consented and provided baseline data. Only 55 patients were lost to follow-up-a 25% attrition rate. The design and implementation of the PAIN RelieveIt Trial adhered to PCORI's methodology standards for research rigor.

  3. FACE Analysis as a Fast and Reliable Methodology to Monitor the Sulfation and Total Amount of Chondroitin Sulfate in Biological Samples of Clinical Importance

    Directory of Open Access Journals (Sweden)

    Evgenia Karousou

    2014-06-01

    Full Text Available Glycosaminoglycans (GAGs due to their hydrophilic character and high anionic charge densities play important roles in various (pathophysiological processes. The identification and quantification of GAGs in biological samples and tissues could be useful prognostic and diagnostic tools in pathological conditions. Despite the noteworthy progress in the development of sensitive and accurate methodologies for the determination of GAGs, there is a significant lack in methodologies regarding sample preparation and reliable fast analysis methods enabling the simultaneous analysis of several biological samples. In this report, developed protocols for the isolation of GAGs in biological samples were applied to analyze various sulfated chondroitin sulfate- and hyaluronan-derived disaccharides using fluorophore-assisted carbohydrate electrophoresis (FACE. Applications to biologic samples of clinical importance include blood serum, lens capsule tissue and urine. The sample preparation protocol followed by FACE analysis allows quantification with an optimal linearity over the concentration range 1.0–220.0 µg/mL, affording a limit of quantitation of 50 ng of disaccharides. Validation of FACE results was performed by capillary electrophoresis and high performance liquid chromatography techniques.

  4. New methodological approaches to the simultaneous measurement of the 90Sr and 137Cs activity in environmental samples

    Directory of Open Access Journals (Sweden)

    M. V. Zheltonozhska

    2012-12-01

    Full Text Available Nonradiochemical method of measurement of 90Sr and137Cs activity in environmental samples is proposed. This method is based on spectrometrical investigation of electrons accompanied the decay of the 90Sr and137Cs. Accounting for the contribution to the total activity of the samples from the zones with the density of the contamination 1 - 5 Кu/km2 the 40K electrons allowed to improve the accuracy of the measurements for the samples of small rodents up to 15 - 20 % (the ratio of A (137Cs/A (90Sr was from 2 to 100, for samples of soil up to 10 - 15 % (the change of activity in these samples was ten thousand times. The results of the spectrometric measurements were confirmed by the traditional radiochemical research.

  5. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  6. Characterization of electron microscopes with binary pseudo-random multilayer test samples

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; Conley, Raymond; Anderson, Erik H.; Barber, Samuel K.; Bouet, Nathalie; McKinney, Wayne R.; Takacs, Peter Z.; Voronov, Dmitriy L.

    2010-01-01

    We discuss the results of SEM and TEM measurements with the BPRML test samples fabricated from a BPRML (WSi2/Si with fundamental layer thickness of 3 nm) with a Dual Beam FIB (focused ion beam)/SEM technique. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.

  7. Random sampling of the Central European bat fauna reveals the existence of numerous hitherto unknown adenoviruses.

    Science.gov (United States)

    Vidovszky, Márton; Kohl, Claudia; Boldogh, Sándor; Görföl, Tamás; Wibbelt, Gudrun; Kurth, Andreas; Harrach, Balázs

    2015-12-01

    From over 1250 extant species of the order Chiroptera, 25 and 28 are known to occur in Germany and Hungary, respectively. Close to 350 samples originating from 28 bat species (17 from Germany, 27 from Hungary) were screened for the presence of adenoviruses (AdVs) using a nested PCR that targets the DNA polymerase gene of AdVs. An additional PCR was designed and applied to amplify a fragment from the gene encoding the IVa2 protein of mastadenoviruses. All German samples originated from organs of bats found moribund or dead. The Hungarian samples were excrements collected from colonies of known bat species, throat or rectal swab samples, taken from live individuals that had been captured for faunistic surveys and migration studies, as well as internal organs of dead specimens. Overall, 51 samples (14.73%) were found positive. We detected 28 seemingly novel and six previously described bat AdVs by sequencing the PCR products. The positivity rate was the highest among the guano samples of bat colonies. In phylogeny reconstructions, the AdVs detected in bats clustered roughly, but not perfectly, according to the hosts' families (Vespertilionidae, Rhinolophidae, Hipposideridae, Phyllostomidae and Pteropodidae). In a few cases, identical sequences were derived from animals of closely related species. On the other hand, some bat species proved to harbour more than one type of AdV. The high prevalence of infection and the large number of chiropteran species worldwide make us hypothesise that hundreds of different yet unknown AdV types might circulate in bats.

  8. Effect of sampling and short isolation methodologies on the recovery of human pathogenic Yersinia enterocolitica from pig tonsils.

    Science.gov (United States)

    Van Damme, Inge; Berkvens, Dirk; De Zutter, Lieven

    2012-07-01

    The objective of this study was to determine the effect of sampling (swab samples compared to destructive samples) on isolation rates of human pathogenic Yersinia enterocolitica from pig tonsils. Moreover, the relative efficiency of different rapid, routinely applicable isolation methods was evaluated. Therefore, swab and destructive samples from tonsils of 120 pigs at slaughter were analyzed in parallel using direct plating and different enrichment methods. Salmonella-Shigella-desoxycholate-calcium chloride (SSDC) agar, cefsulodin-irgasan-novobiocin (CIN) agar, and Yersinia enterocolitica chromogenic medium (YeCM) were used as selective agar media. For enrichment, irgasan-ticarcillin-potassium chlorate (ITC) broth and peptone-sorbitol-bile (PSB) broth were incubated at 25°C for 48 h. Overall, 55 tonsils (45.8%) were positive for Y. enterocolitica bioserotype 4/O:3. Recovery was significantly higher using the destructive method compared to the swabbing method. Direct plating resulted in 47 and 28 Y. enterocolitica-positive destructive and swab samples, respectively. Alkali treatment of PSB and ITC enrichment broths significantly increased recovery of pathogenic Y. enterocolitica from destructive tonsil samples. The performance of YeCM for qualitative and quantitative isolation of pathogenic Y. enterocolitica from pig tonsils was equal to SSDC and CIN. In conclusion, direct plating and ISO 10273: 2003 with minor modifications are suitable and rapid methods for isolation of pathogenic Y. enterocolitica from destructive tonsil samples.

  9. Determination of radium isotopes in environmental samples by gamma spectrometry, liquid scintillation counting and alpha spectrometry: a review of analytical methodology

    International Nuclear Information System (INIS)

    Jia, Guogang; Jia, Jing

    2012-01-01

    Radium (Ra) isotopes are important from the viewpoints of radiation protection and environmental protection. Their high toxicity has stimulated the continuing interest in methodology research for determination of Ra isotopes in various media. In this paper, the three most routinely used analytical techniques for Ra isotope determination in biological and environmental samples, i.e. low-background γ-spectrometry, liquid scintillation counting and α-spectrometry, were reviewed, with emphasis on new methodological developments in sample preparation, preconcentration, separation, purification, source preparation and measurement techniques. The accuracy, selectivity, traceability, applicability and minimum detectable activity (MDA) of the three techniques were discussed. It was concluded that the MDA (0.1 mBq L −1 ) of the α-spectrometry technique coupled with chemical separation is about two orders of magnitude lower than that of low-background HPGe γ-spectrometry and LSC techniques. Therefore, when maximum sensitivity is required, the α-spectrometry technique remains the first choice. - Highlights: ► A review is made for determination of Ra isotopes in environmental samples. ► Gamma spectrometry, LSC and a-spectrometry are the main concerned radiometric approach. ► Sample preparation, preconcentration, separation and source preparation are discussed. ► The methods can analyse air, water, seawater, soil, sediment and foodstuffs samples. ► Some new data obtained recently from our laboratory for Ra method study are included.

  10. Associations Among Religiousness and Community Volunteerism in National Random Samples of American Adults.

    Science.gov (United States)

    Haggard, Megan C; Kang, Linda L; Rowatt, Wade C; Shen, Megan Johnson

    2015-01-01

    The connection between religiousness and volunteering for the community can be explained through two distinct features of religion. First, religious organizations are social groups that encourage members to help others through planned opportunities. Second, helping others is regarded as an important value for members in religious organizations to uphold. We examined the relationship between religiousness and self-reported community volunteering in two independent national random surveys of American adults (i.e., the 2005 and 2007 waves of the Baylor Religion Survey). In both waves, frequency of religious service attendance was associated with an increase in likelihood that individuals would volunteer, whether through their religious organization or not, whereas frequency of reading sacred texts outside of religious services was associated with an increase in likelihood of volunteering only for or through their religious organization. The role of religion in community volunteering is discussed in light of these findings.

  11. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  12. Re-estimating sample size in cluster randomized trials with active recruitment within clusters

    NARCIS (Netherlands)

    van Schie, Sander; Moerbeek, Mirjam

    2014-01-01

    Often only a limited number of clusters can be obtained in cluster randomised trials, although many potential participants can be recruited within each cluster. Thus, active recruitment is feasible within the clusters. To obtain an efficient sample size in a cluster randomised trial, the cluster

  13. A systematic random sampling scheme optimized to detect the proportion of rare synapses in the neuropil.

    Science.gov (United States)

    da Costa, Nuno Maçarico; Hepp, Klaus; Martin, Kevan A C

    2009-05-30

    Synapses can only be morphologically identified by electron microscopy and this is often a very labor-intensive and time-consuming task. When quantitative estimates are required for pathways that contribute a small proportion of synapses to the neuropil, the problems of accurate sampling are particularly severe and the total time required may become prohibitive. Here we present a sampling method devised to count the percentage of rarely occurring synapses in the neuropil using a large sample (approximately 1000 sampling sites), with the strong constraint of doing it in reasonable time. The strategy, which uses the unbiased physical disector technique, resembles that used in particle physics to detect rare events. We validated our method in the primary visual cortex of the cat, where we used biotinylated dextran amine to label thalamic afferents and measured the density of their synapses using the physical disector method. Our results show that we could obtain accurate counts of the labeled synapses, even when they represented only 0.2% of all the synapses in the neuropil.

  14. Random Assignment of Schools to Groups in the Drug Resistance Strategies Rural Project: Some New Methodological Twists

    Science.gov (United States)

    Pettigrew, Jonathan; Miller-Day, Michelle; Krieger, Janice L.; Zhou, Jiangxiu; Hecht, Michael L.

    2014-01-01

    Random assignment to groups is the foundation for scientifically rigorous clinical trials. But assignment is challenging in group randomized trials when only a few units (schools) are assigned to each condition. In the DRSR project, we assigned 39 rural Pennsylvania and Ohio schools to three conditions (rural, classic, control). But even with 13 schools per condition, achieving pretest equivalence on important variables is not guaranteed. We collected data on six important school-level variables: rurality, number of grades in the school, enrollment per grade, percent white, percent receiving free/assisted lunch, and test scores. Key to our procedure was the inclusion of school-level drug use data, available for a subset of the schools. Also, key was that we handled the partial data with modern missing data techniques. We chose to create one composite stratifying variable based on the seven school-level variables available. Principal components analysis with the seven variables yielded two factors, which were averaged to form the composite inflate-suppress (CIS) score which was the basis of stratification. The CIS score was broken into three strata within each state; schools were assigned at random to the three program conditions from within each stratum, within each state. Results showed that program group membership was unrelated to the CIS score, the two factors making up the CIS score, and the seven items making up the factors. Program group membership was not significantly related to pretest measures of drug use (alcohol, cigarettes, marijuana, chewing tobacco; smallest p>.15), thus verifying that pretest equivalence was achieved. PMID:23722619

  15. JOINT STUDY OF IMPROVED SAFEGUARDS METHODOLOGY USING NO-NOTICE RANDOMIZED INSPECTION AT JNC'S Pu HANDLING FACILITIES

    International Nuclear Information System (INIS)

    LU, M.S.; SANBORN, J.B.

    2000-01-01

    After the Iraq war, the International Atomic Energy Agency (IAEA) 93+2 Program was developed to strengthen and improve the cost-effectiveness of the existing safeguards system. In particular, the Program aims to enhance the IAEA ability to detect undeclared nuclear activities and materials. The IAEA 93+2 Program includes: (1) Increased access to information and its effective use; (2) Increased physical access; (3) Optimum use of the existing system. The measures considered are divided in two parts: measures in Part 1 are those, which may be implemented within the existing IAEA authority; Part 2 measures require complementary legal authority, in the form of an additional Protocol, INFCIRC/540. A description of the status of its implementation can be found in ''Implementation of the Additional Protocol'' (Cooley, 1999). In particular, increased physical access includes access beyond locations requiring additional authorities derived from the INFCIRC/540 and no-notice randomized inspections. No-notice randomized inspections could enhance the inspection effectiveness and efficiency by increasing the coverage of the material involved, providing better confirmation of the operational status of the facilities and higher degree of confidence that no undeclared activities or materials existed at the facilities--including the detection of possible measures to conceal diversions

  16. Random Evolutionary Dynamics Driven by Fitness and House-of-Cards Mutations: Sampling Formulae

    Science.gov (United States)

    Huillet, Thierry E.

    2017-07-01

    We first revisit the multi-allelic mutation-fitness balance problem, especially when mutations obey a house of cards condition, where the discrete-time deterministic evolutionary dynamics of the allelic frequencies derives from a Shahshahani potential. We then consider multi-allelic Wright-Fisher stochastic models whose deviation to neutrality is from the Shahshahani mutation/selection potential. We next focus on the weak selection, weak mutation cases and, making use of a Gamma calculus, we compute the normalizing partition functions of the invariant probability densities appearing in their Wright-Fisher diffusive approximations. Using these results, generalized Ewens sampling formulae (ESF) from the equilibrium distributions are derived. We start treating the ESF in the mixed mutation/selection potential case and then we restrict ourselves to the ESF in the simpler house-of-cards mutations only situation. We also address some issues concerning sampling problems from infinitely-many alleles weak limits.

  17. Seroincidence of non-typhoid Salmonella infections: convenience vs. random community-based sampling.

    Science.gov (United States)

    Emborg, H-D; Simonsen, J; Jørgensen, C S; Harritshøj, L H; Krogfelt, K A; Linneberg, A; Mølbak, K

    2016-01-01

    The incidence of reported infections of non-typhoid Salmonella is affected by biases inherent to passive laboratory surveillance, whereas analysis of blood sera may provide a less biased alternative to estimate the force of Salmonella transmission in humans. We developed a mathematical model that enabled a back-calculation of the annual seroincidence of Salmonella based on measurements of specific antibodies. The aim of the present study was to determine the seroincidence in two convenience samples from 2012 (Danish blood donors, n = 500, and pregnant women, n = 637) and a community-based sample of healthy individuals from 2006 to 2007 (n = 1780). The lowest antibody levels were measured in the samples from the community cohort and the highest in pregnant women. The annual Salmonella seroincidences were 319 infections/1000 pregnant women [90% credibility interval (CrI) 210-441], 182/1000 in blood donors (90% CrI 85-298) and 77/1000 in the community cohort (90% CrI 45-114). Although the differences between study populations decreased when accounting for different age distributions the estimates depend on the study population. It is important to be aware of this issue and define a certain population under surveillance in order to obtain consistent results in an application of serological measures for public health purposes.

  18. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  19. The quality of the reported sample size calculations in randomized controlled trials indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-05-01

    There are limited data on the quality of reporting of information essential for replication of the calculation as well as the accuracy of the sample size calculation. We examine the current quality of reporting of the sample size calculation in randomized controlled trials (RCTs) published in PubMed and to examine the variation in reporting across study design, study characteristics, and journal impact factor. We also reviewed the targeted sample size reported in trial registries. We reviewed and analyzed all RCTs published in December 2014 with journals indexed in PubMed. The 2014 Impact Factors for the journals were used as proxies for their quality. Of the 451 analyzed papers, 58.1% reported an a priori sample size calculation. Nearly all papers provided the level of significance (97.7%) and desired power (96.6%), and most of the papers reported the minimum clinically important effect size (73.3%). The median (inter-quartile range) of the percentage difference of the reported and calculated sample size calculation was 0.0% (IQR -4.6%;3.0%). The accuracy of the reported sample size was better for studies published in journals that endorsed the CONSORT statement and journals with an impact factor. A total of 98 papers had provided targeted sample size on trial registries and about two-third of these papers (n=62) reported sample size calculation, but only 25 (40.3%) had no discrepancy with the reported number in the trial registries. The reporting of the sample size calculation in RCTs published in PubMed-indexed journals and trial registries were poor. The CONSORT statement should be more widely endorsed. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  20. Active Learning Not Associated with Student Learning in a Random Sample of College Biology Courses

    Science.gov (United States)

    Andrews, T. M.; Leonard, M. J.; Colgrove, C. A.; Kalinowski, S. T.

    2011-01-01

    Previous research has suggested that adding active learning to traditional college science lectures substantially improves student learning. However, this research predominantly studied courses taught by science education researchers, who are likely to have exceptional teaching expertise. The present study investigated introductory biology courses randomly selected from a list of prominent colleges and universities to include instructors representing a broader population. We examined the relationship between active learning and student learning in the subject area of natural selection. We found no association between student learning gains and the use of active-learning instruction. Although active learning has the potential to substantially improve student learning, this research suggests that active learning, as used by typical college biology instructors, is not associated with greater learning gains. We contend that most instructors lack the rich and nuanced understanding of teaching and learning that science education researchers have developed. Therefore, active learning as designed and implemented by typical college biology instructors may superficially resemble active learning used by education researchers, but lacks the constructivist elements necessary for improving learning. PMID:22135373

  1. Analytical methodology for determination of the sulfate in vinasse samples; Metodologia analitica para a determinacao de sulfato em vinhoto

    Energy Technology Data Exchange (ETDEWEB)

    Prada, Silvio Miranda; Guekezian, Marcia; Suarez-Ilha, Maria Encarnacion V. [Sao Paulo Univ., SP (Brazil). Inst. de Quimica

    1998-05-01

    When sulfate is present in high concentrations, it acts as an inhibitor in the production of methane (Biogas Formation) in anaerobic biodigestion processes. In this way it is very important to know the sulfate concentration in vinasse samples before to make the biodigester design. A previous developed and indirect method (Anal. Chim. Acta. 1996, 329, 197), was used to determine sulfate in samples of vinasse, after previous treatments, done in order to eliminate organic matter with hydrogen peroxide 30% and concentrated nitric acid mixture (3:1), under heating. Interferent cationic ions were isolated by using ion exchange columns. The results obtained for some samples from Araraquara and Penapolis are here presented. The phosphate concentration was also determined. (author) 23 refs., 3 tabs.

  2. Vitamin D measurement in the intensive care unit: methodology, clinical relevance and interpretation of a random value.

    Science.gov (United States)

    Krishnan, Anand; Venkatesh, Bala

    2013-08-01

    Vitamin D deficiency, as measured by a random level of 25-hydroxyvitamin D is very prevalent in critically ill patients admitted to the ICU and is associated with adverse outcomes. Both 25(OH)vitamin D and 1α,25(OH)2D3 are difficult to analyse because of their lipophilic nature, affinity for VDBP and small concentrations. Also, the various tests used to estimate vitamin D levels show significant inter- and intra-assay variability, which significantly affect the veracity of the results obtained and confound their interpretation. The two main types of assays include those that directly estimate vitamin D levels (HPLC, LC-MS/MS) and competitive binding assays (RIA, EIA). The former methods require skilled operators, with prolonged assay times and increased cost, whereas the latter are cheaper and easy to perform, but with decreased accuracy. The direct assays are not affected by lipophilic substances in plasma and heterophile antibodies, but may overestimate vitamin D levels by measuring the 3-epimers. These problems can be eliminated by adequate standardization of the test using SRMs provided by NIST, as well as participating in proficiency schemes like DEQAS. It is therefore important to consider the test employed as well as laboratory quality control, while interpreting vitamin D results. A single random measurement may not be reflective of the vitamin D status in ICU patients because of changes with fluid administration, and intra-day variation in 25-hydroxyvitamin D levels. 1α,25(OH)2D3 may behave differently to 25-hydroxyvitamin D, both in plasma and at tissue level, in inflammatory states. Measurement of tissue 1α,25(OH)2D3 levels may provide the true estimate of vitamin D activity.

  3. Adult Congenital Heart Disease-Coping And REsilience (ACHD-CARE): Rationale and methodology of a pilot randomized controlled trial.

    Science.gov (United States)

    Kovacs, Adrienne H; Bandyopadhyay, Mimi; Grace, Sherry L; Kentner, Amanda C; Nolan, Robert P; Silversides, Candice K; Irvine, M Jane

    2015-11-01

    One-third of North American adults with congenital heart disease (CHD) have diagnosable mood or anxiety disorders and most do not receive mental health treatment. There are no published interventions targeting the psychosocial needs of patients with CHD of any age. We describe the development of a group psychosocial intervention aimed at improving the psychosocial functioning, quality of life, and resilience of adults with CHD and the design of a study protocol to determine the feasibility of a potential full-scale randomized controlled trial (RCT). Drawing upon our quantitative and qualitative research, we developed the Adult CHD-Coping And REsilience (ACHD-CARE) intervention and designed a feasibility study that included a 2-parallel arm non-blinded pilot RCT. Eligible participants (CHD, age ≥ 18 years, no planned surgery, symptoms suggestive of a mood and/or anxiety disorder) were randomized to the ACHD-CARE intervention or Usual Care (1:1 allocation ratio). The group intervention was delivered during eight 90-minute weekly sessions. Feasibility will be assessed in the following domains: (i) process (e.g. recruitment and retention), (ii) resources, (iii) management, (iv) scientific outcomes, and (v) intervention acceptability. This study underscores the importance of carefully developing and testing the feasibility of psychosocial interventions in medical populations before moving to full-scale clinical trials. At study conclusion, we will be poised to make one of three determinations for a full-scale RCT: (1) feasible, (2) feasible with modifications, or (3) not feasible. This study will guide the future evaluation and provision of psychosocial treatment for adults with CHD. Copyright © 2015. Published by Elsevier Inc.

  4. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern.

    Science.gov (United States)

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-06-28

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method.

  5. Characterization of cotton gin PM10 emissions based on EPA stack sampling methodologies and particle size distributions

    Science.gov (United States)

    A project to characterize cotton gin emissions in terms of stack sampling was conducted during the 2008 through 2011 ginning seasons. The impetus behind the project was the urgent need to collect additional cotton gin emissions data to address current regulatory issues. EPA AP-42 emission factors ar...

  6. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  7. Development of standardized methodology for identifying toxins in clinical samples and fish species associated with tetrodotoxin-borne poisoning incidents

    Directory of Open Access Journals (Sweden)

    Tai-Yuan Chen

    2016-01-01

    Full Text Available Tetrodotoxin (TTX is a naturally occurring toxin in food, especially in puffer fish. TTX poisoning is observed frequently in South East Asian regions. In TTX-derived food poisoning outbreaks, the amount of TTX recovered from suspicious fish samples or leftovers, and residual levels from biological fluids of victims are typically trace. However, liquid chromatography–mass spectrometry and liquid chromatography–tandem mass spectrometry methods have been demonstrated to qualitatively and quantitatively determine TTX in clinical samples from victims. Identification and validation of the TTX-originating seafood species responsible for a food poisoning incident is needed. A polymerase chain reaction-based method on mitochondrial DNA analysis is useful for identification of fish species. This review aims to collect pertinent information available on TTX-borne food poisoning incidents with a special emphasis on the analytical methods employed for TTX detection in clinical laboratories as well as for the identification of TTX-bearing species.

  8. Building research capacity in Botswana: a randomized trial comparing training methodologies in the Botswana ethics training initiative

    Science.gov (United States)

    2013-01-01

    Background Little empirical data are available on the extent to which capacity-building programs in research ethics prepare trainees to apply ethical reasoning skills to the design, conduct, or review of research. A randomized controlled trial was conducted in Botswana in 2010 to assess the effectiveness of a case-based intervention using email to augment in-person seminars. Methods University faculty and current and prospective IRB/REC members took part in a semester-long training program in research ethics. Participants attended two 2-day seminars and were assigned at random to one of two on-line arms of the trial. Participants in both arms completed on-line international modules from the Collaborative Institutional Training Initiative. Between seminars, intervention-arm participants were also emailed a weekly case to analyze in response to set questions; responses and individualized faculty feedback were exchanged via email. Tests assessing ethics knowledge were administered at the start of each seminar. The post-test included an additional section in which participants were asked to identify the ethical issues highlighted in five case studies from a list of multiple-choice responses. Results were analyzed using regression and ANOVA. Results Of the 71 participants (36 control, 35 intervention) enrolled at the first seminar, 41 (57.7%) attended the second seminar (19 control, 22 intervention). In the intervention arm, 19 (54.3%) participants fully completed and 8 (22.9%) partially completed all six weekly cases. The mean score was higher on the post-test (30.3/40) than on the pre-test (28.0/40), and individual post- and pre-test scores were highly correlated (r = 0.65, p  0.84), but intervention-arm subjects who completed all assigned cases answered an average of 3.2 more questions correctly on the post-test than others, controlling for pre-test scores (p = 0.003). Conclusions Completion of the case-based intervention improved respondents’ test

  9. Building research capacity in Botswana: a randomized trial comparing training methodologies in the Botswana ethics training initiative.

    Science.gov (United States)

    Barchi, Francis H; Kasimatis-Singleton, Megan; Kasule, Mary; Khulumani, Pilate; Merz, Jon F

    2013-02-01

    Little empirical data are available on the extent to which capacity-building programs in research ethics prepare trainees to apply ethical reasoning skills to the design, conduct, or review of research. A randomized controlled trial was conducted in Botswana in 2010 to assess the effectiveness of a case-based intervention using email to augment in-person seminars. University faculty and current and prospective IRB/REC members took part in a semester-long training program in research ethics. Participants attended two 2-day seminars and were assigned at random to one of two on-line arms of the trial. Participants in both arms completed on-line international modules from the Collaborative Institutional Training Initiative. Between seminars, intervention-arm participants were also emailed a weekly case to analyze in response to set questions; responses and individualized faculty feedback were exchanged via email. Tests assessing ethics knowledge were administered at the start of each seminar. The post-test included an additional section in which participants were asked to identify the ethical issues highlighted in five case studies from a list of multiple-choice responses. Results were analyzed using regression and ANOVA. Of the 71 participants (36 control, 35 intervention) enrolled at the first seminar, 41 (57.7%) attended the second seminar (19 control, 22 intervention). In the intervention arm, 19 (54.3%) participants fully completed and 8 (22.9%) partially completed all six weekly cases. The mean score was higher on the post-test (30.3/40) than on the pre-test (28.0/40), and individual post- and pre-test scores were highly correlated (r = 0.65, p  0.84), but intervention-arm subjects who completed all assigned cases answered an average of 3.2 more questions correctly on the post-test than others, controlling for pre-test scores (p = 0.003). Completion of the case-based intervention improved respondents' test scores, with those who completed all six

  10. Teachers' Attitude towards Implementation of Learner-Centered Methodology in Science Education in Kenya

    Science.gov (United States)

    Ndirangu, Caroline

    2017-01-01

    This study aims to evaluate teachers' attitude towards implementation of learner-centered methodology in science education in Kenya. The study used a survey design methodology, adopting the purposive, stratified random and simple random sampling procedures and hypothesised that there was no significant relationship between the head teachers'…

  11. Global Stratigraphy of Venus: Analysis of a Random Sample of Thirty-Six Test Areas

    Science.gov (United States)

    Basilevsky, Alexander T.; Head, James W., III

    1995-01-01

    The age relations between 36 impact craters with dark paraboloids and other geologic units and structures at these localities have been studied through photogeologic analysis of Magellan SAR images of the surface of Venus. Geologic settings in all 36 sites, about 1000 x 1000 km each, could be characterized using only 10 different terrain units and six types of structures. These units and structures form a major stratigraphic and geologic sequence (from oldest to youngest): (1) tessera terrain; (2) densely fractured terrains associated with coronae and in the form of remnants among plains; (3) fractured and ridged plains and ridge belts; (4) plains with wrinkle ridges; (5) ridges associated with coronae annulae and ridges of arachnoid annulae which are contemporary with wrinkle ridges of the ridged plains; (6) smooth and lobate plains; (7) fractures of coronae annulae, and fractures not related to coronae annulae, which disrupt ridged and smooth plains; (8) rift-associated fractures; and (9) craters with associated dark paraboloids, which represent the youngest 1O% of the Venus impact crater population (Campbell et al.), and are on top of all volcanic and tectonic units except the youngest episodes of rift-associated fracturing and volcanism; surficial streaks and patches are approximately contemporary with dark-paraboloid craters. Mapping of such units and structures in 36 randomly distributed large regions (each approximately 10(exp 6) sq km) shows evidence for a distinctive regional and global stratigraphic and geologic sequence. On the basis of this sequence we have developed a model that illustrates several major themes in the history of Venus. Most of the history of Venus (that of its first 80% or so) is not preserved in the surface geomorphological record. The major deformation associated with tessera formation in the period sometime between 0.5-1.0 b.y. ago (Ivanov and Basilevsky) is the earliest event detected. In the terminal stages of tessera fon

  12. Protocol adherence for continuously titrated interventions in randomized trials: an overview of the current methodology and case study

    Directory of Open Access Journals (Sweden)

    F. Lauzier

    2017-07-01

    Full Text Available Abstract Background The standard definition for protocol adherence is the proportion of all scheduled doses that are delivered. In clinical research, this definition has several limitations when evaluating protocol adherence in trials that study interventions requiring continuous titration. Discussion Building upon a specific case study, we analyzed a recent trial of a continuously titrated intervention to assess the impact of different definitions of protocol deviations on the interpretation of protocol adherence. The OVATION pilot trial was an open-label randomized controlled trial of higher (75–80 mmHg versus lower (60–65 mmHg mean arterial pressure (MAP targets for vasopressor therapy in shock. In this trial, potential protocol deviations were defined as MAP values outside the targeted range for >4 consecutive hours during vasopressor therapy without synchronous and consistent adjustments of vasopressor doses. An adjudication committee reviewed each potential deviation to determine if it was clinically-justified or not. There are four reasons for this contextual measurement and reporting of protocol adherence. First, between-arm separation is a robust measure of adherence to complex protocols. Second, adherence assessed by protocol deviations varies in function of the definition of deviations and the frequency of measurements. Third, distinguishing clinically-justified vs. not clinically-justified protocol deviations acknowledges clinically sensible bedside decision-making and offers a clear terminology before the trial begins. Finally, multiple metrics exist to report protocol deviations, which provides different information but complementary information on protocol adherence. Conclusions In trials of interventions requiring continuous titration, metrics used for defining protocol deviations have a considerable impact on the interpretation of protocol adherence. Definitions for protocol deviations should be prespecified and correlated

  13. Protocol adherence for continuously titrated interventions in randomized trials: an overview of the current methodology and case study.

    Science.gov (United States)

    Lauzier, F; Adhikari, N K; Seely, A; Koo, K K Y; Belley-Côté, E P; Burns, K E A; Cook, D J; D'Aragon, F; Rochwerg, B; Kho, M E; Oczkowksi, S J W; Duan, E H; Meade, M O; Day, A G; Lamontagne, F

    2017-07-17

    The standard definition for protocol adherence is the proportion of all scheduled doses that are delivered. In clinical research, this definition has several limitations when evaluating protocol adherence in trials that study interventions requiring continuous titration. Building upon a specific case study, we analyzed a recent trial of a continuously titrated intervention to assess the impact of different definitions of protocol deviations on the interpretation of protocol adherence. The OVATION pilot trial was an open-label randomized controlled trial of higher (75-80 mmHg) versus lower (60-65 mmHg) mean arterial pressure (MAP) targets for vasopressor therapy in shock. In this trial, potential protocol deviations were defined as MAP values outside the targeted range for >4 consecutive hours during vasopressor therapy without synchronous and consistent adjustments of vasopressor doses. An adjudication committee reviewed each potential deviation to determine if it was clinically-justified or not. There are four reasons for this contextual measurement and reporting of protocol adherence. First, between-arm separation is a robust measure of adherence to complex protocols. Second, adherence assessed by protocol deviations varies in function of the definition of deviations and the frequency of measurements. Third, distinguishing clinically-justified vs. not clinically-justified protocol deviations acknowledges clinically sensible bedside decision-making and offers a clear terminology before the trial begins. Finally, multiple metrics exist to report protocol deviations, which provides different information but complementary information on protocol adherence. In trials of interventions requiring continuous titration, metrics used for defining protocol deviations have a considerable impact on the interpretation of protocol adherence. Definitions for protocol deviations should be prespecified and correlated with between-arm separation, if it can be measured.

  14. Active learning for clinical text classification: is it better than random sampling?

    Science.gov (United States)

    Figueroa, Rosa L; Ngo, Long H; Goryachev, Sergey; Wiechmann, Eduardo P

    2012-01-01

    Objective This study explores active learning algorithms as a way to reduce the requirements for large training sets in medical text classification tasks. Design Three existing active learning algorithms (distance-based (DIST), diversity-based (DIV), and a combination of both (CMB)) were used to classify text from five datasets. The performance of these algorithms was compared to that of passive learning on the five datasets. We then conducted a novel investigation of the interaction between dataset characteristics and the performance results. Measurements Classification accuracy and area under receiver operating characteristics (ROC) curves for each algorithm at different sample sizes were generated. The performance of active learning algorithms was compared with that of passive learning using a weighted mean of paired differences. To determine why the performance varies on different datasets, we measured the diversity and uncertainty of each dataset using relative entropy and correlated the results with the performance differences. Results The DIST and CMB algorithms performed better than passive learning. With a statistical significance level set at 0.05, DIST outperformed passive learning in all five datasets, while CMB was found to be better than passive learning in four datasets. We found strong correlations between the dataset diversity and the DIV performance, as well as the dataset uncertainty and the performance of the DIST algorithm. Conclusion For medical text classification, appropriate active learning algorithms can yield performance comparable to that of passive learning with considerably smaller training sets. In particular, our results suggest that DIV performs better on data with higher diversity and DIST on data with lower uncertainty. PMID:22707743

  15. Development of a methodology for low-energy X-ray absorption correction in biological samples using radiation scattering techniques

    International Nuclear Information System (INIS)

    Pereira, Marcelo O.; Anjos, Marcelino J.; Lopes, Ricardo T.

    2009-01-01

    Non-destructive techniques with X-ray, such as tomography, radiography and X-ray fluorescence are sensitive to the attenuation coefficient and have a large field of applications in medical as well as industrial area. In the case of X-ray fluorescence analysis the knowledge of photon X-ray attenuation coefficients provides important information to obtain the elemental concentration. On the other hand, the mass attenuation coefficient values are determined by transmission methods. So, the use of X-ray scattering can be considered as an alternative to transmission methods. This work proposes a new method for obtain the X-ray absorption curve through superposition peak Rayleigh and Compton scattering of the lines L a e L β of Tungsten (Tungsten L lines of an X-ray tube with W anode). The absorption curve was obtained using standard samples with effective atomic number in the range from 6 to 16. The method were applied in certified samples of bovine liver (NIST 1577B) , milk powder and V-10. The experimental measurements were obtained using the portable system EDXRF of the Nuclear Instrumentation Laboratory (LIN-COPPE/UFRJ) with Tungsten (W) anode. (author)

  16. Novel approach to systematic random sampling in population surveys: Lessons from the United Arab Emirates National Diabetes Study (UAEDIAB).

    Science.gov (United States)

    Sulaiman, Nabil; Albadawi, Salah; Abusnana, Salah; Fikri, Mahmoud; Madani, Abdulrazzag; Mairghani, Maisoon; Alawadi, Fatheya; Zimmet, Paul; Shaw, Jonathan

    2015-09-01

    The prevalence of diabetes has risen rapidly in the Middle East, particularly in the Gulf Region. However, some prevalence estimates have not fully accounted for large migrant worker populations and have focused on minority indigenous populations. The objectives of the UAE National Diabetes and Lifestyle Study are to: (i) define the prevalence of, and risk factors for, T2DM; (ii) describe the distribution and determinants of T2DM risk factors; (iii) study health knowledge, attitudes, and (iv) identify gene-environment interactions; and (v) develop baseline data for evaluation of future intervention programs. Given the high burden of diabetes in the region and the absence of accurate data on non-UAE nationals in the UAE, a representative sample of the non-UAE nationals was essential. We used an innovative methodology in which non-UAE nationals were sampled when attending the mandatory biannual health check that is required for visa renewal. Such an approach could also be used in other countries in the region. Complete data were available for 2719 eligible non-UAE nationals (25.9% Arabs, 70.7% Asian non-Arabs, 1.1% African non-Arabs, and 2.3% Westerners). Most were men < 65 years of age. The response rate was 68%, and the non-response was greater among women than men; 26.9% earned less than UAE Dirham (AED) 24 000 (US$6500) and the most common areas of employment were as managers or professionals, in service and sales, and unskilled occupations. Most (37.4%) had completed high school and 4.1% had a postgraduate degree. This novel methodology could provide insights for epidemiological studies in the UAE and other Gulf States, particularly for expatriates. © 2015 Ruijin Hospital, Shanghai Jiaotong University School of Medicine and Wiley Publishing Asia Pty Ltd.

  17. Thermal Protection for Mars Sample Return Earth Entry Vehicle: A Grand Challenge for Design Methodology and Reliability Verification

    Science.gov (United States)

    Venkatapathy, Ethiraj; Gage, Peter; Wright, Michael J.

    2017-01-01

    Mars Sample Return is our Grand Challenge for the coming decade. TPS (Thermal Protection System) nominal performance is not the key challenge. The main difficulty for designers is the need to verify unprecedented reliability for the entry system: current guidelines for prevention of backward contamination require that the probability of spores larger than 1 micron diameter escaping into the Earth environment be lower than 1 million for the entire system, and the allocation to TPS would be more stringent than that. For reference, the reliability allocation for Orion TPS is closer to 11000, and the demonstrated reliability for previous human Earth return systems was closer to 1100. Improving reliability by more than 3 orders of magnitude is a grand challenge indeed. The TPS community must embrace the possibility of new architectures that are focused on reliability above thermal performance and mass efficiency. MSR (Mars Sample Return) EEV (Earth Entry Vehicle) will be hit with MMOD (Micrometeoroid and Orbital Debris) prior to reentry. A chute-less aero-shell design which allows for self-righting shape was baselined in prior MSR studies, with the assumption that a passive system will maximize EEV robustness. Hence the aero-shell along with the TPS has to take ground impact and not break apart. System verification will require testing to establish ablative performance and thermal failure but also testing of damage from MMOD, and structural performance at ground impact. Mission requirements will demand analysis, testing and verification that are focused on establishing reliability of the design. In this proposed talk, we will focus on the grand challenge of MSR EEV TPS and the need for innovative approaches to address challenges in modeling, testing, manufacturing and verification.

  18. Intraarticular Facet Injections for Low Back Pain: Design Considerations, Consensus Methodology to Develop the Protocol for a Randomized Controlled Trial.

    Science.gov (United States)

    Mars, Tom; Ellard, David R; Antrobus, James H L; Cairns, Melinda; Underwood, Martin; Haywood, Kirstie; Keohane, Susie; Sandhu, Harbinder; Griffiths, Frances

    2015-01-01

    Since the publication of guidelines by the UK National Institute for Health and Care Excellence (NICE) and the American Pain Society guidelines for low back pain in 2009 there have been deep divisions in the pain treatment community about the use of therapeutic intraarticular facet joint injections. While evidence for the effectiveness or not of intraarticular facet joint injections remains sparse, uncertainty will remain. The Warwick feasibility study, along with a concurrent study with a different design led by another group, aims to provide a stable platform from which the effectiveness and cost effectiveness of intraarticular facet joint injections added to normal care could be evaluated in randomized controlled trials (RCTs). To reach consensus on key design considerations for the Warwick facet feasibility study from which the study protocol and working manuals will be developed. A consensus conference involving expert professionals and lay members. Preliminary work identified 5 key design considerations for deliberation at our consensus conference. Three concerned patient assessment and treatment: diagnosis of possible facet joint pain, interaarticular facet joint injection technique, and best usual care. Two concerned trial analysis: a priori sub-groups and minimally important difference and are reported elsewhere. We did systematic evidence reviews of the design considerations and summarized the evidence. Our design questions and evidence summaries were distributed to all delegates. This formed the basis for discussions on the day. Clinical experts in all aspects of facet joint injection from across the UK along with lay people were invited via relevant organizations. Nominal group technique was used in 15 facilitated initial small group discussions. Further discussion and ranking was undertaken in plenary. All small group and plenary results were recorded and checked and verified post conference. Where necessary participants were contacted via email to

  19. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Science.gov (United States)

    Catarino, Rosa; Vassilakos, Pierre; Bilancioni, Aline; Vanden Eynde, Mathieu; Meyer-Hamme, Ulrike; Menoud, Pierre-Alain; Guerry, Frédéric; Petignat, Patrick

    2015-01-01

    Human papillomavirus (HPV) self-sampling (self-HPV) is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab. A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY) or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA). After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET). HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic. HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2) detected by s-DRY, 56.2% (95%CI: 47.6-64.4) by Dr-WET, and 54.6% (95%CI: 46.1-62.9) by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34), and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56). Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+) were: 64.0% (95%CI: 44.5-79.8) for s-FTA, 84.6% (95%CI: 66.5-93.9) for s-DRY, and 76.9% (95%CI: 58.0-89.0) for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%). Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD)/per card vs. ~1 USD/per swab). Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA. International Standard Randomized Controlled Trial Number (ISRCTN): 43310942.

  20. Randomized Comparison of Two Vaginal Self-Sampling Methods for Human Papillomavirus Detection: Dry Swab versus FTA Cartridge.

    Directory of Open Access Journals (Sweden)

    Rosa Catarino

    Full Text Available Human papillomavirus (HPV self-sampling (self-HPV is valuable in cervical cancer screening. HPV testing is usually performed on physician-collected cervical smears stored in liquid-based medium. Dry filters and swabs are an alternative. We evaluated the adequacy of self-HPV using two dry storage and transport devices, the FTA cartridge and swab.A total of 130 women performed two consecutive self-HPV samples. Randomization determined which of the two tests was performed first: self-HPV using dry swabs (s-DRY or vaginal specimen collection using a cytobrush applied to an FTA cartridge (s-FTA. After self-HPV, a physician collected a cervical sample using liquid-based medium (Dr-WET. HPV types were identified by real-time PCR. Agreement between collection methods was measured using the kappa statistic.HPV prevalence for high-risk types was 62.3% (95%CI: 53.7-70.2 detected by s-DRY, 56.2% (95%CI: 47.6-64.4 by Dr-WET, and 54.6% (95%CI: 46.1-62.9 by s-FTA. There was overall agreement of 70.8% between s-FTA and s-DRY samples (kappa = 0.34, and of 82.3% between self-HPV and Dr-WET samples (kappa = 0.56. Detection sensitivities for low-grade squamous intraepithelial lesion or worse (LSIL+ were: 64.0% (95%CI: 44.5-79.8 for s-FTA, 84.6% (95%CI: 66.5-93.9 for s-DRY, and 76.9% (95%CI: 58.0-89.0 for Dr-WET. The preferred self-collection method among patients was s-DRY (40.8% vs. 15.4%. Regarding costs, FTA card was five times more expensive than the swab (~5 US dollars (USD/per card vs. ~1 USD/per swab.Self-HPV using dry swabs is sensitive for detecting LSIL+ and less expensive than s-FTA.International Standard Randomized Controlled Trial Number (ISRCTN: 43310942.

  1. A novel approach to non-biased systematic random sampling: a stereologic estimate of Purkinje cells in the human cerebellum.

    Science.gov (United States)

    Agashiwala, Rajiv M; Louis, Elan D; Hof, Patrick R; Perl, Daniel P

    2008-10-21

    Non-biased systematic sampling using the principles of stereology provides accurate quantitative estimates of objects within neuroanatomic structures. However, the basic principles of stereology are not optimally suited for counting objects that selectively exist within a limited but complex and convoluted portion of the sample, such as occurs when counting cerebellar Purkinje cells. In an effort to quantify Purkinje cells in association with certain neurodegenerative disorders, we developed a new method for stereologic sampling of the cerebellar cortex, involving calculating the volume of the cerebellar tissues, identifying and isolating the Purkinje cell layer and using this information to extrapolate non-biased systematic sampling data to estimate the total number of Purkinje cells in the tissues. Using this approach, we counted Purkinje cells in the right cerebella of four human male control specimens, aged 41, 67, 70 and 84 years, and estimated the total Purkinje cell number for the four entire cerebella to be 27.03, 19.74, 20.44 and 22.03 million cells, respectively. The precision of the method is seen when comparing the density of the cells within the tissue: 266,274, 173,166, 167,603 and 183,575 cells/cm3, respectively. Prior literature documents Purkinje cell counts ranging from 14.8 to 30.5 million cells. These data demonstrate the accuracy of our approach. Our novel approach, which offers an improvement over previous methodologies, is of value for quantitative work of this nature. This approach could be applied to morphometric studies of other similarly complex tissues as well.

  2. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    Science.gov (United States)

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  3. Media Use and Source Trust among Muslims in Seven Countries: Results of a Large Random Sample Survey

    Directory of Open Access Journals (Sweden)

    Steven R. Corman

    2013-12-01

    Full Text Available Despite the perceived importance of media in the spread of and resistance against Islamist extremism, little is known about how Muslims use different kinds of media to get information about religious issues, and what sources they trust when doing so. This paper reports the results of a large, random sample survey among Muslims in seven countries Southeast Asia, West Africa and Western Europe, which helps fill this gap. Results show a diverse set of profiles of media use and source trust that differ by country, with overall low trust in mediated sources of information. Based on these findings, we conclude that mass media is still the most common source of religious information for Muslims, but that trust in mediated information is low overall. This suggests that media are probably best used to persuade opinion leaders, who will then carry anti-extremist messages through more personal means.

  4. The relationship between blood viscosity and blood pressure in a random sample of the population aged 55 to 74 years.

    Science.gov (United States)

    Fowkes, F G; Lowe, G D; Rumley, A; Lennie, S E; Smith, F B; Donnan, P T

    1993-05-01

    Blood viscosity is elevated in hypertensive subjects, but the association of viscosity with arterial blood pressure in the general population, and the influence of social, lifestyle and disease characteristics on this association, are not established. In the Edinburgh Artery Study, 1592 men and women aged 55-74 years selected randomly from the general population attended a university clinic. A fasting blood sample was taken for the measurement of blood viscosity and its major determinants (haematocrit, plasma viscosity and fibrinogen). Systolic pressure was related univariately to blood viscosity (P viscosity (P index. Diastolic pressure was related univariately to blood viscosity (P viscosity (P viscosity and systolic pressure was confined to males. Blood viscosity was associated equally with systolic and diastolic pressures in males, and remained independently related on multivariate analysis adjusting for age, sex, body mass index, social class, smoking, alcohol intake, exercise, angina, HDL and non-HDL cholesterol, diabetes mellitus, plasma viscosity, fibrinogen, and haematocrit.

  5. Application of bias factor method using random sampling technique for prediction accuracy improvement of critical eigenvalue of BWR

    International Nuclear Information System (INIS)

    Ito, Motohiro; Endo, Tomohiro; Yamamoto, Akio; Kuroda, Yusuke; Yoshii, Takashi

    2017-01-01

    The bias factor method based on the random sampling technique is applied to the benchmark problem of Peach Bottom Unit 2. Validity and availability of the present method, i.e. correction of calculation results and reduction of uncertainty, are confirmed in addition to features and performance of the present method. In the present study, core characteristics in cycle 3 are corrected with the proposed method using predicted and 'measured' critical eigenvalues in cycles 1 and 2. As the source of uncertainty, variance-covariance of cross sections is considered. The calculation results indicate that bias between predicted and measured results, and uncertainty owing to cross section can be reduced. Extension to other uncertainties such as thermal hydraulics properties will be a future task. (author)

  6. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    implementation generally improved the algorithm’s ability to predict the correct soil class. The implementation of soil-landscape relationships and area-proportional sampling generally increased the calculation time, while the random forest implementation reduced the calculation time. In the most successful......Detailed soil information is often needed to support agricultural practices, environmental protection and policy decisions. Several digital approaches can be used to map soil properties based on field observations. When soil observations are sparse or missing, an alternative approach...... is to disaggregate existing conventional soil maps. At present, the DSMART algorithm represents the most sophisticated approach for disaggregating conventional soil maps (Odgers et al., 2014). The algorithm relies on classification trees trained from resampled points, which are assigned classes according...

  7. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  8. Evaluation of the methodologies used to generate random pavement profiles based on the power spectral density: An approach based on the International Roughness Index

    Directory of Open Access Journals (Sweden)

    Boris Jesús Goenaga

    2017-01-01

    Full Text Available The pavement roughness is the main variable that produces the vertical excitation in vehicles. Pavement profiles are the main determinant of (i discomfort perception on users and (ii dynamic loads generated at the tire-pavement interface, hence its evaluation constitutes an essential step on a Pavement Management System. The present document evaluates two specific techniques used to simulate pavement profiles; these are the shaping filter and the sinusoidal approach, both based on the Power Spectral Density. Pavement roughness was evaluated using the International Roughness Index (IRI, which represents the most used index to characterize longitudinal road profiles. Appropriate parameters were defined in the simulation process to obtain pavement profiles with specific ranges of IRI values using both simulation techniques. The results suggest that using a sinusoidal approach one can generate random profiles with IRI values that are representative of different road types, therefore, one could generate a profile for a paved or an unpaved road, representing all the proposed categories defined by ISO 8608 standard. On the other hand, to obtain similar results using the shaping filter approximation a modification in the simulation parameters is necessary. The new proposed values allow one to generate pavement profiles with high levels of roughness, covering a wider range of surface types. Finally, the results of the current investigation could be used to further improve our understanding on the effect of pavement roughness on tire pavement interaction. The evaluated methodologies could be used to generate random profiles with specific levels of roughness to assess its effect on dynamic loads generated at the tire-pavement interface and user’s perception of road condition.

  9. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires.

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  10. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires

    Science.gov (United States)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  11. Assessing sample representativeness in randomized controlled trials: application to the National Institute of Drug Abuse Clinical Trials Network.

    Science.gov (United States)

    Susukida, Ryoko; Crum, Rosa M; Stuart, Elizabeth A; Ebnesajjad, Cyrus; Mojtabai, Ramin

    2016-07-01

    To compare the characteristics of individuals participating in randomized controlled trials (RCTs) of treatments of substance use disorder (SUD) with individuals receiving treatment in usual care settings, and to provide a summary quantitative measure of differences between characteristics of these two groups of individuals using propensity score methods. Design Analyses using data from RCT samples from the National Institute of Drug Abuse Clinical Trials Network (CTN) and target populations of patients drawn from the Treatment Episodes Data Set-Admissions (TEDS-A). Settings Multiple clinical trial sites and nation-wide usual SUD treatment settings in the United States. A total of 3592 individuals from 10 CTN samples and 1 602 226 individuals selected from TEDS-A between 2001 and 2009. Measurements The propensity scores for enrolling in the RCTs were computed based on the following nine observable characteristics: sex, race/ethnicity, age, education, employment status, marital status, admission to treatment through criminal justice, intravenous drug use and the number of prior treatments. Findings The proportion of those with ≥ 12 years of education and the proportion of those who had full-time jobs were significantly higher among RCT samples than among target populations (in seven and nine trials, respectively, at P difference in the mean propensity scores between the RCTs and the target population was 1.54 standard deviations and was statistically significant at P different from individuals receiving treatment in usual care settings. Notably, RCT participants tend to have more years of education and a greater likelihood of full-time work compared with people receiving care in usual care settings. © 2016 Society for the Study of Addiction.

  12. High Field In Vivo 13C Magnetic Resonance Spectroscopy of Brain by Random Radiofrequency Heteronuclear Decoupling and Data Sampling

    Science.gov (United States)

    Li, Ningzhi; Li, Shizhe; Shen, Jun

    2017-06-01

    In vivo 13C magnetic resonance spectroscopy (MRS) is a unique and effective tool for studying dynamic human brain metabolism and the cycling of neurotransmitters. One of the major technical challenges for in vivo 13C-MRS is the high radio frequency (RF) power necessary for heteronuclear decoupling. In the common practice of in vivo 13C-MRS, alkanyl carbons are detected in the spectra range of 10-65ppm. The amplitude of decoupling pulses has to be significantly greater than the large one-bond 1H-13C scalar coupling (1JCH=125-145 Hz). Two main proton decoupling methods have been developed: broadband stochastic decoupling and coherent composite or adiabatic pulse decoupling (e.g., WALTZ); the latter is widely used because of its efficiency and superb performance under inhomogeneous B1 field. Because the RF power required for proton decoupling increases quadratically with field strength, in vivo 13C-MRS using coherent decoupling is often limited to low magnetic fields (protons via weak long-range 1H-13C scalar couplings, which can be decoupled using low RF power broadband stochastic decoupling. Recently, the carboxylic/amide 13C-MRS technique using low power random RF heteronuclear decoupling was safely applied to human brain studies at 7T. Here, we review the two major decoupling methods and the carboxylic/amide 13C-MRS with low power decoupling strategy. Further decreases in RF power deposition by frequency-domain windowing and time-domain random under-sampling are also discussed. Low RF power decoupling opens the possibility of performing in vivo 13C experiments of human brain at very high magnetic fields (such as 11.7T), where signal-to-noise ratio as well as spatial and temporal spectral resolution are more favorable than lower fields.

  13. Iterative random vs. Kennard-Stone sampling for IR spectrum-based classification task using PLS2-DA

    Science.gov (United States)

    Lee, Loong Chuen; Liong, Choong-Yeun; Jemain, Abdul Aziz

    2018-04-01

    External testing (ET) is preferred over auto-prediction (AP) or k-fold-cross-validation in estimating more realistic predictive ability of a statistical model. With IR spectra, Kennard-stone (KS) sampling algorithm is often used to split the data into training and test sets, i.e. respectively for model construction and for model testing. On the other hand, iterative random sampling (IRS) has not been the favored choice though it is theoretically more likely to produce reliable estimation. The aim of this preliminary work is to compare performances of KS and IRS in sampling a representative training set from an attenuated total reflectance - Fourier transform infrared spectral dataset (of four varieties of blue gel pen inks) for PLS2-DA modeling. The `best' performance achievable from the dataset is estimated with AP on the full dataset (APF, error). Both IRS (n = 200) and KS were used to split the dataset in the ratio of 7:3. The classic decision rule (i.e. maximum value-based) is employed for new sample prediction via partial least squares - discriminant analysis (PLS2-DA). Error rate of each model was estimated repeatedly via: (a) AP on full data (APF, error); (b) AP on training set (APS, error); and (c) ET on the respective test set (ETS, error). A good PLS2-DA model is expected to produce APS, error and EVS, error that is similar to the APF, error. Bearing that in mind, the similarities between (a) APS, error vs. APF, error; (b) ETS, error vs. APF, error and; (c) APS, error vs. ETS, error were evaluated using correlation tests (i.e. Pearson and Spearman's rank test), using series of PLS2-DA models computed from KS-set and IRS-set, respectively. Overall, models constructed from IRS-set exhibits more similarities between the internal and external error rates than the respective KS-set, i.e. less risk of overfitting. In conclusion, IRS is more reliable than KS in sampling representative training set.

  14. A pilot double-blind, randomized, placebo-controlled trial of the efficacy of trace elements in the treatment of endometriosis-related pain: study design and methodology

    Directory of Open Access Journals (Sweden)

    Oberweis D

    2016-02-01

    Full Text Available Didier Oberweis,1 Patrick Madelenat,2 Michelle Nisolle,3 Etienne Demanet4 1Department of Gynecology and Obstetrics, CHU de Charleroi, Hôpital André Vésale, Montigny-le-Tilleul, Belgium; 2Private Consultation, Paris, France; 3Department of Gynecology and Obstetrics, CHR Citadelle, Liège, 4Clinical Research Unit, Charleroi, Belgium Abstract: Endometriosis is one of the most common benign gynecological disorders, affecting almost 10%–15% of all women of reproductive age and >30% of infertile women. The pathology is associated with various distressing symptoms, particularly pelvic pain, which adversely affect patients' quality of life. It is an estrogen-dependent disease. There is evidence both in animals and in humans that metal ions can activate the estrogen receptors. They are defined as a variety of xenoestrogens, called metalloestrogens, which could act as endocrine disruptors. Therefore, it could be considered to act on this gynecological disorder using food supplements containing trace elements (ie, nutripuncture. The assumption is that they could modulate estrogen receptors and thus influence the tropism and the survival of cells involved in endometriosis. By a modulation of the antioxidant system, they might also interact with various parameters influencing tissue biochemistry. The objective of this article is to describe and discuss the design and methodology of an ongoing double-blind, randomized, placebo-controlled study aiming to evaluate the efficacy of metal trace elements on the reduction of pain and improvement of quality of life, in patients with a revised American Fertility Society Score Stages II–IV endometriosis, combined or not with adenomyosis, during a treatment period of 4 months. Trace elements or placebo is proposed in the absence of any other treatment or as an add-on to current therapies, such as sexual hormones, nonsteroidal anti-inflammatory drugs, and surgery. A placebo run-in period of one menstrual cycle or

  15. Design and methodology of a randomized clinical trial of home-based telemental health treatment for U.S. military personnel and veterans with depression.

    Science.gov (United States)

    Luxton, David D; Pruitt, Larry D; O'Brien, Karen; Stanfill, Katherine; Jenkins-Guarnieri, Michael A; Johnson, Kristine; Wagner, Amy; Thomas, Elissa; Gahm, Gregory A

    2014-05-01

    Home-based telemental health (TMH) treatments have the potential to address current and future health needs of military service members, veterans, and their families, especially for those who live in rural or underserved areas. The use of home-based TMH treatments to address the behavioral health care needs of U.S. military healthcare beneficiaries is not presently considered standard of care in the Military Health System. The feasibility, safety, and clinical efficacy of home-based TMH treatments must be established before broad dissemination of home-based treatment programs can be implemented. This paper describes the design, methodology, and protocol of a clinical trial that compares in-office to home-based Behavioral Activation for Depression (BATD) treatment delivered via web-based video technology for service members and veterans with depression. This grant funded three-year randomized clinical trial is being conducted at the National Center for Telehealth and Technology at Joint-base Lewis-McChord and at the Portland VA Medical Center. Best practice recommendations regarding the implementation of in-home telehealth in the military setting as well as the cultural and contextual factors of providing in-home care to active duty and veteran military populations are also discussed. Published by Elsevier Inc.

  16. Comparing attitudes about legal sanctions and teratogenic effects for cocaine, alcohol, tobacco and caffeine: A randomized, independent samples design

    Directory of Open Access Journals (Sweden)

    Alanis Kelly L

    2006-02-01

    Full Text Available Abstract Background Establishing more sensible measures to treat cocaine-addicted mothers and their children is essential for improving U.S. drug policy. Favorable post-natal environments have moderated potential deleterious prenatal effects. However, since cocaine is an illicit substance having long been demonized, we hypothesized that attitudes toward prenatal cocaine exposure would be more negative than for licit substances, alcohol, nicotine and caffeine. Further, media portrayals about long-term outcomes were hypothesized to influence viewers' attitudes, measured immediately post-viewing. Reducing popular crack baby stigmas could influence future policy decisions by legislators. In Study 1, 336 participants were randomly assigned to 1 of 4 conditions describing hypothetical legal sanction scenarios for pregnant women using cocaine, alcohol, nicotine or caffeine. Participants rated legal sanctions against pregnant women who used one of these substances and risk potential for developing children. In Study 2, 139 participants were randomly assigned to positive, neutral and negative media conditions. Immediately post-viewing, participants rated prenatal cocaine-exposed or non-exposed teens for their academic performance and risk for problems at age18. Results Participants in Study 1 imposed significantly greater legal sanctions for cocaine, perceiving prenatal cocaine exposure as more harmful than alcohol, nicotine or caffeine. A one-way ANOVA for independent samples showed significant differences, beyond .0001. Post-hoc Sheffe test illustrated that cocaine was rated differently from other substances. In Study 2, a one-way ANOVA for independent samples was performed on difference scores for the positive, neutral or negative media conditions about prenatal cocaine exposure. Participants in the neutral and negative media conditions estimated significantly lower grade point averages and more problems for the teen with prenatal cocaine exposure

  17. A cross-sectional, randomized cluster sample survey of household vulnerability to extreme heat among slum dwellers in ahmedabad, india.

    Science.gov (United States)

    Tran, Kathy V; Azhar, Gulrez S; Nair, Rajesh; Knowlton, Kim; Jaiswal, Anjali; Sheffield, Perry; Mavalankar, Dileep; Hess, Jeremy

    2013-06-18

    Extreme heat is a significant public health concern in India; extreme heat hazards are projected to increase in frequency and severity with climate change. Few of the factors driving population heat vulnerability are documented, though poverty is a presumed risk factor. To facilitate public health preparedness, an assessment of factors affecting vulnerability among slum dwellers was conducted in summer 2011 in Ahmedabad, Gujarat, India. Indicators of heat exposure, susceptibility to heat illness, and adaptive capacity, all of which feed into heat vulnerability, was assessed through a cross-sectional household survey using randomized multistage cluster sampling. Associations between heat-related morbidity and vulnerability factors were identified using multivariate logistic regression with generalized estimating equations to account for clustering effects. Age, preexisting medical conditions, work location, and access to health information and resources were associated with self-reported heat illness. Several of these variables were unique to this study. As sociodemographics, occupational heat exposure, and access to resources were shown to increase vulnerability, future interventions (e.g., health education) might target specific populations among Ahmedabad urban slum dwellers to reduce vulnerability to extreme heat. Surveillance and evaluations of future interventions may also be worthwhile.

  18. Multiple-image authentication with a cascaded multilevel architecture based on amplitude field random sampling and phase information multiplexing.

    Science.gov (United States)

    Fan, Desheng; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Pan, Xuemei; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi

    2015-04-10

    A multiple-image authentication method with a cascaded multilevel architecture in the Fresnel domain is proposed, in which a synthetic encoded complex amplitude is first fabricated, and its real amplitude component is generated by iterative amplitude encoding, random sampling, and space multiplexing for the low-level certification images, while the phase component of the synthetic encoded complex amplitude is constructed by iterative phase information encoding and multiplexing for the high-level certification images. Then the synthetic encoded complex amplitude is iteratively encoded into two phase-type ciphertexts located in two different transform planes. During high-level authentication, when the two phase-type ciphertexts and the high-level decryption key are presented to the system and then the Fresnel transform is carried out, a meaningful image with good quality and a high correlation coefficient with the original certification image can be recovered in the output plane. Similar to the procedure of high-level authentication, in the case of low-level authentication with the aid of a low-level decryption key, no significant or meaningful information is retrieved, but it can result in a remarkable peak output in the nonlinear correlation coefficient of the output image and the corresponding original certification image. Therefore, the method realizes different levels of accessibility to the original certification image for different authority levels with the same cascaded multilevel architecture.

  19. Mental Health Impact of Hosting Disaster Refugees: Analyses from a Random Sample Survey Among Haitians Living in Miami.

    Science.gov (United States)

    Messiah, Antoine; Lacoste, Jérôme; Gokalsing, Erick; Shultz, James M; Rodríguez de la Vega, Pura; Castro, Grettel; Acuna, Juan M

    2016-08-01

    Studies on the mental health of families hosting disaster refugees are lacking. This study compares participants in households that hosted 2010 Haitian earthquake disaster refugees with their nonhost counterparts. A random sample survey was conducted from October 2011 through December 2012 in Miami-Dade County, Florida. Haitian participants were assessed regarding their 2010 earthquake exposure and impact on family and friends and whether they hosted earthquake refugees. Using standardized scores and thresholds, they were evaluated for symptoms of three common mental disorders (CMDs): posttraumatic stress disorder, generalized anxiety disorder, and major depressive disorder (MDD). Participants who hosted refugees (n = 51) had significantly higher percentages of scores beyond thresholds for MDD than those who did not host refugees (n = 365) and for at least one CMD, after adjusting for participants' earthquake exposures and effects on family and friends. Hosting refugees from a natural disaster appears to elevate the risk for MDD and possibly other CMDs, independent of risks posed by exposure to the disaster itself. Families hosting refugees deserve special attention.

  20. [Transciptome among Mexicans: a large scale methodology to analyze the genetics expression profile of simultaneous samples in muscle, adipose tissue and lymphocytes obtained from the same individual].

    Science.gov (United States)

    Bastarrachea, Raúl A; López-Alvarenga, Juan Carlos; Kent, Jack W; Laviada-Molina, Hugo A; Cerda-Flores, Ricardo M; Calderón-Garcidueñas, Ana Laura; Torres-Salazar, Amada; Torres-Salazar, Amanda; Nava-González, Edna J; Solis-Pérez, Elizabeth; Gallegos-Cabrales, Esther C; Cole, Shelley A; Comuzzie, Anthony G

    2008-01-01

    We describe the methodology used to analyze multiple transcripts using microarray techniques in simultaneous biopsies of muscle, adipose tissue and lymphocytes obtained from the same individual as part of the standard protocol of the Genetics of Metabolic Diseases in Mexico: GEMM Family Study. We recruited 4 healthy male subjects with BM1 20-41, who signed an informed consent letter. Subjects participated in a clinical examination that included anthropometric and body composition measurements, muscle biopsies (vastus lateralis) subcutaneous fat biopsies anda blood draw. All samples provided sufficient amplified RNA for microarray analysis. Total RNA was extracted from the biopsy samples and amplified for analysis. Of the 48,687 transcript targets queried, 39.4% were detectable in a least one of the studied tissues. Leptin was not detectable in lymphocytes, weakly expressed in muscle, but overexpressed and highly correlated with BMI in subcutaneous fat. Another example was GLUT4, which was detectable only in muscle and not correlated with BMI. Expression level concordance was 0.7 (p< 0.001) for the three tissues studied. We demonstrated the feasibility of carrying out simultaneous analysis of gene expression in multiple tissues, concordance of genetic expression in different tissues, and obtained confidence that this method corroborates the expected biological relationships among LEPand GLUT4. TheGEMM study will provide a broad and valuable overview on metabolic diseases, including obesity and type 2 diabetes.

  1. Methodology of ABNT ISO/IEC GUIA 25 implantation in the laboratories of radionuclides analysis in environmental samples of the Analysis Division/CNEN

    International Nuclear Information System (INIS)

    Oliveira, Josue Peter de

    1997-07-01

    The ISO/EEC Guide 25: 1993 Standard G eneral requirements for the competence of calibration and testing laboratories . Is published in Brazil by Brazilian Association for Technical Standards (ABNT) as ABNT ISO/DEC GUIA 25 and establishes general requirements a laboratory must demonstrate to meet, in order to be recognized as having technical competence (accreditation) to carry out specifics calibration or testing. Therefore, the accredited laboratory starts, respectively, taking part from the Brazilian Calibration Network (RBC) or from the Brazilian Testing Laboratories Network (RBLE) . The Environmental Radioanalysis Division (DIAMB) from Environmental Radiological Protection Department (DEPRA) from Institute of Radiation Protection and Dosimetry (IRD) from Brazilian National Nuclear Energy Commission (CNEN) is a laboratory responsible for analyzing radionuclides deriving for the samples from DEPRA's Surveillance Program, research and servings, due to an eventual radionuclide contamination in environment, foods and others raw materials for human consumption; including for importation and exportation products certification purposes. For all these reasons, DIAMB needs its formal recognition for carrying out radionuclides analysis in environmental samples. This work aims to provide a methodology in order to guide a laboratory which has the intention to implement a accreditation process. It also describes policies to meet the requirements related to the Standard, guidance needed to specification of some steps and also comments some points from the Standard in order to become easier all the accreditation process comprehension. (author)

  2. Comparison of Address-based Sampling and Random-digit Dialing Methods for Recruiting Young Men as Controls in a Case-Control Study of Testicular Cancer Susceptibility

    OpenAIRE

    Clagett, Bartholt; Nathanson, Katherine L.; Ciosek, Stephanie L.; McDermoth, Monique; Vaughn, David J.; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A.

    2013-01-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-...

  3. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  4. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    International Nuclear Information System (INIS)

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-01

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  5. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  6. A Systematic Review of Surgical Randomized Controlled Trials: Part 2. Funding Source, Conflict of Interest, and Sample Size in Plastic Surgery.

    Science.gov (United States)

    Voineskos, Sophocles H; Coroneos, Christopher J; Ziolkowski, Natalia I; Kaur, Manraj N; Banfield, Laura; Meade, Maureen O; Chung, Kevin C; Thoma, Achilleas; Bhandari, Mohit

    2016-02-01

    The authors examined industry support, conflict of interest, and sample size in plastic surgery randomized controlled trials that compared surgical interventions. They hypothesized that industry-funded trials demonstrate statistically significant outcomes more often, and randomized controlled trials with small sample sizes report statistically significant results more frequently. An electronic search identified randomized controlled trials published between 2000 and 2013. Independent reviewers assessed manuscripts and performed data extraction. Funding source, conflict of interest, primary outcome direction, and sample size were examined. Chi-squared and independent-samples t tests were used in the analysis. The search identified 173 randomized controlled trials, of which 100 (58 percent) did not acknowledge funding status. A relationship between funding source and trial outcome direction was not observed. Both funding status and conflict of interest reporting improved over time. Only 24 percent (six of 25) of industry-funded randomized controlled trials reported authors to have independent control of data and manuscript contents. The mean number of patients randomized was 73 per trial (median, 43, minimum, 3, maximum, 936). Small trials were not found to be positive more often than large trials (p = 0.87). Randomized controlled trials with small sample size were common; however, this provides great opportunity for the field to engage in further collaboration and produce larger, more definitive trials. Reporting of trial funding and conflict of interest is historically poor, but it greatly improved over the study period. Underreporting at author and journal levels remains a limitation when assessing the relationship between funding source and trial outcomes. Improved reporting and manuscript control should be goals that both authors and journals can actively achieve.

  7. The relationship between external and internal validity of randomized controlled trials: A sample of hypertension trials from China.

    Science.gov (United States)

    Zhang, Xin; Wu, Yuxia; Ren, Pengwei; Liu, Xueting; Kang, Deying

    2015-10-30

    To explore the relationship between the external validity and the internal validity of hypertension RCTs conducted in China. Comprehensive literature searches were performed in Medline, Embase, Cochrane Central Register of Controlled Trials (CCTR), CBMdisc (Chinese biomedical literature database), CNKI (China National Knowledge Infrastructure/China Academic Journals Full-text Database) and VIP (Chinese scientific journals database) as well as advanced search strategies were used to locate hypertension RCTs. The risk of bias in RCTs was assessed by a modified scale, Jadad scale respectively, and then studies with 3 or more grading scores were included for the purpose of evaluating of external validity. A data extract form including 4 domains and 25 items was used to explore relationship of the external validity and the internal validity. Statistic analyses were performed by using SPSS software, version 21.0 (SPSS, Chicago, IL). 226 hypertension RCTs were included for final analysis. RCTs conducted in university affiliated hospitals (P internal validity. Multi-center studies (median = 4.0, IQR = 2.0) were scored higher internal validity score than single-center studies (median = 3.0, IQR = 1.0) (P internal validity (P = 0.004). Multivariate regression indicated sample size, industry-funding, quality of life (QOL) taken as measure and the university affiliated hospital as trial setting had statistical significance (P external validity of RCTs do associate with the internal validity, that do not stand in an easy relationship to each other. Regarding the poor reporting, other possible links between two variables need to trace in the future methodological researches.

  8. Recovery from work-related stress: a randomized controlled trial of a stress management intervention in a clinical sample.

    Science.gov (United States)

    Glasscock, David J; Carstensen, Ole; Dalgaard, Vita Ligaya

    2018-05-28

    Randomized controlled trials (RCTs) of interventions aimed at reducing work-related stress indicate that cognitive behavioural therapy (CBT) is more effective than other interventions. However, definitions of study populations are often unclear and there is a lack of interventions targeting both the individual and the workplace. The aim of this study was to determine whether a stress management intervention combining individual CBT and a workplace focus is superior to no treatment in the reduction of perceived stress and stress symptoms and time to lasting return to work (RTW) in a clinical sample. Patients with work-related stress reactions or adjustment disorders were randomly assigned to an intervention group (n = 57, 84.2% female) or a control group (n = 80, 83.8% female). Subjects were followed via questionnaires and register data. The intervention contained individual CBT and the offer of a workplace meeting. We examined intervention effects by analysing group differences in score changes on the Perceived Stress Scale (PSS-10) and the General Health Questionnaire (GHQ-30). We also tested if intervention led to faster lasting RTW. Mean baseline values of PSS were 24.79 in the intervention group and 23.26 in the control group while the corresponding values for GHQ were 21.3 and 20.27, respectively. There was a significant effect of time. 10 months after baseline, both groups reported less perceived stress and improved mental health. 4 months after baseline, we found significant treatment effects for both perceived stress and mental health. The difference in mean change in PSS after 4 months was - 3.09 (- 5.47, - 0.72), while for GHQ it was - 3.91 (- 7.15, - 0.68). There were no group differences in RTW. The intervention led to faster reductions in perceived stress and stress symptoms amongst patients with work-related stress reactions and adjustment disorders. 6 months after the intervention ended there were no longer differences between

  9. Electronic symptom reporting between patient and provider for improved health care service quality: a systematic review of randomized controlled trials. part 2: methodological quality and effects.

    Science.gov (United States)

    Johansen, Monika Alise; Berntsen, Gro K Rosvold; Schuster, Tibor; Henriksen, Eva; Horsch, Alexander

    2012-10-03

    We conducted in two parts a systematic review of randomized controlled trials (RCTs) on electronic symptom reporting between patients and providers to improve health care service quality. Part 1 reviewed the typology of patient groups, health service innovations, and research targets. Four innovation categories were identified: consultation support, monitoring with clinician support, self-management with clinician support, and therapy. To assess the methodological quality of the RCTs, and summarize effects and benefits from the methodologically best studies. We searched Medline, EMBASE, PsycINFO, Cochrane Central Register of Controlled Trials, and IEEE Xplore for original studies presented in English-language articles between 1990 and November 2011. Risk of bias and feasibility were judged according to the Cochrane recommendation, and theoretical evidence and preclinical testing were evaluated according to the Framework for Design and Evaluation of Complex Interventions to Improve Health. Three authors assessed the risk of bias and two authors extracted the effect data independently. Disagreement regarding bias assessment, extraction, and interpretation of results were resolved by consensus discussions. Of 642 records identified, we included 32 articles representing 29 studies. No articles fulfilled all quality requirements. All interventions were feasible to implement in a real-life setting, and theoretical evidence was provided for almost all studies. However, preclinical testing was reported in only a third of the articles. We judged three-quarters of the articles to have low risk for random sequence allocation and approximately half of the articles to have low risk for the following biases: allocation concealment, incomplete outcome data, and selective reporting. Slightly more than one fifth of the articles were judged as low risk for blinding of outcome assessment. Only 1 article had low risk of bias for blinding of participants and personnel. We excluded 12

  10. The French national survey on food consumption of children under 3 years of age - Nutri-Bébé 2013: design, methodology, population sampling and feeding practices.

    Science.gov (United States)

    Chouraqui, Jean-Pierre; Tavoularis, Gabriel; Emery, Yves; Francou, Aurée; Hébel, Pascale; Bocquet, Magali; Hankard, Régis; Turck, Dominique

    2018-02-01

    To update the data on food consumption and practices in children under 3 years of age in metropolitan France. The Nutri-Bébé 2013 cross-sectional study selected a random sample, according to the quota sampling method. After giving their informed consent, parents had to record the food consumption during three non-consecutive days framed by two face-to-face interviews, using for quantitative information different portion size measurement aids. One thousand one hundred and eighty-four children were enrolled. Mothers' mean age was 30·8 (sd 5·4) years; 38 % were primiparous; 89 % lived with a partner; 60 % had an occupation. Of the infants younger than 4 months, 31 % were breast-fed. One thousand and thirty-five children consumed infant formula followed by growing-up milk in 63 % of them; solid foods were introduced at a mean age of 5·4 (sd 2·13) months. From 8 months onwards, 25 % of children consumed the same foods as their parents on a more or less regular basis; 29 % ate in front of a screen, with a daily average screen time of 43·0 (sd 40·4) min. This robust survey highlights the low prevalence and duration of breast-feeding in France and shows a modest improvement since the previous survey of 2005 in the observance of recommendations concerning other feeding practices. The frequent consumption of adult foods and the screen time are of concern.

  11. CT-Guided Transgluteal Biopsy for Systematic Random Sampling of the Prostate in Patients Without Rectal Access.

    Science.gov (United States)

    Goenka, Ajit H; Remer, Erick M; Veniero, Joseph C; Thupili, Chakradhar R; Klein, Eric A

    2015-09-01

    The objective of our study was to review our experience with CT-guided transgluteal prostate biopsy in patients without rectal access. Twenty-one CT-guided transgluteal prostate biopsy procedures were performed in 16 men (mean age, 68 years; age range, 60-78 years) who were under conscious sedation. The mean prostate-specific antigen (PSA) value was 11.4 ng/mL (range, 2.3-39.4 ng/mL). Six had seven prior unsuccessful transperineal or transurethral biopsies. Biopsy results, complications, sedation time, and radiation dose were recorded. The mean PSA values and number of core specimens were compared between patients with malignant results and patients with nonmalignant results using the Student t test. The average procedural sedation time was 50.6 minutes (range, 15-90 minutes) (n = 20), and the mean effective radiation dose was 8.2 mSv (median, 6.6 mSv; range 3.6-19.3 mSv) (n = 13). Twenty of the 21 (95%) procedures were technically successful. The only complication was a single episode of gross hematuria and penile pain in one patient, which resolved spontaneously. Of 20 successful biopsies, 8 (40%) yielded adenocarcinoma (Gleason score: mean, 8; range, 7-9). Twelve biopsies yielded nonmalignant results (60%): high-grade prostatic intraepithelial neoplasia (n = 3) or benign prostatic tissue with or without inflammation (n = 9). Three patients had carcinoma diagnosed on subsequent biopsies (second biopsy, n = 2 patients; third biopsy, n = 1 patient). A malignant biopsy result was not significantly associated with the number of core specimens (p = 0.3) or the mean PSA value (p = 0.1). CT-guided transgluteal prostate biopsy is a safe and reliable technique for the systematic random sampling of the prostate in patients without a rectal access. In patients with initial negative biopsy results, repeat biopsy should be considered if there is a persistent rise in the PSA value.

  12. Distribution of peak expiratory flow variability by age, gender and smoking habits in a random population sample aged 20-70 yrs

    NARCIS (Netherlands)

    Boezen, H M; Schouten, J. P.; Postma, D S; Rijcken, B

    1994-01-01

    Peak expiratory flow (PEF) variability can be considered as an index of bronchial lability. Population studies on PEF variability are few. The purpose of the current paper is to describe the distribution of PEF variability in a random population sample of adults with a wide age range (20-70 yrs),

  13. Evaluating effectiveness of down-sampling for stratified designs and unbalanced prevalence in Random Forest models of tree species distributions in Nevada

    Science.gov (United States)

    Elizabeth A. Freeman; Gretchen G. Moisen; Tracy S. Frescino

    2012-01-01

    Random Forests is frequently used to model species distributions over large geographic areas. Complications arise when data used to train the models have been collected in stratified designs that involve different sampling intensity per stratum. The modeling process is further complicated if some of the target species are relatively rare on the landscape leading to an...

  14. Convergence analysis for Latin-hypercube lattice-sample selection strategies for 3D correlated random hydraulic-conductivity fields

    OpenAIRE

    Simuta-Champo, R.; Herrera-Zamarrón, G. S.

    2010-01-01

    The Monte Carlo technique provides a natural method for evaluating uncertainties. The uncertainty is represented by a probability distribution or by related quantities such as statistical moments. When the groundwater flow and transport governing equations are solved and the hydraulic conductivity field is treated as a random spatial function, the hydraulic head, velocities and concentrations also become random spatial functions. When that is the case, for the stochastic simulation of groundw...

  15. Nutritional status and falls in community-dwelling older people: a longitudinal study of a population-based random sample.

    Directory of Open Access Journals (Sweden)

    Ming-Hung Chien

    Full Text Available Falls are common in older people and may lead to functional decline, disability, and death. Many risk factors have been identified, but studies evaluating effects of nutritional status are limited. To determine whether nutritional status is a predictor of falls in older people living in the community, we analyzed data collected through the Survey of Health and Living Status of the Elderly in Taiwan (SHLSET.SHLSET include a series of interview surveys conducted by the government on a random sample of people living in community dwellings in the nation. We included participants who received nutritional status assessment using the Mini Nutritional Assessment Taiwan Version 2 (MNA-T2 in the 1999 survey when they were 53 years or older and followed up on the cumulative incidence of falls in the one-year period before the interview in the 2003 survey.At the beginning of follow-up, the 4440 participants had a mean age of 69.5 (standard deviation= 9.1 years, and 467 participants were "not well-nourished," which was defined as having an MNA-T2 score of 23 or less. In the one-year study period, 659 participants reported having at least one fall. After adjusting for other risk factors, we found the associated odds ratio for falls was 1.73 (95% confidence interval, 1.23, 2.42 for "not well-nourished," 1.57 (1.30, 1.90 for female gender, 1.03 (1.02, 1.04 for one-year older, 1.55 (1.22, 1.98 for history of falls, 1.34 (1.05, 1.72 for hospital stay during the past 12 months, 1.66 (1.07, 2.58 for difficulties in activities of daily living, and 1.53 (1.23, 1.91 for difficulties in instrumental activities of daily living.Nutritional status is an independent predictor of falls in older people living in the community. Further studies are warranted to identify nutritional interventions that can help prevent falls in the elderly.

  16. The relationship between external and internal validity of randomized controlled trials: A sample of hypertension trials from China

    Directory of Open Access Journals (Sweden)

    Xin Zhang

    2015-10-01

    Conclusion: Several components relate to the external validity of RCTs do associate with the internal validity, that do not stand in an easy relationship to each other. Regarding the poor reporting, other possible links between two variables need to trace in the future methodological researches.

  17. Ultrasonic assisted dispersive solid-phase microextraction of Eriochrome Cyanine R from water sample on ultrasonically synthesized lead (II) dioxide nanoparticles loaded on activated carbon: Experimental design methodology.

    Science.gov (United States)

    Bahrani, Sonia; Ghaedi, Mehrorang; Mansoorkhani, Mohammad Javad Khoshnood; Asfaram, Arash; Bazrafshan, Ali Akbar; Purkait, Mihir Kumar

    2017-01-01

    The present research focus on designing an appropriate dispersive solid-phase microextraction (UA-DSPME) for preconcentration and determination of Eriochrome Cyanine R (ECR) in aqueous solutions with aid of sonication using lead (II) dioxide nanoparticles loaded on activated carbon (PbO-NPs-AC). This material was fully identified with XRD and SEM. Influence of pH, amounts of sorbent, type and volume of eluent, and sonication time on response properties were investigated and optimized by central composite design (CCD) combined with surface response methodology using STATISTICA. Among different solvents, dimethyl sulfoxide (DMSO) was selected as an efficient eluent, which its combination by present nanoparticles and application of ultrasound waves led to enhancement in mass transfer. The predicted maximum extraction (100%) under the optimum conditions of the process variables viz. pH 4.5, eluent 200μL, adsorbent dosage 2.5mg and 5min sonication was close to the experimental value (99.50%). at optimum conditions some experimental features like wide 5-2000ngmL -1 ECR, low detection limit (0.43ngmL -1 , S/N=3:1) and good repeatability and reproducibility (relative standard deviation, <5.5%, n=12) indicate versatility in successful applicability of present method for real sample analysis. Investigation of accuracy by spiking known concentration of ECR over 200-600ngmL -1 gave mean recoveries from 94.850% to 101.42% under optimal conditions. The procedure was also applied for the pre-concentration and subsequent determination of ECR in tap and waste waters. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. The use of a lot quality assurance sampling methodology to assess and manage primary health interventions in conflict-affected West Darfur, Sudan.

    Science.gov (United States)

    Pham, Kiemanh; Sharpe, Emily Chambers; Weiss, William M; Vu, Alexander

    2016-01-01

    Organizations working in conflict-affected areas have a need to monitor and evaluate their programs, however this is often difficult due to the logistical challenges of conflict areas. Lot quality assurance sampling may be a suitable method of assessing programs in these situations. We conducted a secondary data analysis of information collected during Medair's routine program management functions. Medair's service area in West Darfur, Sudan was divided into seven supervisory areas. Using the available population information, a sampling frame was developed and interviews were conducted from randomly selected caretakers of children in each supervisory area every six months over 19 months. A survey instrument with questions related to key indicators for immunizations and maternal, newborn, and child health was used for the interviews. Based on Medair's goals for each indicator, decision rules were calculated for the indicators; these decision rules determined which supervisory areas and indicators performed adequately in each assessment period. Pearson's chi-squared tests, adjusted for the survey design using STATA "svy: tab" commands, were used to detect overall differences in coverage in this analysis. The coverage of tetanus toxoid vaccination among pregnant women increased from 47.2 to 69.7 % ( p value = 0.046), and births attended by a skilled health professional increased from 35.7 to 52.7 % ( p value = 0.025) from the first to last assessment periods. Measles vaccinations declined from 72.0 to 54.1 % ( p value = 0.046). The estimated coverage for the proportion of women receiving a postpartum dose of vitamin A (54.7 to 61.3 %, p value = 0.44); pregnant women receiving a clean delivery kit (54.6 to 47.1 %, p value = 0.49); and pentavalent vaccinations (49.7 to 42.1 %, p value = 0.28) did not significantly change. Lot quality assurance sampling was a feasible method for Medair staff to evaluate and optimize primary health programs

  19. Statistical Power and Optimum Sample Allocation Ratio for Treatment and Control Having Unequal Costs Per Unit of Randomization

    Science.gov (United States)

    Liu, Xiaofeng

    2003-01-01

    This article considers optimal sample allocation between the treatment and control condition in multilevel designs when the costs per sampling unit vary due to treatment assignment. Optimal unequal allocation may reduce the cost from that of a balanced design without sacrificing any power. The optimum sample allocation ratio depends only on the…

  20. Estimation of Daily Proteinuria in Patients with Amyloidosis by Using the Protein-To-Creatinine ratio in Random Urine Samples.

    Science.gov (United States)

    Talamo, Giampaolo; Mir Muhammad, A; Pandey, Manoj K; Zhu, Junjia; Creer, Michael H; Malysz, Jozef

    2015-02-11

    Measurement of daily proteinuria in patients with amyloidosis is recommended at the time of diagnosis for assessing renal involvement, and for monitoring disease activity. Renal involvement is usually defined by proteinuria >500 mg/day. We evaluated the accuracy of the random urine protein-to-creatinine ratio (Pr/Cr) in predicting 24 hour proteinuria in patient with amyloidosis. We compared results of random urine Pr/Cr ratio and concomitant 24-hour urine collections in 44 patients with amyloidosis. We found a strong correlation (Spearman's ρ=0.874) between the Pr/Cr ratio and the 24 hour urine protein excretion. For predicting renal involvement, the optimal cut-off point of the Pr/Cr ratio was 715 mg/g. The sensitivity and specificity for this point were 91.8% and 95.5%, respectively, and the area under the curve value was 97.4%. We conclude that the random urine Pr/Cr ratio could be useful in the screening of renal involvement in patients with amyloidosis. If validated in a prospective study, the random urine Pr/Cr ratio could replace the 24 hour urine collection for the assessment of daily proteinuria and presence of nephrotic syndrome in patients with amyloidosis.

  1. Estimation of daily proteinuria in patients with amyloidosis by using the protein-to-creatinine ratio in random urine sample

    Directory of Open Access Journals (Sweden)

    Giampaolo Talamo

    2015-02-01

    Full Text Available Measurement of daily proteinuria in patients with amyloidosis is recommended at the time of diagnosis for assessing renal involvement, and for monitoring disease activity. Renal involvement is usually defined by proteinuria >500 mg/day. We evaluated the accuracy of the random urine protein-to-creatinine ratio (Pr/Cr in predicting 24 hour proteinuria in patient with amyloidosis. We com- pared results of random urine Pr/Cr ratio and concomitant 24-hour urine collections in 44 patients with amyloidosis. We found a strong correlation (Spearman’s ρ=0.874 between the Pr/Cr ratio and the 24 hour urine protein excretion. For predicting renal involvement, the optimal cut-off point of the Pr/Cr ratio was 715 mg/g. The sensitivity and specificity for this point were 91.8% and 95.5%, respectively, and the area under the curve value was 97.4%. We conclude that the random urine Pr/Cr ratio could be useful in the screening of renal involvement in patients with amyloidosis. If validated in a prospective study, the random urine Pr/Cr ratio could replace the 24 hour urine collection for the assessment of daily proteinuria and presence of nephrotic syndrome in patients with amyloidosis.

  2. 40 CFR Appendix I to Subpart S of... - Vehicle Procurement Methodology

    Science.gov (United States)

    2010-07-01

    ... I to Subpart S of Part 86—Vehicle Procurement Methodology I. Test Sampling: The master owner list... randomized master owner list. The manufacturer or their representative shall perform the following steps: (a... order of their appearance on a randomized master owner list until the required number of vehicles are...

  3. Approximating the variance of estimated means for systematic random sampling, illustrated with data of the French Soil Monitoring Network

    NARCIS (Netherlands)

    Brus, D.J.; Saby, N.P.A.

    2016-01-01

    In France like in many other countries, the soil is monitored at the locations of a regular, square grid thus forming a systematic sample (SY). This sampling design leads to good spatial coverage, enhancing the precision of design-based estimates of spatial means and totals. Design-based

  4. The concentration of heavy metals: zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people

    International Nuclear Information System (INIS)

    Wandiga, S.O.; Jumba, I.O.

    1982-01-01

    An intercomparative analysis of the concentration of heavy metals:zinc, cadmium, lead, copper, mercury, iron and calcium in head hair of a randomly selected sample of Kenyan people using the techniques of atomic absorption spectrophotometry (AAS) and differential pulse anodic stripping voltammetry (DPAS) has been undertaken. The percent relative standard deviation for each sample analysed using either of the techniques show good sensitivity and correlation between the techniques. The DPAS was found to be slightly sensitive than the AAs instrument used. The recalculated body burden rations of Cd to Zn, Pb to Fe reveal no unusual health impairement symptoms and suggest a relatively clean environment in Kenya.(author)

  5. A random cluster survey and a convenience sample give comparable estimates of immunity to vaccine preventable diseases in children of school age in Victoria, Australia.

    Science.gov (United States)

    Kelly, Heath; Riddell, Michaela A; Gidding, Heather F; Nolan, Terry; Gilbert, Gwendolyn L

    2002-08-19

    We compared estimates of the age-specific population immunity to measles, mumps, rubella, hepatitis B and varicella zoster viruses in Victorian school children obtained by a national sero-survey, using a convenience sample of residual sera from diagnostic laboratories throughout Australia, with those from a three-stage random cluster survey. When grouped according to school age (primary or secondary school) there was no significant difference in the estimates of immunity to measles, mumps, hepatitis B or varicella. Compared with the convenience sample, the random cluster survey estimated higher immunity to rubella in samples from both primary (98.7% versus 93.6%, P = 0.002) and secondary school students (98.4% versus 93.2%, P = 0.03). Despite some limitations, this study suggests that the collection of a convenience sample of sera from diagnostic laboratories is an appropriate sampling strategy to provide population immunity data that will inform Australia's current and future immunisation policies. Copyright 2002 Elsevier Science Ltd.

  6. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    Science.gov (United States)

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  7. Differentiating emotions across contexts: comparing adults with and without social anxiety disorder using random, social interaction, and daily experience sampling.

    Science.gov (United States)

    Kashdan, Todd B; Farmer, Antonina S

    2014-06-01

    The ability to recognize and label emotional experiences has been associated with well-being and adaptive functioning. This skill is particularly important in social situations, as emotions provide information about the state of relationships and help guide interpersonal decisions, such as whether to disclose personal information. Given the interpersonal difficulties linked to social anxiety disorder (SAD), deficient negative emotion differentiation may contribute to impairment in this population. We hypothesized that people with SAD would exhibit less negative emotion differentiation in daily life, and these differences would translate to impairment in social functioning. We recruited 43 people diagnosed with generalized SAD and 43 healthy adults to describe the emotions they experienced over 14 days. Participants received palmtop computers for responding to random prompts and describing naturalistic social interactions; to complete end-of-day diary entries, they used a secure online website. We calculated intraclass correlation coefficients to capture the degree of differentiation of negative and positive emotions for each context (random moments, face-to-face social interactions, and end-of-day reflections). Compared to healthy controls, the SAD group exhibited less negative (but not positive) emotion differentiation during random prompts, social interactions, and (at trend level) end-of-day assessments. These differences could not be explained by emotion intensity or variability over the 14 days, or to comorbid depression or anxiety disorders. Our findings suggest that people with generalized SAD have deficits in clarifying specific negative emotions felt at a given point of time. These deficits may contribute to difficulties with effective emotion regulation and healthy social relationship functioning.

  8. Differentiating Emotions Across Contexts: Comparing Adults with and without Social Anxiety Disorder Using Random, Social Interaction, and Daily Experience Sampling

    Science.gov (United States)

    Kashdan, Todd B.; Farmer, Antonina S.

    2014-01-01

    The ability to recognize and label emotional experiences has been associated with well-being and adaptive functioning. This skill is particularly important in social situations, as emotions provide information about the state of relationships and help guide interpersonal decisions, such as whether to disclose personal information. Given the interpersonal difficulties linked to social anxiety disorder (SAD), deficient negative emotion differentiation may contribute to impairment in this population. We hypothesized that people with SAD would exhibit less negative emotion differentiation in daily life, and these differences would translate to impairment in social functioning. We recruited 43 people diagnosed with generalized SAD and 43 healthy adults to describe the emotions they experienced over 14 days. Participants received palmtop computers for responding to random prompts and describing naturalistic social interactions; to complete end-of-day diary entries, they used a secure online website. We calculated intraclass correlation coefficients to capture the degree of differentiation of negative and positive emotions for each context (random moments, face-to-face social interactions, and end-of-day reflections). Compared to healthy controls, the SAD group exhibited less negative (but not positive) emotion differentiation during random prompts, social interactions, and (at trend level) end-of-day assessments. These differences could not be explained by emotion intensity or variability over the 14 days, or to comorbid depression or anxiety disorders. Our findings suggest that people with generalized SAD have deficits in clarifying specific negative emotions felt at a given point of time. These deficits may contribute to difficulties with effective emotion regulation and healthy social relationship functioning. PMID:24512246

  9. Innovative Methodologies for 21st Century Learning, Teaching and Assessment: A Convenience Sampling Investigation into the Use of Social Media Technologies in Higher Education

    Science.gov (United States)

    Kivunja, Charles

    2015-01-01

    The advent of the Web as a social technology has created opportunities for the creation of informal learning environments, which have potential for innovative methodologies in learning, teaching and assessment. However, as Wolfe (2001) admonishes, "contrary to the rhetoric of cheerleaders, the Web places greater demands on students than…

  10. Reconstructing random media

    International Nuclear Information System (INIS)

    Yeong, C.L.; Torquato, S.

    1998-01-01

    We formulate a procedure to reconstruct the structure of general random heterogeneous media from limited morphological information by extending the methodology of Rintoul and Torquato [J. Colloid Interface Sci. 186, 467 (1997)] developed for dispersions. The procedure has the advantages that it is simple to implement and generally applicable to multidimensional, multiphase, and anisotropic structures. Furthermore, an extremely useful feature is that it can incorporate any type and number of correlation functions in order to provide as much morphological information as is necessary for accurate reconstruction. We consider a variety of one- and two-dimensional reconstructions, including periodic and random arrays of rods, various distribution of disks, Debye random media, and a Fontainebleau sandstone sample. We also use our algorithm to construct heterogeneous media from specified hypothetical correlation functions, including an exponentially damped, oscillating function as well as physically unrealizable ones. copyright 1998 The American Physical Society

  11. A randomized trial of a DWI intervention program for first offenders: intervention outcomes and interactions with antisocial personality disorder among a primarily American-Indian sample.

    Science.gov (United States)

    Woodall, W Gill; Delaney, Harold D; Kunitz, Stephen J; Westerberg, Verner S; Zhao, Hongwei

    2007-06-01

    Randomized trial evidence on the effectiveness of incarceration and treatment of first-time driving while intoxicated (DWI) offenders who are primarily American Indian has yet to be reported in the literature on DWI prevention. Further, research has confirmed the association of antisocial personality disorder (ASPD) with problems with alcohol including DWI. A randomized clinical trial was conducted, in conjunction with 28 days of incarceration, of a treatment program incorporating motivational interviewing principles for first-time DWI offenders. The sample of 305 offenders including 52 diagnosed as ASPD by the Diagnostic Interview Schedule were assessed before assignment to conditions and at 6, 12, and 24 months after discharge. Self-reported frequency of drinking and driving as well as various measures of drinking over the preceding 90 days were available at all assessments for 244 participants. Further, DWI rearrest data for 274 participants were available for analysis. Participants randomized to receive the first offender incarceration and treatment program reported greater reductions in alcohol consumption from baseline levels when compared with participants who were only incarcerated. Antisocial personality disorder participants reported heavier and more frequent drinking but showed significantly greater declines in drinking from intake to posttreatment assessments. Further, the treatment resulted in larger effects relative to the control on ASPD than non-ASPD participants. Nonconfrontational treatment may significantly enhance outcomes for DWI offenders with ASPD when delivered in an incarcerated setting, and in the present study, such effects were found in a primarily American-Indian sample.

  12. Assessing differences in groups randomized by recruitment chain in a respondent-driven sample of Seattle-area injection drug users.

    Science.gov (United States)

    Burt, Richard D; Thiede, Hanne

    2014-11-01

    Respondent-driven sampling (RDS) is a form of peer-based study recruitment and analysis that incorporates features designed to limit and adjust for biases in traditional snowball sampling. It is being widely used in studies of hidden populations. We report an empirical evaluation of RDS's consistency and variability, comparing groups recruited contemporaneously, by identical methods and using identical survey instruments. We randomized recruitment chains from the RDS-based 2012 National HIV Behavioral Surveillance survey of injection drug users in the Seattle area into two groups and compared them in terms of sociodemographic characteristics, drug-associated risk behaviors, sexual risk behaviors, human immunodeficiency virus (HIV) status and HIV testing frequency. The two groups differed in five of the 18 variables examined (P ≤ .001): race (e.g., 60% white vs. 47%), gender (52% male vs. 67%), area of residence (32% downtown Seattle vs. 44%), an HIV test in the previous 12 months (51% vs. 38%). The difference in serologic HIV status was particularly pronounced (4% positive vs. 18%). In four further randomizations, differences in one to five variables attained this level of significance, although the specific variables involved differed. We found some material differences between the randomized groups. Although the variability of the present study was less than has been reported in serial RDS surveys, these findings indicate caution in the interpretation of RDS results. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Computed tomography of the brain, hepatotoxic drugs and high alcohol consumption in male alcoholic patients and a random sample from the general male population

    Energy Technology Data Exchange (ETDEWEB)

    Muetzell, S. (Univ. Hospital of Uppsala (Sweden). Dept. of Family Medicine)

    1992-01-01

    Computed tomography (CT) of the brain was performed in a random sample of a total of 195 men and 211 male alcoholic patients admitted for the first time during a period of two years from the same geographically limited area of Greater Stockholm as the sample. Laboratory tests were performed, including liver and pancreatic tests. Toxicological screening was performed and the consumption of hepatotoxic drugs was also investigated. The groups were then subdivided with respect to alcohol consumption and use of hepatotoxic drugs: group IA, men from the random sample with low or moderate alcohol consumption and no use of hepatotoxic drugs; IB, men from the random sample with low or moderate alcohol consumption with use of hepatotoxic drugs; IIA, alcoholic inpatients with use of alcohol and no drugs; and IIB, alcoholic inpatients with use of alcohol and drugs. Group IIB was found to have a higher incidence of cortical and subcortical changes than group IA. Group IB had a higher incidence of subcortical changes than group IA, and they differed only in drug use. Groups IIN and IIA only differed in drug use, and IIB had a higher incidence of brian damage except for anterior horn index and wide cerebellar sulci indicating vermian atrophy. Significantly higher serum levels of bilirubin, GGT, ASAT, ALAT, CK LD, and amylase were found in IIB. The results indicate that drug use influences the incidence of cortical and subcortical aberrations, except anterior horn index. It is concluded that the groups with alcohol abuse who used hepatotoxic drugs showed a picture of cortical changes (wide transport sulci and clear-cut of high-grade cortical changes) and also of subcortical aberrations, expressed as an increased widening on the third ventricle.

  14. Computed tomography of the brain, hepatotoxic drugs and high alcohol consumption in male alcoholic patients and a random sample from the general male population

    International Nuclear Information System (INIS)

    Muetzell, S.

    1992-01-01

    Computed tomography (CT) of the brain was performed in a random sample of a total of 195 men and 211 male alcoholic patients admitted for the first time during a period of two years from the same geographically limited area of Greater Stockholm as the sample. Laboratory tests were performed, including liver and pancreatic tests. Toxicological screening was performed and the consumption of hepatotoxic drugs was also investigated. The groups were then subdivided with respect to alcohol consumption and use of hepatotoxic drugs: group IA, men from the random sample with low or moderate alcohol consumption and no use of hepatotoxic drugs; IB, men from the random sample with low or moderate alcohol consumption with use of hepatotoxic drugs; IIA, alcoholic inpatients with use of alcohol and no drugs; and IIB, alcoholic inpatients with use of alcohol and drugs. Group IIB was found to have a higher incidence of cortical and subcortical changes than group IA. Group IB had a higher incidence of subcortical changes than group IA, and they differed only in drug use. Groups IIN and IIA only differed in drug use, and IIB had a higher incidence of brian damage except for anterior horn index and wide cerebellar sulci indicating vermian atrophy. Significantly higher serum levels of bilirubin, GGT, ASAT, ALAT, CK LD, and amylase were found in IIB. The results indicate that drug use influences the incidence of cortical and subcortical aberrations, except anterior horn index. It is concluded that the groups with alcohol abuse who used hepatotoxic drugs showed a picture of cortical changes (wide transport sulci and clear-cut of high-grade cortical changes) and also of subcortical aberrations, expressed as an increased widening on the third ventricle

  15. Open-Label Randomized Trial of Titrated Disease Management for Patients with Hypertension: Study Design and Baseline Sample Characteristics

    Science.gov (United States)

    Jackson, George L.; Weinberger, Morris; Kirshner, Miriam A.; Stechuchak, Karen M.; Melnyk, Stephanie D.; Bosworth, Hayden B.; Coffman, Cynthia J.; Neelon, Brian; Van Houtven, Courtney; Gentry, Pamela W.; Morris, Isis J.; Rose, Cynthia M.; Taylor, Jennifer P.; May, Carrie L.; Han, Byungjoo; Wainwright, Christi; Alkon, Aviel; Powell, Lesa; Edelman, David

    2016-01-01

    Despite the availability of efficacious treatments, only half of patients with hypertension achieve adequate blood pressure (BP) control. This paper describes the protocol and baseline subject characteristics of a 2-arm, 18-month randomized clinical trial of titrated disease management (TDM) for patients with pharmaceutically-treated hypertension for whom systolic blood pressure (SBP) is not controlled (≥140mmHg for non-diabetic or ≥130mmHg for diabetic patients). The trial is being conducted among patients of four clinic locations associated with a Veterans Affairs Medical Center. An intervention arm has a TDM strategy in which patients' hypertension control at baseline, 6, and 12 months determines the resource intensity of disease management. Intensity levels include: a low-intensity strategy utilizing a licensed practical nurse to provide bi-monthly, non-tailored behavioral support calls to patients whose SBP comes under control; medium-intensity strategy utilizing a registered nurse to provide monthly tailored behavioral support telephone calls plus home BP monitoring; and high-intensity strategy utilizing a pharmacist to provide monthly tailored behavioral support telephone calls, home BP monitoring, and pharmacist-directed medication management. Control arm patients receive the low-intensity strategy regardless of BP control. The primary outcome is SBP. There are 385 randomized (192 intervention; 193 control) veterans that are predominately older (mean age 63.5 years) men (92.5%). 61.8% are African American, and the mean baseline SBP for all subjects is 143.6mmHg. This trial will determine if a disease management program that is titrated by matching the intensity of resources to patients' BP control leads to superior outcomes compared to a low-intensity management strategy. PMID:27417982

  16. Predictors of poor retention on antiretroviral therapy as a major HIV drug resistance early warning indicator in Cameroon: results from a nationwide systematic random sampling.

    Science.gov (United States)

    Billong, Serge Clotaire; Fokam, Joseph; Penda, Calixte Ida; Amadou, Salmon; Kob, David Same; Billong, Edson-Joan; Colizzi, Vittorio; Ndjolo, Alexis; Bisseck, Anne-Cecile Zoung-Kani; Elat, Jean-Bosco Nfetam

    2016-11-15

    Retention on lifelong antiretroviral therapy (ART) is essential in sustaining treatment success while preventing HIV drug resistance (HIVDR), especially in resource-limited settings (RLS). In an era of rising numbers of patients on ART, mastering patients in care is becoming more strategic for programmatic interventions. Due to lapses and uncertainty with the current WHO sampling approach in Cameroon, we thus aimed to ascertain the national performance of, and determinants in, retention on ART at 12 months. Using a systematic random sampling, a survey was conducted in the ten regions (56 sites) of Cameroon, within the "reporting period" of October 2013-November 2014, enrolling 5005 eligible adults and children. Performance in retention on ART at 12 months was interpreted following the definition of HIVDR early warning indicator: excellent (>85%), fair (85-75%), poor (sampling strategy could be further strengthened for informed ART monitoring and HIVDR prevention perspectives.

  17. An econometric method for estimating population parameters from non-random samples: An application to clinical case finding.

    Science.gov (United States)

    Burger, Rulof P; McLaren, Zoë M

    2017-09-01

    The problem of sample selection complicates the process of drawing inference about populations. Selective sampling arises in many real world situations when agents such as doctors and customs officials search for targets with high values of a characteristic. We propose a new method for estimating population characteristics from these types of selected samples. We develop a model that captures key features of the agent's sampling decision. We use a generalized method of moments with instrumental variables and maximum likelihood to estimate the population prevalence of the characteristic of interest and the agents' accuracy in identifying targets. We apply this method to tuberculosis (TB), which is the leading infectious disease cause of death worldwide. We use a national database of TB test data from South Africa to examine testing for multidrug resistant TB (MDR-TB). Approximately one quarter of MDR-TB cases was undiagnosed between 2004 and 2010. The official estimate of 2.5% is therefore too low, and MDR-TB prevalence is as high as 3.5%. Signal-to-noise ratios are estimated to be between 0.5 and 1. Our approach is widely applicable because of the availability of routinely collected data and abundance of potential instruments. Using routinely collected data to monitor population prevalence can guide evidence-based policy making. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Randomized controlled trial of endoscopic ultrasound-guided fine-needle sampling with or without suction for better cytological diagnosis

    DEFF Research Database (Denmark)

    Puri, Rajesh; Vilmann, Peter; Saftoiu, Adrian

    2009-01-01

    ). The samples were characterized for cellularity and bloodiness, with a final cytology diagnosis established blindly. The final diagnosis was reached either by EUS-FNA if malignancy was definite, or by surgery and/or clinical follow-up of a minimum of 6 months in the cases of non-specific benign lesions...

  19. Rationale, design and methodology of a double-blind, randomized, placebo-controlled study of escitalopram in prevention of Depression in Acute Coronary Syndrome (DECARD)

    DEFF Research Database (Denmark)

    Hansen, Baiba Hedegaard; Hanash, Jamal Abed; Rasmussen, Alice

    2009-01-01

    with acute coronary syndrome. METHODS: Two hundred forty non-depressed patients with acute coronary syndrome are randomized to treatment with either escitalopram or placebo for 1 year. Psychiatric and cardiac assessment of patients is performed to evaluate the possibility of preventing depression. Diagnosis...

  20. Aerobic physical activity and resistance training: an application of the theory of planned behavior among adults with type 2 diabetes in a random, national sample of Canadians

    Directory of Open Access Journals (Sweden)

    Karunamuni Nandini

    2008-12-01

    Full Text Available Abstract Background Aerobic physical activity (PA and resistance training are paramount in the treatment and management of type 2 diabetes (T2D, but few studies have examined the determinants of both types of exercise in the same sample. Objective The primary purpose was to investigate the utility of the Theory of Planned Behavior (TPB in explaining aerobic PA and resistance training in a population sample of T2D adults. Methods A total of 244 individuals were recruited through a random national sample which was created by generating a random list of household phone numbers. The list was proportionate to the actual number of household telephone numbers for each Canadian province (with the exception of Quebec. These individuals completed self-report TPB constructs of attitude, subjective norm, perceived behavioral control and intention, and a 3-month follow-up that assessed aerobic PA and resistance training. Results TPB explained 10% and 8% of the variance respectively for aerobic PA and resistance training; and accounted for 39% and 45% of the variance respectively for aerobic PA and resistance training intentions. Conclusion These results may guide the development of appropriate PA interventions for aerobic PA and resistance training based on the TPB.

  1. Detection of cytomegalovirus in blood donors by PCR using the digene SHARP signal system assay: effects of sample preparation and detection methodology.

    OpenAIRE

    Krajden, M; Shankaran, P; Bourke, C; Lau, W

    1996-01-01

    Cytomegalovirus (CMV) is an important cause of transfusion-associated morbidity and mortality; however, only 0.4 to 12% of the blood products obtained from seropositive blood donors transmit infection. The effects of three commercially available whole-blood sample preparation kits on the detection of CMV PCR products by a semiquantitative adaptation of the Digene SHARP Signal System Assay (DSSSA) in samples from volunteer blood donors was assessed. Of 101 samples from seropositive blood donor...

  2. Ventilatory Function in Relation to Mining Experience and Smoking in a Random Sample of Miners and Non-miners in a Witwatersrand Town1

    Science.gov (United States)

    Sluis-Cremer, G. K.; Walters, L. G.; Sichel, H. S.

    1967-01-01

    The ventilatory capacity of a random sample of men over the age of 35 years in the town of Carletonville was estimated by the forced expiratory volume and the peak expiratory flow rate. Five hundred and sixty-two persons were working or had worked in gold-mines and 265 had never worked in gold-mines. No difference in ventilatory function was found between the miners and non-miners other than that due to the excess of chronic bronchitis in miners. PMID:6017134

  3. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  4. Job strain and resting heart rate: a cross-sectional study in a Swedish random working sample

    Directory of Open Access Journals (Sweden)

    Peter Eriksson

    2016-03-01

    Full Text Available Abstract Background Numerous studies have reported an association between stressing work conditions and cardiovascular disease. However, more evidence is needed, and the etiological mechanisms are unknown. Elevated resting heart rate has emerged as a possible risk factor for cardiovascular disease, but little is known about the relation to work-related stress. This study therefore investigated the association between job strain, job control, and job demands and resting heart rate. Methods We conducted a cross-sectional survey of randomly selected men and women in Västra Götalandsregionen, Sweden (West county of Sweden (n = 1552. Information about job strain, job demands, job control, heart rate and covariates was collected during the period 2001–2004 as part of the INTERGENE/ADONIX research project. Six different linear regression models were used with adjustments for gender, age, BMI, smoking, education, and physical activity in the fully adjusted model. Job strain was operationalized as the log-transformed ratio of job demands over job control in the statistical analyses. Results No associations were seen between resting heart rate and job demands. Job strain was associated with elevated resting heart rate in the unadjusted model (linear regression coefficient 1.26, 95 % CI 0.14 to 2.38, but not in any of the extended models. Low job control was associated with elevated resting heart rate after adjustments for gender, age, BMI, and smoking (linear regression coefficient −0.18, 95 % CI −0.30 to −0.02. However, there were no significant associations in the fully adjusted model. Conclusions Low job control and job strain, but not job demands, were associated with elevated resting heart rate. However, the observed associations were modest and may be explained by confounding effects.

  5. Reducing Eating Disorder Onset in a Very High Risk Sample with Significant Comorbid Depression: A Randomized Controlled Trial

    Science.gov (United States)

    Taylor, C. Barr; Kass, Andrea E.; Trockel, Mickey; Cunning, Darby; Weisman, Hannah; Bailey, Jakki; Sinton, Meghan; Aspen, Vandana; Schecthman, Kenneth; Jacobi, Corinna; Wilfley, Denise E.

    2015-01-01

    Objective Eating disorders (EDs) are serious problems among college-age women and may be preventable. An indicated on-line eating disorder (ED) intervention, designed to reduce ED and comorbid pathology, was evaluated. Method 206 women (M age = 20 ± 1.8 years; 51% White/Caucasian, 11% African American, 10% Hispanic, 21% Asian/Asian American, 7% other) at very high risk for ED onset (i.e., with high weight/shape concerns plus a history of being teased, current or lifetime depression, and/or non-clinical levels of compensatory behaviors) were randomized to a 10-week, Internet-based, cognitive-behavioral intervention or wait-list control. Assessments included the Eating Disorder Examination (EDE to assess ED onset), EDE-Questionnaire, Structured Clinical Interview for DSM Disorders, and Beck Depression Inventory-II. Results ED attitudes and behaviors improved more in the intervention than control group (p = 0.02, d = 0.31); although ED onset rate was 27% lower, this difference was not significant (p = 0.28, NNT = 15). In the subgroup with highest shape concerns, ED onset rate was significantly lower in the intervention than control group (20% versus 42%, p = 0.025, NNT = 5). For the 27 individuals with depression at baseline, depressive symptomatology improved more in the intervention than control group (p = 0.016, d = 0.96); although ED onset rate was lower in the intervention than control group, this difference was not significant (25% versus 57%, NNT = 4). Conclusions An inexpensive, easily disseminated intervention might reduce ED onset among those at highest risk. Low adoption rates need to be addressed in future research. PMID:26795936

  6. Reducing eating disorder onset in a very high risk sample with significant comorbid depression: A randomized controlled trial.

    Science.gov (United States)

    Taylor, C Barr; Kass, Andrea E; Trockel, Mickey; Cunning, Darby; Weisman, Hannah; Bailey, Jakki; Sinton, Meghan; Aspen, Vandana; Schecthman, Kenneth; Jacobi, Corinna; Wilfley, Denise E

    2016-05-01

    Eating disorders (EDs) are serious problems among college-age women and may be preventable. An indicated online eating disorder (ED) intervention, designed to reduce ED and comorbid pathology, was evaluated. 206 women (M age = 20 ± 1.8 years; 51% White/Caucasian, 11% African American, 10% Hispanic, 21% Asian/Asian American, 7% other) at very high risk for ED onset (i.e., with high weight/shape concerns plus a history of being teased, current or lifetime depression, and/or nonclinical levels of compensatory behaviors) were randomized to a 10-week, Internet-based, cognitive-behavioral intervention or waitlist control. Assessments included the Eating Disorder Examination (EDE, to assess ED onset), EDE-Questionnaire, Structured Clinical Interview for DSM Disorders, and Beck Depression Inventory-II. ED attitudes and behaviors improved more in the intervention than control group (p = .02, d = 0.31); although ED onset rate was 27% lower, this difference was not significant (p = .28, NNT = 15). In the subgroup with highest shape concerns, ED onset rate was significantly lower in the intervention than control group (20% vs. 42%, p = .025, NNT = 5). For the 27 individuals with depression at baseline, depressive symptomatology improved more in the intervention than control group (p = .016, d = 0.96); although ED onset rate was lower in the intervention than control group, this difference was not significant (25% vs. 57%, NNT = 4). An inexpensive, easily disseminated intervention might reduce ED onset among those at highest risk. Low adoption rates need to be addressed in future research. (c) 2016 APA, all rights reserved).

  7. Undergraduate student drinking and related harms at an Australian university: web-based survey of a large random sample

    Directory of Open Access Journals (Sweden)

    Hallett Jonathan

    2012-01-01

    Full Text Available Abstract Background There is considerable interest in university student hazardous drinking among the media and policy makers. However there have been no population-based studies in Australia to date. We sought to estimate the prevalence and correlates of hazardous drinking and secondhand effects among undergraduates at a Western Australian university. Method We invited 13,000 randomly selected undergraduate students from a commuter university in Australia to participate in an online survey of university drinking. Responses were received from 7,237 students (56%, who served as participants in this study. Results Ninety percent had consumed alcohol in the last 12 months and 34% met criteria for hazardous drinking (AUDIT score ≥ 8 and greater than 6 standard drinks in one sitting in the previous month. Men and Australian/New Zealand residents had significantly increased odds (OR: 2.1; 95% CI: 1.9-2.3; OR: 5.2; 95% CI: 4.4-6.2 of being categorised as dependent (AUDIT score 20 or over than women and non-residents. In the previous 4 weeks, 13% of students had been insulted or humiliated and 6% had been pushed, hit or otherwise assaulted by others who were drinking. One percent of respondents had experienced sexual assault in this time period. Conclusions Half of men and over a third of women were drinking at hazardous levels and a relatively large proportion of students were negatively affected by their own and other students' drinking. There is a need for intervention to reduce hazardous drinking early in university participation. Trial registration ACTRN12608000104358

  8. Sampling and assessment accuracy in mate choice: a random-walk model of information processing in mating decision.

    Science.gov (United States)

    Castellano, Sergio; Cermelli, Paolo

    2011-04-07

    Mate choice depends on mating preferences and on the manner in which mate-quality information is acquired and used to make decisions. We present a model that describes how these two components of mating decision interact with each other during a comparative evaluation of prospective mates. The model, with its well-explored precedents in psychology and neurophysiology, assumes that decisions are made by the integration over time of noisy information until a stopping-rule criterion is reached. Due to this informational approach, the model builds a coherent theoretical framework for developing an integrated view of functions and mechanisms of mating decisions. From a functional point of view, the model allows us to investigate speed-accuracy tradeoffs in mating decision at both population and individual levels. It shows that, under strong time constraints, decision makers are expected to make fast and frugal decisions and to optimally trade off population-sampling accuracy (i.e. the number of sampled males) against individual-assessment accuracy (i.e. the time spent for evaluating each mate). From the proximate-mechanism point of view, the model makes testable predictions on the interactions of mating preferences and choosiness in different contexts and it might be of compelling empirical utility for a context-independent description of mating preference strength. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Importance of methodology on (99m)technetium dimercapto-succinic acid scintigraphic image quality: imaging pilot study for RIVUR (Randomized Intervention for Children With Vesicoureteral Reflux) multicenter investigation.

    Science.gov (United States)

    Ziessman, Harvey A; Majd, Massoud

    2009-07-01

    We reviewed our experience with (99m)technetium dimercapto-succinic acid scintigraphy obtained during an imaging pilot study for a multicenter investigation (Randomized Intervention for Children With Vesicoureteral Reflux) of the effectiveness of daily antimicrobial prophylaxis for preventing recurrent urinary tract infection and renal scarring. We analyzed imaging methodology and its relation to diagnostic image quality. (99m)Technetium dimercapto-succinic acid imaging guidelines were provided to participating sites. High-resolution planar imaging with parallel hole or pinhole collimation was required. Two core reviewers evaluated all submitted images. Analysis included appropriate views, presence or lack of patient motion, adequate magnification, sufficient counts and diagnostic image quality. Inter-reader agreement was evaluated. We evaluated 70, (99m)technetium dimercapto-succinic acid studies from 14 institutions. Variability was noted in methodology and image quality. Correlation (r value) between dose administered and patient age was 0.780. For parallel hole collimator imaging good correlation was noted between activity administered and counts (r = 0.800). For pinhole imaging the correlation was poor (r = 0.110). A total of 10 studies (17%) were rejected for quality issues of motion, kidney overlap, inadequate magnification, inadequate counts and poor quality images. The submitting institution was informed and provided with recommendations for improving quality, and resubmission of another study was required. Only 4 studies (6%) were judged differently by the 2 reviewers, and the differences were minor. Methodology and image quality for (99m)technetium dimercapto-succinic acid scintigraphy varied more than expected between institutions. The most common reason for poor image quality was inadequate count acquisition with insufficient attention to the tradeoff between administered dose, length of image acquisition, start time of imaging and resulting image

  10. Methodology for the detection of contamination by hydrocarbons and further soil sampling for volatile and semi-volatile organic enrichment in former petrol stations, SE Spain

    Directory of Open Access Journals (Sweden)

    Rosa María Rosales Aranda

    2012-01-01

    Full Text Available The optimal detection and quantification of contamination plumes in soil and groundwater by petroleum organic compounds, gasoline and diesel, is critical for the reclamation of hydrocarbons contaminated soil at petrol stations. Through this study it has been achieved a sampling stage optimization in these scenarios by means of the location of potential contamination areas before sampling with the application of the 2D electrical resistivity tomography method, a geophysical non destructive technique based on resistivity measurements in soils. After the detection of hydrocarbons contaminated areas, boreholes with continuous coring were performed in a petrol station located in Murcia Region (Spain. The drillholes reached depths down to 10 m and soil samples were taken from each meter of the drilling. The optimization in the soil samples handling and storage, for both volatile and semi-volatile organic compounds determinations, was achieved by designing a soil sampler to minimize volatilization losses and in order to avoid the manual contact with the environmental samples during the sampling. The preservation of soil samples was performed according to Europe regulations and US Environmental Protection Agency recommendations into two kinds of glass vials. Moreover, it has been taken into account the determination techniques to quantify the hydrocarbon pollution based on Gas Chromatography with different detectors and headspace technique to reach a liquid-gas equilibrium for volatile analyses.

  11. A school-based comprehensive lifestyle intervention among chinese kids against obesity (CLICK-Obesity: rationale, design and methodology of a randomized controlled trial in Nanjing city, China

    Directory of Open Access Journals (Sweden)

    Xu Fei

    2012-06-01

    Full Text Available Abstract Background The prevalence of childhood obesity among adolescents has been rapidly rising in Mainland China in recent decades, especially in urban and rich areas. There is an urgent need to develop effective interventions to prevent childhood obesity. Limited data regarding adolescent overweight prevention in China are available. Thus, we developed a school-based intervention with the aim of reducing excess body weight in children. This report described the study design. Methods/design We designed a cluster randomized controlled trial in 8 randomly selected urban primary schools between May 2010 and December 2013. Each school was randomly assigned to either the intervention or control group (four schools in each group. Participants were the 4th graders in each participating school. The multi-component program was implemented within the intervention group, while students in the control group followed their usual health and physical education curriculum with no additional intervention program. The intervention consisted of four components: a classroom curriculum, (including physical education and healthy diet education, b school environment support, c family involvement, and d fun programs/events. The primary study outcome was body composition, and secondary outcomes were behaviour and behavioural determinants. Discussion The intervention was designed with due consideration of Chinese cultural and familial tradition, social convention, and current primary education and exam system in Mainland China. We did our best to gain good support from educational authorities, school administrators, teachers and parents, and to integrate intervention components into schools’ regular academic programs. The results of and lesson learned from this study will help guide future school-based childhood obesity prevention programs in Mainland China. Trial registration Registration number: ChiCTR-ERC-11001819

  12. Development of a simple and low-cost enzymatic methodology for quantitative analysis of carbamates in meat samples of forensic interest.

    Science.gov (United States)

    Sabino, Bruno Duarte; Torraca, Tathiana Guilliod; Moura, Claudia Melo; Rozenbaum, Hannah Felicia; de Castro Faria, Mauro Velho

    2010-05-01

    Foods contaminated with a granulated material similar to Temik (a commercial pesticide formulation containing the carbamate insecticide aldicarb) are often involved in accidental ingestion, suicides, and homicides in Brazil. We developed a simple technique to detect aldicarb. This technique is based on the inhibition of a stable preparation of the enzyme acetylcholinesterase, and it is specially adapted for forensic purposes. It comprises an initial extraction step with the solvent methylene chloride followed by a colorimetric acetylcholinesterase assay. We propose that results of testing contaminated forensic samples be expressed in aldicarb equivalents because, even though all other carbamates are also potent enzyme inhibitors, aldicarb is the contaminant most frequently found in forensic samples. This method is rapid (several samples can be run in a period of 2 h) and low cost. This method also proved to be precise and accurate, detecting concentrations as low as 40 microg/kg of aldicarb in meat samples.

  13. Two to five repeated measurements per patient reduced the required sample size considerably in a randomized clinical trial for patients with inflammatory rheumatic diseases

    Directory of Open Access Journals (Sweden)

    Smedslund Geir

    2013-02-01

    Full Text Available Abstract Background Patient reported outcomes are accepted as important outcome measures in rheumatology. The fluctuating symptoms in patients with rheumatic diseases have serious implications for sample size in clinical trials. We estimated the effects of measuring the outcome 1-5 times on the sample size required in a two-armed trial. Findings In a randomized controlled trial that evaluated the effects of a mindfulness-based group intervention for patients with inflammatory arthritis (n=71, the outcome variables Numerical Rating Scales (NRS (pain, fatigue, disease activity, self-care ability, and emotional wellbeing and General Health Questionnaire (GHQ-20 were measured five times before and after the intervention. For each variable we calculated the necessary sample sizes for obtaining 80% power (α=.05 for one up to five measurements. Two, three, and four measures reduced the required sample sizes by 15%, 21%, and 24%, respectively. With three (and five measures, the required sample size per group was reduced from 56 to 39 (32 for the GHQ-20, from 71 to 60 (55 for pain, 96 to 71 (73 for fatigue, 57 to 51 (48 for disease activity, 59 to 44 (45 for self-care, and 47 to 37 (33 for emotional wellbeing. Conclusions Measuring the outcomes five times rather than once reduced the necessary sample size by an average of 27%. When planning a study, researchers should carefully compare the advantages and disadvantages of increasing sample size versus employing three to five repeated measurements in order to obtain the required statistical power.

  14. Missing citations due to exact reference matching: Analysis of a random sample from WoS. Are publications from peripheral countries disadvantaged?

    Energy Technology Data Exchange (ETDEWEB)

    Donner, P.

    2016-07-01

    Citation counts of scientific research contributions are one fundamental data in scientometrics. Accuracy and completeness of citation links are therefore crucial data quality issues (Moed, 2005, Ch. 13). However, despite the known flaws of reference matching algorithms, usually no attempts are made to incorporate uncertainty about citation counts into indicators. This study is a step towards that goal. Particular attention is paid to the question whether publications from countries not using basic Latin script are differently affected by missed citations. The proprietary reference matching procedure of Web of Science (WoS) is based on (near) exact agreement of cited reference data (normalized during processing) to the target papers bibliographical data. Consequently, the procedure has near-optimal precision but incomplete recall - it is known to miss some slightly inaccurate reference links (Olensky, 2015). However, there has been no attempt so far to estimate the rate of missed citations by a principled method for a random sample. For this study a simple random sample of WoS source papers was drawn and it was attempted to find all reference strings of WoS indexed documents that refer to them, in particular inexact matches. The objective is to give a statistical estimate of the proportion of missed citations and to describe the relationship of the number of found citations to the number of missed citations, i.e. the conditional error distribution. The empirical error distribution is statistically analyzed and modelled. (Author)

  15. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  16. Comparison of address-based sampling and random-digit dialing methods for recruiting young men as controls in a case-control study of testicular cancer susceptibility.

    Science.gov (United States)

    Clagett, Bartholt; Nathanson, Katherine L; Ciosek, Stephanie L; McDermoth, Monique; Vaughn, David J; Mitra, Nandita; Weiss, Andrew; Martonik, Rachel; Kanetsky, Peter A

    2013-12-01

    Random-digit dialing (RDD) using landline telephone numbers is the historical gold standard for control recruitment in population-based epidemiologic research. However, increasing cell-phone usage and diminishing response rates suggest that the effectiveness of RDD in recruiting a random sample of the general population, particularly for younger target populations, is decreasing. In this study, we compared landline RDD with alternative methods of control recruitment, including RDD using cell-phone numbers and address-based sampling (ABS), to recruit primarily white men aged 18-55 years into a study of testicular cancer susceptibility conducted in the Philadelphia, Pennsylvania, metropolitan area between 2009 and 2012. With few exceptions, eligible and enrolled controls recruited by means of RDD and ABS were similar with regard to characteristics for which data were collected on the screening survey. While we find ABS to be a comparably effective method of recruiting young males compared with landline RDD, we acknowledge the potential impact that selection bias may have had on our results because of poor overall response rates, which ranged from 11.4% for landline RDD to 1.7% for ABS.

  17. School-based mindfulness intervention for stress reduction in adolescents: Design and methodology of an open-label, parallel group, randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Jeanette M. Johnstone

    2016-12-01

    Full Text Available Adolescents are in a high-risk period developmentally, in terms of susceptibility to stress. A mindfulness intervention represents a potentially useful strategy for developing cognitive and emotion regulation skills associated with successful stress coping. Mindfulness strategies have been used successfully for emotional coping in adults, but are not as well studied in youth. This article details a novel proposal for the design of an 8-week randomized study to evaluate a high school-based mindfulness curriculum delivered as part of a two semester health class. A wellness education intervention is proposed as an active control, along with a waitlist control condition. All students enrolled in a sophomore (10th grade health class at a private suburban high school will be invited to participate (n = 300. Pre-test assessments will be obtained by youth report, parent ratings, and on-site behavioral testing. The assessments will evaluate baseline stress, mood, emotional coping, controlled attention, and working memory. Participants, divided into 13 classrooms, will be randomized into one of three conditions, by classroom: A mindfulness intervention, an active control (wellness education, and a passive control (waitlist. Waitlisted participants will receive one of the interventions in the following term. Intervention groups will meet weekly for 8 weeks during regularly scheduled health classes. Immediate post-tests will be conducted, followed by a 60-day post-test. It is hypothesized that the mindfulness intervention will outperform the other conditions with regard to the adolescents' mood, attention and response to stress.

  18. A sero-survey of rinderpest in nomadic pastoral systems in central and southern Somalia from 2002 to 2003, using a spatially integrated random sampling approach.

    Science.gov (United States)

    Tempia, S; Salman, M D; Keefe, T; Morley, P; Freier, J E; DeMartini, J C; Wamwayi, H M; Njeumi, F; Soumaré, B; Abdi, A M

    2010-12-01

    A cross-sectional sero-survey, using a two-stage cluster sampling design, was conducted between 2002 and 2003 in ten administrative regions of central and southern Somalia, to estimate the seroprevalence and geographic distribution of rinderpest (RP) in the study area, as well as to identify potential risk factors for the observed seroprevalence distribution. The study was also used to test the feasibility of the spatially integrated investigation technique in nomadic and semi-nomadic pastoral systems. In the absence of a systematic list of livestock holdings, the primary sampling units were selected by generating random map coordinates. A total of 9,216 serum samples were collected from cattle aged 12 to 36 months at 562 sampling sites. Two apparent clusters of RP seroprevalence were detected. Four potential risk factors associated with the observed seroprevalence were identified: the mobility of cattle herds, the cattle population density, the proximity of cattle herds to cattle trade routes and cattle herd size. Risk maps were then generated to assist in designing more targeted surveillance strategies. The observed seroprevalence in these areas declined over time. In subsequent years, similar seroprevalence studies in neighbouring areas of Kenya and Ethiopia also showed a very low seroprevalence of RP or the absence of antibodies against RP. The progressive decline in RP antibody prevalence is consistent with virus extinction. Verification of freedom from RP infection in the Somali ecosystem is currently in progress.

  19. Sample size adjustments for varying cluster sizes in cluster randomized trials with binary outcomes analyzed with second-order PQL mixed logistic regression.

    Science.gov (United States)

    Candel, Math J J M; Van Breukelen, Gerard J P

    2010-06-30

    Adjustments of sample size formulas are given for varying cluster sizes in cluster randomized trials with a binary outcome when testing the treatment effect with mixed effects logistic regression using second-order penalized quasi-likelihood estimation (PQL). Starting from first-order marginal quasi-likelihood (MQL) estimation of the treatment effect, the asymptotic relative efficiency of unequal versus equal cluster sizes is derived. A Monte Carlo simulation study shows this asymptotic relative efficiency to be rather accurate for realistic sample sizes, when employing second-order PQL. An approximate, simpler formula is presented to estimate the efficiency loss due to varying cluster sizes when planning a trial. In many cases sampling 14 per cent more clusters is sufficient to repair the efficiency loss due to varying cluster sizes. Since current closed-form formulas for sample size calculation are based on first-order MQL, planning a trial also requires a conversion factor to obtain the variance of the second-order PQL estimator. In a second Monte Carlo study, this conversion factor turned out to be 1.25 at most. (c) 2010 John Wiley & Sons, Ltd.

  20. Dynamical implications of sample shape for avalanches in 2-dimensional random-field Ising model with saw-tooth domain wall

    Science.gov (United States)

    Tadić, Bosiljka

    2018-03-01

    We study dynamics of a built-in domain wall (DW) in 2-dimensional disordered ferromagnets with different sample shapes using random-field Ising model on a square lattice rotated by 45 degrees. The saw-tooth DW of the length Lx is created along one side and swept through the sample by slow ramping of the external field until the complete magnetisation reversal and the wall annihilation at the open top boundary at a distance Ly. By fixing the number of spins N =Lx ×Ly = 106 and the random-field distribution at a value above the critical disorder, we vary the ratio of the DW length to the annihilation distance in the range Lx /Ly ∈ [ 1 / 16 , 16 ] . The periodic boundary conditions are applied in the y-direction so that these ratios comprise different samples, i.e., surfaces of cylinders with the changing perimeter Lx and height Ly. We analyse the avalanches of the DW slips between following field updates, and the multifractal structure of the magnetisation fluctuation time series. Our main findings are that the domain-wall lengths materialised in different sample shapes have an impact on the dynamics at all scales. Moreover, the domain-wall motion at the beginning of the hysteresis loop (HLB) probes the disorder effects resulting in the fluctuations that are significantly different from the large avalanches in the central part of the loop (HLC), where the strong fields dominate. Specifically, the fluctuations in HLB exhibit a wide multi-fractal spectrum, which shifts towards higher values of the exponents when the DW length is reduced. The distributions of the avalanches in this segments of the loops obey power-law decay and the exponential cutoffs with the exponents firmly in the mean-field universality class for long DW. In contrast, the avalanches in the HLC obey Tsallis density distribution with the power-law tails which indicate the new categories of the scale invariant behaviour for different ratios Lx /Ly. The large fluctuations in the HLC, on the other

  1. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    Science.gov (United States)

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  2. Long-term sampling of CO2 from waste-to-energy plants: 14C determination methodology, data variation and uncertainty

    DEFF Research Database (Denmark)

    Fuglsang, Karsten; Pedersen, Niels Hald; Larsen, Anna Warberg

    2014-01-01

    A dedicated sampling and measurement method was developed for long-term measurements of biogenic and fossil-derived CO2 from thermal waste-to-energy processes. Based on long-term sampling of CO2 and 14C determination, plant-specific emission factors can be determined more accurately, and the annual...... emission of fossil CO2 from waste-to-energy plants can be monitored according to carbon trading schemes and renewable energy certificates. Weekly and monthly measurements were performed at five Danish waste incinerators. Significant variations between fractions of biogenic CO2 emitted were observed...... was ± 4.0 pmC (95 % confidence interval) at 62 pmC. The long-term sampling method was found to be useful for waste incinerators for determination of annual fossil and biogenic CO2 emissions with relatively low uncertainty....

  3. A Comparative Evaluation of EPR and OxyLite Oximetry Using a Random Sampling of pO2 in a Murine Tumor

    Science.gov (United States)

    Vikram, Deepti S.; Bratasz, Anna; Ahmad, Rizwan; Kuppusamy, Periannan

    2015-01-01

    Methods currently available for the measurement of oxygen concentrations (oximetry) in viable tissues differ widely from each other in their methodological basis and applicability. The goal of this study was to compare two novel methods, particulate-based electron paramagnetic resonance (EPR) and OxyLite oximetry, in an experimental tumor model. EPR oximetry uses implantable paramagnetic particulates, whereas OxyLite uses fluorescent probes affixed on a fiber-optic cable. C3H mice were transplanted with radiation-induced fibrosarcoma (RIF-1) tumors in their hind limbs. Lithium phthalocyanine (LiPc) microcrystals were used as EPR probes. The pO2 measurements were taken from random locations at a depth of ~3 mm within the tumor either immediately or 48 h after implantation of LiPc. Both methods revealed significant hypoxia in the tumor. However, there were striking differences between the EPR and OxyLite readings. The differences were attributed to the volume of tissue under examination and the effect of needle invasion at the site of measurement. This study recognizes the unique benefits of EPR oximetry in terms of robustness, repeatability and minimal invasiveness. PMID:17705635

  4. A study on antimony determination in environmental samples by neutron activation analysis: validation of the methodology and determination of the uncertainty of the measurement

    International Nuclear Information System (INIS)

    Matsubara, Tassiane Cristina Martins

    2011-01-01

    Antimony is an element found in low concentrations in the environment. However, its determination has attracted great interest due to the knowledge of its toxicity and increasing application in industry. The determination of antimony has been a challenge for researchers since this element is found in low concentrations which make its analysis a difficult task. Therefore, although neutron activation analysis (NAA) is an appropriate method for the determination of various elements in different types of matrix, in the case of Sb its analysis presents some difficulties, mainly due to spectral interferences. The objective of this research was to validate the NAA method for Sb determination in environmental samples. To establish appropriate conditions for Sb determinations, preliminary assays were carried out for further analysis of certified reference materials (CRM). The experimental procedure was to irradiate samples with a synthetic Sb standard for a period of 8 or 16 hours in the IEA-R1 nuclear research reactor, followed by gamma ray spectrometry. The quantification of Sb was performed by measuring the radioactive isotopes of 122 Sb and '1 24 Sb. The results of preliminary assays indicated the presence of Sb in Whatman no 40 filter paper used in the preparation of the synthetic standard, but at very low concentrations, which could be considered negligible. In the case of the plastic material used in bags for the sample irradiation, it should be chosen carefully, because depending on the thickness, they may contain Sb. The analyses of the stability of the diluted Sb standard solution showed no change in the Sb concentration within eight months after its preparation. Results obtained in the analysis of certified reference materials indicated the interference of 76 As and also of 134 Cs and 152 Eu in the Sb determinations by measuring '1 22 Sb, due to the proximity of the gamma ray energies. The high activity of '2 4 Na can also mask the peak of 122 Sb hindering its

  5. Random Fields

    Science.gov (United States)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  6. Random noise attenuation of non-uniformly sampled 3D seismic data along two spatial coordinates using non-equispaced curvelet transform

    Science.gov (United States)

    Zhang, Hua; Yang, Hui; Li, Hongxing; Huang, Guangnan; Ding, Zheyi

    2018-04-01

    The attenuation of random noise is important for improving the signal to noise ratio (SNR). However, the precondition for most conventional denoising methods is that the noisy data must be sampled on a uniform grid, making the conventional methods unsuitable for non-uniformly sampled data. In this paper, a denoising method capable of regularizing the noisy data from a non-uniform grid to a specified uniform grid is proposed. Firstly, the denoising method is performed for every time slice extracted from the 3D noisy data along the source and receiver directions, then the 2D non-equispaced fast Fourier transform (NFFT) is introduced in the conventional fast discrete curvelet transform (FDCT). The non-equispaced fast discrete curvelet transform (NFDCT) can be achieved based on the regularized inversion of an operator that links the uniformly sampled curvelet coefficients to the non-uniformly sampled noisy data. The uniform curvelet coefficients can be calculated by using the inversion algorithm of the spectral projected-gradient for ℓ1-norm problems. Then local threshold factors are chosen for the uniform curvelet coefficients for each decomposition scale, and effective curvelet coefficients are obtained respectively for each scale. Finally, the conventional inverse FDCT is applied to the effective curvelet coefficients. This completes the proposed 3D denoising method using the non-equispaced curvelet transform in the source-receiver domain. The examples for synthetic data and real data reveal the effectiveness of the proposed approach in applications to noise attenuation for non-uniformly sampled data compared with the conventional FDCT method and wavelet transformation.

  7. Development of a Virtual Reality Exposure Tool as Psychological Preparation for Elective Pediatric Day Care Surgery: Methodological Approach for a Randomized Controlled Trial.

    Science.gov (United States)

    Eijlers, Robin; Legerstee, Jeroen S; Dierckx, Bram; Staals, Lonneke M; Berghmans, Johan; van der Schroeff, Marc P; Wijnen, Rene Mh; Utens, Elisabeth Mwj

    2017-09-11

    Preoperative anxiety in children is highly prevalent and is associated with adverse outcomes. Existing psychosocial interventions to reduce preoperative anxiety are often aimed at distraction and are of limited efficacy. Gradual exposure is a far more effective way to reduce anxiety. Virtual reality (VR) provides a unique opportunity to gradually expose children to all aspects of the operating theater. The aims of our study are (1) to develop a virtual reality exposure (VRE) tool to prepare children psychologically for surgery; and (2) to examine the efficacy of the VRE tool in a randomized controlled trial (RCT), in which VRE will be compared to care as usual (CAU). The VRE tool is highly realistic and resembles the operating room environment accurately. With this tool, children will not only be able to explore the operating room environment, but also get accustomed to general anesthesia procedures. The PREoperative Virtual reality Intervention to Enhance Wellbeing (PREVIEW) study will be conducted. In this single-blinded RCT, 200 consecutive patients (aged 4 to 12 years) undergoing elective day care surgery for dental, oral, or ear-nose-throat problems, will be randomly allocated to the preoperative VRE intervention or CAU. The primary outcome is change in child state anxiety level between baseline and induction of anesthesia. Secondary outcome measures include child's postoperative anxiety, emergence delirium, postoperative pain, use of analgesics, health care use, and pre- and postoperative parental anxiety. The VRE tool has been developed. Participant recruitment began March 2017 and is expected to be completed by September 2018. To our knowledge, this is the first RCT evaluating the effect of a VRE tool to prepare children for surgery. The VRE intervention is expected to significantly diminish preoperative anxiety, postoperative pain, and the use of postoperative analgesics in pediatric patients. The tool could create a less stressful experience for both

  8. Development of a Virtual Reality Exposure Tool as Psychological Preparation for Elective Pediatric Day Care Surgery: Methodological Approach for a Randomized Controlled Trial

    Science.gov (United States)

    Eijlers, Robin; Legerstee, Jeroen S; Dierckx, Bram; Staals, Lonneke M; Berghmans, Johan; van der Schroeff, Marc P; Wijnen, Rene MH

    2017-01-01

    Background Preoperative anxiety in children is highly prevalent and is associated with adverse outcomes. Existing psychosocial interventions to reduce preoperative anxiety are often aimed at distraction and are of limited efficacy. Gradual exposure is a far more effective way to reduce anxiety. Virtual reality (VR) provides a unique opportunity to gradually expose children to all aspects of the operating theater. Objective The aims of our study are (1) to develop a virtual reality exposure (VRE) tool to prepare children psychologically for surgery; and (2) to examine the efficacy of the VRE tool in a randomized controlled trial (RCT), in which VRE will be compared to care as usual (CAU). Methods The VRE tool is highly realistic and resembles the operating room environment accurately. With this tool, children will not only be able to explore the operating room environment, but also get accustomed to general anesthesia procedures. The PREoperative Virtual reality Intervention to Enhance Wellbeing (PREVIEW) study will be conducted. In this single-blinded RCT, 200 consecutive patients (aged 4 to 12 years) undergoing elective day care surgery for dental, oral, or ear-nose-throat problems, will be randomly allocated to the preoperative VRE intervention or CAU. The primary outcome is change in child state anxiety level between baseline and induction of anesthesia. Secondary outcome measures include child’s postoperative anxiety, emergence delirium, postoperative pain, use of analgesics, health care use, and pre- and postoperative parental anxiety. Results The VRE tool has been developed. Participant recruitment began March 2017 and is expected to be completed by September 2018. Conclusions To our knowledge, this is the first RCT evaluating the effect of a VRE tool to prepare children for surgery. The VRE intervention is expected to significantly diminish preoperative anxiety, postoperative pain, and the use of postoperative analgesics in pediatric patients. The tool

  9. Importance of Using Multiple Sampling Methodologies for Estimating of Fish Community Composition in Offshore Wind Power Construction Areas of the Baltic Sea

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Mathias H.; Gullstroem, Martin; Oehman, Marcus C. (Dept. of Zoology, Stockholm Univ., Stockholm (Sweden)); Asplund, Maria E. (Dept. of Marine Ecology, Goeteborg Univ., Kristineberg Marine Research Station, Fiskebaeckskil (Sweden))

    2007-12-15

    In this study a visual SCUBA investigation was conducted in Utgrunden 2, an area where windmills had not yet been constructed, and where the bottom mainly consisted of mud or sand with no or a sparse number of algae or mussel beds. A wind farm at Utgrunden 2 would alter the local habitat from a predominantly sandy soft-bottom habitat to an area in which artificial reef structures that resemble hard-bottom habitats is introduced, i.e., the steel foundations and possibly boulders for scour protection. The fish community that will develop over time would be expected to change to resemble the assemblages observed at Utgrunden 1 and hence not visible using trawling and echosound sampling technique. As the goal of EIA is to assess changes, following human development visual techniques is recommended as a complement when examining the environmental effects of offshore windpower. Otherwise important ecological changes may go unnoticed. For a comprehensive understanding of the ecological effects of windfarm developments it is recommended that a combination of sampling methods is applied and that this should be defined before an investigation commences. Although it is well established in the scientific literature that different sampling methods will give different estimations of fish community composition, environmental impact assessments of offshore windpower have been incorrectly interpreted. In the interpretation of the results of such assessments it is common that the findings are extrapolated by stakeholders and media to include a larger extent of the fish populations than what was intended. Therefore, to fully understand how windpower influences fish the underwater visual census technique is here put forward as a necessary complement to more widescreening fish sampling methods (e.g., gill nets, echo-sounds, trawling)

  10. Why choose Random Forest to predict rare species distribution with few samples in large undersampled areas? Three Asian crane species models provide supporting evidence

    Directory of Open Access Journals (Sweden)

    Chunrong Mi

    2017-01-01

    Full Text Available Species distribution models (SDMs have become an essential tool in ecology, biogeography, evolution and, more recently, in conservation biology. How to generalize species distributions in large undersampled areas, especially with few samples, is a fundamental issue of SDMs. In order to explore this issue, we used the best available presence records for the Hooded Crane (Grus monacha, n = 33, White-naped Crane (Grus vipio, n = 40, and Black-necked Crane (Grus nigricollis, n = 75 in China as three case studies, employing four powerful and commonly used machine learning algorithms to map the breeding distributions of the three species: TreeNet (Stochastic Gradient Boosting, Boosted Regression Tree Model, Random Forest, CART (Classification and Regression Tree and Maxent (Maximum Entropy Models. In addition, we developed an ensemble forecast by averaging predicted probability of the above four models results. Commonly used model performance metrics (Area under ROC (AUC and true skill statistic (TSS were employed to evaluate model accuracy. The latest satellite tracking data and compiled literature data were used as two independent testing datasets to confront model predictions. We found Random Forest demonstrated the best performance for the most assessment method, provided a better model fit to the testing data, and achieved better species range maps for each crane species in undersampled areas. Random Forest has been generally available for more than 20 years and has been known to perform extremely well in ecological predictions. However, while increasingly on the rise, its potential is still widely underused in conservation, (spatial ecological applications and for inference. Our results show that it informs ecological and biogeographical theories as well as being suitable for conservation applications, specifically when the study area is undersampled. This method helps to save model-selection time and effort, and allows robust and rapid

  11. Why choose Random Forest to predict rare species distribution with few samples in large undersampled areas? Three Asian crane species models provide supporting evidence.

    Science.gov (United States)

    Mi, Chunrong; Huettmann, Falk; Guo, Yumin; Han, Xuesong; Wen, Lijia

    2017-01-01

    Species distribution models (SDMs) have become an essential tool in ecology, biogeography, evolution and, more recently, in conservation biology. How to generalize species distributions in large undersampled areas, especially with few samples, is a fundamental issue of SDMs. In order to explore this issue, we used the best available presence records for the Hooded Crane ( Grus monacha , n  = 33), White-naped Crane ( Grus vipio , n  = 40), and Black-necked Crane ( Grus nigricollis , n  = 75) in China as three case studies, employing four powerful and commonly used machine learning algorithms to map the breeding distributions of the three species: TreeNet (Stochastic Gradient Boosting, Boosted Regression Tree Model), Random Forest, CART (Classification and Regression Tree) and Maxent (Maximum Entropy Models). In addition, we developed an ensemble forecast by averaging predicted probability of the above four models results. Commonly used model performance metrics (Area under ROC (AUC) and true skill statistic (TSS)) were employed to evaluate model accuracy. The latest satellite tracking data and compiled literature data were used as two independent testing datasets to confront model predictions. We found Random Forest demonstrated the best performance for the most assessment method, provided a better model fit to the testing data, and achieved better species range maps for each crane species in undersampled areas. Random Forest has been generally available for more than 20 years and has been known to perform extremely well in ecological predictions. However, while increasingly on the rise, its potential is still widely underused in conservation, (spatial) ecological applications and for inference. Our results show that it informs ecological and biogeographical theories as well as being suitable for conservation applications, specifically when the study area is undersampled. This method helps to save model-selection time and effort, and allows robust and rapid

  12. Levels of dioxin (PCDD/F) and PCBs in a random sample of Australian aquaculture-produced Southern Bluefin Tuna (Thunnus maccoyii)

    Energy Technology Data Exchange (ETDEWEB)

    Padula, D.; Madigan, T.; Kiermeier, A.; Daughtry, B.; Pointon, A. [South Australian Research and Development Inst. (Australia)

    2004-09-15

    To date there has been no published information available on the levels of dioxin (PCDD/F) and PCBs in Australian aquaculture-produced Southern Bluefin Tuna (Thunnus maccoyii). Southern Bluefin Tuna are commercially farmed off the coast of Port Lincoln in the state of South Australia, Australia. This paper reports the levels of dioxin (PCDD/F) and PCBs in muscle tissue samples from 11 randomly sampled aquaculture-produced Southern Bluefin Tuna collected in 2003. Little published data exists on the levels of dioxin (PCDD/F) and PCBs in Australian aquacultureproduced seafood. Wild tuna are first caught in the Great Australian Bight in South Australian waters, and are then brought back to Port Lincoln where they are ranched in sea-cages before being harvested and exported to Japan. The aim of the study was to identify pathways whereby contaminants such as dioxin (PCDD/F) and PCBs may enter the aquaculture production system. This involved undertaking a through chain analysis of the levels of dioxin (PCDD/F) and PCBs in wild caught tuna, seafloor sediment samples from the marine environment, levels in feeds and final harvested exported product. Detailed study was also undertaken on the variation of dioxin (PCDD/F) and PCBs across individual tuna carcases. This paper addresses the levels found in final harvested product. Details on levels found in other studies will be published elsewhere shortly.

  13. Sample size determinations for group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms.

    Science.gov (United States)

    Heo, Moonseong; Litwin, Alain H; Blackstock, Oni; Kim, Namhee; Arnsten, Julia H

    2017-02-01

    We derived sample size formulae for detecting main effects in group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms. Such designs are necessary when experimental interventions need to be administered to groups of subjects whereas control conditions need to be administered to individual subjects. This type of trial, often referred to as a partially nested or partially clustered design, has been implemented for management of chronic diseases such as diabetes and is beginning to emerge more commonly in wider clinical settings. Depending on the research setting, the level of hierarchy of data structure for the experimental arm can be three or two, whereas that for the control arm is two or one. Such different levels of data hierarchy assume correlation structures of outcomes that are different between arms, regardless of whether research settings require two or three level data structure for the experimental arm. Therefore, the different correlations should be taken into account for statistical modeling and for sample size determinations. To this end, we considered mixed-effects linear models with different correlation structures between experimental and control arms to theoretically derive and empirically validate the sample size formulae with simulation studies.

  14. Approaches to Sampling Gay, Bisexual, and Other Men Who Have Sex with Men from Geosocial-Networking Smartphone Applications: A Methodological Note

    Directory of Open Access Journals (Sweden)

    William C. Goedel

    2016-09-01

    Full Text Available Geosocial-networking smartphone applications utilize global positioning system (GPS technologies to connect users based on their physical proximity. Many gay, bisexual, and other men who have sex with men (MSM have smartphones, and these new mobile technologies have generated quicker and easier modes for MSM to meet potential partners. In doing so, these technologies may facilitate a user’s ability to have multiple concurrent partners, thereby increasing their risk for acquiring HIV or other sexually transmitted infections. Researchers have sought to recruit users of these applications (e.g., Grindr, Jack’d, Scruff into HIV prevention studies, primarily through advertising on the application. Given that these advertisements often broadly targeted large urban areas, these approaches have generated samples that are not representative of the population of users of the given application in a given area. As such, we propose a method to generate a spatially representative sample of MSM via direct messaging on a given application using New York City and its geography as an example of this sampling and recruitment method. These methods can increase geographic representativeness and wider access to MSM who use geosocial-networking smartphone applications.

  15. A drink is a drink? Variation in the amount of alcohol contained in beer, wine and spirits drinks in a US methodological sample.

    Science.gov (United States)

    Kerr, William C; Greenfield, Thomas K; Tujague, Jennifer; Brown, Stephan E

    2005-11-01

    Empirically based estimates of the mean alcohol content of beer, wine and spirits drinks from a national sample of US drinkers are not currently available. A sample of 310 drinkers from the 2000 National Alcohol Survey were re-contacted to participate in a telephone survey with specific questions about the drinks they consume. Subjects were instructed to prepare their usual drink of each beverage at home and to measure each alcoholic beverage and other ingredients with a provided beaker. Information on the brand or type of each beverage was used to specify the percentage of alcohol. The weighted mean alcohol content of respondents' drinks was 0.67 ounces overall, 0.56 ounces for beer, 0.66 ounces for wine and 0.89 ounces for spirits. Spirits and wine drink contents were particularly variable with many high-alcohol drinks observed. While the 0.6-ounce of alcohol drink standard appears to be a reasonable single standard, it cannot capture the substantial variation evident in this sample and it underestimates average wine and spirits ethanol content. Direct measurement or beverage-specific mean ethanol content estimates would improve the precision of survey alcohol assessment.

  16. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions

    Directory of Open Access Journals (Sweden)

    Alexander J. Schmithausen

    2016-10-01

    Full Text Available Trace gases such as nitrous oxide (N2O, methane (CH4, and carbon dioxide (CO2 are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR; photoacoustic system (PAS are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC with electron capture detection (ECD, but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system and GC in the laboratory (offline system. Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges.

  17. Trace metals in fugitive dust from unsurfaced roads in the Viburnum Trend resource mining District of Missouri--implementation of a direct-suspension sampling methodology.

    Science.gov (United States)

    Witt, Emitt C; Wronkiewicz, David J; Pavlowsky, Robert T; Shi, Honglan

    2013-09-01

    Fugitive dust from 18 unsurfaced roadways in Missouri were sampled using a novel cyclonic fugitive dust collector that was designed to obtain suspended bulk samples for analysis. The samples were analyzed for trace metals, Fe and Al, particle sizes, and mineralogy to characterize the similarities and differences between roadways. Thirteen roads were located in the Viburnum Trend (VT) mining district, where there has been a history of contaminant metal loading of local soils; while the remaining five roads were located southwest of the VT district in a similar rural setting, but without any mining or industrial process that might contribute to trace metal enrichment. Comparison of these two groups shows that trace metal concentration is higher for dusts collected in the VT district. Lead is the dominant trace metal found in VT district dusts representing on average 79% of the total trace metal concentration, and was found moderately to strongly enriched relative to unsurfaced roads in the non-VT area. Fugitive road dust concentrations calculated for the VT area substantially exceed the 2008 Federal ambient air standard of 0.15μgm(-3) for Pb. The pattern of trace metal contamination in fugitive dust from VT district roads is similar to trace metal concentrations patterns observed for soils measured more than 40years ago indicating that Pb contamination in the region is persistent as a long-term soil contaminant. Published by Elsevier Ltd.

  18. Fenton and Fenton-like oxidation of pesticide acetamiprid in water samples: kinetic study of the degradation and optimization using response surface methodology.

    Science.gov (United States)

    Mitsika, Elena E; Christophoridis, Christophoros; Fytianos, Konstantinos

    2013-11-01

    The aims of this study were (a) to evaluate the degradation of acetamiprid with the use of Fenton reaction, (b) to investigate the effect of different concentrations of H2O2 and Fe(2+), initial pH and various iron salts, on the degradation of acetamiprid and (c) to apply response surface methodology for the evaluation of degradation kinetics. The kinetic study revealed a two-stage process, described by pseudo- first and second order kinetics. Different H2O2:Fe(2+) molar ratios were examined for their effect on acetamiprid degradation kinetics. The ratio of 3 mg L(-1) Fe(2+): 40 mg L(-1) H2O2 was found to completely remove acetamiprid at less than 10 min. Degradation rate was faster at lower pH, with the optimal value at pH 2.9, while Mohr salt appeared to degrade acetamiprid faster. A central composite design was selected in order to observe the effects of Fe(2+) and H2O2 initial concentration on acetamiprid degradation kinetics. A quadratic model fitted the experimental data, with satisfactory regression and fit. The most significant effect on the degradation of acetamiprid, was induced by ferrous iron concentration followed by H2O2. Optimization, aiming to minimize the applied ferrous concentration and the process time, proposed a ratio of 7.76 mg L(-1) Fe(II): 19.78 mg L(-1) H2O2. DOC is reduced much more slowly and requires more than 6h of processing for 50% degradation. The use to zero valent iron, demonstrated fast kinetic rates with acetamiprid degradation occurring in 10 min and effective DOC removal. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Study of New Analytical Methodologies for the Analysis of Polychlorinated Dibenzo-P-Dioxins (PcDDs) and Polychlorinated Di benzofurans (PCDFs) by Quadrupole Ion Storage Tandem-in-time Mass Spectrometry. Application to Environmental Samples

    International Nuclear Information System (INIS)

    Sanz Chichon, M. P.

    2008-01-01

    Two alternative analytical methodologies have been developed for the analysis of polychlorinated dibenzo-p-dioxins ( PCDDs) and di benzofurans (PCDFs) in environmental samples. The techniques studied have been: Pressurized Fluid Extraction (PFE) and Microwave-Assisted Extraction (MAE) versus Soxhlet extraction; the automated system Power-PrepTM versus the conventional cleanup using open chromatographic columns with different adsorbents and the application of tandem mass spectrometry (HRGC-MS/MS) versus high resolution mass spectrometry (HRGC-HRMS) for PCDD/Fs detection and quantification. (Author) 233 refs

  20. Comparison of open-flow microperfusion and microdialysis methodologies when sampling topically applied fentanyl and benzoic acid in human dermis ex vivo

    DEFF Research Database (Denmark)

    Holmgaard, R; Benfeldt, E; Nielsen, J B

    2012-01-01

    . The second purpose was to provide guidance to researchers in choosing the most efficient method for a given penetrant and give suggestions concerning critical choices for successful dermal sampling. METHODS: The dOFM and dMD techniques are compared in equal set-ups using three probe-types (one dOFM probe...... experimental conditions. The methods each had advantages and limitations in technical, practical and hands-on comparisons. CONCLUSION: When planning a study of cutaneous penetration the advantages and limitations of each probe-type have to be considered in relation to the scientific question posed, the physico...

  1. Optimisation methodology in the chiral and achiral separation in electrokinetic chromatography in the case of a multicomponent sample of dansyl amino acids.

    Science.gov (United States)

    Giuffrida, Alessandro; Messina, Marianna; Contino, Annalinda; Cucinotta, Vincenzo

    2013-11-01

    Two different chiral selectors synthesised in our laboratory were used to test the possibility of separation for a sample consisting of ten different enantiomeric pairs of dansyl-derivatives of α-amino acids in electrokinetic chromatography. It was possible to observe all the peaks, though only partly resolved, due to the twenty analytes through an accurate strategy of choice of the experimental conditions. As a part of this strategy, a procedure of identification of the single peaks in the electropherograms called LACI (lastly added component identification) has been developed. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Methodology of simultaneous analysis of Uranium and Thorium by nuclear and atomic techniques. Application to the Uranium and Thorium dosing in mineralogic samples

    International Nuclear Information System (INIS)

    Fakhi, S.

    1988-01-01

    This work concerns essentially the potential applications of 100 kW nuclear reactor of Strasbourg Nuclear Research Centre to neutron activation analysis of Uranium and Thorium. The Uranium dosing has been made using: 239-U, 239-Np, fission products or delayed neutrons. Thorium has been showed up by means of 233-Th or 233-Pa. The 239-U and 233-Th detection leads to a rapid and non-destructive analysis of Uranium and Thorium. The maximum sensitivity is of 78 ng for Uranium and of 160 ng for Thorium. The Uranium and Thorium dosing based on 239-Np and 233-Pa detection needs chemical selective separations for each of these radionuclides. The liquid-liquid extraction has permitted to elaborate rapid and quantitative separation methods. The sensitivities of the analysis after extraction reach 30 ng for Uranium and 50 ng for Thorium. The fission products separation study has allowed to elaborate the La, Ce and Nd extractions and its application to the Uranium dosing gives satisfying results. A rapid dosing method with a sensitivity of 0.35 microgramme has been elaborated with the help of delayed neutrons measurement. These different methods have been applied to the Uranium and Thorium dosing in samples coming from Oklo mine in Gabon. The analyses of these samples by atomic absorption spectroscopy and by the proton induced X-ray emission (PIXE) method confirm that the neutron activation analysis methods are reliable. 37 figs., 14 tabs., 50 refs

  3. A dispersive liquid--liquid microextraction methodology for copper (II) in environmental samples prior to determination using microsample injection flame atomic absorption spectrometry.

    Science.gov (United States)

    Alothman, Zeid A; Habila, Mohamed; Yilmaz, Erkan; Soylak, Mustafa

    2013-01-01

    A simple, environmentally friendly, and efficient dispersive liquid-liquid microextraction method combined with microsample injection flame atomic absorption spectrometry was developed for the separation and preconcentration of Cu(II). 2-(5-Bromo-2-pyridylazo)-5-(diethylamino)phenol (5-Br-PADAP) was used to form a hydrophobic complex of Cu(II) ions in the aqueous phase before extraction. To extract the Cu(II)-5-Br-PADAP complex from the aqueous phase to the organic phase, 2.0 mL of acetone as a disperser solvent and 200 microL of chloroform as an extraction solvent were used. The influences of important analytical parameters, such as the pH, types and volumes of the extraction and disperser solvents, amount of chelating agent, sample volume, and matrix effects, on the microextraction procedure were evaluated and optimized. Using the optimal conditions, the LOD, LOQ, preconcentration factor, and RSD were determined to be 1.4 microg/L, 4.7 microg/L, 120, and 6.5%, respectively. The accuracy of the proposed method was investigated using standard addition/recovery tests. The analysis of certified reference materials produced satisfactory analytical results. The developed method was applied for the determination of Cu in real samples.

  4. Random and systematic sampling error when hooking fish to monitor skin fluke (Benedenia seriolae) and gill fluke (Zeuxapta seriolae) burden in Australian farmed yellowtail kingfish (Seriola lalandi).

    Science.gov (United States)

    Fensham, J R; Bubner, E; D'Antignana, T; Landos, M; Caraguel, C G B

    2018-05-01

    The Australian farmed yellowtail kingfish (Seriola lalandi, YTK) industry monitor skin fluke (Benedenia seriolae) and gill fluke (Zeuxapta seriolae) burden by pooling the fluke count of 10 hooked YTK. The random and systematic error of this sampling strategy was evaluated to assess potential impact on treatment decisions. Fluke abundance (fluke count per fish) in a study cage (estimated 30,502 fish) was assessed five times using the current sampling protocol and its repeatability was estimated the repeatability coefficient (CR) and the coefficient of variation (CV). Individual body weight, fork length, fluke abundance, prevalence, intensity (fluke count per infested fish) and density (fluke count per Kg of fish) were compared between 100 hooked and 100 seined YTK (assumed representative of the entire population) to estimate potential selection bias. Depending on the fluke species and age category, CR (expected difference in parasite count between 2 sampling iterations) ranged from 0.78 to 114 flukes per fish. Capturing YTK by hooking increased the selection of fish of a weight and length in the lowest 5th percentile of the cage (RR = 5.75, 95% CI: 2.06-16.03, P-value = 0.0001). These lower end YTK had on average an extra 31 juveniles and 6 adults Z. seriolae per Kg of fish and an extra 3 juvenile and 0.4 adult B. seriolae per Kg of fish, compared to the rest of the cage population (P-value sampling towards the smallest and most heavily infested fish in the population, resulting in poor repeatability (more variability amongst sampled fish) and an overestimation of parasite burden in the population. In this particular commercial situation these finding supported that health management program, where the finding of an underestimation of parasite burden could provide a production impact on the study population. In instances where fish populations and parasite burdens are more homogenous, sampling error may be less severe. Sampling error when capturing fish

  5. Systematic screening with information and home sampling for genital Chlamydia trachomatis infections in young men and women in Norway: a randomized controlled trial.

    Science.gov (United States)

    Kløvstad, Hilde; Natås, Olav; Tverdal, Aage; Aavitsland, Preben

    2013-01-23

    As most genital Chlamydia trachomatis infections are asymptomatic, many patients do not seek health care for testing. Infections remain undiagnosed and untreated. We studied whether screening with information and home sampling resulted in more young people getting tested, diagnosed and treated for chlamydia in the three months following the intervention compared to the current strategy of testing in the health care system. We conducted a population based randomized controlled trial among all persons aged 18-25 years in one Norwegian county (41 519 persons). 10 000 persons (intervention) received an invitation by mail with chlamydia information and a mail-back urine sampling kit. 31 519 persons received no intervention and continued with usual care (control). All samples from both groups were analysed in the same laboratory. Information on treatment was obtained from the Norwegian Prescription Database (NorPD). We estimated risk ratios and risk differences of being tested, diagnosed and treated in the intervention group compared to the control group. In the intervention group 16.5% got tested and in the control group 3.4%, risk ratio 4.9 (95% CI 4.5-5.2). The intervention led to 2.6 (95% CI 2.0-3.4) times as many individuals being diagnosed and 2.5 (95% CI 1.9-3.4) times as many individuals receiving treatment for chlamydia compared to no intervention in the three months following the intervention. In Norway, systematic screening with information and home sampling results in more young people being tested, diagnosed and treated for chlamydia in the three months following the intervention than the current strategy of testing in the health care system. However, the study has not established that the intervention will reduce the chlamydia prevalence or the risk of complications from chlamydia.

  6. Synthetic and non-synthetic anthropogenic fibers in a river under the impact of Paris Megacity: Sampling methodological aspects and flux estimations.

    Science.gov (United States)

    Dris, Rachid; Gasperi, Johnny; Rocher, Vincent; Tassin, Bruno

    2018-03-15

    Processed fibers are highly present in our daily life and can be either natural, artificial (regenerated cellulose) and synthetic (made with petrochemicals). Their widespread use lead inevitably to a high contamination of environment. Previous studies focus on plastic particles regardless of their type or shape as long as they are comprised between 330μm and 5mm. On the contrary, this study focuses exclusively on fibers using a smaller mesh size net (80μm) to sample freshwater. Moreover, all processed organic fibers are considered, irrespective to their nature. First, the short term temporal variability of the fibers in the environment was assessed. While exposing the sampling net during 1min a coefficient of variation of approx. 45% (with n=6) was determined. It was of only 26% (n=6) when the exposure was of 3min. The assessment of the distribution through the section showed a possible difference in concentrations between the middle of the water surface and the river banks which could be attributed to the intense river traffic within the Paris Megacity. The vertical variability seems negligible as turbulence and current conditions homogenize the distribution of the fibers. A monthly monitoring showed concentrations of 100.6±99.9fibers·m -3 in the Marne River and of: 48.5±98.5, 27.9±26.3, 27.9±40.3 and 22.1±25.3fibers·m -3 from the upstream to downstream points in the Seine River. Once these concentrations are converted into fluxes, it seems that the impact generated by the Paris Megacity cannot be distinguished. Investigations on the role of sedimentation and deposition on the banks are required. This study helped fill some major knowledge gaps regarding the fibers in rivers, their sampling, occurrence, spatial-temporal distribution and fluxes. It is encouraged that future studies include both synthetic and none synthetic fibers. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Community-based intermittent mass testing and treatment for malaria in an area of high transmission intensity, western Kenya: study design and methodology for a cluster randomized controlled trial.

    Science.gov (United States)

    Samuels, Aaron M; Awino, Nobert; Odongo, Wycliffe; Abong'o, Benard; Gimnig, John; Otieno, Kephas; Shi, Ya Ping; Were, Vincent; Allen, Denise Roth; Were, Florence; Sang, Tony; Obor, David; Williamson, John; Hamel, Mary J; Patrick Kachur, S; Slutsker, Laurence; Lindblade, Kim A; Kariuki, Simon; Desai, Meghna

    2017-06-07

    Most human Plasmodium infections in western Kenya are asymptomatic and are believed to contribute importantly to malaria transmission. Elimination of asymptomatic infections requires active treatment approaches, such as mass testing and treatment (MTaT) or mass drug administration (MDA), as infected persons do not seek care for their infection. Evaluations of community-based approaches that are designed to reduce malaria transmission require careful attention to study design to ensure that important effects can be measured accurately. This manuscript describes the study design and methodology of a cluster-randomized controlled trial to evaluate a MTaT approach for malaria transmission reduction in an area of high malaria transmission. Ten health facilities in western Kenya were purposively selected for inclusion. The communities within 3 km of each health facility were divided into three clusters of approximately equal population size. Two clusters around each health facility were randomly assigned to the control arm, and one to the intervention arm. Three times per year for 2 years, after the long and short rains, and again before the long rains, teams of community health volunteers visited every household within the intervention arm, tested all consenting individuals with malaria rapid diagnostic tests, and treated all positive individuals with an effective anti-malarial. The effect of mass testing and treatment on malaria transmission was measured through population-based longitudinal cohorts, outpatient visits for clinical malaria, periodic population-based cross-sectional surveys, and entomological indices.

  8. A protocol for a three-arm cluster randomized controlled superiority trial investigating the effects of two pedagogical methodologies in Swedish preschool settings on language and communication, executive functions, auditive selective attention, socioemotional skills and early maths skills.

    Science.gov (United States)

    Gerholm, Tove; Hörberg, Thomas; Tonér, Signe; Kallioinen, Petter; Frankenberg, Sofia; Kjällander, Susanne; Palmer, Anna; Taguchi, Hillevi Lenz

    2018-06-19

    During the preschool years, children develop abilities and skills in areas crucial for later success in life. These abilities include language, executive functions, attention, and socioemotional skills. The pedagogical methods used in preschools hold the potential to enhance these abilities, but our knowledge of which pedagogical practices aid which abilities, and for which children, is limited. The aim of this paper is to describe an intervention study designed to evaluate and compare two pedagogical methodologies in terms of their effect on the above-mentioned skills in Swedish preschool children. The study is a randomized control trial (RCT) where two pedagogical methodologies were tested to evaluate how they enhanced children's language, executive functions and attention, socioemotional skills, and early maths skills during an intensive 6-week intervention. Eighteen preschools including 28 units and 432 children were enrolled in a municipality close to Stockholm, Sweden. The children were between 4;0 and 6;0 years old and each preschool unit was randomly assigned to either of the interventions or to the control group. Background information on all children was collected via questionnaires completed by parents and preschools. Pre- and post-intervention testing consisted of a test battery including tests on language, executive functions, selective auditive attention, socioemotional skills and early maths skills. The interventions consisted of 6 weeks of intensive practice of either a socioemotional and material learning paradigm (SEMLA), for which group-based activities and interactional structures were the main focus, or an individual, digitally implemented attention and math training paradigm, which also included a set of self-regulation practices (DIL). All preschools were evaluated with the ECERS-3. If this intervention study shows evidence of a difference between group-based learning paradigms and individual training of specific skills in terms of

  9. Ultrasonic energy enhanced the efficiency of advance extraction methodology for enrichment of trace level of copper in serum samples of patients having neurological disorders.

    Science.gov (United States)

    Arain, Mariam S; Kazi, Tasneem G; Afridi, Hassan I; Ali, Jamshed; Akhtar, Asma

    2017-07-01

    An innovative dual dispersive ionic liquid based on ultrasound assisted microextraction (UDIL-μE), for the enrichment of trace levels of copper ion (Cu 2+ ), in serum (blood) of patients suffering from different neurological disorders. The enriched metal ions were subjected to flame atomic absorption spectrometry (FAAS). In the UDIL-μE method, the extraction solvent, ionic liquid, 1-butyl-3-methylimidazolium hexafluorophosphate [C 4 mim][PF 6 ], was dispersed into the aqueous samples using an ultrasonic bath. The(PAN) 1-(2-pyridylazo)-2-naphthol was used as ligand for the complexation of Cu ion in IL (as extracting solvent). The various variables such as sonication time, pH, concentration of complexing agent, time and rate of centrifugation, IL volume that affect the extraction process were optimized. The enhancement factor (EF) and detection limit (LOD) was found under favorable condition was 31 and 0.36μgL -1 , respectively. Reliability of the proposed method was checked by relative standard deviation (%RSD), which was found to be <5%. The accuracy of developed procedure was assured by using certified reference material (CRM) of blood serum. The developed procedure was applied successfully to the analysis of concentration of Cu ion in blood serum of different neurological disorders subjects and referents of same age group. It was observed that the levels of Cu ion was two folds higher in serum samples of neurological disorders patients as related to normal referents of same age group. Copyright © 2016. Published by Elsevier B.V.

  10. Random Sampling of Squamate Reptiles in Spanish Natural Reserves Reveals the Presence of Novel Adenoviruses in Lacertids (Family Lacertidae) and Worm Lizards (Amphisbaenia).

    Science.gov (United States)

    Szirovicza, Leonóra; López, Pilar; Kopena, Renáta; Benkő, Mária; Martín, José; Pénzes, Judit J

    2016-01-01

    Here, we report the results of a large-scale PCR survey on the prevalence and diversity of adenoviruses (AdVs) in samples collected randomly from free-living reptiles. On the territories of the Guadarrama Mountains National Park in Central Spain and of the Chafarinas Islands in North Africa, cloacal swabs were taken from 318 specimens of eight native species representing five squamate reptilian families. The healthy-looking animals had been captured temporarily for physiological and ethological examinations, after which they were released. We found 22 AdV-positive samples in representatives of three species, all from Central Spain. Sequence analysis of the PCR products revealed the existence of three hitherto unknown AdVs in 11 Carpetane rock lizards (Iberolacerta cyreni), nine Iberian worm lizards (Blanus cinereus), and two Iberian green lizards (Lacerta schreiberi), respectively. Phylogeny inference showed every novel putative virus to be a member of the genus Atadenovirus. This is the very first description of the occurrence of AdVs in amphisbaenian and lacertid hosts. Unlike all squamate atadenoviruses examined previously, two of the novel putative AdVs had A+T rich DNA, a feature generally deemed to mirror previous host switch events. Our results shed new light on the diversity and evolution of atadenoviruses.

  11. Random Sampling of Squamate Reptiles in Spanish Natural Reserves Reveals the Presence of Novel Adenoviruses in Lacertids (Family Lacertidae and Worm Lizards (Amphisbaenia.

    Directory of Open Access Journals (Sweden)

    Leonóra Szirovicza

    Full Text Available Here, we report the results of a large-scale PCR survey on the prevalence and diversity of adenoviruses (AdVs in samples collected randomly from free-living reptiles. On the territories of the Guadarrama Mountains National Park in Central Spain and of the Chafarinas Islands in North Africa, cloacal swabs were taken from 318 specimens of eight native species representing five squamate reptilian families. The healthy-looking animals had been captured temporarily for physiological and ethological examinations, after which they were released. We found 22 AdV-positive samples in representatives of three species, all from Central Spain. Sequence analysis of the PCR products revealed the existence of three hitherto unknown AdVs in 11 Carpetane rock lizards (Iberolacerta cyreni, nine Iberian worm lizards (Blanus cinereus, and two Iberian green lizards (Lacerta schreiberi, respectively. Phylogeny inference showed every novel putative virus to be a member of the genus Atadenovirus. This is the very first description of the occurrence of AdVs in amphisbaenian and lacertid hosts. Unlike all squamate atadenoviruses examined previously, two of the novel putative AdVs had A+T rich DNA, a feature generally deemed to mirror previous host switch events. Our results shed new light on the diversity and evolution of atadenoviruses.

  12. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  13. A simple and sensitive methodology for voltammetric determination of valproic acid in human blood plasma samples using 3-aminopropyletriethoxy silane coated magnetic nanoparticles modified pencil graphite electrode.

    Science.gov (United States)

    Zabardasti, Abedin; Afrouzi, Hossein; Talemi, Rasoul Pourtaghavi

    2017-07-01

    In this work, we have prepared a nano-material modified pencil graphite electrode for the sensing of valproic acid (VA) by immobilization 3-aminopropyletriethoxy silane coated magnetic nanoparticles (APTES-MNPs) on the pencil graphite surface (PGE). Electrochemical studies indicated that the APTES-MNPs efficiently increased the electron transfer kinetics between VA and the electrode and the free NH 2 groups of the APTES on the outer surface of magnetic nanoparticles can interact with carboxyl groups of VA. Based on this, we have proposed a sensitive, rapid and convenient electrochemical method for VA determination. Under the optimized conditions, the reduction peak current of VA is found to be proportional to its concentration in the range of 1.0 (±0.2) to 100.0 (±0.3) ppm with a detection limit of 0.4 (±0.1) ppm. The whole sensor fabrication process was characterized by cyclic voltammetry (CV) and electrochemical impedance spectroscopy (EIS) methods with using [Fe(CN) 6 ] 3-/4- as an electrochemical redox indicator. The prepared modified electrode showed several advantages such as high sensitivity, selectivity, ease of preparation and good repeatability, reproducibility and stability. The proposed method was applied to determination of valproic acid in blood plasma samples and the obtained results were satisfactory accurate. Copyright © 2017. Published by Elsevier B.V.

  14. Occupational exposure to mineral oil metalworking fluid (MWFs) mist: Development of new methodologies for mist sampling and analysis. Results from an inter-laboratory comparison

    International Nuclear Information System (INIS)

    Huynh, C Khanh; Herrera, H; Parrat, J; Wolf, R; Perret, V

    2009-01-01

    Metalworking Fluids (MWFs) are largely used in the sector of undercutting, a large professional activity in Switzerland, in particular in the fine mechanic and watch making industry. France proposes a Permissible Exposure Limit (PEL) of 1 mg.m -3 of aerosol. The American Conference of Governmental Industrial Hygienists (ACGIH) sets its value at 5 mg.m -3 but a proposal to lower the standard ('intended changes') to 0.2 mg.m -3 of aerosol is pending since 2001. However, it has not become a recognized threshold limit value for exposure. Since 2003, the new Swiss PEL (MAK) recommendations would be 0.2 mg.m -3 of aerosol (oil with boiling point > 350 deg. C without additives) and/or 20 mg.m -3 of oil aerosol + vapour for medium or light oil. To evaluate evaporative losses of sampled oil, the German 'Berufsgenossenschaftliches Institut fuer Arbeitssicherheit' (BGIA) recommends the use of a XAD-2 cartridge behind the filter. The method seems to work perfectly for MWFs in a clean occupational atmosphere free from interference of light vapour cleaning solvent such as White Spirit. But, in real situation, machine shop atmosphere contaminated with traces of White Spirit, the BGIA method failed to estimate the MWFs levels (over-estimation). In this paper, we propose a new approach meant to measure both oil vapours and aerosols. Five inter-laboratory comparisons are discussed, based on the production of oil mist in an experimental chamber under controlled conditions.

  15. Tranport of p-aminohippuric acid (3H-PAH), inulin and dextran out of the cranial cavity: A methodological study using intraventricular injection and sample combustion

    International Nuclear Information System (INIS)

    Jakobson, Aa. M.

    1987-01-01

    Material injected into the cerebral ventricles can leave the cerebrospinal fluid (CSF) but remain in the cranial cavity. To analyze the disappearance of 3 H- and of 14 C-labelled material from the cranial cavity, such material was injected into the lateral ventricles together with a bulk flow marker, labelled with the other radionuclide. In the present pilot study 3 H-PAH and 14 C-inulin were used. Five μl of a mixture was injected into each lateral cerebral ventricles in rats, which were killed at various intervals. The whole skull was analyzed without opening the CSF space after homogenization in the deep-frozen state. The samples were combusted and analyzed by liquid scintillation counting. Probenecid, injected intraperitoneally, inhibited the removal of 3 H-PAH from the skull cavity, as anticipated. Immediately after the intraventricular injection, however, 3 H-PAH was transiently retained, probably by uptake into actively transporting tissue. After injection of probenecid, this delay in removal was reduced. The difference in disappearance rate between 3 H-PAH and 14 C-inulin was estimated by comparing the 3 H/ 14 C ratio in the skulls with that in the injected solution, which appeared to be a better method than comparing the recovery of each compound. (author)

  16. A facile and selective approach for enrichment of l-cysteine in human plasma sample based on zinc organic polymer: Optimization by response surface methodology.

    Science.gov (United States)

    Bahrani, Sonia; Ghaedi, Mehrorang; Ostovan, Abbas; Javadian, Hamedreza; Mansoorkhani, Mohammad Javad Khoshnood; Taghipour, Tahere

    2018-02-05

    In this research, a facile and selective method was described to extract l-cysteine (l-Cys), an essential α-amino acid for anti-ageing playing an important role in human health, from human blood plasma sample. The importance of this research was the mild and time-consuming synthesis of zinc organic polymer (Zn-MOP) as an adsorbent and evaluation of its ability for efficient enrichment of l-Cys by ultrasound-assisted dispersive micro solid-phase extraction (UA-DMSPE) method. The structure of Zn-MOP was investigated by FT-IR, XRD and SEM. Analysis of variance (ANOVA) was applied for the experimental data to reach the best optimum conditions. The quantification of l-Cys was carried out by high performance liquid chromatography with UV detection set at λ=230nm. The calibration graph showed reasonable linear responses towards l-Cys concentrations in the range of 4.0-1000μg/L (r 2 =0.999) with low limit of detection (0.76μg/L, S/N=3) and RSD≤2.18 (n=3). The results revealed the applicability and high performance of this novel strategy in detecting trace l-Cys by Zn-MOP in complicated matrices. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Diagnostic-test evaluation of immunoassays for anti-Toxoplasma gondii IgG antibodies in a random sample of Mexican population.

    Science.gov (United States)

    Caballero-Ortega, Heriberto; Castillo-Cruz, Rocío; Murieta, Sandra; Ortíz-Alegría, Luz Belinda; Calderón-Segura, Esther; Conde-Glez, Carlos J; Cañedo-Solares, Irma; Correa, Dolores

    2014-05-14

    There are few articles on evaluation of Toxoplasma gondii serological tests. Besides, commercially available tests are not always useful and are expensive for studies in open population. The aim of this study was to evaluate in-house ELISA and western blot for IgG antibodies in a representative sample of people living in Mexico. Three hundred and five serum samples were randomly selected from two national seroepidemiological survey banks; they were taken from men and women of all ages and from all areas of the country. ELISA cut-off was established using the mean plus three standard deviations of negative samples. Western blots were analysed by two experienced technicians and positivity was established according to the presence of at least three diagnostic bands. A commercial ELISA kit was used as a third test. Two reference standards were built up: one using concordant results of two assays leaving the evaluated test out and the other in which the evaluated test was included (IN) with at least two concordant results to define diagnosis. the lowest values of diagnostic parameters were obtained with the OUT reference standards: in-house ELISA had 96.9% sensitivity, 62.1% specificity, 49.6% PPV, 98.1% NPV and 71.8% accuracy, while western blot presented 81.8%, 89.7%, 84.0%, 88.2% and 86.6% values and the best kappa coefficient (0.72-0.82). The in-house ELISA is useful for screening people of Mexico, due to its high sensitivity, while western blot may be used to confirm diagnosis. These techniques might prove useful in other Latin American countries.

  18. Employing a Multi-level Approach to Recruit a Representative Sample of Women with Recent Gestational Diabetes Mellitus into a Randomized Lifestyle Intervention Trial.

    Science.gov (United States)

    Nicklas, Jacinda M; Skurnik, Geraldine; Zera, Chloe A; Reforma, Liberty G; Levkoff, Sue E; Seely, Ellen W

    2016-02-01

    The postpartum period is a window of opportunity for diabetes prevention in women with recent gestational diabetes (GDM), but recruitment for clinical trials during this period of life is a major challenge. We adapted a social-ecologic model to develop a multi-level recruitment strategy at the macro (high or institutional level), meso (mid or provider level), and micro (individual) levels. Our goal was to recruit 100 women with recent GDM into the Balance after Baby randomized controlled trial over a 17-month period. Participants were asked to attend three in-person study visits at 6 weeks, 6, and 12 months postpartum. They were randomized into a control arm or a web-based intervention arm at the end of the baseline visit at six weeks postpartum. At the end of the recruitment period, we compared population characteristics of our enrolled subjects to the entire population of women with GDM delivering at Brigham and Women's Hospital (BWH). We successfully recruited 107 of 156 (69 %) women assessed for eligibility, with the majority (92) recruited during pregnancy at a mean 30 (SD ± 5) weeks of gestation, and 15 recruited postpartum, at a mean 2 (SD ± 3) weeks postpartum. 78 subjects attended the initial baseline visit, and 75 subjects were randomized into the trial at a mean 7 (SD ± 2) weeks postpartum. The recruited subjects were similar in age and race/ethnicity to the total population of 538 GDM deliveries at BWH over the 17-month recruitment period. Our multilevel approach allowed us to successfully meet our recruitment goal and recruit a representative sample of women with recent GDM. We believe that our most successful strategies included using a dedicated in-person recruiter, integrating recruitment into clinical flow, allowing for flexibility in recruitment, minimizing barriers to participation, and using an opt-out strategy with providers. Although the majority of women were recruited while pregnant, women recruited in the early postpartum period were

  19. Assessing the suitability of summary data for two-sample Mendelian randomization analyses using MR-Egger regression: the role of the I2 statistic.

    Science.gov (United States)

    Bowden, Jack; Del Greco M, Fabiola; Minelli, Cosetta; Davey Smith, George; Sheehan, Nuala A; Thompson, John R

    2016-12-01

    : MR-Egger regression has recently been proposed as a method for Mendelian randomization (MR) analyses incorporating summary data estimates of causal effect from multiple individual variants, which is robust to invalid instruments. It can be used to test for directional pleiotropy and provides an estimate of the causal effect adjusted for its presence. MR-Egger regression provides a useful additional sensitivity analysis to the standard inverse variance weighted (IVW) approach that assumes all variants are valid instruments. Both methods use weights that consider the single nucleotide polymorphism (SNP)-exposure associations to be known, rather than estimated. We call this the `NO Measurement Error' (NOME) assumption. Causal effect estimates from the IVW approach exhibit weak instrument bias whenever the genetic variants utilized violate the NOME assumption, which can be reliably measured using the F-statistic. The effect of NOME violation on MR-Egger regression has yet to be studied. An adaptation of the I2 statistic from the field of meta-analysis is proposed to quantify the strength of NOME violation for MR-Egger. It lies between 0 and 1, and indicates the expected relative bias (or dilution) of the MR-Egger causal estimate in the two-sample MR context. We call it IGX2 . The method of simulation extrapolation is also explored to counteract the dilution. Their joint utility is evaluated using simulated data and applied to a real MR example. In simulated two-sample MR analyses we show that, when a causal effect exists, the MR-Egger estimate of causal effect is biased towards the null when NOME is violated, and the stronger the violation (as indicated by lower values of IGX2 ), the stronger the dilution. When additionally all genetic variants are valid instruments, the type I error rate of the MR-Egger test for pleiotropy is inflated and the causal effect underestimated. Simulation extrapolation is shown to substantially mitigate these adverse effects. We

  20. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  1. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  2. Application of bimodal distribution to the detection of changes in uranium concentration in drinking water collected by random daytime sampling method from a large water supply zone.

    Science.gov (United States)

    Garboś, Sławomir; Święcicka, Dorota

    2015-11-01

    The random daytime (RDT) sampling method was used for the first time in the assessment of average weekly exposure to uranium through drinking water in a large water supply zone. Data set of uranium concentrations determined in 106 RDT samples collected in three runs from the water supply zone in Wroclaw (Poland), cannot be simply described by normal or log-normal distributions. Therefore, a numerical method designed for the detection and calculation of bimodal distribution was applied. The extracted two distributions containing data from the summer season of 2011 and the winter season of 2012 (nI=72) and from the summer season of 2013 (nII=34) allowed to estimate means of U concentrations in drinking water: 0.947 μg/L and 1.23 μg/L, respectively. As the removal efficiency of uranium during applied treatment process is negligible, the effect of increase in uranium concentration can be explained by higher U concentration in the surface-infiltration water used for the production of drinking water. During the summer season of 2013, heavy rains were observed in Lower Silesia region, causing floods over the territory of the entire region. Fluctuations in uranium concentrations in surface-infiltration water can be attributed to releases of uranium from specific sources - migration from phosphate fertilizers and leaching from mineral deposits. Thus, exposure to uranium through drinking water may increase during extreme rainfall events. The average chronic weekly intakes of uranium through drinking water, estimated on the basis of central values of the extracted normal distributions, accounted for 3.2% and 4.1% of tolerable weekly intake. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Impact of an educational intervention on women's knowledge and acceptability of human papillomavirus self-sampling: a randomized controlled trial in Cameroon.

    Directory of Open Access Journals (Sweden)

    Gaëtan Sossauer

    Full Text Available OBJECTIVE: Human papillomavirus (HPV self-sampling (Self-HPV may be used as a primary cervical cancer screening method in a low resource setting. Our aim was to evaluate whether an educational intervention would improve women's knowledge and confidence in the Self-HPV method. METHOD: Women aged between 25 and 65 years old, eligible for cervical cancer screening, were randomly chosen to receive standard information (control group or standard information followed by educational intervention (interventional group. Standard information included explanations about what the test detects (HPV, the link between HPV and cervical cancer and how to perform HPV self-sampling. The educational intervention consisted of a culturally tailored video about HPV, cervical cancer, Self-HPV and its relevancy as a screening test. All participants completed a questionnaire that assessed sociodemographic data, women's knowledge about cervical cancer and acceptability of Self-HPV. RESULTS: A total of 302 women were enrolled in 4 health care centers in Yaoundé and the surrounding countryside. 301 women (149 in the "control group" and 152 in the "intervention group" completed the full process and were included into the analysis. Participants who received the educational intervention had a significantly higher knowledge about HPV and cervical cancer than the control group (p<0.05, but no significant difference on Self-HPV acceptability and confidence in the method was noticed between the two groups. CONCLUSION: Educational intervention promotes an increase in knowledge about HPV and cervical cancer. Further investigation should be conducted to determine if this intervention can be sustained beyond the short term and influences screening behavior. TRIALS REGISTRATION: International Standard Randomised Controlled Trial Number (ISRCTN Register ISRCTN78123709.

  4. Association between Spouse/Child Separation and Migration-Related Stress among a Random Sample of Rural-to-Urban Migrants in Wuhan, China.

    Science.gov (United States)

    Guo, Yan; Chen, Xinguang; Gong, Jie; Li, Fang; Zhu, Chaoyang; Yan, Yaqiong; Wang, Liang

    2016-01-01

    Millions of people move from rural areas to urban areas in China to pursue new opportunities while leaving their spouses and children at rural homes. Little is known about the impact of migration-related separation on mental health of these rural migrants in urban China. Survey data from a random sample of rural-to-urban migrants (n = 1113, aged 18-45) from Wuhan were analyzed. The Domestic Migration Stress Questionnaire (DMSQ), an instrument with four subconstructs, was used to measure migration-related stress. The relationship between spouse/child separation and stress was assessed using survey estimation methods to account for the multi-level sampling design. 16.46% of couples were separated from their spouses (spouse-separation only), 25.81% of parents were separated from their children (child separation only). Among the participants who married and had children, 5.97% were separated from both their spouses and children (double separation). Spouse-separation only and double separation did not scored significantly higher on DMSQ than those with no separation. Compared to parents without child separation, parents with child separation scored significantly higher on DMSQ (mean score = 2.88, 95% CI: [2.81, 2.95] vs. 2.60 [2.53, 2.67], p separation type and by gender indicated that the association was stronger for child-separation only and for female participants. Child-separation is an important source of migration-related stress, and the effect is particularly strong for migrant women. Public policies and intervention programs should consider these factors to encourage and facilitate the co-migration of parents with their children to mitigate migration-related stress.

  5. Association between Spouse/Child Separation and Migration-Related Stress among a Random Sample of Rural-to-Urban Migrants in Wuhan, China.

    Directory of Open Access Journals (Sweden)

    Yan Guo

    Full Text Available Millions of people move from rural areas to urban areas in China to pursue new opportunities while leaving their spouses and children at rural homes. Little is known about the impact of migration-related separation on mental health of these rural migrants in urban China.Survey data from a random sample of rural-to-urban migrants (n = 1113, aged 18-45 from Wuhan were analyzed. The Domestic Migration Stress Questionnaire (DMSQ, an instrument with four subconstructs, was used to measure migration-related stress. The relationship between spouse/child separation and stress was assessed using survey estimation methods to account for the multi-level sampling design.16.46% of couples were separated from their spouses (spouse-separation only, 25.81% of parents were separated from their children (child separation only. Among the participants who married and had children, 5.97% were separated from both their spouses and children (double separation. Spouse-separation only and double separation did not scored significantly higher on DMSQ than those with no separation. Compared to parents without child separation, parents with child separation scored significantly higher on DMSQ (mean score = 2.88, 95% CI: [2.81, 2.95] vs. 2.60 [2.53, 2.67], p < .05. Stratified analysis by separation type and by gender indicated that the association was stronger for child-separation only and for female participants.Child-separation is an important source of migration-related stress, and the effect is particularly strong for migrant women. Public policies and intervention programs should consider these factors to encourage and facilitate the co-migration of parents with their children to mitigate migration-related stress.

  6. Primary prevention of stroke and cardiovascular disease in the community (PREVENTS): Methodology of a health wellness coaching intervention to reduce stroke and cardiovascular disease risk, a randomized clinical trial.

    Science.gov (United States)

    Mahon, Susan; Krishnamurthi, Rita; Vandal, Alain; Witt, Emma; Barker-Collo, Suzanne; Parmar, Priya; Theadom, Alice; Barber, Alan; Arroll, Bruce; Rush, Elaine; Elder, Hinemoa; Dyer, Jesse; Feigin, Valery

    2018-02-01

    Rationale Stroke is a major cause of death and disability worldwide, yet 80% of strokes can be prevented through modifications of risk factors and lifestyle and by medication. While management strategies for primary stroke prevention in high cardiovascular disease risk individuals are well established, they are underutilized and existing practice of primary stroke prevention are inadequate. Behavioral interventions are emerging as highly promising strategies to improve cardiovascular disease risk factor management. Health Wellness Coaching is an innovative, patient-focused and cost-effective, multidimensional psychological intervention designed to motivate participants to adhere to recommended medication and lifestyle changes and has been shown to improve health and enhance well-being. Aims and/or hypothesis To determine the effectiveness of Health Wellness Coaching for primary stroke prevention in an ethnically diverse sample including Māori, Pacific Island, New Zealand European and Asian participants. Design A parallel, prospective, randomized, open-treatment, single-blinded end-point trial. Participants include 320 adults with absolute five-year cardiovascular disease risk ≥ 10%, calculated using the PREDICT web-based clinical tool. Randomization will be to Health Wellness Coaching or usual care groups. Participants randomized to Health Wellness Coaching will receive 15 coaching sessions over nine months. Study outcomes A substantial relative risk reduction of five-year cardiovascular disease risk at nine months post-randomization, which is defined as 10% relative risk reduction among those at moderate five-year cardiovascular disease risk (10-15%) and 25% among those at high risk (>15%). Discussion This clinical trial will determine whether Health Wellness Coaching is an effective intervention for reducing modifiable risk factors, and hence decrease the risk of stroke and cardiovascular disease.

  7. Thermal discomfort with cold extremities in relation to age, gender, and body mass index in a random sample of a Swiss urban population

    Directory of Open Access Journals (Sweden)

    Orgül Selim

    2010-06-01

    Full Text Available Abstract Background The aim of this epidemiological study was to investigate the relationship of thermal discomfort with cold extremities (TDCE to age, gender, and body mass index (BMI in a Swiss urban population. Methods In a random population sample of Basel city, 2,800 subjects aged 20-40 years were asked to complete a questionnaire evaluating the extent of cold extremities. Values of cold extremities were based on questionnaire-derived scores. The correlation of age, gender, and BMI to TDCE was analyzed using multiple regression analysis. Results A total of 1,001 women (72.3% response rate and 809 men (60% response rate returned a completed questionnaire. Statistical analyses revealed the following findings: Younger subjects suffered more intensely from cold extremities than the elderly, and women suffered more than men (particularly younger women. Slimmer subjects suffered significantly more often from cold extremities than subjects with higher BMIs. Conclusions Thermal discomfort with cold extremities (a relevant symptom of primary vascular dysregulation occurs at highest intensity in younger, slimmer women and at lowest intensity in elderly, stouter men.

  8. Do parents of adolescents request the same universal parental support as parents of younger children? A random sample of Swedish parents.

    Science.gov (United States)

    Thorslund, Karin; Johansson Hanse, Jan; Axberg, Ulf

    2017-07-01

    Universal parental support intended to enhance parents' capacity for parenting is an important aspect of public health strategies. However, support has mostly been aimed at parents, especially mothers, of younger children. There is a gap in the research concerning parents of adolescents and fathers' interest in parenting support. To investigate and compare the interest in parenting support of parents of adolescents and younger children, potential differences between mothers and fathers, and their knowledge of what is being offered to them already, and to explore their requirements for future universal parental support. Telephone interviews were conducted with a random sample of 1336 parents. Quantitative methods were used to analyze differences between groups and qualitative methods were used to analyze open-ended questions in regard to parents' requirements for future universal parental support. About 82% of the parents of adolescents interviewed think that offering universal parental support is most important during child's adolescence. There is a substantial interest, particularly among mothers, in most forms of support. Despite their interest, parents have limited awareness of the support available. Only 7% knew about the local municipality website, although 70% reported a possible interest in such a website. Similarly, 3% knew that a parent phone line was available to them, while 59% reported a possible interest. It poses a challenge but is nevertheless important for municipalities to develop support targeted at parents of adolescents which is tailored to their needs, and to reach out with information.

  9. [Qualitative research methodology in health care].

    Science.gov (United States)

    Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara

    2017-03-01

    Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, “The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals”. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.

  10. A Comparison of the Methodological Quality of Articles in Computer Science Education Journals and Conference Proceedings

    Science.gov (United States)

    Randolph, Justus J.; Julnes, George; Bednarik, Roman; Sutinen, Erkki

    2007-01-01

    In this study we empirically investigate the claim that articles published in computer science education journals are more methodologically sound than articles published in computer science education conference proceedings. A random sample of 352 articles was selected from those articles published in major computer science education forums between…

  11. Increased risk for invasive breast cancer associated with hormonal therapy: a nation-wide random sample of 65,723 women followed from 1997 to 2008.

    Directory of Open Access Journals (Sweden)

    Jung-Nien Lai

    Full Text Available BACKGROUND: Hormonal therapy (HT either estrogen alone (E-alone or estrogen plus progesterone (E+P appears to increase the risk for breast cancer in Western countries. However, limited information is available on the association between HT and breast cancer in Asian women characterized mainly by dietary phytoestrogens intake and low prevalence of contraceptive pills prescription. METHODOLOGY: A total of 65,723 women (20-79 years of age without cancer or the use of Chinese herbal products were recruited from a nation-wide one-million representative sample of the National Health Insurance of Taiwan and followed from 1997 to 2008. Seven hundred and eighty incidents of invasive breast cancer were diagnosed. Using a reference group that comprised 40,052 women who had never received a hormone prescription, Cox proportional hazard models were constructed to determine the hazard ratios for receiving different types of HT and the occurrence of breast cancer. CONCLUSIONS: 5,156 (20% women ever used E+P, 2,798 (10.8% ever used E-alone, and 17,717 (69% ever used other preparation types. The Cox model revealed adjusted hazard ratios (HRs of 2.05 (95% CI 1.37-3.07 for current users of E-alone and 8.65 (95% CI 5.45-13.70 for current users of E+P. Using women who had ceased to take hormonal medication for 6 years or more as the reference group, the adjusted HRs were significantly elevated and greater than current users and women who had discontinued hormonal medication for less than 6 years. Current users of either E-alone or E+P have an increased risk for invasive breast cancer in Taiwan, and precautions should be taken when such agents are prescribed.

  12. Predictors of poor retention on antiretroviral therapy as a major HIV drug resistance early warning indicator in Cameroon: results from a nationwide systematic random sampling

    Directory of Open Access Journals (Sweden)

    Serge Clotaire Billong

    2016-11-01

    Full Text Available Abstract Background Retention on lifelong antiretroviral therapy (ART is essential in sustaining treatment success while preventing HIV drug resistance (HIVDR, especially in resource-limited settings (RLS. In an era of rising numbers of patients on ART, mastering patients in care is becoming more strategic for programmatic interventions. Due to lapses and uncertainty with the current WHO sampling approach in Cameroon, we thus aimed to ascertain the national performance of, and determinants in, retention on ART at 12 months. Methods Using a systematic random sampling, a survey was conducted in the ten regions (56 sites of Cameroon, within the “reporting period” of October 2013–November 2014, enrolling 5005 eligible adults and children. Performance in retention on ART at 12 months was interpreted following the definition of HIVDR early warning indicator: excellent (>85%, fair (85–75%, poor (<75; and factors with p-value < 0.01 were considered statistically significant. Results Majority (74.4% of patients were in urban settings, and 50.9% were managed in reference treatment centres. Nationwide, retention on ART at 12 months was 60.4% (2023/3349; only six sites and one region achieved acceptable performances. Retention performance varied in reference treatment centres (54.2% vs. management units (66.8%, p < 0.0001; male (57.1% vs. women (62.0%, p = 0.007; and with WHO clinical stage I (63.3% vs. other stages (55.6%, p = 0.007; but neither for age (adults [60.3%] vs. children [58.8%], p = 0.730 nor for immune status (CD4351–500 [65.9%] vs. other CD4-staging [59.86%], p = 0.077. Conclusions Poor retention in care, within 12 months of ART initiation, urges active search for lost-to-follow-up targeting preferentially male and symptomatic patients, especially within reference ART clinics. Such sampling strategy could be further strengthened for informed ART monitoring and HIVDR prevention perspectives.

  13. The influence of psychoeducation on regulating biological rhythm in a sample of patients with bipolar II disorder: a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Faria AD

    2014-06-01

    Full Text Available Augusto Duarte Faria,1 Luciano Dias de Mattos Souza,2 Taiane de Azevedo Cardoso,2 Karen Amaral Tavares Pinheiro,2 Ricardo Tavares Pinheiro,2 Ricardo Azevedo da Silva,2 Karen Jansen21Department of Clinical and Health Psychology, Universidade Federal do Rio Grande – FURG, Rio Grande, RS, Brazil; 2Health and Behavior Postgraduate Program, Universidade Católica de Pelotas – UCPEL, Pelotas, RS, BrazilIntroduction: Changes in biological rhythm are among the various characteristics of bipolar disorder, and have long been associated with the functional impairment of the disease. There are only a few viable options of psychosocial interventions that deal with this specific topic; one of them is psychoeducation, a model that, although it has been used by practitioners for some time, only recently have studies shown its efficacy in clinical practice.Aim: To assess if patients undergoing psychosocial intervention in addition to a pharmacological treatment have better regulation of their biological rhythm than those only using medication.Method: This study is a randomized clinical trial that compares a standard medication intervention to an intervention combined with drugs and psychoeducation. The evaluation of the biological rhythm was made using the Biological Rhythm Interview of Assessment in Neuropsychiatry, an 18-item scale divided in four areas (sleep, activity, social rhythm, and eating pattern. The combined intervention consisted of medication and a short-term psychoeducation model summarized in a protocol of six individual sessions of 1 hour each.Results: The sample consisted of 61 patients with bipolar II disorder, but during the study, there were 14 losses to follow-up. Therefore, the final sample consisted of 45 individuals (26 for standard intervention and 19 for combined. The results showed that, in this sample and time period evaluated, the combined treatment of medication and psychoeducation had no statistically significant impact on the

  14. Methicillin-sensitive and methicillin-resistant Staphylococcus aureus nasal carriage in a random sample of non-hospitalized adult population in northern Germany.

    Directory of Open Access Journals (Sweden)

    Jaishri Mehraj

    Full Text Available OBJECTIVE: The findings from truly randomized community-based studies on Staphylococcus aureus nasal colonization are scarce. Therefore we have examined point prevalence and risk factors of S. aureus nasal carriage in a non-hospitalized population of Braunschweig, northern Germany. METHODS: A total of 2026 potential participants were randomly selected through the resident's registration office and invited by mail. They were requested to collect a nasal swab at home and return it by mail. S. aureus was identified by culture and PCR. Logistic regression was used to determine risk factors of S. aureus carriage. RESULTS: Among the invitees, 405 individuals agreed to participate and 389 provided complete data which was included in the analysis. The median age of the participants was 49 years (IQR: 39-61 and 61% were females. S. aureus was isolated in 85 (21.9%; 95% CI: 18.0-26.2% of the samples, five of which were MRSA (1.29%; 95% CI: 0.55-2.98%. In multiple logistic regression, male sex (OR = 3.50; 95% CI: 2.01-6.11 and presence of allergies (OR = 2.43; 95% CI: 1.39-4.24 were found to be associated with S. aureus nasal carriage. Fifty five different spa types were found, that clustered into nine distinct groups. MRSA belonged to the hospital-associated spa types t032 and t025 (corresponds to MLST CC 22, whereas MSSA spa types varied and mostly belonged to spa-CC 012 (corresponds to MLST CC 30, and spa-CC 084 (corresponds to MLST CC 15. CONCLUSION: This first point prevalence study of S. aureus in a non-hospitalized population of Germany revealed prevalence, consistent with other European countries and supports previous findings on male sex and allergies as risk factors of S. aureus carriage. The detection of hospital-associated MRSA spa types in the community indicates possible spread of these strains from hospitals into the community.

  15. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  16. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  17. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  18. Extração de matéria orgânica aquática por abaixamento de temperatura: uma metodologia alternativa para manter a identidade da amostra Extraction of aquatic organic matter by temperature decreasing: an alternative methodology to keep the original sample characteristics

    Directory of Open Access Journals (Sweden)

    Rosana N. H. Martins de Almeida

    2003-03-01

    Full Text Available In this work was developed an alternative methodology to separation of aquatic organic matter (AOM present in natural river waters. The process is based in temperature decreasing of the aqueous sample under controlled conditions that provoke the freezing of the sample and separation of the dark extract, not frozen and rich in organic matter. The results showed that speed of temperature decreasing exerts strongly influence in relative recovery of organic carbon, enrichment and time separation of the organic matter present in water samples. Elemental composition, infrared spectra and thermal analysis results showed that the alternative methodology is less aggressive possible in the attempt of maintaining the integrity of the sample.

  19. Effectiveness of Housing First with Intensive Case Management in an Ethnically Diverse Sample of Homeless Adults with Mental Illness: A Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Vicky Stergiopoulos

    Full Text Available Housing First (HF is being widely disseminated in efforts to end homelessness among homeless adults with psychiatric disabilities. This study evaluates the effectiveness of HF with Intensive Case Management (ICM among ethnically diverse homeless adults in an urban setting. 378 participants were randomized to HF with ICM or treatment-as-usual (TAU in Toronto (Canada, and followed for 24 months. Measures of effectiveness included housing stability, physical (EQ5D-VAS and mental (CSI, GAIN-SS health, social functioning (MCAS, quality of life (QoLI20, and health service use. Two-thirds of the sample (63% was from racialized groups and half (50% were born outside Canada. Over the 24 months of follow-up, HF participants spent a significantly greater percentage of time in stable residences compared to TAU participants (75.1% 95% CI 70.5 to 79.7 vs. 39.3% 95% CI 34.3 to 44.2, respectively. Similarly, community functioning (MCAS improved significantly from baseline in HF compared to TAU participants (change in mean difference = +1.67 95% CI 0.04 to 3.30. There was a significant reduction in the number of days spent experiencing alcohol problems among the HF compared to TAU participants at 24 months (ratio of rate ratios = 0.47 95% CI 0.22 to 0.99 relative to baseline, a reduction of 53%. Although the number of emergency department visits and days in hospital over 24 months did not differ significantly between HF and TAU participants, fewer HF participants compared to TAU participants had 1 or more hospitalizations during this period (70.4% vs. 81.1%, respectively; P=0.044. Compared to non-racialized HF participants, racialized HF participants saw an increase in the amount of money spent on alcohol (change in mean difference = $112.90 95% CI 5.84 to 219.96 and a reduction in physical community integration (ratio of rate ratios = 0.67 95% CI 0.47 to 0.96 from baseline to 24 months. Secondary analyses found a significant reduction in the number of days

  20. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  1. Children's Quality of Life Based on the KIDSCREEN-27: Child Self-Report, Parent Ratings and Child-Parent Agreement in a Swedish Random Population Sample.

    Directory of Open Access Journals (Sweden)

    Anne H Berman

    Full Text Available The KIDSCREEN-27 is a measure of child and adolescent quality of life (QoL, with excellent psychometric properties, available in child-report and parent-rating versions in 38 languages. This study provides child-reported and parent-rated norms for the KIDSCREEN-27 among Swedish 11-16 year-olds, as well as child-parent agreement. Sociodemographic correlates of self-reported wellbeing and parent-rated wellbeing were also measured.A random population sample consisting of 600 children aged 11-16, 100 per age group and one of their parents (N = 1200, were approached for response to self-reported and parent-rated versions of the KIDSCREEN-27. Parents were also asked about their education, employment status and their own QoL based on the 26-item WHOQOL-Bref. Based on the final sampling pool of 1158 persons, a 34.8% response rate of 403 individuals was obtained, including 175 child-parent pairs, 27 child singleton responders and 26 parent singletons. Gender and age differences for parent ratings and child-reported data were analyzed using t-tests and the Mann-Whitney U-test. Post-hoc Dunn tests were conducted for pairwise comparisons when the p-value for specific subscales was 0.05 or lower. Child-parent agreement was tested item-by-item, using the Prevalence- and Bias-Adjusted Kappa (PABAK coefficient for ordinal data (PABAK-OS; dimensional and total score agreement was evaluated based on dichotomous cut-offs for lower well-being, using the PABAK and total, continuous scores were evaluated using Bland-Altman plots.Compared to European norms, Swedish children in this sample scored lower on Physical wellbeing (48.8 SE/49.94 EU but higher on the other KIDSCREEN-27 dimensions: Psychological wellbeing (53.4/49.77, Parent relations and autonomy (55.1/49.99, Social Support and peers (54.1/49.94 and School (55.8/50.01. Older children self-reported lower wellbeing than younger children. No significant self-reported gender differences occurred and parent ratings

  2. Critical evaluation of methodology commonly used in sample collection, storage and preparation for the analysis of pharmaceuticals and illicit drugs in surface water and wastewater by solid phase extraction and liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Baker, David R; Kasprzyk-Hordern, Barbara

    2011-11-04

    The main aim of this manuscript is to provide a comprehensive and critical verification of methodology commonly used for sample collection, storage and preparation in studies concerning the analysis of pharmaceuticals and illicit drugs in aqueous environmental samples with the usage of SPE-LC/MS techniques. This manuscript reports the results of investigations into several sample preparation parameters that to the authors' knowledge have not been reported or have received very little attention. This includes: (i) effect of evaporation temperature and (ii) solvent with regards to solid phase extraction (SPE) extracts; (iii) effect of silanising glassware; (iv) recovery of analytes during vacuum filtration through glass fibre filters and (v) pre LC-MS filter membranes. All of these parameters are vital to develop efficient and reliable extraction techniques; an essential factor given that target drug residues are often present in the aqueous environment at ng L(-1) levels. Presented is also the first comprehensive review of the stability of illicit drugs and pharmaceuticals in wastewater. Among the parameters studied are: time of storage, temperature and pH. Over 60 analytes were targeted including stimulants, opioid and morphine derivatives, benzodiazepines, antidepressants, dissociative anaesthetics, drug precursors, human urine indicators and their metabolites. The lack of stability of analytes in raw wastewater was found to be significant for many compounds. For instance, 34% of compounds studied reported a stability change >15% after only 12 h in raw wastewater stored at 2 °C; a very important finding given that wastewater is typically collected with the use of 24 h composite samplers. The stability of these compounds is also critical given the recent development of so-called 'sewage forensics' or 'sewage epidemiology' in which concentrations of target drug residues in wastewater are used to back-calculate drug consumption. Without an understanding of stability

  3. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  4. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  5. A Randomized Clinical Trial of Cogmed Working Memory Training in School-Age Children with ADHD: A Replication in a Diverse Sample Using a Control Condition

    Science.gov (United States)

    Chacko, A.; Bedard, A. C.; Marks, D. J.; Feirsen, N.; Uderman, J. Z.; Chimiklis, A.; Rajwan, E.; Cornwell, M.; Anderson, L.; Zwilling, A.; Ramon, M.

    2014-01-01

    Background: Cogmed Working Memory Training (CWMT) has received considerable attention as a promising intervention for the treatment of Attention-Deficit/Hyperactivity Disorder (ADHD) in children. At the same time, methodological weaknesses in previous clinical trials call into question reported efficacy of CWMT. In particular, lack of equivalence…

  6. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  7. Environmental monitoring at the nuclear power plants and Studsvik 1992-1993. Results from measurements of radionuclide contents of environmental samples, and from random checks by SSI

    International Nuclear Information System (INIS)

    Bengtson, P.; Larsson, C.M.; Simenstad, P.; Suomela, J.

    1995-09-01

    Marine samples from the vicinity of the plants show elevated radionuclide concentrations, caused by discharges from the plants. Very low concentrations are noted in terrestrial samples. At several locations, the effects of the Chernobyl disaster still dominates. Control samples measured by SSI have confirmed the measurements performed by the operators. 8 refs, 6 tabs, 46 figs

  8. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  9. Assessment of generalizability, applicability and predictability (GAP) for evaluating external validity in studies of universal family-based prevention of alcohol misuse in young people: systematic methodological review of randomized controlled trials.

    Science.gov (United States)

    Fernandez-Hermida, Jose Ramon; Calafat, Amador; Becoña, Elisardo; Tsertsvadze, Alexander; Foxcroft, David R

    2012-09-01

    To assess external validity characteristics of studies from two Cochrane Systematic Reviews of the effectiveness of universal family-based prevention of alcohol misuse in young people. Two reviewers used an a priori developed external validity rating form and independently assessed three external validity dimensions of generalizability, applicability and predictability (GAP) in randomized controlled trials. The majority (69%) of the included 29 studies were rated 'unclear' on the reporting of sufficient information for judging generalizability from sample to study population. Ten studies (35%) were rated 'unclear' on the reporting of sufficient information for judging applicability to other populations and settings. No study provided an assessment of the validity of the trial end-point measures for subsequent mortality, morbidity, quality of life or other economic or social outcomes. Similarly, no study reported on the validity of surrogate measures using established criteria for assessing surrogate end-points. Studies evaluating the benefits of family-based prevention of alcohol misuse in young people are generally inadequate at reporting information relevant to generalizability of the findings or implications for health or social outcomes. Researchers, study authors, peer reviewers, journal editors and scientific societies should take steps to improve the reporting of information relevant to external validity in prevention trials. © 2012 The Authors. Addiction © 2012 Society for the Study of Addiction.

  10. Random-effects linear modeling and sample size tables for two special crossover designs of average bioequivalence studies: the four-period, two-sequence, two-formulation and six-period, three-sequence, three-formulation designs.

    Science.gov (United States)

    Diaz, Francisco J; Berg, Michel J; Krebill, Ron; Welty, Timothy; Gidal, Barry E; Alloway, Rita; Privitera, Michael

    2013-12-01

    Due to concern and debate in the epilepsy medical community and to the current interest of the US Food and Drug Administration (FDA) in revising approaches to the approval of generic drugs, the FDA is currently supporting ongoing bioequivalence studies of antiepileptic drugs, the EQUIGEN studies. During the design of these crossover studies, the researchers could not find commercial or non-commercial statistical software that quickly allowed computation of sample sizes for their designs, particularly software implementing the FDA requirement of using random-effects linear models for the analyses of bioequivalence studies. This article presents tables for sample-size evaluations of average bioequivalence studies based on the two crossover designs used in the EQUIGEN studies: the four-period, two-sequence, two-formulation design, and the six-period, three-sequence, three-formulation design. Sample-size computations assume that random-effects linear models are used in bioequivalence analyses with crossover designs. Random-effects linear models have been traditionally viewed by many pharmacologists and clinical researchers as just mathematical devices to analyze repeated-measures data. In contrast, a modern view of these models attributes an important mathematical role in theoretical formulations in personalized medicine to them, because these models not only have parameters that represent average patients, but also have parameters that represent individual patients. Moreover, the notation and language of random-effects linear models have evolved over the years. Thus, another goal of this article is to provide a presentation of the statistical modeling of data from bioequivalence studies that highlights the modern view of these models, with special emphasis on power analyses and sample-size computations.

  11. Effect of sample size on multi-parametric prediction of tissue outcome in acute ischemic stroke using a random forest classifier

    Science.gov (United States)

    Forkert, Nils Daniel; Fiehler, Jens

    2015-03-01

    The tissue outcome prediction in acute ischemic stroke patients is highly relevant for clinical and research purposes. It has been shown that the combined analysis of diffusion and perfusion MRI datasets using high-level machine learning techniques leads to an improved prediction of final infarction compared to single perfusion parameter thresholding. However, most high-level classifiers require a previous training and, until now, it is ambiguous how many subjects are required for this, which is the focus of this work. 23 MRI datasets of acute stroke patients with known tissue outcome were used in this work. Relative values of diffusion and perfusion parameters as well as the binary tissue outcome were extracted on a voxel-by- voxel level for all patients and used for training of a random forest classifier. The number of patients used for training set definition was iteratively and randomly reduced from using all 22 other patients to only one other patient. Thus, 22 tissue outcome predictions were generated for each patient using the trained random forest classifiers and compared to the known tissue outcome using the Dice coefficient. Overall, a logarithmic relation between the number of patients used for training set definition and tissue outcome prediction accuracy was found. Quantitatively, a mean Dice coefficient of 0.45 was found for the prediction using the training set consisting of the voxel information from only one other patient, which increases to 0.53 if using all other patients (n=22). Based on extrapolation, 50-100 patients appear to be a reasonable tradeoff between tissue outcome prediction accuracy and effort required for data acquisition and preparation.

  12. A study on the advanced statistical core thermal design methodology

    International Nuclear Information System (INIS)

    Lee, Seung Hyuk

    1992-02-01

    A statistical core thermal design methodology for generating the limit DNBR and the nominal DNBR is proposed and used in assessing the best-estimate thermal margin in a reactor core. Firstly, the Latin Hypercube Sampling Method instead of the conventional Experimental Design Technique is utilized as an input sampling method for a regression analysis to evaluate its sampling efficiency. Secondly and as a main topic, the Modified Latin Hypercube Sampling and the Hypothesis Test Statistics method is proposed as a substitute for the current statistical core thermal design method. This new methodology adopts 'a Modified Latin Hypercube Sampling Method' which uses the mean values of each interval of input variables instead of random values to avoid the extreme cases that arise in the tail areas of some parameters. Next, the independence between the input variables is verified through 'Correlation Coefficient Test' for statistical treatment of their uncertainties. And the distribution type of DNBR response is determined though 'Goodness of Fit Test'. Finally, the limit DNBR with one-sided 95% probability and 95% confidence level, DNBR 95/95 ' is estimated. The advantage of this methodology over the conventional statistical method using Response Surface and Monte Carlo simulation technique lies in its simplicity of the analysis procedure, while maintaining the same level of confidence in the limit DNBR result. This methodology is applied to the two cases of DNBR margin calculation. The first case is the application to the determination of the limit DNBR where the DNBR margin is determined by the difference between the nominal DNBR and the limit DNBR. The second case is the application to the determination of the nominal DNBR where the DNBR margin is determined by the difference between the lower limit value of the nominal DNBR and the CHF correlation limit being used. From this study, it is deduced that the proposed methodology gives a good agreement in the DNBR results

  13. A sequential logic circuit for coincidences randomly distributed in 'time' and 'duration', with selection and total sampling

    International Nuclear Information System (INIS)

    Carnet, Bernard; Delhumeau, Michel

    1971-06-01

    The principles of binary analysis applied to the investigation of sequential circuits were used to design a two way coincidence circuit whose input may be, random or periodic variables of constant or variable duration. The output signal strictly reproduces the characteristics of the input signal triggering the coincidence. A coincidence between input signals does not produce any output signal if one of the signals has already triggered the output signal. The characteristics of the output signal in relation to those of the input signal are: minimum time jitter, excellent duration reproducibility and maximum efficiency. Some rules are given for achieving these results. The symmetry, transitivity and non-transitivity characteristics of the edges on the primitive graph are analyzed and lead to some rules for positioning the states on a secondary graph. It is from this graph that the equations of the circuits can be calculated. The development of the circuit and its dynamic testing are discussed. For this testing, the functioning of the circuit is simulated by feeding into the input randomly generated signals

  14. A randomized trial found online questionnaires supplemented by postal reminders generated a cost-effective and generalizable sample but don't forget the reminders.

    Science.gov (United States)

    Loban, Amanda; Mandefield, Laura; Hind, Daniel; Bradburn, Mike

    2017-12-01

    The objective of this study was to compare the response rates, data completeness, and representativeness of survey data produced by online and postal surveys. A randomized trial nested within a cohort study in Yorkshire, United Kingdom. Participants were randomized to receive either an electronic (online) survey questionnaire with paper reminder (N = 2,982) or paper questionnaire with electronic reminder (N = 2,855). Response rates were similar for electronic contact and postal contacts (50.9% vs. 49.7%, difference = 1.2%, 95% confidence interval: -1.3% to 3.8%). The characteristics of those responding to the two groups were similar. Participants nevertheless demonstrated an overwhelming preference for postal questionnaires, with the majority responding by post in both groups. Online survey questionnaire systems need to be supplemented with a postal reminder to achieve acceptable uptake, but doing so provides a similar response rate and case mix when compared to postal questionnaires alone. For large surveys, online survey systems may be cost saving. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. A combinatorial and probabilistic study of initial and end heights of descents in samples of geometrically distributed random variables and in permutations

    Directory of Open Access Journals (Sweden)

    Helmut Prodinger

    2007-01-01

    Full Text Available In words, generated by independent geometrically distributed random variables, we study the l th descent, which is, roughly speaking, the l th occurrence of a neighbouring pair ab with a>b. The value a is called the initial height, and b the end height. We study these two random variables (and some similar ones by combinatorial and probabilistic tools. We find in all instances a generating function Ψ(v,u, where the coefficient of v j u i refers to the j th descent (ascent, and i to the initial (end height. From this, various conclusions can be drawn, in particular expected values. In the probabilistic part, a Markov chain model is used, which allows to get explicit expressions for the heights of the second descent. In principle, one could go further, but the complexity of the results forbids it. This is extended to permutations of a large number of elements. Methods from q-analysis are used to simplify the expressions. This is the reason that we confine ourselves to the geometric distribution only. For general discrete distributions, no such tools are available.

  16. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    Science.gov (United States)

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  17. A cluster-randomized trial of a college health center-based alcohol and sexual violence intervention (GIFTSS): Design, rationale, and baseline sample.

    Science.gov (United States)

    Abebe, Kaleab Z; Jones, Kelley A; Rofey, Dana; McCauley, Heather L; Clark, Duncan B; Dick, Rebecca; Gmelin, Theresa; Talis, Janine; Anderson, Jocelyn; Chugani, Carla; Algarroba, Gabriela; Antonio, Ashley; Bee, Courtney; Edwards, Clare; Lethihet, Nadia; Macak, Justin; Paley, Joshua; Torres, Irving; Van Dusen, Courtney; Miller, Elizabeth

    2018-02-01

    Sexual violence (SV) on college campuses is common, especially alcohol-related SV. This is a 2-arm cluster randomized controlled trial to test a brief intervention to reduce risk for alcohol-related sexual violence (SV) among students receiving care from college health centers (CHCs). Intervention CHC staff are trained to deliver universal SV education to all students seeking care, to facilitate patient and provider comfort in discussing SV and related abusive experiences (including the role of alcohol). Control sites provide participants with information about drinking responsibly. Across 28 participating campuses (12 randomized to intervention and 16 to control), 2292 students seeking care at CHCs complete surveys prior to their appointment (baseline), immediately after (exit), 4months later (T2) and one year later (T3). The primary outcome is change in recognition of SV and sexual risk. Among those reporting SV exposure at baseline, changes in SV victimization, disclosure, and use of SV services are additional outcomes. Intervention effects will be assessed using generalized linear mixed models that account for clustering of repeated observations both within CHCs and within students. Slightly more than half of the participating colleges have undergraduate enrollment of ≥3000 students; two-thirds are public and almost half are urban. Among participants there were relatively more Asian (10 v 1%) and Black/African American (13 v 7%) and fewer White (58 v 74%) participants in the intervention compared to control. This study will offer the first formal assessment for SV prevention in the CHC setting. Clinical Trials #: NCT02355470. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Quantitative culture of endotracheal aspirate and BAL fluid samples in the management of patients with ventilator-associated pneumonia: a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Ricardo de Amorim Corrêa

    2014-12-01

    Full Text Available OBJECTIVE: To compare 28-day mortality rates and clinical outcomes in ICU patients with ventilator-associated pneumonia according to the diagnostic strategy used. METHODS: This was a prospective randomized clinical trial. Of the 73 patients included in the study, 36 and 37 were randomized to undergo BAL or endotracheal aspiration (EA, respectively. Antibiotic therapy was based on guidelines and was adjusted according to the results of quantitative cultures. RESULTS: The 28-day mortality rate was similar in the BAL and EA groups (25.0% and 37.8%, respectively; p = 0.353. There were no differences between the groups regarding the duration of mechanical ventilation, antibiotic therapy, secondary complications, VAP recurrence, or length of ICU and hospital stay. Initial antibiotic therapy was deemed appropriate in 28 (77.8% and 30 (83.3% of the patients in the BAL and EA groups, respectively (p = 0.551. The 28-day mortality rate was not associated with the appropriateness of initial therapy in the BAL and EA groups (appropriate therapy: 35.7% vs. 43.3%; p = 0.553; and inappropriate therapy: 62.5% vs. 50.0%; p = 1.000. Previous use of antibiotics did not affect the culture yield in the EA or BAL group (p = 0.130 and p = 0.484, respectively. CONCLUSIONS: In the context of this study, the management of VAP patients, based on the results of quantitative endotracheal aspirate cultures, led to similar clinical outcomes to those obtained with the results of quantitative BAL fluid cultures.

  19. Effectiveness of school-based humanistic counselling for psychological distress in young people: Pilot randomized controlled trial with follow-up in an ethnically diverse sample.

    Science.gov (United States)

    Pearce, Peter; Sewell, Ros; Cooper, Mick; Osman, Sarah; Fugard, Andrew J B; Pybis, Joanne

    2017-06-01

    The aim of this study was to pilot a test of the effectiveness of school-based humanistic counselling (SBHC) in an ethnically diverse group of young people (aged 11-18 years old), with follow-up assessments at 6 and 9 months. Pilot randomized controlled trial, using linear-mixed effect modelling and intention-to-treat analysis to compare changes in levels of psychological distress for participants in SBHC against usual care (UC). ISRCTN44253140. In total, 64 young people were randomized to either SBHC or UC. Participants were aged between 11 and 18 (M = 14.2, SD = 1.8), with 78.1% of a non-white ethnicity. The primary outcome was psychological distress at 6 weeks (mid-therapy), 12 weeks (end of therapy), 6-month follow-up and 9-month follow-up. Secondary measures included emotional symptoms, self-esteem and attainment of personal goals. Recruitment and retention rates for the study were acceptable. Participants in the SBHC condition, as compared with participants in the UC condition, showed greater reductions in psychological distress and emotional symptoms, and greater improvements in self-esteem, over time. However, at follow-up, only emotional symptoms showed significant differences across groups. The study adds to the pool of evidence suggesting that SBHC can be tested and that it brings about short-term reductions in psychological and emotional distress in young people, across ethnicities. However, there is no evidence of longer-term effects. School-based humanistic counselling can be an effective means of reducing the psychological distress experienced by young people with emotional symptoms in the short term. The short-term effectiveness of school-based humanistic counselling is not limited to young people of a White ethnicity. There is no evidence that school-based humanistic counselling has effects beyond the end of therapy. © 2016 The British Psychological Society.

  20. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  1. A double-blind, placebo-controlled, randomized trial of Ginkgo biloba extract EGb 761 in a sample of cognitively intact older adults: neuropsychological findings.

    Science.gov (United States)

    Mix, Joseph A; Crews, W David

    2002-08-01

    There appears to be an absence of large-scaled clinical trials that have examined the efficacy of Ginkgo biloba extract on the neuropsychological functioning of cognitively intact older adults. The importance of such clinical research appears paramount in light of the plethora of products containing Ginkgo biloba that are currently being widely marketed to predominantly cognitively intact adults with claims of enhanced cognitive performances. The purpose of this research was to conduct the first known, large-scaled clinical trial of the efficacy of Ginkgo biloba extract (EGb 761) on the neuropsychological functioning of cognitively intact older adults. Two hundred and sixty-two community-dwelling volunteers (both male and female) 60 years of age and older, who reported no history of dementia or significant neurocognitive impairments and obtained Mini-Mental State Examination total scores of at least 26, were examined via a 6-week, randomized, double-blind, fixed-dose, placebo-controlled, parallel-group, clinical trial. Participants were randomly assigned to receive either Ginkgo biloba extract EGb 761(n = 131; 180 mg/day) or placebo (n = 131) for 6 weeks. Efficacy measures consisted of participants' raw change in performance scores from pretreatment baseline to those obtained just prior to termination of treatment on the following standardized neuropsychological measures: Selective Reminding Test (SRT), Wechsler Adult Intelligence Scale-III Block Design (WAIS-III BD) and Digit Symbol-Coding (WAIS-III DS) subtests, and the Wechsler Memory Scale-III Faces I (WMS-III FI) and Faces II (WMS-III FII) subtests. A subjective Follow-up Self-report Questionnaire was also administered to participants just prior to termination of the treatment phase. Analyses of covariance indicated that cognitively intact participants who received 180 mg of EGb 761 daily for 6 weeks exhibited significantly more improvement on SRT tasks involving delayed (30 min) free recall (p visual material

  2. The potential of Virtual Reality as anxiety management tool: a randomized controlled study in a sample of patients affected by Generalized Anxiety Disorder

    Directory of Open Access Journals (Sweden)

    Gorini Alessandra

    2008-05-01

    Full Text Available Abstract Background Generalized anxiety disorder (GAD is a psychiatric disorder characterized by a constant and unspecific anxiety that interferes with daily-life activities. Its high prevalence in general population and the severe limitations it causes, point out the necessity to find new efficient strategies to treat it. Together with the cognitive-behavioural treatments, relaxation represents a useful approach for the treatment of GAD, but it has the limitation that it is hard to be learned. To overcome this limitation we propose the use of virtual reality (VR to facilitate the relaxation process by visually presenting key relaxing images to the subjects. The visual presentation of a virtual calm scenario can facilitate patients' practice and mastery of relaxation, making the experience more vivid and real than the one that most subjects can create using their own imagination and memory, and triggering a broad empowerment process within the experience induced by a high sense of presence. According to these premises, the aim of the present study is to investigate the advantages of using a VR-based relaxation protocol in reducing anxiety in patients affected by GAD. Methods/Design The trial is based on a randomized controlled study, including three groups of 25 patients each (for a total of 75 patients: (1 the VR group, (2 the non-VR group and (3 the waiting list (WL group. Patients in the VR group will be taught to relax using a VR relaxing environment and audio-visual mobile narratives; patients in the non-VR group will be taught to relax using the same relaxing narratives proposed to the VR group, but without the VR support, and patients in the WL group will not receive any kind of relaxation training. Psychometric and psychophysiological outcomes will serve as quantitative dependent variables, while subjective reports of participants will be used as qualitative dependent variables. Conclusion We argue that the use of VR for relaxation

  3. Parent-Child Associations in Pedometer-Determined Physical Activity and Sedentary Behaviour on Weekdays and Weekends in Random Samples of Families in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Dagmar Sigmundová

    2014-07-01

    Full Text Available This study investigates whether more physically active parents bring up more physically active children and whether parents’ level of physical activity helps children achieve step count recommendations on weekdays and weekends. The participants (388 parents aged 35–45 and their 485 children aged 9–12 were randomly recruited from 21 Czech government-funded primary schools. The participants recorded pedometer step counts for seven days (≥10 h a day during April–May and September–October of 2013. Logistic regression (Enter method was used to examine the achievement of the international recommendations of 11,000 steps/day for girls and 13,000 steps/day for boys. The children of fathers and mothers who met the weekend recommendation of 10,000 steps were 5.48 (95% confidence interval: 1.65; 18.19; p < 0.01 and 3.60 times, respectively (95% confidence interval: 1.21; 10.74; p < 0.05 more likely to achieve the international weekend recommendation than the children of less active parents. The children of mothers who reached the weekday pedometer-based step count recommendation were 4.94 times (95% confidence interval: 1.45; 16.82; p < 0.05 more likely to fulfil the step count recommendation on weekdays than the children of less active mothers.

  4. Measurement methodology of vegetable samples from an area affected by residual contamination due to uranium mining sterile; Metodologia de medida de muestras vegetales procedentes de un terreno afectado por contaminacion residual debida a esteriles de mineria de uranio

    Energy Technology Data Exchange (ETDEWEB)

    Navarro, N.; Suarez, J. A.; Yague, L.; Ortiz Gandia, M. I.; Marijuan, M. J.; Garcia, E.; Ortiz, T.; Alvarez, A.

    2013-07-01

    This paper presents the methodology established for radiological characterization of plant material generated during the first stage of the realization of a movement of land in an area of land affected by residual contamination due to the burial of sterile of uranium mining. (Author)

  5. Tolerance limits and tolerance intervals for ratios of normal random variables using a bootstrap calibration.

    Science.gov (United States)

    Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut

    2017-05-01

    This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. A comparison of random walks in dependent random environments

    NARCIS (Netherlands)

    Scheinhardt, Willem R.W.; Kroese, Dirk

    2015-01-01

    Although the theoretical behavior of one-dimensional random walks in random environments is well understood, the actual evaluation of various characteristics of such processes has received relatively little attention. This paper develops new methodology for the exact computation of the drift in such

  7. Data fabrication and other reasons for non-random sampling in 5087 randomised, controlled trials in anaesthetic and general medical journals.

    Science.gov (United States)

    Carlisle, J B

    2017-08-01

    10 -10 . The difference between the distributions of these two subgroups was confirmed by comparison of their overall distributions, p = 5.3 × 10 -15 . Each journal exhibited the same abnormal distribution of baseline means. There was no difference in distributions of baseline means for 1453 trials in non-anaesthetic journals and 3634 trials in anaesthetic journals, p = 0.30. The rate of retractions from JAMA and NEJM, 6/1453 or 1 in 242, was one-quarter the rate from the six anaesthetic journals, 66/3634 or 1 in 55, relative risk (99%CI) 0.23 (0.08-0.68), p = 0.00022. A probability threshold of 1 in 10,000 identified 8/72 (11%) retracted trials (7 by Fujii et al.) and 82/5015 (1.6%) unretracted trials. Some p values were so extreme that the baseline data could not be correct: for instance, for 43/5015 unretracted trials the probability was less than 1 in 10 15 (equivalent to one drop of water in 20,000 Olympic-sized swimming pools). A probability threshold of 1 in 100 for two or more trials by the same author identified three authors of retracted trials (Boldt, Fujii and Reuben) and 21 first or corresponding authors of 65 unretracted trials. Fraud, unintentional error, correlation, stratified allocation and poor methodology might have contributed to the excess of randomised, controlled trials with similar or dissimilar means, a pattern that was common to all the surveyed journals. It is likely that this work will lead to the identification, correction and retraction of hitherto unretracted randomised, controlled trials. © 2017 The Association of Anaesthetists of Great Britain and Ireland.

  8. Rationale and design of the HOME trial: A pragmatic randomized controlled trial of home-based human papillomavirus (HPV) self-sampling for increasing cervical cancer screening uptake and effectiveness in a U.S. healthcare system.

    Science.gov (United States)

    Winer, Rachel L; Tiro, Jasmin A; Miglioretti, Diana L; Thayer, Chris; Beatty, Tara; Lin, John; Gao, Hongyuan; Kimbel, Kilian; Buist, Diana S M

    2018-01-01

    Women who delay or do not attend Papanicolaou (Pap) screening are at increased risk for cervical cancer. Trials in countries with organized screening programs have demonstrated that mailing high-risk (hr) human papillomavirus (HPV) self-sampling kits to under-screened women increases participation, but U.S. data are lacking. HOME is a pragmatic randomized controlled trial set within a U.S. integrated healthcare delivery system to compare two programmatic approaches for increasing cervical cancer screening uptake and effectiveness in under-screened women (≥3.4years since last Pap) aged 30-64years: 1) usual care (annual patient reminders and ad hoc outreach by clinics) and 2) usual care plus mailed hrHPV self-screening kits. Over 2.5years, eligible women were identified through electronic medical record (EMR) data and randomized 1:1 to the intervention or control arm. Women in the intervention arm were mailed kits with pre-paid envelopes to return samples to the central clinical laboratory for hrHPV testing. Results were documented in the EMR to notify women's primary care providers of appropriate follow-up. Primary outcomes are detection and treatment of cervical neoplasia. Secondary outcomes are cervical cancer screening uptake, abnormal screening results, and women's experiences and attitudes towards hrHPV self-sampling and follow-up of hrHPV-positive results (measured through surveys and interviews). The trial was designed to evaluate whether a programmatic strategy incorporating hrHPV self-sampling is effective in promoting adherence to the complete screening process (including follow-up of abnormal screening results and treatment). The objective of this report is to describe the rationale and design of this pragmatic trial. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Health indicators: eliminating bias from convenience sampling estimators.

    Science.gov (United States)

    Hedt, Bethany L; Pagano, Marcello

    2011-02-28

    Public health practitioners are often called upon to make inference about a health indicator for a population at large when the sole available information are data gathered from a convenience sample, such as data gathered on visitors to a clinic. These data may be of the highest quality and quite extensive, but the biases inherent in a convenience sample preclude the legitimate use of powerful inferential tools that are usually associated with a random sample. In general, we know nothing about those who do not visit the clinic beyond the fact that they do not visit the clinic. An alternative is to take a random sample of the population. However, we show that this solution would be wasteful if it excluded the use of available information. Hence, we present a simple annealing methodology that combines a relatively small, and presumably far less expensive, random sample with the convenience sample. This allows us to not only take advantage of powerful inferential tools, but also provides more accurate information than that available from just using data from the random sample alone. Copyright © 2011 John Wiley & Sons, Ltd.

  10. Investigating the Randomness of Numbers

    Science.gov (United States)

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  11. Rationale and design of the iPap trial: a randomized controlled trial of home-based HPV self-sampling for improving participation in cervical screening by never- and under-screened women in Australia

    International Nuclear Information System (INIS)

    Sultana, Farhana; Gertig, Dorota M; English, Dallas R; Simpson, Julie A; Brotherton, Julia ML; Drennan, Kelly; Mullins, Robyn; Heley, Stella; Wrede, C David; Saville, Marion

    2014-01-01

    Organized screening based on Pap tests has substantially reduced deaths from cervical cancer in many countries, including Australia. However, the impact of the program depends upon the degree to which women participate. A new method of screening, testing for human papillomavirus (HPV) DNA to detect the virus that causes cervical cancer, has recently become available. Because women can collect their own samples for this test at home, it has the potential to overcome some of the barriers to Pap tests. Th