WorldWideScience

Sample records for outcome-dependent sampling designs

  1. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  2. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  3. Statistical inference for the additive hazards model under outcome-dependent sampling.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo

    2015-09-01

    Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.

  4. Graphical models for inference under outcome-dependent sampling

    DEFF Research Database (Denmark)

    Didelez, V; Kreiner, S; Keiding, N

    2010-01-01

    a node for the sampling indicator, assumptions about sampling processes can be made explicit. We demonstrate how to read off such graphs whether consistent estimation of the association between exposure and outcome is possible. Moreover, we give sufficient graphical conditions for testing and estimating......We consider situations where data have been collected such that the sampling depends on the outcome of interest and possibly further covariates, as for instance in case-control studies. Graphical models represent assumptions about the conditional independencies among the variables. By including...

  5. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    Science.gov (United States)

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  6. Are quantitative trait-dependent sampling designs cost-effective for analysis of rare and common variants?

    Science.gov (United States)

    Yilmaz, Yildiz E; Bull, Shelley B

    2011-11-29

    Use of trait-dependent sampling designs in whole-genome association studies of sequence data can reduce total sequencing costs with modest losses of statistical efficiency. In a quantitative trait (QT) analysis of data from the Genetic Analysis Workshop 17 mini-exome for unrelated individuals in the Asian subpopulation, we investigate alternative designs that sequence only 50% of the entire cohort. In addition to a simple random sampling design, we consider extreme-phenotype designs that are of increasing interest in genetic association analysis of QTs, especially in studies concerned with the detection of rare genetic variants. We also evaluate a novel sampling design in which all individuals have a nonzero probability of being selected into the sample but in which individuals with extreme phenotypes have a proportionately larger probability. We take differential sampling of individuals with informative trait values into account by inverse probability weighting using standard survey methods which thus generalizes to the source population. In replicate 1 data, we applied the designs in association analysis of Q1 with both rare and common variants in the FLT1 gene, based on knowledge of the generating model. Using all 200 replicate data sets, we similarly analyzed Q1 and Q4 (which is known to be free of association with FLT1) to evaluate relative efficiency, type I error, and power. Simulation study results suggest that the QT-dependent selection designs generally yield greater than 50% relative efficiency compared to using the entire cohort, implying cost-effectiveness of 50% sample selection and worthwhile reduction of sequencing costs.

  7. Two‐phase designs for joint quantitative‐trait‐dependent and genotype‐dependent sampling in post‐GWAS regional sequencing

    Science.gov (United States)

    Espin‐Garcia, Osvaldo; Craiu, Radu V.

    2017-01-01

    ABSTRACT We evaluate two‐phase designs to follow‐up findings from genome‐wide association study (GWAS) when the cost of regional sequencing in the entire cohort is prohibitive. We develop novel expectation‐maximization‐based inference under a semiparametric maximum likelihood formulation tailored for post‐GWAS inference. A GWAS‐SNP (where SNP is single nucleotide polymorphism) serves as a surrogate covariate in inferring association between a sequence variant and a normally distributed quantitative trait (QT). We assess test validity and quantify efficiency and power of joint QT‐SNP‐dependent sampling and analysis under alternative sample allocations by simulations. Joint allocation balanced on SNP genotype and extreme‐QT strata yields significant power improvements compared to marginal QT‐ or SNP‐based allocations. We illustrate the proposed method and evaluate the sensitivity of sample allocation to sampling variation using data from a sequencing study of systolic blood pressure. PMID:29239496

  8. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  9. The significance of Sampling Design on Inference: An Analysis of Binary Outcome Model of Children’s Schooling Using Indonesian Large Multi-stage Sampling Data

    OpenAIRE

    Ekki Syamsulhakim

    2008-01-01

    This paper aims to exercise a rather recent trend in applied microeconometrics, namely the effect of sampling design on statistical inference, especially on binary outcome model. Many theoretical research in econometrics have shown the inappropriateness of applying i.i.dassumed statistical analysis on non-i.i.d data. These research have provided proofs showing that applying the iid-assumed analysis on a non-iid observations would result in an inflated standard errors which could make the esti...

  10. Estimation of AUC or Partial AUC under Test-Result-Dependent Sampling.

    Science.gov (United States)

    Wang, Xiaofei; Ma, Junling; George, Stephen; Zhou, Haibo

    2012-01-01

    The area under the ROC curve (AUC) and partial area under the ROC curve (pAUC) are summary measures used to assess the accuracy of a biomarker in discriminating true disease status. The standard sampling approach used in biomarker validation studies is often inefficient and costly, especially when ascertaining the true disease status is costly and invasive. To improve efficiency and reduce the cost of biomarker validation studies, we consider a test-result-dependent sampling (TDS) scheme, in which subject selection for determining the disease state is dependent on the result of a biomarker assay. We first estimate the test-result distribution using data arising from the TDS design. With the estimated empirical test-result distribution, we propose consistent nonparametric estimators for AUC and pAUC and establish the asymptotic properties of the proposed estimators. Simulation studies show that the proposed estimators have good finite sample properties and that the TDS design yields more efficient AUC and pAUC estimates than a simple random sampling (SRS) design. A data example based on an ongoing cancer clinical trial is provided to illustrate the TDS design and the proposed estimators. This work can find broad applications in design and analysis of biomarker validation studies.

  11. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Directory of Open Access Journals (Sweden)

    Lauren Hund

    Full Text Available Lot quality assurance sampling (LQAS surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  12. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Science.gov (United States)

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  13. Exercise for methamphetamine dependence: rationale, design, and methodology.

    Science.gov (United States)

    Mooney, Larissa J; Cooper, Christopher; London, Edythe D; Chudzynski, Joy; Dolezal, Brett; Dickerson, Daniel; Brecht, Mary-Lynn; Peñate, Jose; Rawson, Richard A

    2014-01-01

    Effective pharmacotherapies to treat methamphetamine (MA) dependence have not been identified, and behavioral therapies are marginally effective. Based on behavioral studies demonstrating the potential efficacy of aerobic exercise for improving depressive symptoms, anxiety, cognitive deficits, and substance use outcomes, the study described here is examining exercise as a potential treatment for MA-dependent individuals. This study is randomizing 150 participants with MA dependence at a residential treatment facility for addictive disorders to receive either a thrice-weekly structured aerobic and resistance exercise intervention or a health education condition. Recruitment commenced in March, 2010. Enrollment and follow-up phases are ongoing, and recruitment is exceeding targeted enrollment rates. Seeking evidence for a possibly effective adjunct to traditional behavioral approaches for treatment of MA dependence, this study is assessing the ability of an 8-week aerobic and resistance exercise protocol to reduce relapse to MA use during a 12-week follow-up period after discharge from residential-based treatment. The study also is evaluating improvements in health and functional outcomes during and after the protocol. This paper describes the design and methods of the study. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Outcome dependency alters the neural substrates of impression formation

    Science.gov (United States)

    Ames, Daniel L.; Fiske, Susan T.

    2015-01-01

    How do people maintain consistent impressions of other people when other people are often inconsistent? The present research addresses this question by combining recent neuroscientific insights with ecologically meaningful behavioral methods. Participants formed impressions of real people whom they met in a personally involving situation. fMRI and supporting behavioral data revealed that outcome dependency (i.e., depending on another person for a desired outcome) alters previously identified neural dynamics of impression formation. Consistent with past research, a functional localizer identified a region of dorsomedial PFC previously linked to social impression formation. In the main task, this ROI revealed the predicted patterns of activity across outcome dependency conditions: greater BOLD response when information confirmed (vs. violated) social expectations if participants were outcome-independent and the reverse pattern if participants were outcome-dependent. We suggest that, although social perceivers often discount expectancy-disconfirming information as noise, being dependent on another person for a desired outcome focuses impression-formation processing on the most diagnostic information, rather than on the most tractable information. PMID:23850465

  15. Predictors of posttreatment drinking outcomes in patients with alcohol dependence.

    Science.gov (United States)

    Flórez, Gerardo; Saiz, Pilar A; García-Portilla, Paz; De Cos, Francisco J; Dapía, Sonia; Alvarez, Sandra; Nogueiras, Luis; Bobes, Julio

    2015-01-01

    This cohort study examined how predictors of alcohol dependence treatment outcomes work together over time by comparing pretreatment and posttreatment predictors. A sample of 274 alcohol-dependent patients was recruited and assessed at baseline, 6 months after treatment initiation (end of the active intervention phase), and 18 months after treatment initiation (end of the 12-month research follow-up phase). At each assessment point, the participants completed a battery of standardized tests [European Addiction Severity Index (EuropASI), Obsessive Compulsive Drinking Scale (OCDS), Alcohol Timeline Followback (TLFB), Fagerström, and International Personality Disorder Examination (IPDE)] that measured symptom severity and consequences; biological markers of alcohol consumption were also tested at each assessment point. A sequential strategy with univariate and multivariate analyses was used to identify how pretreatment and posttreatment predictors influence outcomes up to 1 year after treatment. Pretreatment variables had less predictive power than posttreatment ones. OCDS scores and biological markers of alcohol consumption were the most significant variables for the prediction of posttreatment outcomes. Prior pharmacotherapy treatment and relapse prevention interventions were also associated with posttreatment outcomes. The findings highlight the positive impact of pharmacotherapy during the first 6 months after treatment initiation and of relapse prevention during the first year after treatment and how posttreatment predictors are more important than pretreatment predictors.

  16. Impact of family-friendly prison policies on health, justice and child protection outcomes for incarcerated mothers and their dependent children: a cohort study protocol.

    Science.gov (United States)

    Myers, Helen; Segal, Leonie; Lopez, Derrick; Li, Ian W; Preen, David B

    2017-08-23

    Female imprisonment has numerous health and social sequelae for both women prisoners and their children. Examples of comprehensive family-friendly prison policies that seek to improve the health and social functioning of women prisoners and their children exist but have not been evaluated. This study will determine the impact of exposure to a family-friendly prison environment on health, child protection and justice outcomes for incarcerated mothers and their dependent children. A longitudinal retrospective cohort design will be used to compare outcomes for mothers incarcerated at Boronia Pre-release Centre, a women's prison with a dedicated family-friendly environment, and their dependent children, with outcomes for mothers incarcerated at other prisons in Western Australia (that do not offer this environment) and their dependent children. Routinely collected administrative data from 1985 to 2013 will be used to determine child and mother outcomes such as hospital admissions, emergency department presentations, custodial sentences, community service orders and placement in out-of home care. The sample consists of all children born in Western Australia between 1 January 1985 and 31 December 2011 who had a mother in a West Australian prison between 1990 and 2012 and their mothers. Children are included if they were alive and aged less than 18 years at the time of their mother's incarceration. The sample comprises an exposed group of 665 women incarcerated at Boronia and their 1714 dependent children and a non-exposed comparison sample of 2976 women incarcerated at other West Australian prisons and their 7186 dependent children, creating a total study sample of 3641 women and 8900 children. This project received ethics approval from the Western Australian Department of Health Human Research Ethics Committee, the Western Australian Aboriginal Health Ethics Committee and the University of Western Australia Human Research Ethics Committee. © Article author(s) (or their

  17. Sample-size dependence of diversity indices and the determination of sufficient sample size in a high-diversity deep-sea environment

    OpenAIRE

    Soetaert, K.; Heip, C.H.R.

    1990-01-01

    Diversity indices, although designed for comparative purposes, often cannot be used as such, due to their sample-size dependence. It is argued here that this dependence is more pronounced in high diversity than in low diversity assemblages and that indices more sensitive to rarer species require larger sample sizes to estimate diversity with reasonable precision than indices which put more weight on commoner species. This was tested for Hill's diversity number N sub(0) to N sub( proportional ...

  18. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can

  19. HOUSEHOLD NUCLEATION, DEPENDENCY AND CHILD HEALTH OUTCOMES IN GHANA.

    Science.gov (United States)

    Annim, Samuel Kobina; Awusabo-Asare, Kofi; Amo-Adjei, Joshua

    2015-09-01

    This study uses three key anthropometric measures of nutritional status among children (stunting, wasting and underweight) to explore the dual effects of household composition and dependency on nutritional outcomes of under-five children in Ghana. The objective is to examine changes in household living arrangements of under-five children to explore the interaction of dependency and nucleation on child health outcomes. The concept of nucleation refers to the changing structure and composition of household living arrangements, from highly extended with its associated socioeconomic system of production and reproduction, social behaviour and values, towards single-family households - especially the nuclear family, containing a husband and wife and their children alone. A negative relationship between levels of dependency, as measured by the number of children in the household, and child health outcomes is premised on the grounds that high dependency depletes resources, both tangible and intangible, to the disadvantage of young children. Data were drawn from the last four rounds of the Ghana Demographic and Health Surveys (GDHSs), from 1993 to 2008, for the first objective - to explore changes in household composition. For the second objective, the study used data from the 2008 GDHS. The results show that, over time, households in Ghana have been changing towards nucleation. The main finding is that in households with the same number of dependent children, in nucleated households children under age 5 have better health outcomes compared with children under age 5 in non-nucleated households. The results also indicate that the effect of dependency on child health outcomes is mediated by household nucleation and wealth status and that, as such, high levels of dependency do not necessarily translate into negative health outcomes for children under age 5, based on anthropometric measures.

  20. Parameter dependence and outcome dependence in dynamical models for state vector reduction

    International Nuclear Information System (INIS)

    Ghirardi, G.C.; Grassi, R.; Butterfield, J.; Fleming, G.N.

    1993-01-01

    The authors apply the distinction between parameter independence and outcome independence to the linear and nonlinear models of a recent nonrelativistic theory of continuous state vector reduction. It is shown that in the nonlinear model there is a set of realizations of the stochastic process that drives the state vector reduction for which parameter independence is violated for parallel spin components in the EPR-Bohm setup. Such a set has an appreciable probability of occurrence (∼ 1/2). On the other hand, the linear model exhibits only extremely small parameter dependence effects. Some specific features of the models are investigated and it is recalled that, as has been pointed out recently, to be able to speak of definite outcomes (or equivalently of possessed objective elements of reality) at finite times, the criteria for their attribution to physical systems must be slightly changed. The concluding section is devoted to a detailed discussion of the difficulties met when attempting to take, as a starting point for the formulation of a relativistic theory, a nonrelativistic scheme which exhibits parameter dependence. Here the authors derive a theorem which identifies the precise sense in which the occurrence of parameter dependence forbids a genuinely relativistic generalization. Finally, the authors show how the appreciable parameter dependence of the nonlinear model gives rise to problems with relativity, while the extremely weak parameter dependence of the linear model does not give rise to any difficulty, provided the appropriate criteria for the attribution of definite outcomes are taken into account. 19 refs

  1. Sample size reassessment for a two-stage design controlling the false discovery rate.

    Science.gov (United States)

    Zehetmayer, Sonja; Graf, Alexandra C; Posch, Martin

    2015-11-01

    Sample size calculations for gene expression microarray and NGS-RNA-Seq experiments are challenging because the overall power depends on unknown quantities as the proportion of true null hypotheses and the distribution of the effect sizes under the alternative. We propose a two-stage design with an adaptive interim analysis where these quantities are estimated from the interim data. The second stage sample size is chosen based on these estimates to achieve a specific overall power. The proposed procedure controls the power in all considered scenarios except for very low first stage sample sizes. The false discovery rate (FDR) is controlled despite of the data dependent choice of sample size. The two-stage design can be a useful tool to determine the sample size of high-dimensional studies if in the planning phase there is high uncertainty regarding the expected effect sizes and variability.

  2. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  3. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    Science.gov (United States)

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  4. Long-term outcome in pyridoxine-dependent epilepsy

    NARCIS (Netherlands)

    Bok, L.A.; Halbertsma, F.J.; Houterman, S.; Wevers, R.A.; Vreeswijk, C.M.J.M.; Jakobs, C.; Struys, E.; van der Hoeven, J.H.; Sival, D.A.; Willemsen, M.A.

    2012-01-01

    Aim  The long-term outcome of the Dutch pyridoxine-dependent epilepsy cohort and correlations between patient characteristics and follow-up data were retrospectively studied. Method  Fourteen patients recruited from a national reference laboratory were included (four males, 10 females, from 11

  5. Sample holder for studying temperature dependent particle guiding

    International Nuclear Information System (INIS)

    Bereczky, R.J.; Toekesi, K.; Kowarik, G.; Aumayr, F.

    2011-01-01

    Complete text of publication follows. The so called guiding effect is a complex process involving the interplay of a large number of charged particles with a solid. Although many research groups joined this field and carried out various experiments with insulator capillaries many details of the interactions are still unknown. We investigated the temperature dependence of the guiding since it opens new possibilities both for a fundamental understanding of the guiding phenomenon and for applications. For the temperature dependent guiding experiments a completely new heatable sample holder was designed. We developed and built such a heatable sample holder to make accurate and reproducible studies of the temperature dependence of the ion guiding effect possible. The target holder (for an exploded view see Fig. 1) consists of two main parts, the front and the back plates. The two plates of the sample holder, which function as an oven, are made of copper. These parts surround the capillary in order to guarantee a uniform temperature along the whole tube. The temperature of the copper parts is monitored by a K-Type thermocouple. Stainless steel coaxial heaters surrounding the oven are used for heating. The heating power up to a few watts is regulated by a PID controller. Cooling of the capillary is achieved by a copper feed-through connected to a liquid nitrogen bath outside the UHV chamber. This solution allows us to change the temperature of the sample from -30 deg C up to 90 deg C. Our experiments with this newly developed temperature regulated capillary holder show that the glass temperature (i.e. conductivity) can be used to control the guiding properties of the glass capillary and adjust the conditions from guiding at room temperature to simple geometrical transmission at elevated temperatures. This holds the promise to investigate the effect of conductivity on particle transport (build-up and removal of charge patches) through capillaries in more details

  6. Directional dependency of air sampling

    International Nuclear Information System (INIS)

    1994-01-01

    A field study was performed by Idaho State University-Environmental Monitoring Laboratory (EML) to examine the directional dependency of low-volume air samplers. A typical continuous low volume air sampler contains a sample head that is mounted on the sampler housing either horizontally through one of four walls or vertically on an exterior wall 'looking down or up.' In 1992, a field study was undertaken to estimate sampling error and to detect the directional effect of sampler head orientation. Approximately 1/2 mile downwind from a phosphate plant (continuous source of alpha activity), four samplers were positioned in identical orientation alongside one sampler configured with the sample head 'looking down'. At least five consecutive weekly samples were collected. The alpha activity, beta activity, and the Be-7 activity collected on the particulate filter were analyzed to determine sampling error. Four sample heads were than oriented to the four different horizontal directions. Samples were collected for at least five weeks. Analysis of the alpha data can show the effect of sampler orientation to a know near source term. Analysis of the beta and Be-7 activity shows the effect of sampler orientation to a ubiquitous source term

  7. Spatial-dependence recurrence sample entropy

    Science.gov (United States)

    Pham, Tuan D.; Yan, Hong

    2018-03-01

    Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.

  8. Long-Term Outcome in Pyridoxine-Dependent Epilepsy

    Science.gov (United States)

    Bok, Levinus A.; Halbertsma, Feico J..; Houterman, Saskia; Wevers, Ron A.; Vreeswijk, Charlotte; Jakobs, Cornelis; Struys, Eduard; van der Hoeven, Johan H.; Sival, Deborah A.; Willemsen, Michel A.

    2012-01-01

    Aim: The long-term outcome of the Dutch pyridoxine-dependent epilepsy cohort and correlations between patient characteristics and follow-up data were retrospectively studied. Method: Fourteen patients recruited from a national reference laboratory were included (four males, 10 females, from 11 families; median age at assessment 6y; range 2y…

  9. The Betel Quid Dependence Scale: replication and extension in a Guamanian sample.

    Science.gov (United States)

    Herzog, Thaddeus A; Murphy, Kelle L; Little, Melissa A; Suguitan, Gil S; Pokhrel, Pallav; Kawamoto, Crissy T

    2014-05-01

    Betel quid is the fourth most commonly consumed psychoactive substance in the world. The Betel Quid Dependence Scale (BQDS) is the first instrument designed specifically to measure betel quid dependence. The three factor structure of the BQDS consists of "physical and psychological urgent need," "increasing dose," and "maladaptive use." The BQDS initially was validated in a sample of male prisoner ex-chewers in Taiwan. To replicate and extend the original validation research on the BQDS in a sample of male and female current betel quid chewers in Guam. A survey containing the BQDS was administered to 300 current betel quid chewers in Guam. Participants were compensated for their time with a gift card worth $25. Confirmatory factor analysis revealed an adequate fit with the hypothesized three-factor measurement model. ANOVAs and structural equations modeling revealed that betel quid dependence is associated with the inclusion of tobacco in the quid, number of chews per day, years of chewing, and education. The BQDS is valid for current English-speaking male and female chewers in Guam. Overall levels of betel quid dependence were high, and most chewers included tobacco in their betel quid. The results suggest that levels of dependence for betel quid are similar to those observed for nicotine dependence. Future research should explore other important psychological and behavioral aspects of betel quid chewing such as health risk perceptions and motivation to quit chewing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Outcomes of couples with infidelity in a community-based sample of couple therapy.

    Science.gov (United States)

    Atkins, David C; Marín, Rebeca A; Lo, Tracy T Y; Klann, Notker; Hahlweg, Kurt

    2010-04-01

    Infidelity is an often cited problem for couples seeking therapy, but the research literature to date is very limited on couple therapy outcomes when infidelity is a problem. The current study is a secondary analysis of a community-based sample of couple therapy in Germany and Austria. Outcomes for 145 couples who reported infidelity as a problem in their relationship were compared with 385 couples who sought therapy for other reasons. Analyses based on hierarchical linear modeling revealed that infidelity couples were significantly more distressed and reported more depressive symptoms at the start of therapy but continued improving through the end of therapy and to 6 months posttherapy. At the follow-up assessment, infidelity couples were not statistically distinguishable from non-infidelity couples, replicating previous research. Sexual dissatisfaction did not depend on infidelity status. Although there was substantial missing data, sensitivity analyses suggested that the primary findings were not due to missing data. The current findings based on a large community sample replicated previous work from an efficacy trial and show generally optimistic results for couples in which there has been an affair. 2010 APA, all rights reserved

  11. Life Design Counseling Group Intervention with Portuguese Adolescents: A Process and Outcome Study

    Science.gov (United States)

    Cardoso, Paulo; Janeiro, Isabel Nunes; Duarte, Maria Eduarda

    2018-01-01

    This article examines the process and outcome of a life design counseling group intervention with students in Grades 9 and 12. First, we applied a quasi-experimental methodology to analyze the intervention's effectiveness in promoting career certainty, career decision-making, self-efficacy, and career adaptability in a sample of 236 students.…

  12. Event dependent sampling of recurrent events

    DEFF Research Database (Denmark)

    Kvist, Tine Kajsa; Andersen, Per Kragh; Angst, Jules

    2010-01-01

    The effect of event-dependent sampling of processes consisting of recurrent events is investigated when analyzing whether the risk of recurrence increases with event count. We study the situation where processes are selected for study if an event occurs in a certain selection interval. Motivation...... retrospective and prospective disease course histories are used. We examine two methods to correct for the selection depending on which data are used in the analysis. In the first case, the conditional distribution of the process given the pre-selection history is determined. In the second case, an inverse...

  13. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  14. Benchmarking bio-inspired designs with brainstorming in terms of novelty of design outcomes

    DEFF Research Database (Denmark)

    Keshwani, Sonal; Lenau, Torben Anker; Ahmed-Kristensen, Saeema

    2013-01-01

    With the increasing demand of innovative products in the market, there is a need for effective creativity approaches that will support development of creative design outcomes. Most researchers agree that novelty of design concepts is a major element of creativity; design outcomes are more creative...... generated using existing traditional creative problem solving approaches. In this research we have compared the novelty of design concepts produced by using biological analogies with the novelty of design concepts produced by using traditional brainstorming. Results show that there is an increase...... in the percentage of highly novel concepts produced in a design task, as well as the novelty of the concept space, when biological analogies are used over traditional brainstorming....

  15. Predictors of social anxiety in an opioid dependent sample and a control sample

    OpenAIRE

    Shand, Fiona L.; Degenhardt, Louisa; Nelson, Elliot C.; Mattick, Richard P.

    2010-01-01

    Compared to other mental health problems, social anxiety is under-acknowledged amongst opioid dependent populations. This study aimed to assess levels of social anxiety and identify its predictors in an opioid dependent sample and a matched control group. Opioid dependent participants (n = 1385) and controls (n = 417) completed the Social Interaction Anxiety Scale (SIAS), the Social Phobia Scale (SPS) and a diagnostic interview. Regression analyses were used to test a range of predictors of s...

  16. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  17. Attending to Objects as Outcomes of Design Research

    DEFF Research Database (Denmark)

    Jenkins, Tom; Andersen, Kristina; Gaver, William

    2016-01-01

    outcomes. The premise of this workshop is simple: We need additional spaces for interacting with and reflecting upon material design outcomes at CHI. The goal of this workshop is to experiment with such a space, and to initially do so without a strong theoretical or conceptual framing.......The goal for this workshop is to provide a venue at CHI for research through design practitioners to materially share their work with each other. Conversation will largely be centered upon a discussion of objects produced through a research through design process. Bringing together researchers...

  18. Context-dependent representation of response-outcome in monkey prefrontal neurons.

    Science.gov (United States)

    Tsujimoto, Satoshi; Sawaguchi, Toshiyuki

    2005-07-01

    For behaviour to be purposeful, it is important to monitor the preceding behavioural context, particularly for factors regarding stimulus, response and outcome. The dorsolateral prefrontal cortex (DLPFC) appears to play a major role in such a context-dependent, flexible behavioural control system, and this area is likely to have a neuronal mechanism for such retrospective coding, which associates response-outcome with the information and/or neural systems that guided the response. To address this hypothesis, we recorded neuronal activity from the DLPFC of monkeys performing memory- and sensory-guided saccade tasks, each of which had two conditions with reward contingencies. We found that post-response activity of a subset of DLPFC neurons was modulated by three factors relating to earlier events: the direction of the immediately preceding response, its outcome (reward or non-reward) and the information type (memory or sensory) that guided the response. Such neuronal coding should play a role in associating response-outcome with information and/or neural systems used to guide behaviour - that is, 'retrospective monitoring' of behavioural context and/or neural systems used for guiding behaviour - thereby contributing to context-dependent, flexible control of behaviours.

  19. Inclusion of mobile phone numbers into an ongoing population health survey in New South Wales, Australia: design, methods, call outcomes, costs and sample representativeness.

    Science.gov (United States)

    Barr, Margo L; van Ritten, Jason J; Steel, David G; Thackway, Sarah V

    2012-11-22

    In Australia telephone surveys have been the method of choice for ongoing jurisdictional population health surveys. Although it was estimated in 2011 that nearly 20% of the Australian population were mobile-only phone users, the inclusion of mobile phone numbers into these existing landline population health surveys has not occurred. This paper describes the methods used for the inclusion of mobile phone numbers into an existing ongoing landline random digit dialling (RDD) health survey in an Australian state, the New South Wales Population Health Survey (NSWPHS). This paper also compares the call outcomes, costs and the representativeness of the resultant sample to that of the previous landline sample. After examining several mobile phone pilot studies conducted in Australia and possible sample designs (screening dual-frame and overlapping dual-frame), mobile phone numbers were included into the NSWPHS using an overlapping dual-frame design. Data collection was consistent, where possible, with the previous years' landline RDD phone surveys and between frames. Survey operational data for the frames were compared and combined. Demographic information from the interview data for mobile-only phone users, both, and total were compared to the landline frame using χ2 tests. Demographic information for each frame, landline and the mobile-only (equivalent to a screening dual frame design), and the frames combined (with appropriate overlap adjustment) were compared to the NSW demographic profile from the 2011 census using χ2 tests. In the first quarter of 2012, 3395 interviews were completed with 2171 respondents (63.9%) from the landline frame (17.6% landline only) and 1224 (36.1%) from the mobile frame (25.8% mobile only). Overall combined response, contact and cooperation rates were 33.1%, 65.1% and 72.2% respectively. As expected from previous research, the demographic profile of the mobile-only phone respondents differed most (more that were young, males, Aboriginal

  20. A New Attribute Control Chart using Multiple Dependent State Repetitive Sampling

    KAUST Repository

    Aldosari, Mansour Sattam; Aslam, Muhammad; Jun, Chi-Hyuck

    2017-01-01

    In this manuscript, a new attribute control chart using multiple dependent state repetitive sampling is designed. The operational procedure and structure of the proposed control chart is given. The required measures to determine the average run length (ARL) for in-control and out-of-control processes are given. Tables of ARLs are reported for various control chart parameters. The proposed control chart is more sensitive in detecting a small shift in the process as compared to the existing attribute control charts. The simulation study shows the efficiency of the proposed chart over the existing charts. An example is given for the illustration purpose.

  1. A New Attribute Control Chart using Multiple Dependent State Repetitive Sampling

    KAUST Repository

    Aldosari, Mansour Sattam

    2017-03-25

    In this manuscript, a new attribute control chart using multiple dependent state repetitive sampling is designed. The operational procedure and structure of the proposed control chart is given. The required measures to determine the average run length (ARL) for in-control and out-of-control processes are given. Tables of ARLs are reported for various control chart parameters. The proposed control chart is more sensitive in detecting a small shift in the process as compared to the existing attribute control charts. The simulation study shows the efficiency of the proposed chart over the existing charts. An example is given for the illustration purpose.

  2. Evaluation of design flood estimates with respect to sample size

    Science.gov (United States)

    Kobierska, Florian; Engeland, Kolbjorn

    2016-04-01

    Estimation of design floods forms the basis for hazard management related to flood risk and is a legal obligation when building infrastructure such as dams, bridges and roads close to water bodies. Flood inundation maps used for land use planning are also produced based on design flood estimates. In Norway, the current guidelines for design flood estimates give recommendations on which data, probability distribution, and method to use dependent on length of the local record. If less than 30 years of local data is available, an index flood approach is recommended where the local observations are used for estimating the index flood and regional data are used for estimating the growth curve. For 30-50 years of data, a 2 parameter distribution is recommended, and for more than 50 years of data, a 3 parameter distribution should be used. Many countries have national guidelines for flood frequency estimation, and recommended distributions include the log Pearson II, generalized logistic and generalized extreme value distributions. For estimating distribution parameters, ordinary and linear moments, maximum likelihood and Bayesian methods are used. The aim of this study is to r-evaluate the guidelines for local flood frequency estimation. In particular, we wanted to answer the following questions: (i) Which distribution gives the best fit to the data? (ii) Which estimation method provides the best fit to the data? (iii) Does the answer to (i) and (ii) depend on local data availability? To answer these questions we set up a test bench for local flood frequency analysis using data based cross-validation methods. The criteria were based on indices describing stability and reliability of design flood estimates. Stability is used as a criterion since design flood estimates should not excessively depend on the data sample. The reliability indices describe to which degree design flood predictions can be trusted.

  3. Dependent seniors garment design

    Science.gov (United States)

    Caldas, A. L.; Carvalho, M. A.; Lopes, H. P.

    2017-10-01

    This paper is part of a PhD research in Textile Engineering at University of Minho and aims to establish an ergonomic pattern design methodology to be used in the construction of garments for elderly women, aged 65 and over, dependent of care. The research was developed with a close contact with four institutions involved in supporting this aged population, located in the cities of Guimarães (Portugal) and Teresina (Brazil). These clothes should be adequate to their anthropometrics and their special needs, in accordance with important functional factors for the dependency of their caregiver, such as: care for the caregiver and comfort for the user. Questions regarding the functional properties of the materials, the pattern design process, trimmings and the assembling process of the garments are specially considered in the desired comfort levels, in order to provide an adequate handling by facilitating the dressing and undressing tasks, but also to assure the user the needed comfort in all its variables.

  4. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  5. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  6. Predicting Inpatient Detoxification Outcome of Alcohol and Drug Dependent Patients: The Influence of Sociodemographic Environment, Motivation, Impulsivity, and Medical Comorbidities

    Directory of Open Access Journals (Sweden)

    Yvonne Sofin

    2017-01-01

    Full Text Available Aims. This prospective study aims to identify patient characteristics as predictors for treatment outcome during inpatient detoxification treatment for drug and alcohol dependent patients. Methods. A mixed gender sample of 832 consecutively admitted drug and alcohol dependent patients were interviewed by an experienced physician. The impact of a variety of factors concerning social environment, therapy motivation, impulsivity related variables, medical history, and addiction severity on treatment outcome was examined. Results. 525 (63.1% of the patients completed detoxification treatment whereas 307 (36.9% dropped out prematurely. Being female, living in a partnership, having children, being employed, and having good education were predictive for a positive outcome. Family, health, the fear of losing the job, prosecution, and emergency admission were significant motivational predictors for treatment outcome. Being younger, history of imprisonment, and the number of previous drop-outs were predictive for a negative outcome. Conclusions. Variables concerning social environment and the number of previous drop-outs have been identified as best predictors for treatment outcome. Socially stable patients benefit from the current treatment setting and treatment shall be adapted for patients with negative predictors. Treatment may consequently be tailored with respect to intervention type, duration, and intensity to improve the outcome for those patients that fulfil criteria with negative impact on treatment retention.

  7. Heat experiment design to estimate temperature dependent thermal properties

    International Nuclear Information System (INIS)

    Romanovski, M

    2008-01-01

    Experimental conditions are studied to optimize transient experiments for estimating temperature dependent thermal conductivity and volumetric heat capacity. A mathematical model of a specimen is the one-dimensional heat equation with boundary conditions of the second kind. Thermal properties are assumed to vary nonlinearly with temperature. Experimental conditions refer to the thermal loading scheme, sampling times and sensor location. A numerical model of experimental configurations is studied to elicit the optimal conditions. The numerical solution of the design problem is formulated on a regularization scheme with a stabilizer minimization without a regularization parameter. An explicit design criterion is used to reveal the optimal sensor location, heating duration and flux magnitude. Results obtained indicate that even the strongly nonlinear experimental design problem admits the aggregation of its solution and has a strictly defined optimal measurement scheme. Additional region of temperature measurements with allowable identification error is revealed.

  8. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  9. Frictional behaviour of sandstone: A sample-size dependent triaxial investigation

    Science.gov (United States)

    Roshan, Hamid; Masoumi, Hossein; Regenauer-Lieb, Klaus

    2017-01-01

    Frictional behaviour of rocks from the initial stage of loading to final shear displacement along the formed shear plane has been widely investigated in the past. However the effect of sample size on such frictional behaviour has not attracted much attention. This is mainly related to the limitations in rock testing facilities as well as the complex mechanisms involved in sample-size dependent frictional behaviour of rocks. In this study, a suite of advanced triaxial experiments was performed on Gosford sandstone samples at different sizes and confining pressures. The post-peak response of the rock along the formed shear plane has been captured for the analysis with particular interest in sample-size dependency. Several important phenomena have been observed from the results of this study: a) the rate of transition from brittleness to ductility in rock is sample-size dependent where the relatively smaller samples showed faster transition toward ductility at any confining pressure; b) the sample size influences the angle of formed shear band and c) the friction coefficient of the formed shear plane is sample-size dependent where the relatively smaller sample exhibits lower friction coefficient compared to larger samples. We interpret our results in terms of a thermodynamics approach in which the frictional properties for finite deformation are viewed as encompassing a multitude of ephemeral slipping surfaces prior to the formation of the through going fracture. The final fracture itself is seen as a result of the self-organisation of a sufficiently large ensemble of micro-slip surfaces and therefore consistent in terms of the theory of thermodynamics. This assumption vindicates the use of classical rock mechanics experiments to constrain failure of pressure sensitive rocks and the future imaging of these micro-slips opens an exciting path for research in rock failure mechanisms.

  10. Pyridoxine-dependent epilepsy due to antiquitin deficiency: achieving a favourable outcome

    NARCIS (Netherlands)

    Oliveira, R.; Pereira, C.; Rodrigues, F.; Alfaite, C.; Garcia, P.; Robalo, C.; Fineza, I.; Goncalves, O.; Struys, E.A.; Salomons, G.S.; Jakobs, C.A.J.M.; Diogo, L.

    2013-01-01

    We report 4 pyridoxine-dependent epilepsy patients in which good outcome was determined in three. The 4 patients were male and aged from 7 to 24 years old (from three unrelated Caucasian families). A clinical diagnosis of neonatal pyridoxine-dependent epilepsy was confirmed by biochemical and

  11. Measuring Outcome in the Treatment of Cocaine Dependence

    Science.gov (United States)

    Crits-Christoph, Paul; Gallop, Robert; Gibbons, Mary Beth Connolly; Sadicario, Jaclyn S.; Woody, George

    2015-01-01

    Background Little in known about the extent to which outcome measures used in studies of the treatment of cocaine dependence are associated with longer-term use and with broader measures of clinical improvement. The current study examined reductions in use, and abstinence-oriented measures, in relation to functioning and longer-term clinical benefits in the treatment of cocaine dependence. Methods Overall drug use, cocaine use, and functioning in a number of addiction-related domains for 487 patients diagnosed with DSM-IV cocaine dependence and treated with one of four psychosocial interventions in the NIDA Cocaine Collaborative Treatment Study were assessed monthly during 6 months of treatment and at 9, 12, 15, and 18 month follow-up. Results Measures of during-treatment reduction in use were moderately correlated with drug and cocaine use measures 12 months, but showed non-significant or small correlations with measures of functioning at 12 months. Highest correlations were evident for abstinence measures (maximum consecutive days abstinence and completely abstinent) during treatment in relation to sustained (3 month) abstinence at 12 months. Latent class analysis of patterns of change over time revealed that most patients initially (months 1 to 4 of treatment) either became abstinent immediately or continued to use every month. Over the couse of follow-up, patients either maintained abstinence or used regularly – intermittent use was less common. Conclusions There were generally small associations between various measures of cocaine use and longer-term clinical benefits, other than abstinence was associated with continued abstinence. No one method of measuring outcome of treatment of cocaine dependence appears superior to others. PMID:26366427

  12. Predictors of social anxiety in an opioid dependent sample and a control sample.

    Science.gov (United States)

    Shand, Fiona L; Degenhardt, Louisa; Nelson, Elliot C; Mattick, Richard P

    2010-01-01

    Compared to other mental health problems, social anxiety is under-acknowledged amongst opioid dependent populations. This study aimed to assess levels of social anxiety and identify its predictors in an opioid dependent sample and a matched control group. Opioid dependent participants (n=1385) and controls (n=417) completed the Social Interaction Anxiety Scale (SIAS), the Social Phobia Scale (SPS) and a diagnostic interview. Regression analyses were used to test a range of predictors of social anxiety. Opioid dependent cases had higher mean scores on both scales compared to controls. Predictors of social anxiety centred on emotional rejection in childhood, either by parents or peers. For opioid dependent cases, but not controls, lifetime non-opioid substance dependence (cannabis, sedatives, and tobacco) was associated with higher levels of social anxiety. However, much of the variance in social anxiety remains unexplained for this population.

  13. Sensitivity of adaptive enrichment trial designs to accrual rates, time to outcome measurement, and prognostic variables

    Directory of Open Access Journals (Sweden)

    Tianchen Qian

    2017-12-01

    Full Text Available Adaptive enrichment designs involve rules for restricting enrollment to a subset of the population during the course of an ongoing trial. This can be used to target those who benefit from the experimental treatment. Trial characteristics such as the accrual rate and the prognostic value of baseline variables are typically unknown when a trial is being planned; these values are typically assumed based on information available before the trial starts. Because of the added complexity in adaptive enrichment designs compared to standard designs, it may be of special concern how sensitive the trial performance is to deviations from assumptions. Through simulation studies, we evaluate the sensitivity of Type I error, power, expected sample size, and trial duration to different design characteristics. Our simulation distributions mimic features of data from the Alzheimer's Disease Neuroimaging Initiative cohort study, and involve two subpopulations based on a genetic marker. We investigate the impact of the following design characteristics: the accrual rate, the time from enrollment to measurement of a short-term outcome and the primary outcome, and the prognostic value of baseline variables and short-term outcomes. To leverage prognostic information in baseline variables and short-term outcomes, we use a semiparametric, locally efficient estimator, and investigate its strengths and limitations compared to standard estimators. We apply information-based monitoring, and evaluate how accurately information can be estimated in an ongoing trial.

  14. Temporal framing and the hidden-zero effect: rate-dependent outcomes on delay discounting.

    Science.gov (United States)

    Naudé, Gideon P; Kaplan, Brent A; Reed, Derek D; Henley, Amy J; DiGennaro Reed, Florence D

    2018-05-01

    Recent research suggests that presenting time intervals as units (e.g., days) or as specific dates, can modulate the degree to which humans discount delayed outcomes. Another framing effect involves explicitly stating that choosing a smaller-sooner reward is mutually exclusive to receiving a larger-later reward, thus presenting choices as an extended sequence. In Experiment 1, participants (N = 201) recruited from Amazon Mechanical Turk completed the Monetary Choice Questionnaire in a 2 (delay framing) by 2 (zero framing) design. Regression suggested a main effect of delay, but not zero, framing after accounting for other demographic variables and manipulations. We observed a rate-dependent effect for the date-framing group, such that those with initially steep discounting exhibited greater sensitivity to the manipulation than those with initially shallow discounting. Subsequent analyses suggest these effects cannot be explained by regression to the mean. Experiment 2 addressed the possibility that the null effect of zero framing was due to within-subject exposure to the hidden- and explicit-zero conditions. A new Amazon Mechanical Turk sample completed the Monetary Choice Questionnaire in either hidden- or explicit-zero formats. Analyses revealed a main effect of reward magnitude, but not zero framing, suggesting potential limitations to the generality of the hidden-zero effect. © 2018 Society for the Experimental Analysis of Behavior.

  15. 30 CFR 71.208 - Bimonthly sampling; designated work positions.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling; designated work positions... UNDERGROUND COAL MINES Sampling Procedures § 71.208 Bimonthly sampling; designated work positions. (a) Each... standard when quartz is present), respirable dust sampling of designated work positions shall begin on the...

  16. Minimum variance Monte Carlo importance sampling with parametric dependence

    International Nuclear Information System (INIS)

    Ragheb, M.M.H.; Halton, J.; Maynard, C.W.

    1981-01-01

    An approach for Monte Carlo Importance Sampling with parametric dependence is proposed. It depends upon obtaining by proper weighting over a single stage the overall functional dependence of the variance on the importance function parameter over a broad range of its values. Results corresponding to minimum variance are adapted and other results rejected. Numerical calculation for the estimation of intergrals are compared to Crude Monte Carlo. Results explain the occurrences of the effective biases (even though the theoretical bias is zero) and infinite variances which arise in calculations involving severe biasing and a moderate number of historis. Extension to particle transport applications is briefly discussed. The approach constitutes an extension of a theory on the application of Monte Carlo for the calculation of functional dependences introduced by Frolov and Chentsov to biasing, or importance sample calculations; and is a generalization which avoids nonconvergence to the optimal values in some cases of a multistage method for variance reduction introduced by Spanier. (orig.) [de

  17. The SDSS-IV MaNGA Sample: Design, Optimization, and Usage Considerations

    Science.gov (United States)

    Wake, David A.; Bundy, Kevin; Diamond-Stanic, Aleksandar M.; Yan, Renbin; Blanton, Michael R.; Bershady, Matthew A.; Sánchez-Gallego, José R.; Drory, Niv; Jones, Amy; Kauffmann, Guinevere; Law, David R.; Li, Cheng; MacDonald, Nicholas; Masters, Karen; Thomas, Daniel; Tinker, Jeremy; Weijmans, Anne-Marie; Brownstein, Joel R.

    2017-09-01

    We describe the sample design for the SDSS-IV MaNGA survey and present the final properties of the main samples along with important considerations for using these samples for science. Our target selection criteria were developed while simultaneously optimizing the size distribution of the MaNGA integral field units (IFUs), the IFU allocation strategy, and the target density to produce a survey defined in terms of maximizing signal-to-noise ratio, spatial resolution, and sample size. Our selection strategy makes use of redshift limits that only depend on I-band absolute magnitude (M I ), or, for a small subset of our sample, M I and color (NUV - I). Such a strategy ensures that all galaxies span the same range in angular size irrespective of luminosity and are therefore covered evenly by the adopted range of IFU sizes. We define three samples: the Primary and Secondary samples are selected to have a flat number density with respect to M I and are targeted to have spectroscopic coverage to 1.5 and 2.5 effective radii (R e ), respectively. The Color-Enhanced supplement increases the number of galaxies in the low-density regions of color-magnitude space by extending the redshift limits of the Primary sample in the appropriate color bins. The samples cover the stellar mass range 5× {10}8≤slant {M}* ≤slant 3× {10}11 {M}⊙ {h}-2 and are sampled at median physical resolutions of 1.37 and 2.5 kpc for the Primary and Secondary samples, respectively. We provide weights that will statistically correct for our luminosity and color-dependent selection function and IFU allocation strategy, thus correcting the observed sample to a volume-limited sample.

  18. Substance use, symptom, and employment outcomes of persons with a workplace mandate for chemical dependency treatment.

    Science.gov (United States)

    Weisner, Constance; Lu, Yun; Hinman, Agatha; Monahan, John; Bonnie, Richard J; Moore, Charles D; Chi, Felicia W; Appelbaum, Paul S

    2009-05-01

    This study examined the role of workplace mandates to chemical dependency treatment in treatment adherence, alcohol and drug abstinence, severity of employment problems, and severity of psychiatric problems. The sample included 448 employed members of a private, nonprofit U.S. managed care health plan who entered chemical dependency treatment with a workplace mandate (N=75) or without one (N=373); 405 of these individuals were followed up at one year (N=70 and N=335, respectively), and 362 participated in a five-year follow up (N=60 and N=302, respectively). Propensity scores predicting receipt of a workplace mandate were calculated. Logistic regression and ordinary least-squares regression were used to predict length of stay in chemical dependency treatment, alcohol and drug abstinence, and psychiatric and employment problem severity at one and five years. Overall, participants with a workplace mandate had one- and five-year outcomes similar to those without such a mandate. Having a workplace mandate also predicted longer treatment stays and improvement in employment problems. When other factors related to outcomes were controlled for, having a workplace mandate predicted abstinence at one year, with length of stay as a mediating variable. Workplace mandates can be an effective mechanism for improving work performance and other outcomes. Study participants who had a workplace mandate were more likely than those who did not have a workplace mandate to be abstinent at follow-up, and they did as well in treatment, both short and long term. Pressure from the workplace likely gets people to treatment earlier and provides incentives for treatment adherence.

  19. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    OpenAIRE

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we comp...

  20. Relational Intimacy Mediates Sexual Outcomes Associated With Impaired Sexual Function: Examination in a Clinical Sample.

    Science.gov (United States)

    Witherow, Marta Parkanyi; Chandraiah, Shambhavi; Seals, Samantha R; Sarver, Dustin E; Parisi, Kathryn E; Bugan, Antal

    2017-06-01

    Relational intimacy is hypothesized to underlie the association between female sexual functioning and various sexual outcomes, and married women and women with sexual dysfunction have been generally absent from prior studies investigating these associations, thus restricting generalizability. To investigate whether relational intimacy mediates sexual outcomes (sexual satisfaction, coital frequency, and sexual distress) in a sample of married women with and without impaired sexual functioning presenting in clinical settings. Using a cross-sectional design, 64 heterosexual married women with (n = 44) and without (n = 20) impaired sexual functioning completed a battery of validated measurements assessing relational intimacy, sexual dysfunction, sexual frequency, satisfaction, and distress. Intimacy measurements were combined using latent factor scores before analysis. Bias-corrected mediation models of the indirect effect were used to test mediation effects. Moderated mediation models examined whether indirect effects were influenced by age and marital duration. Patients completed the Female Sexual Function Index, the Couple's Satisfaction Index, the Sexual Satisfaction Scale for Women, the Inclusion of the Other in the Self Scale, and the Miller Social Intimacy Test. Mediation models showed that impaired sexual functioning is associated with all sexual outcomes directly and indirectly through relational intimacy. Results were predominantly independent of age and marital duration. Findings have important treatment implications for modifying interventions to focus on enhancing relational intimacy to improve the sexual functioning of women with impaired sexual functioning. The importance of the role relational intimacy plays in broad sexual outcomes of women with impaired sexual functioning is supported in clinically referred and married women. Latent factor scores to improve estimation of study constructs and the use of contemporary mediation analysis also are

  1. The effects of field dependent/independent style awareness on learning strategies and outcomes in an instructional hypermedia module

    Science.gov (United States)

    Fyle, Clifford Omodele

    The purpose of this study was to examine whether field-dependent/independent style awareness affects learning outcomes and learning strategies used in a hypermedia instructional module. Field-dependent/independent style was measured using the Global Embedded Figures Test. Style awareness meant that students were provided with information and explanations about their individual cognitive styles and the learning strategies that accommodate those styles. The study entailed examining students' achievement in a multiple-choice test and performance in a design task, and also their navigation patterns as they studied a science-oriented Webquest. The sample consisted of 149 eighth-grade students in 10 sections of a science class taught by two teachers in a public middle school. A two-group posttest-only design on one factor (style awareness) was used. Sixty-eight students in five sections of the class were assigned to the treatment group (field dependent/independent style awareness) while the other 81 students in five sections were assigned to the control group (no field dependent/independent style awareness). The study took place over a period of 6 days. On the first day, students in the treatment group were first tested and debriefed on their individual styles. Next, all students in both the treatment and control groups studied the hypermedia instructional module (Webquest) over a period of two days. On the fourth and fifth days students worked on the performance tasks, and on the sixth day students took the multiple-choice test and students in the control group were tested and debriefed on their individual styles. The findings indicate that style awareness significantly influenced the learning strategies of field-dependent students as they studied and carried out learning tasks in the Webquest. Field-dependent students with style awareness used hypertext links and navigated the menu sequentially a greater number of times than their counterparts with no style awareness

  2. A two-stage Bayesian design with sample size reestimation and subgroup analysis for phase II binary response trials.

    Science.gov (United States)

    Zhong, Wei; Koopmeiners, Joseph S; Carlin, Bradley P

    2013-11-01

    Frequentist sample size determination for binary outcome data in a two-arm clinical trial requires initial guesses of the event probabilities for the two treatments. Misspecification of these event rates may lead to a poor estimate of the necessary sample size. In contrast, the Bayesian approach that considers the treatment effect to be random variable having some distribution may offer a better, more flexible approach. The Bayesian sample size proposed by (Whitehead et al., 2008) for exploratory studies on efficacy justifies the acceptable minimum sample size by a "conclusiveness" condition. In this work, we introduce a new two-stage Bayesian design with sample size reestimation at the interim stage. Our design inherits the properties of good interpretation and easy implementation from Whitehead et al. (2008), generalizes their method to a two-sample setting, and uses a fully Bayesian predictive approach to reduce an overly large initial sample size when necessary. Moreover, our design can be extended to allow patient level covariates via logistic regression, now adjusting sample size within each subgroup based on interim analyses. We illustrate the benefits of our approach with a design in non-Hodgkin lymphoma with a simple binary covariate (patient gender), offering an initial step toward within-trial personalized medicine. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Substance Use, Symptom, and Employment Outcomes of Persons With a Workplace Mandate for Chemical Dependency Treatment

    Science.gov (United States)

    Weisner, Constance; Lu, Yun; Hinman, Agatha; Monahan, John; Bonnie, Richard J.; Moore, Charles D.; Chi, Felicia W.; Appelbaum, Paul S.

    2010-01-01

    Objective This study examined the role of workplace mandates to chemical dependency treatment in treatment adherence, alcohol and drug abstinence, severity of employment problems, and severity of psychiatric problems. Methods The sample included 448 employed members of a private, nonprofit U.S. managed care health plan who entered chemical dependency treatment with a workplace mandate (N=75) or without one (N=373); 405 of these individuals were followed up at one year (N=70 and N=335, respectively), and 362 participated in a five-year follow up (N=60 and N=302, respectively). Propensity scores predicting receipt of a workplace mandate were calculated. Logistic regression and ordinary least-squares regression were used to predict length of stay in chemical dependency treatment, alcohol and drug abstinence, and psychiatric and employment problem severity at one and five years. Results Overall, participants with a workplace mandate had one- and five-year outcomes similar to those without such a mandate. Having a workplace mandate also predicted longer treatment stays and improvement in employment problems. When other factors related to outcomes were controlled for, having a workplace mandate predicted abstinence at one year, with length of stay as a mediating variable. Conclusions Workplace mandates can be an effective mechanism for improving work performance and other outcomes. Study participants who had a workplace mandate were more likely than those who did not have a workplace mandate to be abstinent at follow-up, and they did as well in treatment, both short and long term. Pressure from the workplace likely gets people to treatment earlier and provides incentives for treatment adherence. PMID:19411353

  4. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    Science.gov (United States)

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  5. Correlates of interpersonal dependency and detachment in an adolescent inpatient sample.

    Science.gov (United States)

    Haggerty, Greg; Siefert, Caleb J; Bornstein, Robert F; Sinclair, Samuel Justin; Blais, Mark A; Zodan, Jennifer; Rao, Nyapati

    2015-01-01

    Interpersonal dependency has been linked to psychological distress, depression, help seeking, treatment compliance, and sensitivity to interpersonal cues in adult samples. However, there is a dearth of research focusing on dependency in child and adolescent samples. The current study examined the construct validity of a measure of interpersonal dependency. The authors investigated how interpersonal dependency and detachment relate to behavioral problems, subjective well-being, interpersonal problems, and global symptom severity in adolescent inpatients. Destructive overdependence (DO) and dysfunctional detachment (DD) were positively related to interpersonal distress, behavioral problems, and symptom severity and negatively related to psychological health and well-being. Healthy dependency (HD) was associated with fewer behavioral problems and less symptom severity and positively related to subjective well-being. The clinical implications of these findings are discussed.

  6. On efficiency of some ratio estimators in double sampling design ...

    African Journals Online (AJOL)

    In this paper, three sampling ratio estimators in double sampling design were proposed with the intention of finding an alternative double sampling design estimator to the conventional ratio estimator in double sampling design discussed by Cochran (1997), Okafor (2002) , Raj (1972) and Raj and Chandhok (1999).

  7. Predictors and outcomes of shunt-dependent hydrocephalus in patients with aneurysmal sub-arachnoid hemorrhage

    Science.gov (United States)

    2012-01-01

    Background Hydrocephalus following spontaneous aneurysmal sub-arachnoid hemorrhage (SAH) is often associated with unfavorable outcome. This study aimed to determine the potential risk factors and outcomes of shunt-dependent hydrocephalus in aneurysmal SAH patients but without hydrocephalus upon arrival at the hospital. Methods One hundred and sixty-eight aneurysmal SAH patients were evaluated. Using functional scores, those without hydrocephalus upon arrival at the hospital were compared to those already with hydrocephalus on admission, those who developed it during hospitalization, and those who did not develop it throughout their hospital stay. The Glasgow Coma Score, modified Fisher SAH grade, and World Federation of Neurosurgical Societies grade were determined at the emergency room. Therapeutic outcomes immediately after discharge and 18 months after were assessed using the Glasgow Outcome Score. Results Hydrocephalus accounted for 61.9% (104/168) of all episodes, including 82 with initial hydrocephalus on admission and 22 with subsequent hydrocephalus. Both the presence of intra-ventricular hemorrhage on admission and post-operative intra-cerebral hemorrhage were independently associated with shunt-dependent hydrocephalus in patients without hydrocephalus on admission. After a minimum 1.5 years of follow-up, the mean Glasgow outcome score was 3.33 ± 1.40 for patients with shunt-dependent hydrocephalus and 4.21 ± 1.19 for those without. Conclusions The presence of intra-ventricular hemorrhage, lower mean Glasgow Coma Scale score, and higher mean scores of the modified Fisher SAH and World Federation of Neurosurgical grading on admission imply risk of shunt-dependent hydrocephalus in patients without initial hydrocephalus. These patients have worse short- and long-term outcomes and longer hospitalization. PMID:22765765

  8. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  9. Patient activation and disparate health care outcomes in a racially diverse sample of chronically ill older adults.

    Science.gov (United States)

    Ryvicker, Miriam; Peng, Timothy R; Feldman, Penny Hollander

    2012-11-01

    The Patient Activation Measure (PAM) assesses people's ability to self-manage their health. Variations in PAM score have been linked with health behaviors, outcomes, and potential disparities. This study assessed the relative impacts of activation, socio-demographic and clinical factors on health care outcomes in a racially diverse sample of chronically ill, elderly homecare patients. Using survey and administrative data from 249 predominantly non-White patients, logistic regression was conducted to examine the effects of activation level and patient characteristics on the likelihood of subsequent hospitalization and emergency department (ED) use. Activation was not a significant predictor of hospitalization or ED use in adjusted models. Non-Whites were more likely than Whites to have a hospitalization or ED visit. Obesity was a strong predictor of both outcomes. Further research should examine potential sources of disadvantage among chronically ill homecare patients to design effective interventions to reduce health disparities in this population.

  10. Evaluation of the International Outcome Inventory for Hearing Aids in a veteran sample.

    Science.gov (United States)

    Smith, Sherri L; Noe, Colleen M; Alexander, Genevieve C

    2009-06-01

    The International Outcome Inventory for Hearing Aids (IOI-HA) was developed as a global hearing aid outcome measure targeting seven outcome domains. The published norms were based on a private-pay sample who were fitted with analog hearing aids. The purpose of this study was to evaluate the psychometric properties of the IOI-HA and to establish normative data in a veteran sample. Survey. The participants were 131 male veterans (mean age of 74.3 years, SD = 7.4) who were issued hearing aids with digital signal processing (DSP). Hearing aids with DSP that were fitted bilaterally between 2005 and 2007. Veterans were mailed two copies of the IOI-HA. The participants were instructed to complete the first copy of the questionnaire immediately and the second copy in two weeks. The completed questionnaires were mailed to the laboratory. The psychometric properties of the questionnaire were evaluated. As suggested by Cox and colleagues, the participants were divided into two categories based on their unaided subjective hearing difficulty. The two categories were (1) those with less hearing difficulty (none-to-moderate category) and (2) those who report more hearing difficulty (moderately severe+ category). The norms from the current veteran sample then were compared to the original, published sample. For each hearing difficulty category, the critical difference values were calculated for each item and for the total score. A factor analysis showed that the IOI-HA in the veteran sample had the identical subscale structure as reported in the original sample. For the total scale, the internal consistency was good (Chronbach's alpha = 0.83), and the test-retest reliability was high (lambda = 0.94). Group and individual norms were developed for both hearing difficulty categories in the veteran sample. For each IOI-HA item, the critical difference scores were one response unit between two test sessions reflects a true change in outcome for a given domain. The results of this study

  11. Serious game design principles: The impact of game design on learning outcomes

    Science.gov (United States)

    Martin, Michael W.

    This dissertation examines the research question "How do video game design principles affect learning outcomes in serious games?" This research first develops a theoretical foundation concerning the meaning of the terms "game" and "serious game". This conceptual clarification is broken down into analytic propositions, which state that games have participants, rules, goals and challenges, and synthetic propositions, which state that the games should be intrinsically compelling, provide meaningful choices, and be self encapsulated. Based on these synthetic propositions, three hypotheses were developed. The hypotheses are that games with an enhanced aesthetic presentation, more meaningful choices, or provide player competition will elicit higher learning outcomes than identical games without these factors. These hypotheses were tested via a quantitative experiment involving 172 undergraduate students in the Old Dominion University Chemistry Department. The students were asked to play a chemistry-oriented serious game entitled Element Solitaire©, which was created by the research author. The students were randomly given different treatments of the Element Solitaire© game to play, and the difference between their learning outcomes were compared. The experimental results demonstrated that the aesthetic presentation of a game can have a significant impact upon the learning outcome. The experiment was not able to discern significant effects from the choice or competition conditions, but further examination of the experimental data did reveal some insight into these aspects of serious game design. Choices need to provide the player with options that have a sufficient value that they will be considered and the application of competition within games needs to be judiciously implemented to promote a positive affect for all players. The results of the theoretical foundations and empirical evidence were then combined with additional theoretical research to develop a set of

  12. Mental illness and housing outcomes among a sample of homeless men in an Australian urban centre.

    Science.gov (United States)

    Spicer, Bridget; Smith, David I; Conroy, Elizabeth; Flatau, Paul R; Burns, Lucy

    2015-05-01

    The over-representation of mental illness among homeless people across the globe is well documented. However, there is a dearth of Australian literature on the mental health needs of homeless individuals. Furthermore, longitudinal research examining the factors that contribute to better housing outcomes among this population is sparse. The aim of this research is to describe the mental illness profile of a sample of homeless men in an Australian urban centre (in Sydney) and examine the factors associated with better housing outcomes at 12-month follow-up. A longitudinal survey was administered to 253 homeless men who were involved in the Michael Project: a 3-year initiative which combined existing accommodation support services with assertive case management and access to coordinated additional specialist allied health and support services. A total of 107 participants were followed up 12 months later. The survey examined the demographics of the sample and lifetime mental disorder diagnoses, and also included psychological screeners for current substance use and dependence, psychological distress, psychosis, and post-traumatic stress. Consistent with existing literature, the prevalence of mental illness was significantly greater amongst this sample than the general Australian population. However, mental illness presentation was not associated with housing situation at 12-month follow-up. Instead, type of support service at baseline was the best predictor of housing outcome, wherein participants who received short to medium-term accommodation and support were significantly more likely to be housed in stable, long-term housing at the 12-month follow-up than participants who received outreach or emergency accommodation support. This study provides evidence to support an innovative support model for homeless people in Australia and contributes to the limited Australian research on mental illness in this population. © The Royal Australian and New Zealand College of

  13. Design compliance matrix waste sample container filling system for nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This design compliance matrix document provides specific design related functional characteristics, constraints, and requirements for the container filling system that is part of the nested, fixed-depth sampling system. This document addresses performance, external interfaces, ALARA, Authorization Basis, environmental and design code requirements for the container filling system. The container filling system will interface with the waste stream from the fluidic pumping channels of the nested, fixed-depth sampling system and will fill containers with waste that meet the Resource Conservation and Recovery Act (RCRA) criteria for waste that contains volatile and semi-volatile organic materials. The specifications for the nested, fixed-depth sampling system are described in a Level 2 Specification document (HNF-3483, Rev. 1). The basis for this design compliance matrix document is the Tank Waste Remediation System (TWRS) desk instructions for design Compliance matrix documents (PI-CP-008-00, Rev. 0)

  14. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation.

    Science.gov (United States)

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-04-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.

  15. Simulation of design dependent failure exposure levels for CMOS ICs

    International Nuclear Information System (INIS)

    Kaul, N.; Bhuva, B.L.; Rangavajjhala, V.; van der Molen, H.; Kerns, S.E.

    1990-01-01

    The total dose exposure of CMOS ICs introduces bias-dependent parameter shifts in individual devices. The bias dependency of individual parameter shifts of devices cause different designs to behave differently under identical testing conditions. This paper studies the effect of design and bias on the radiation tolerance of ICs and presents an automated design tool that produces different designs for a logic function, and presents important parameters of each design to circuit designer for trade off analysis

  16. Adaptive clinical trial designs with pre-specified rules for modifying the sample size: understanding efficient types of adaptation.

    Science.gov (United States)

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2013-04-15

    Adaptive clinical trial design has been proposed as a promising new approach that may improve the drug discovery process. Proponents of adaptive sample size re-estimation promote its ability to avoid 'up-front' commitment of resources, better address the complicated decisions faced by data monitoring committees, and minimize accrual to studies having delayed ascertainment of outcomes. We investigate aspects of adaptation rules, such as timing of the adaptation analysis and magnitude of sample size adjustment, that lead to greater or lesser statistical efficiency. Owing in part to the recent Food and Drug Administration guidance that promotes the use of pre-specified sampling plans, we evaluate alternative approaches in the context of well-defined, pre-specified adaptation. We quantify the relative costs and benefits of fixed sample, group sequential, and pre-specified adaptive designs with respect to standard operating characteristics such as type I error, maximal sample size, power, and expected sample size under a range of alternatives. Our results build on others' prior research by demonstrating in realistic settings that simple and easily implemented pre-specified adaptive designs provide only very small efficiency gains over group sequential designs with the same number of analyses. In addition, we describe optimal rules for modifying the sample size, providing efficient adaptation boundaries on a variety of scales for the interim test statistic for adaptation analyses occurring at several different stages of the trial. We thus provide insight into what are good and bad choices of adaptive sampling plans when the added flexibility of adaptive designs is desired. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Sampling Design of Soil Physical Properties in a Conilon Coffee Field

    Directory of Open Access Journals (Sweden)

    Eduardo Oliveira de Jesus Santos

    Full Text Available ABSTRACT Establishing the number of samples required to determine values of soil physical properties ultimately results in optimization of labor and allows better representation of such attributes. The objective of this study was to analyze the spatial variability of soil physical properties in a Conilon coffee field and propose a soil sampling method better attuned to conditions of the management system. The experiment was performed in a Conilon coffee field in Espírito Santo state, Brazil, under a 3.0 × 2.0 × 1.0 m (4,000 plants ha-1 double spacing design. An irregular grid, with dimensions of 107 × 95.7 m and 65 sampling points, was set up. Soil samples were collected from the 0.00-0.20 m depth from each sampling point. Data were analyzed under descriptive statistical and geostatistical methods. Using statistical parameters, the adequate number of samples for analyzing the attributes under study was established, which ranged from 1 to 11 sampling points. With the exception of particle density, all soil physical properties showed a spatial dependence structure best fitted to the spherical model. Establishment of the number of samples and spatial variability for the physical properties of soils may be useful in developing sampling strategies that minimize costs for farmers within a tolerable and predictable level of error.

  18. The Self-perception of Text-message Dependency Scale (STDS): Psychometric update based on a United States sample.

    Science.gov (United States)

    Liese, Bruce S; Benau, Erik M; Atchley, Paul; Reed, Derek; Becirevic, Amel; Kaplan, Brent

    2018-05-14

    Some have suggested that text messaging is an addictive behavior. However, this characterization is uncertain, partly due to lack of well-validated measures of text messaging attitudes and behaviors. One standard instrument for measuring text messaging attitudes and behaviors is the Self-perception of Text-message Dependency Scale (STDS), though the psychometric properties of this scale have only been examined with a sample of Japanese youth. The primary objective of this study was to evaluate the STDS in the United States to determine its utility as a measure of text messaging dependence. We were interested in examining the factor structure and determining the extent to which this scale would correlate with two important outcome measures: motor vehicle accidents (MVAs) and moving violations. We analyzed data from 468 adults (age 18-74; 274 women) recruited via Amazon's Mechanical Turk (mTurk) service. Participants completed the STDS and provided information about their driving-related incidents in the past year. First we performed a confirmatory factor analysis, which supported the instrument's original factor structure. Then we tested the relationship between scores on the STDS and two important variables, MVAs and moving violations. We found that the STDS significantly correlated with both MVAs and moving violations. The present study confirms that the STDS is a potentially useful instrument for studying texting dependence in the United States and with adults of all ages. The instrument may be particularly useful in predicting motor vehicle outcomes.

  19. Portable ultrahigh-vacuum sample storage system for polarization-dependent total-reflection fluorescence x-ray absorption fine structure spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, Yoshihide, E-mail: e0827@mosk.tytlabs.co.jp; Nishimura, Yusaku F.; Suzuki, Ryo; Beniya, Atsushi; Isomura, Noritake [Toyota Central R& D Labs., Inc., Yokomichi 41-1, Nagakute, Aichi 480-1192 (Japan); Uehara, Hiromitsu; Asakura, Kiyotaka; Takakusagi, Satoru [Catalysis Research Center, Hokkaido University, Kita 21-10, Sapporo, Hokkaido 001-0021 (Japan); Nimura, Tomoyuki [AVC Co., Ltd., Inada 1450-6, Hitachinaka, Ibaraki 312-0061 (Japan)

    2016-03-15

    A portable ultrahigh-vacuum sample storage system was designed and built to investigate the detailed geometric structures of mass-selected metal clusters on oxide substrates by polarization-dependent total-reflection fluorescence x-ray absorption fine structure spectroscopy (PTRF-XAFS). This ultrahigh-vacuum (UHV) sample storage system provides the handover of samples between two different sample manipulating systems. The sample storage system is adaptable for public transportation, facilitating experiments using air-sensitive samples in synchrotron radiation or other quantum beam facilities. The samples were transferred by the developed portable UHV transfer system via a public transportation at a distance over 400 km. The performance of the transfer system was demonstrated by a successful PTRF-XAFS study of Pt{sub 4} clusters deposited on a TiO{sub 2}(110) surface.

  20. Sample design for the residential energy consumption survey

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  1. System design description for sampling fuel in K basins

    International Nuclear Information System (INIS)

    Baker, R.B.

    1996-01-01

    This System Design Description provides: (1) statements of the Spent Nuclear Fuel Projects (SNFP) needs requiring sampling of fuel in the K East and K West Basins, (2) the sampling equipment functions and requirements, (3) a general work plan and the design logic being followed to develop the equipment, and (4) a summary description of the design for the sampling equipment. The report summarizes the integrated application of both the subject equipment and the canister sludge sampler in near-term characterization campaigns at K Basins

  2. Sampling design for use by the soil decontamination project

    International Nuclear Information System (INIS)

    Rutherford, D.W.; Stevens, J.R.

    1981-01-01

    This report proposes a general approach to the problem and discusses sampling of soil to map the contaminated area and to provide samples for characterizaton of soil components and contamination. Basic concepts in sample design are reviewed with reference to environmental transuranic studies. Common designs are reviewed and evaluated for use with specific objectives that might be required by the soil decontamination project. Examples of a hierarchial design pilot study and a combined hierarchial and grid study are proposed for the Rocky Flats 903 pad area

  3. Implications of clinical trial design on sample size requirements.

    Science.gov (United States)

    Leon, Andrew C

    2008-07-01

    The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two particularly relevant aspects of such a design often receive insufficient attention in an RCT. Multiple outcomes inflate type I error, and an unreliable assessment process introduces bias and reduces statistical power. Here we describe how both unreliability and multiple outcomes can increase the study costs and duration and reduce the feasibility of the study. The objective of this article is to consider strategies that overcome the problems of unreliability and multiplicity.

  4. Investigating Treatment Outcomes Across OCD Symptom Dimensions in a Clinical Sample of OCD Patients.

    Science.gov (United States)

    Chase, Tannah; Wetterneck, Chad T; Bartsch, Robert A; Leonard, Rachel C; Riemann, Bradley C

    2015-01-01

    Despite the heterogeneous nature of obsessive-compulsive disorder (OCD), many self-report assessments do not adequately capture the clinical picture presenting within each symptom dimension, particularly unacceptable thoughts (UTs). In addition, obsessions and ordering/arranging compulsions are often underrepresented in samples of treatment outcome studies for OCD. Such methodological discrepancies may obscure research findings comparing treatment outcomes across OCD symptom dimensions. This study aimed to improve upon previous research by investigating treatment outcomes across OCD symptom dimensions using the Dimensional Obsessive-Compulsive Scale, which offers a more comprehensive assessment of UTs. The study included a primarily residential sample of 134 OCD patients. Results indicated that there were no significant differences in treatment outcomes across symptom dimensions. However, the severity of UTs remained significantly greater than other symptom dimensions at both admission and discharge. Thus, it is possible that UTs may exhibit uniquely impairing features, compared with other symptom dimensions. It is also possible that these findings may reflect the characteristics of the residential OCD samples. These speculations as well as implications for OCD treatment and future research are discussed.

  5. [Saarland Growth Study: sampling design].

    Science.gov (United States)

    Danker-Hopfe, H; Zabransky, S

    2000-01-01

    The use of reference data to evaluate the physical development of children and adolescents is part of the daily routine in the paediatric ambulance. The construction of such reference data is based on the collection of extensive reference data. There are different kinds of reference data: cross sectional references, which are based on data collected from a big representative cross-sectional sample of the population, longitudinal references, which are based on follow-up surveys of usually smaller samples of individuals from birth to maturity, and mixed longitudinal references, which are a combination of longitudinal and cross-sectional reference data. The advantages and disadvantages of the different methods of data collection and the resulting reference data are discussed. The Saarland Growth Study was conducted for several reasons: growth processes are subject to secular changes, there are no specific reference data for children and adolescents from this part of the country and the growth charts in use in the paediatric praxis are possibly not appropriate any more. Therefore, the Saarland Growth Study served two purposes a) to create actual regional reference data and b) to create a database for future studies on secular trends in growth processes of children and adolescents from Saarland. The present contribution focusses on general remarks on the sampling design of (cross-sectional) growth surveys and its inferences for the design of the present study.

  6. Design and development of multiple sample counting setup

    International Nuclear Information System (INIS)

    Rath, D.P.; Murali, S.; Babu, D.A.R.

    2010-01-01

    Full text: The analysis of active samples on regular basis for ambient air activity and floor contamination from radio chemical lab accounts for major chunk of the operational activity in Health Physicist's responsibility. The requirement for daily air sample analysis on immediate counting and delayed counting from various labs in addition to samples of smear swipe check of lab led to the urge for development of system that could cater multiple sample analysis in a time programmed manner on a single sample loading. A multiple alpha/beta counting system for counting was designed and fabricated. It has arrangements for loading 10 samples in slots in order, get counted in a time programmed manner with results displayed and records maintained in PC. The paper describes the design and development of multiple sample counting setup presently in use at the facility has resulted in reduction of man-hour consumption in counting and recording of the results

  7. Effects of Environmental Design on Patient Outcome

    DEFF Research Database (Denmark)

    Laursen, Jannie; Danielsen, Anne Kjaergaard; Rosenberg, Jacob

    2014-01-01

    OBJECTIVE: The aim of this systematic review was to assess how inpatients were affected by the built environment design during their hospitalization. BACKGROUND: Over the last decade, the healthcare system has become increasingly aware of how focus on healthcare environment might affect patient....... The following databases were searched: Medline/PubMed, Cinahl, and Embase. Inclusion criteria were randomized clinical trials (RCTs) investigating the effect of built environment design interventions such as music, natural murals, and plants in relation to patients' health outcome. RESULTS: Built environment...... satisfaction. The focus on environmental design has become a field with great potential because of its possible impact on cost control while improving quality of care. METHODS: A systematic literature search was conducted to identify current and past studies about evidence-based healthcare design...

  8. The two-sample problem with induced dependent censorship.

    Science.gov (United States)

    Huang, Y

    1999-12-01

    Induced dependent censorship is a general phenomenon in health service evaluation studies in which a measure such as quality-adjusted survival time or lifetime medical cost is of interest. We investigate the two-sample problem and propose two classes of nonparametric tests. Based on consistent estimation of the survival function for each sample, the two classes of test statistics examine the cumulative weighted difference in hazard functions and in survival functions. We derive a unified asymptotic null distribution theory and inference procedure. The tests are applied to trial V of the International Breast Cancer Study Group and show that long duration chemotherapy significantly improves time without symptoms of disease and toxicity of treatment as compared with the short duration treatment. Simulation studies demonstrate that the proposed tests, with a wide range of weight choices, perform well under moderate sample sizes.

  9. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  10. Rationale, study design and sample characteristics of a randomized controlled trial of directly administered antiretroviral therapy for HIV-infected prisoners transitioning to the community - a potential conduit to improved HIV treatment outcomes.

    Science.gov (United States)

    Saber-Tehrani, Ali Shabahang; Springer, Sandra A; Qiu, Jingjun; Herme, Maua; Wickersham, Jeffrey; Altice, Frederick L

    2012-03-01

    HIV-infected prisoners experience poor HIV treatment outcomes post-release. Directly administered antiretroviral therapy (DAART) is a CDC-designated, evidence-based adherence intervention for drug users, yet untested among released prisoners. Sentenced HIV-infected prisoners on antiretroviral therapy (ART) and returning to New Haven or Hartford, Connecticut were recruited and randomized 2:1 to a prospective controlled trial (RCT) of 6 months of DAART versus self-administered therapy (SAT); all subjects received case management services. Subjects meeting DSM-IV criteria for opioid dependence were offered immediate medication-assisted treatment. Trained outreach workers provided DAART once-daily, seven days per week, including behavioral skills training during the last intervention month. Both study groups were assessed for 6 months after the intervention period. Assessments occurred within 90 days pre-release (baseline), day of release, and then monthly for 12 months. Viral load (VL) and CD4 testing was conducted baseline and quarterly; genotypic resistance testing was conducted at baseline, 6 and 12 months. The primary outcome was pre-defined as viral suppression (VLHIV treatment outcomes after release from prison, a period associated with adverse HIV and other medical consequences. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Burnout and Engagement: Relative Importance of Predictors and Outcomes in Two Health Care Worker Samples.

    Science.gov (United States)

    Fragoso, Zachary L; Holcombe, Kyla J; McCluney, Courtney L; Fisher, Gwenith G; McGonagle, Alyssa K; Friebe, Susan J

    2016-06-09

    This study's purpose was twofold: first, to examine the relative importance of job demands and resources as predictors of burnout and engagement, and second, the relative importance of engagement and burnout related to health, depressive symptoms, work ability, organizational commitment, and turnover intentions in two samples of health care workers. Nurse leaders (n = 162) and licensed emergency medical technicians (EMTs; n = 102) completed surveys. In both samples, job demands predicted burnout more strongly than job resources, and job resources predicted engagement more strongly than job demands. Engagement held more weight than burnout for predicting commitment, and burnout held more weight for predicting health outcomes, depressive symptoms, and work ability. Results have implications for the design, evaluation, and effectiveness of workplace interventions to reduce burnout and improve engagement among health care workers. Actionable recommendations for increasing engagement and decreasing burnout in health care organizations are provided. © 2016 The Author(s).

  12. Sample size calculations based on a difference in medians for positively skewed outcomes in health care studies

    Directory of Open Access Journals (Sweden)

    Aidan G. O’Keeffe

    2017-12-01

    Full Text Available Abstract Background In healthcare research, outcomes with skewed probability distributions are common. Sample size calculations for such outcomes are typically based on estimates on a transformed scale (e.g. log which may sometimes be difficult to obtain. In contrast, estimates of median and variance on the untransformed scale are generally easier to pre-specify. The aim of this paper is to describe how to calculate a sample size for a two group comparison of interest based on median and untransformed variance estimates for log-normal outcome data. Methods A log-normal distribution for outcome data is assumed and a sample size calculation approach for a two-sample t-test that compares log-transformed outcome data is demonstrated where the change of interest is specified as difference in median values on the untransformed scale. A simulation study is used to compare the method with a non-parametric alternative (Mann-Whitney U test in a variety of scenarios and the method is applied to a real example in neurosurgery. Results The method attained a nominal power value in simulation studies and was favourable in comparison to a Mann-Whitney U test and a two-sample t-test of untransformed outcomes. In addition, the method can be adjusted and used in some situations where the outcome distribution is not strictly log-normal. Conclusions We recommend the use of this sample size calculation approach for outcome data that are expected to be positively skewed and where a two group comparison on a log-transformed scale is planned. An advantage of this method over usual calculations based on estimates on the log-transformed scale is that it allows clinical efficacy to be specified as a difference in medians and requires a variance estimate on the untransformed scale. Such estimates are often easier to obtain and more interpretable than those for log-transformed outcomes.

  13. Designing an enhanced groundwater sample collection system

    International Nuclear Information System (INIS)

    Schalla, R.

    1994-10-01

    As part of an ongoing technical support mission to achieve excellence and efficiency in environmental restoration activities at the Laboratory for Energy and Health-Related Research (LEHR), Pacific Northwest Laboratory (PNL) provided guidance on the design and construction of monitoring wells and identified the most suitable type of groundwater sampling pump and accessories for monitoring wells. The goal was to utilize a monitoring well design that would allow for hydrologic testing and reduce turbidity to minimize the impact of sampling. The sampling results of the newly designed monitoring wells were clearly superior to those of the previously installed monitoring wells. The new wells exhibited reduced turbidity, in addition to improved access for instrumentation and hydrologic testing. The variable frequency submersible pump was selected as the best choice for obtaining groundwater samples. The literature references are listed at the end of this report. Despite some initial difficulties, the actual performance of the variable frequency, submersible pump and its accessories was effective in reducing sampling time and labor costs, and its ease of use was preferred over the previously used bladder pumps. The surface seals system, called the Dedicator, proved to be useful accessory to prevent surface contamination while providing easy access for water-level measurements and for connecting the pump. Cost savings resulted from the use of the pre-production pumps (beta units) donated by the manufacturer for the demonstration. However, larger savings resulted from shortened field time due to the ease in using the submersible pumps and the surface seal access system. Proper deployment of the monitoring wells also resulted in cost savings and ensured representative samples

  14. Extending cluster lot quality assurance sampling designs for surveillance programs.

    Science.gov (United States)

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.

  15. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  16. Neurophysiological model of tinnitus: dependence of the minimal masking level on treatment outcome.

    Science.gov (United States)

    Jastreboff, P J; Hazell, J W; Graham, R L

    1994-11-01

    Validity of the neurophysiological model of tinnitus (Jastreboff, 1990), outlined in this paper, was tested on data from multicenter trial of tinnitus masking (Hazell et al., 1985). Minimal masking level, intensity match of tinnitus, and the threshold of hearing have been evaluated on a total of 382 patients before and after 6 months of treatment with maskers, hearing aids, or combination devices. The data has been divided into categories depending on treatment outcome and type of approach used. Results of analysis revealed that: i) the psychoacoustical description of tinnitus does not possess a predictive value for the outcome of the treatment; ii) minimal masking level changed significantly depending on the treatment outcome, decreasing on average by 5.3 dB in patients reporting improvement, and increasing by 4.9 dB in those whose tinnitus remained the same or worsened; iii) 73.9% of patients reporting improvement had their minimal masking level decreased as compared with 50.5% for patients not showing improvement, which is at the level of random change; iv) the type of device used has no significant impact on the treatment outcome and minimal masking level change; v) intensity match and threshold of hearing did not exhibit any significant changes which can be related to treatment outcome. These results are fully consistent with the neurophysiological interpretation of mechanisms involved in the phenomenon of tinnitus and its alleviation.

  17. A Framework for Designing a Healthcare Outcome Data Warehouse

    Science.gov (United States)

    Parmanto, Bambang; Scotch, Matthew; Ahmad, Sjarif

    2005-01-01

    Many healthcare processes involve a series of patient visits or a series of outcomes. The modeling of outcomes associated with these types of healthcare processes is different from and not as well understood as the modeling of standard industry environments. For this reason, the typical multidimensional data warehouse designs that are frequently seen in other industries are often not a good match for data obtained from healthcare processes. Dimensional modeling is a data warehouse design technique that uses a data structure similar to the easily understood entity-relationship (ER) model but is sophisticated in that it supports high-performance data access. In the context of rehabilitation services, we implemented a slight variation of the dimensional modeling technique to make a data warehouse more appropriate for healthcare. One of the key aspects of designing a healthcare data warehouse is finding the right grain (scope) for different levels of analysis. We propose three levels of grain that enable the analysis of healthcare outcomes from highly summarized reports on episodes of care to fine-grained studies of progress from one treatment visit to the next. These grains allow the database to support multiple levels of analysis, which is imperative for healthcare decision making. PMID:18066371

  18. A framework for designing a healthcare outcome data warehouse.

    Science.gov (United States)

    Parmanto, Bambang; Scotch, Matthew; Ahmad, Sjarif

    2005-09-06

    Many healthcare processes involve a series of patient visits or a series of outcomes. The modeling of outcomes associated with these types of healthcare processes is different from and not as well understood as the modeling of standard industry environments. For this reason, the typical multidimensional data warehouse designs that are frequently seen in other industries are often not a good match for data obtained from healthcare processes. Dimensional modeling is a data warehouse design technique that uses a data structure similar to the easily understood entity-relationship (ER) model but is sophisticated in that it supports high-performance data access. In the context of rehabilitation services, we implemented a slight variation of the dimensional modeling technique to make a data warehouse more appropriate for healthcare. One of the key aspects of designing a healthcare data warehouse is finding the right grain (scope) for different levels of analysis. We propose three levels of grain that enable the analysis of healthcare outcomes from highly summarized reports on episodes of care to fine-grained studies of progress from one treatment visit to the next. These grains allow the database to support multiple levels of analysis, which is imperative for healthcare decision making.

  19. Learning Bounds of ERM Principle for Sequences of Time-Dependent Samples

    Directory of Open Access Journals (Sweden)

    Mingchen Yao

    2015-01-01

    Full Text Available Many generalization results in learning theory are established under the assumption that samples are independent and identically distributed (i.i.d.. However, numerous learning tasks in practical applications involve the time-dependent data. In this paper, we propose a theoretical framework to analyze the generalization performance of the empirical risk minimization (ERM principle for sequences of time-dependent samples (TDS. In particular, we first present the generalization bound of ERM principle for TDS. By introducing some auxiliary quantities, we also give a further analysis of the generalization properties and the asymptotical behaviors of ERM principle for TDS.

  20. The Things of Design Research: Diversity in Objects and Outcomes

    DEFF Research Database (Denmark)

    Jenkins, Tom; Andersen, Kristina; Gaver, Bill

    2017-01-01

    of attending to its made-material outcomes. The premise of this workshop is simple: We need additional social spaces and platforms for interacting with and reflecting upon material design outcomes at CHI. The goal of this workshop is to keep experimenting with such a space, with an emphasis on how prototyping...

  1. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  2. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    Science.gov (United States)

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  3. Differences in treatment outcome among marijuana-dependent young adults with and without antisocial personality disorder.

    Science.gov (United States)

    Easton, Caroline J; Oberleitner, Lindsay M; Scott, Melanie C; Crowley, Michael J; Babuscio, Theresa A; Carroll, Kathleen M

    2012-07-01

    Few studies have addressed comorbid antisocial personality disorder (ASPD) and marijuana dependence in young adults, and results from previous studies are inconsistent. This study evaluated differences in pretreatment characteristics and treatment outcomes between marijuana-dependent young adults with and without ASPD. Data for this study were derived from a randomized trial, in which marijuana-dependent young adults (n = 136) between 18 and 25 years of age were randomized to four behavioral conditions: (1) MET/CBT with CM, (2) MET/CBT without CM, (3) DC with CM, and (4) DC without CM. Forty-four percent of the participants met DSM-IV-TR criteria for ASPD. ASPD clients had significantly more lifetime alcohol dependence disorders, marijuana use in the 28 days pretreatment, arrests, and assault and weapon charges compared to those without ASPD. ASPD clients did not differ in retention or substance use outcomes at 8 weeks posttreatment or the 6-month follow-up. In general, both groups had more attendance in the voucher condition, but there were no significant ASPD by treatment interactions. These data suggest that marijuana-dependent young adults with comorbid ASPD do not necessarily have poorer retention or substance use outcomes compared with marijuana-dependent young adults who do not have ASPD when treated in a well-defined behavioral therapy protocol. Previous research has shown increased risks for clients with comorbid ASPD and marijuana dependence; however, our findings suggest that specialized programs for clients with ASPD may not be necessary if they are provided with empirically supported, structured treatments.

  4. Multiple cyber attacks against a target with observation errors and dependent outcomes: Characterization and optimization

    International Nuclear Information System (INIS)

    Hu, Xiaoxiao; Xu, Maochao; Xu, Shouhuai; Zhao, Peng

    2017-01-01

    In this paper we investigate a cybersecurity model: An attacker can launch multiple attacks against a target with a termination strategy that says that the attacker will stop after observing a number of successful attacks or when the attacker is out of attack resources. However, the attacker's observation of the attack outcomes (i.e., random variables indicating whether the target is compromised or not) has an observation error that is specified by both a false-negative and a false-positive probability. The novelty of the model we study is the accommodation of the dependence between the attack outcomes, because the dependence was assumed away in the literature. In this model, we characterize the monotonicity and bounds of the compromise probability (i.e., the probability that the target is compromised). In addition to extensively showing the impact of dependence on quantities such as compromise probability and attack cost, we give methods for finding the optimal strategy that leads to maximum compromise probability or minimum attack cost. This study highlights that the dependence between random variables cannot be assumed away, because the results will be misleading. - Highlights: • A novel cybersecurity model is proposed to accommodate the dependence among attack outcomes. • The monotonicity and bounds of the compromise probability are studied. • The dependence effect on the compromise probability and attack cost is discussed via simulation. • The optimal strategy that leads to maximum compromise probability or minimum attack cost is presented.

  5. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  6. Effect of quality chronic disease management for alcohol and drug dependence on addiction outcomes.

    Science.gov (United States)

    Kim, Theresa W; Saitz, Richard; Cheng, Debbie M; Winter, Michael R; Witas, Julie; Samet, Jeffrey H

    2012-12-01

    We examined the effect of the quality of primary care-based chronic disease management (CDM) for alcohol and/or other drug (AOD) dependence on addiction outcomes. We assessed quality using (1) a visit frequency based measure and (2) a self-reported assessment measuring alignment with the chronic care model. The visit frequency based measure had no significant association with addiction outcomes. The self-reported measure of care-when care was at a CDM clinic-was associated with lower drug addiction severity. The self-reported assessment of care from any healthcare source (CDM clinic or elsewhere) was associated with lower alcohol addiction severity and abstinence. These findings suggest that high quality CDM for AOD dependence may improve addiction outcomes. Quality measures based upon alignment with the chronic care model may better capture features of effective CDM care than a visit frequency measure. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Assessing nicotine dependence in adolescent E-cigarette users: The 4-item Patient-Reported Outcomes Measurement Information System (PROMIS) Nicotine Dependence Item Bank for electronic cigarettes.

    Science.gov (United States)

    Morean, Meghan E; Krishnan-Sarin, Suchitra; S O'Malley, Stephanie

    2018-04-26

    Adolescent e-cigarette use (i.e., "vaping") likely confers risk for developing nicotine dependence. However, there have been no studies assessing e-cigarette nicotine dependence in youth. We evaluated the psychometric properties of the 4-item Patient-Reported Outcomes Measurement Information System Nicotine Dependence Item Bank for E-cigarettes (PROMIS-E) for assessing youth e-cigarette nicotine dependence and examined risk factors for experiencing stronger dependence symptoms. In 2017, 520 adolescent past-month e-cigarette users completed the PROMIS-E during a school-based survey (50.5% female, 84.8% White, 16.22[1.19] years old). Adolescents also reported on sex, grade, race, age at e-cigarette use onset, vaping frequency, nicotine e-liquid use, and past-month cigarette smoking. Analyses included conducting confirmatory factor analysis and examining the internal consistency of the PROMIS-E. Bivariate correlations and independent-samples t-tests were used to examine unadjusted relationships between e-cigarette nicotine dependence and the proposed risk factors. Regression models were run in which all potential risk factors were entered as simultaneous predictors of PROMIS-E scores. The single-factor structure of the PROMIS-E was confirmed and evidenced good internal consistency. Across models, larger PROMIS-E scores were associated with being in a higher grade, initiating e-cigarette use at an earlier age, vaping more frequently, using nicotine e-liquid (and higher nicotine concentrations), and smoking cigarettes. Adolescent e-cigarette users reported experiencing nicotine dependence, which was assessed using the psychometrically sound PROMIS-E. Experiencing stronger nicotine dependence symptoms was associated with characteristics that previously have been shown to confer risk for frequent vaping and tobacco cigarette dependence. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Mobile Variable Depth Sampling System Design Study

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study

  9. Mobile Variable Depth Sampling System Design Study

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    2000-08-25

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study.

  10. Sample requirements and design of an inter-laboratory trial for radiocarbon laboratories

    International Nuclear Information System (INIS)

    Bryant, Charlotte; Carmi, Israel; Cook, Gordon; Gulliksen, Steinar; Harkness, Doug; Heinemeier, Jan; McGee, Edward; Naysmith, Philip; Possnert, Goran; Scott, Marian; Plicht, Hans van der; Strydonck, Mark van

    2000-01-01

    An on-going inter-comparison programme which is focused on assessing and establishing consensus protocols to be applied in the identification, selection and sub-sampling of materials for subsequent 14 C analysis is described. The outcome of the programme will provide a detailed quantification of the uncertainties associated with 14 C measurements including the issues of accuracy and precision. Such projects have become recognised as a fundamental aspect of continuing laboratory quality assurance schemes, providing a mechanism for the harmonisation of measurements and for demonstrating the traceability of results. The design of this study and its rationale are described. In summary, a suite of core samples has been defined which will be made available to both AMS and radiometric laboratories. These core materials are representative of routinely dated material and their ages span the full range of the applied 14 C time-scale. Two of the samples are of wood from the German and Irish dendrochronologies, thus providing a direct connection to the master dendrochronological calibration curve. Further samples link this new inter-comparison to past studies. Sample size and precision have been identified as being of paramount importance in defining dating confidence, and so several core samples have been identified for more in-depth study of these practical issues. In addition to the core samples, optional samples have been identified and prepared specifically for either AMS and/or radiometric laboratories. For AMS laboratories, these include bone, textile, leather and parchment samples. Participation in the study requires a commitment to a minimum of 10 core analyses, with results to be returned within a year

  11. Testing rank-dependent utility theory for health outcomes.

    Science.gov (United States)

    Oliver, Adam

    2003-10-01

    Systematic violations of expected utility theory (EU) have been reported in the context of both money and health outcomes. Rank-dependent utility theory (RDU) is currently the most popular and influential alternative theory of choice under circumstances of risk. This paper reports a test of the descriptive performance of RDU compared to EU in the context of health. When one of the options is certain, violations of EU that can be explained by RDU are found. When both options are risky, no evidence that RDU is a descriptive improvement over EU is found, though this finding may be due to the low power of the tests. Copyright 2002 John Wiley & Sons, Ltd.

  12. Small Sample Properties of the Wilcoxon Signed Rank Test with Discontinuous and Dependent Observations

    OpenAIRE

    Nadine Chlass; Jens J. Krueger

    2007-01-01

    This Monte-Carlo study investigates sensitivity of the Wilcoxon signed rank test to certain assumption violations in small samples. Emphasis is put on within-sample-dependence, between-sample dependence, and the presence of ties. Our results show that both assumption violations induce severe size distortions and entail power losses. Surprisingly, these consequences do vary substantially with other properties the data may display. Results provided are particularly relevant for experimental set...

  13. Re-construction of action awareness depends on an internal model of action-outcome timing.

    Science.gov (United States)

    Stenner, Max-Philipp; Bauer, Markus; Machts, Judith; Heinze, Hans-Jochen; Haggard, Patrick; Dolan, Raymond J

    2014-04-01

    The subjective time of an instrumental action is shifted towards its outcome. This temporal binding effect is partially retrospective, i.e., occurs upon outcome perception. Retrospective binding is thought to reflect post-hoc inference on agency based on sensory evidence of the action - outcome association. However, many previous binding paradigms cannot exclude the possibility that retrospective binding results from bottom-up interference of sensory outcome processing with action awareness and is functionally unrelated to the processing of the action - outcome association. Here, we keep bottom-up interference constant and use a contextual manipulation instead. We demonstrate a shift of subjective action time by its outcome in a context of variable outcome timing. Crucially, this shift is absent when there is no such variability. Thus, retrospective action binding reflects a context-dependent, model-based phenomenon. Such top-down re-construction of action awareness seems to bias agency attribution when outcome predictability is low. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  14. [Transabdominal chorionic villus sampling using biopsy forceps or needle: pregnancy outcomes by technique used].

    Science.gov (United States)

    Spallina, J; Anselem, O; Haddad, B; Touboul, C; Tsatsaris, V; Le Ray, C

    2014-11-01

    To compare pregnancy outcomes after transabdominal chorionic villus sampling using biopsy forceps or needle. Retrospective bicentric study including all women who had a transabdominal chorionic villus sampling between 2005 and 2009 (172 using biopsy forceps and 160 using needle). The primary endpoint was the rate of fetal loss, after excluding medical abortion due to the result of the biopsy. The secondary endpoint was the rate of premature rupture of the membrane. All cases were reviewed to try to determine the responsibility of the biopsy. The pregnancy outcomes were not different between the two groups: 4 (4.4%) fetal losses in the biopsy forceps group and 6 (7.4%) in the needle group (P=0.52). Only one case (1.2%) of fetal loss can be attributed to the biopsy, using a needle, and none (0%) following a forceps biospy (P=0.29). The rate of premature rupture of the membrane was comparable in the two groups. The pregnancy outcomes following chorionic villus sampling using a biopsy forceps or a needle seem comparable. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  15. Associations between ADHD symptoms and smoking outcome expectancies in a non-clinical sample of daily cigarette smokers.

    Science.gov (United States)

    Goldenson, Nicholas I; Pang, Raina D; Leventhal, Adam M

    2016-03-01

    Smoking outcome expectancies for positive reinforcement (PR: beliefs that smoking produces desirable outcomes) and negative reinforcement (NR: beliefs that smoking alleviates negative affect) are modifiable cognitive manifestations of affect-mediated smoking motivation. Based on prior data and theory, we hypothesized that NR and PR expectancies are associated with ADHD symptom levels in a non-clinical sample of cigarette smokers. (Am J Addict 2016; XX:XX -XX) METHODS: Daily cigarette smokers (N = 256) completed self-report measures of ADHD symptoms and smoking outcome expectancies. Cross-sectional associations of overall ADHD symptomatology and the ADHD symptom dimensions of inattention (IN: difficulty concentrating and distractibility) and hyperactivity impulsivity (HI: poor inhibitory control and motor activity restlessness) with PR and NR smoking outcome expectancies were examined. Higher levels of overall, IN and HI ADHD symptoms were positively associated with NR smoking expectancies after statistically controlling for anxiety, depression, alcohol/drug use problems, nicotine dependence, and other smoking expectancies. Although neither HI nor IN symptom dimensions exhibited empirically unique relations to NR expectancies over and above one another, the collective variance across IN and HI was associated with NR expectancies. PR expectancies were not associated with ADHD symptoms. Although PR and NR expectancies may be important etiological influences in the overall population of smokers, NR outcome expectancies appear to be disproportionately expressed in smokers with elevated ADHD symptoms. Cognitive manifestations of NR motivation, which may be modifiable via intervention, are prominent in smokers with elevated ADHD symptoms. Beliefs that smoking alleviates negative affect may underlie ADHD-smoking comorbidity. © American Academy of Addiction Psychiatry.

  16. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  17. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  18. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  19. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  20. Applying Universal Design to Disability Service Provision: Outcome Analysis of a Universal Design (UD) Audit

    Science.gov (United States)

    Beck, Tanja; Diaz del Castillo, Patricia; Fovet, Frederic; Mole, Heather; Noga, Brodie

    2014-01-01

    This article presents out an outcome analysis of a Universal Design (UD) audit to the various professional facets of a disability service (DS) provider's office on a large North American campus. The context of the audit is a broad campus-wide drive to implement Universal Design for Learning (UDL) in teaching practices. In an effort for consistency…

  1. Orientation-Dependent Handedness and Chiral Design

    Directory of Open Access Journals (Sweden)

    Efi Efrati

    2014-01-01

    Full Text Available Chirality occupies a central role in fields ranging from biological self-assembly to the design of optical metamaterials. The definition of chirality, as given by Lord Kelvin, associates chirality with the lack of mirror symmetry: the inability to superpose an object on its mirror image. While this definition has guided the classification of chiral objects for over a century, the quantification of handed phenomena based on this definition has proven elusive, if not impossible, as manifest in the paradox of chiral connectedness. In this work, we put forward a quantification scheme in which the handedness of an object depends on the direction in which it is viewed. While consistent with familiar chiral notions, such as the right-hand rule, this framework allows objects to be simultaneously right and left handed. We demonstrate this orientation dependence in three different systems—a biomimetic elastic bilayer, a chiral propeller, and optical metamaterial—and find quantitative agreement with chirality pseudotensors whose form we explicitly compute. The use of this approach resolves the existing paradoxes and naturally enables the design of handed metamaterials from symmetry principles.

  2. The Effects of Game Design on Learning Outcomes

    Science.gov (United States)

    Martin, Michael W.; Shen, Yuzhong

    2014-01-01

    This article details the administration and results of an experiment conducted to assess the impact of three video game design concepts upon learning outcomes. The principles tested include game aesthetics, player choice, and player competition. The experiment participants were asked to play a serious game over the course of a week, and the…

  3. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

    International Nuclear Information System (INIS)

    LECHELT, J.A.

    2000-01-01

    The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix

  4. Antisocial personality disorder predicts methamphetamine treatment outcomes in homeless, substance-dependent men who have sex with men.

    Science.gov (United States)

    Fletcher, Jesse B; Reback, Cathy J

    2013-09-01

    One-hundred-thirty-one homeless, substance-dependent MSM were enrolled in a randomized controlled trial to assess the efficacy of a contingency management (CM) intervention for reducing substance use and increasing healthy behavior. Participants were randomized into conditions that either provided additional rewards for substance abstinence and/or health-promoting/prosocial behaviors ("CM-full"; n=64) or for study compliance and attendance only ("CM-lite"; n=67). The purpose of this secondary analysis was to determine the affect of ASPD status on two primary study outcomes: methamphetamine abstinence, and engagement in prosocial/health-promoting behavior. Analyses revealed that individuals with ASPD provided more methamphetamine-negative urine samples (37.5%) than participants without ASPD (30.6%). When controlling for participant sociodemographics and condition assignment, the magnitude of this predicted difference increases to 10% and reached statistical significance (p<.05). On average, participants with ASPD earned fewer vouchers for health-promoting/prosocial behaviors than participants without ASPD ($10.21 [SD=$7.02] versus $18.38 [SD=$13.60]; p<.01). Participants with ASPD displayed superior methamphetamine abstinence outcomes regardless of CM schedule; even with potentially unlimited positive reinforcement, individuals with ASPD displayed suboptimal outcomes in achieving health-promoting/prosocial behaviors. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Design dependencies within the automatic generation of hypermedia presentations

    NARCIS (Netherlands)

    O. Rosell Martinez

    2002-01-01

    textabstractMany dependencies appear between the different stages of the creation of a hypermedia presentation. These dependencies have to be taken into account while designing a system for their automatic generation. In this work we study two of them and propose some techniques to treat them.

  6. Randomized controlled trial of attention bias modification in a racially diverse, socially anxious, alcohol dependent sample.

    Science.gov (United States)

    Clerkin, Elise M; Magee, Joshua C; Wells, Tony T; Beard, Courtney; Barnett, Nancy P

    2016-12-01

    Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Adult participants (N = 86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Randomized Controlled Trial of Attention Bias Modification in a Racially Diverse, Socially Anxious, Alcohol Dependent Sample

    Science.gov (United States)

    Clerkin, Elise M.; Magee, Joshua C.; Wells, Tony T.; Beard, Courtney; Barnett, Nancy P.

    2016-01-01

    Objective Attention biases may be an important treatment target for both alcohol dependence and social anxiety. This is the first ABM trial to investigate two (vs. one) targets of attention bias within a sample with co-occurring symptoms of social anxiety and alcohol dependence. Additionally, we used trial-level bias scores (TL-BS) to capture the phenomena of attention bias in a more ecologically valid, dynamic way compared to traditional attention bias scores. Method Adult participants (N=86; 41% Female; 52% African American; 40% White) with elevated social anxiety symptoms and alcohol dependence were randomly assigned to an 8-session training condition in this 2 (Social Anxiety ABM vs. Social Anxiety Control) by 2 (Alcohol ABM vs. Alcohol Control) design. Symptoms of social anxiety, alcohol dependence, and attention bias were assessed across time. Results Multilevel models estimated the trajectories for each measure within individuals, and tested whether these trajectories differed according to the randomized training conditions. Across time, there were significant or trending decreases in all attention TL-BS parameters (but not traditional attention bias scores) and most symptom measures. However, there were not significant differences in the trajectories of change between any ABM and control conditions for any symptom measures. Conclusions These findings add to previous evidence questioning the robustness of ABM and point to the need to extend the effects of ABM to samples that are racially diverse and/or have co-occurring psychopathology. The results also illustrate the potential importance of calculating trial-level attention bias scores rather than only including traditional bias scores. PMID:27591918

  8. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  9. Cocaine dependent individuals discount future rewards more than future losses for both cocaine and monetary outcomes.

    Science.gov (United States)

    Johnson, Matthew W; Bruner, Natalie R; Johnson, Patrick S

    2015-01-01

    Cocaine dependence and other forms of drug dependence are associated with steeper devaluation of future outcomes (delay discounting). Although studies in this domain have typically assessed choices between monetary gains (e.g., receive less money now versus receive more money after a delay), delay discounting is also applicable to decisions involving losses (e.g., small loss now versus larger delayed loss), with gains typically discounted more than losses (the "sign effect"). It is also known that drugs are discounted more than equivalently valued money. In the context of drug dependence, however, relatively little is known about the discounting of delayed monetary and drug losses and the presence of the sign effect. In this within-subject, laboratory study, delay discounting for gains and losses was assessed for cocaine and money outcomes in cocaine-dependent individuals (n=89). Both cocaine and monetary gains were discounted at significantly greater rates than cocaine and monetary losses, respectively (i.e., the sign effect). Cocaine gains were discounted significantly more than monetary gains, but cocaine and monetary losses were discounted similarly. Results suggest that cocaine is discounted by cocaine-dependent individuals in a systematic manner similar to other rewards. Because the sign effect was shown for both cocaine and money, delayed aversive outcomes may generally have greater impact than delayed rewards in shaping present behavior in this population. Copyright © 2014. Published by Elsevier Ltd.

  10. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions.

    Science.gov (United States)

    Tao, Guohua; Miller, William H

    2011-07-14

    An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.

  11. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  12. Designing clinical trials for assessing the effects of cognitive training and physical activity interventions on cognitive outcomes: The Seniors Health and Activity Research Program Pilot (SHARP-P Study, a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Rejeski W Jack

    2011-05-01

    Full Text Available Abstract Background The efficacy of non-pharmacological intervention approaches such as physical activity, strength, and cognitive training for improving brain health has not been established. Before definitive trials are mounted, important design questions on participation/adherence, training and interventions effects must be answered to more fully inform a full-scale trial. Methods SHARP-P was a single-blinded randomized controlled pilot trial of a 4-month physical activity training intervention (PA and/or cognitive training intervention (CT in a 2 × 2 factorial design with a health education control condition in 73 community-dwelling persons, aged 70-85 years, who were at risk for cognitive decline but did not have mild cognitive impairment. Results Intervention attendance rates were higher in the CT and PACT groups: CT: 96%, PA: 76%, PACT: 90% (p=0.004, the interventions produced marked changes in cognitive and physical performance measures (p≤0.05, and retention rates exceeded 90%. There were no statistically significant differences in 4-month changes in composite scores of cognitive, executive, and episodic memory function among arms. Four-month improvements in the composite measure increased with age among participants assigned to physical activity training but decreased with age for other participants (intervention*age interaction p = 0.01. Depending on the choice of outcome, two-armed full-scale trials may require fewer than 1,000 participants (continuous outcome or 2,000 participants (categorical outcome. Conclusions Good levels of participation, adherence, and retention appear to be achievable for participants through age 85 years. Care should be taken to ensure that an attention control condition does not attenuate intervention effects. Depending on the choice of outcome measures, the necessary sample sizes to conduct four-year trials appear to be feasible. Trial Registration Clinicaltrials.gov Identifier: NCT00688155

  13. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Directory of Open Access Journals (Sweden)

    Antonio Di Franco

    Full Text Available Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS. LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1 whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2 the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast. We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within

  14. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Science.gov (United States)

    Di Franco, Antonio; Bulleri, Fabio; Pennetta, Antonio; De Benedetto, Giuseppe; Clarke, K Robert; Guidetti, Paolo

    2014-01-01

    Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1) whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2) the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast). We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within-otolith replication in the

  15. Sample design considerations of indoor air exposure surveys

    International Nuclear Information System (INIS)

    Cox, B.G.; Mage, D.T.; Immerman, F.W.

    1988-01-01

    Concern about the potential for indoor air pollution has prompted recent surveys of radon and NO 2 concentrations in homes and personal exposure studies of volatile organics, carbon monoxide and pesticides, to name a few. The statistical problems in designing sample surveys that measure the physical environment are diverse and more complicated than those encountered in traditional surveys of human attitudes and attributes. This paper addresses issues encountered when designing indoor air quality (IAQ) studies. General statistical concepts related to target population definition, frame creation, and sample selection for area household surveys and telephone surveys are presented. The implications of different measurement approaches are discussed, and response rate considerations are described

  16. Designing key-dependent chaotic S-box with larger key space

    International Nuclear Information System (INIS)

    Yin Ruming; Yuan Jian; Wang Jian; Shan Xiuming; Wang Xiqin

    2009-01-01

    The construction of cryptographically strong substitution boxes (S-boxes) is an important concern in designing secure cryptosystems. The key-dependent S-boxes designed using chaotic maps have received increasing attention in recent years. However, the key space of such S-boxes does not seem to be sufficiently large due to the limited parameter range of discretized chaotic maps. In this paper, we propose a new key-dependent S-box based on the iteration of continuous chaotic maps. We explore the continuous-valued state space of chaotic systems, and devise the discrete mapping between the input and the output of the S-box. A key-dependent S-box is constructed with the logistic map in this paper. We show that its key space could be much larger than the current key-dependent chaotic S-boxes.

  17. Missing continuous outcomes under covariate dependent missingness in cluster randomised trials.

    Science.gov (United States)

    Hossain, Anower; Diaz-Ordaz, Karla; Bartlett, Jonathan W

    2017-06-01

    Attrition is a common occurrence in cluster randomised trials which leads to missing outcome data. Two approaches for analysing such trials are cluster-level analysis and individual-level analysis. This paper compares the performance of unadjusted cluster-level analysis, baseline covariate adjusted cluster-level analysis and linear mixed model analysis, under baseline covariate dependent missingness in continuous outcomes, in terms of bias, average estimated standard error and coverage probability. The methods of complete records analysis and multiple imputation are used to handle the missing outcome data. We considered four scenarios, with the missingness mechanism and baseline covariate effect on outcome either the same or different between intervention groups. We show that both unadjusted cluster-level analysis and baseline covariate adjusted cluster-level analysis give unbiased estimates of the intervention effect only if both intervention groups have the same missingness mechanisms and there is no interaction between baseline covariate and intervention group. Linear mixed model and multiple imputation give unbiased estimates under all four considered scenarios, provided that an interaction of intervention and baseline covariate is included in the model when appropriate. Cluster mean imputation has been proposed as a valid approach for handling missing outcomes in cluster randomised trials. We show that cluster mean imputation only gives unbiased estimates when missingness mechanism is the same between the intervention groups and there is no interaction between baseline covariate and intervention group. Multiple imputation shows overcoverage for small number of clusters in each intervention group.

  18. Field Dependence and Vocational Choice of Interior Design Students.

    Science.gov (United States)

    Davis, Diane M.; And Others

    One hundred ninety-three interior design college students were administered the Group Embedded Figures Test, a measure of field dependence, in order to evaluate two of Witkin's hypotheses regarding career choice. The career-differentiation hypothesis predicted that students electing to major in interior design would be field independent because…

  19. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  20. Sampling designs matching species biology produce accurate and affordable abundance indices

    Directory of Open Access Journals (Sweden)

    Grant Harris

    2013-12-01

    Full Text Available Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling, it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS data from 42 Alaskan brown bears (Ursus arctos. Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion, and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture

  1. Sampling designs matching species biology produce accurate and affordable abundance indices.

    Science.gov (United States)

    Harris, Grant; Farley, Sean; Russell, Gareth J; Butler, Matthew J; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km(2) cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions

  2. Sampling designs matching species biology produce accurate and affordable abundance indices

    Science.gov (United States)

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which

  3. Lagoa Real design. Description and evaluation of sampling system

    International Nuclear Information System (INIS)

    Hashizume, B.K.

    1982-10-01

    This report describes the samples preparation system of drilling from Lagoa Real Design, aiming obtainment representative fraction of the half from drilling outlier. The error of sampling + analysis and analytical accuracy was obtainment by delayed neutron analysis. (author)

  4. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  5. Cumulative risk, cumulative outcome: a 20-year longitudinal study.

    Directory of Open Access Journals (Sweden)

    Leslie Atkinson

    Full Text Available Cumulative risk (CR models provide some of the most robust findings in the developmental literature, predicting numerous and varied outcomes. Typically, however, these outcomes are predicted one at a time, across different samples, using concurrent designs, longitudinal designs of short duration, or retrospective designs. We predicted that a single CR index, applied within a single sample, would prospectively predict diverse outcomes, i.e., depression, intelligence, school dropout, arrest, smoking, and physical disease from childhood to adulthood. Further, we predicted that number of risk factors would predict number of adverse outcomes (cumulative outcome; CO. We also predicted that early CR (assessed at age 5/6 explains variance in CO above and beyond that explained by subsequent risk (assessed at ages 12/13 and 19/20. The sample consisted of 284 individuals, 48% of whom were diagnosed with a speech/language disorder. Cumulative risk, assessed at 5/6-, 12/13-, and 19/20-years-old, predicted aforementioned outcomes at age 25/26 in every instance. Furthermore, number of risk factors was positively associated with number of negative outcomes. Finally, early risk accounted for variance beyond that explained by later risk in the prediction of CO. We discuss these findings in terms of five criteria posed by these data, positing a "mediated net of adversity" model, suggesting that CR may increase some central integrative factor, simultaneously augmenting risk across cognitive, quality of life, psychiatric and physical health outcomes.

  6. Graph Transformation and Designing Parallel Sparse Matrix Algorithms beyond Data Dependence Analysis

    Directory of Open Access Journals (Sweden)

    H.X. Lin

    2004-01-01

    Full Text Available Algorithms are often parallelized based on data dependence analysis manually or by means of parallel compilers. Some vector/matrix computations such as the matrix-vector products with simple data dependence structures (data parallelism can be easily parallelized. For problems with more complicated data dependence structures, parallelization is less straightforward. The data dependence graph is a powerful means for designing and analyzing parallel algorithms. However, for sparse matrix computations, parallelization based on solely exploiting the existing parallelism in an algorithm does not always give satisfactory results. For example, the conventional Gaussian elimination algorithm for the solution of a tri-diagonal system is inherently sequential, so algorithms specially for parallel computation has to be designed. After briefly reviewing different parallelization approaches, a powerful graph formalism for designing parallel algorithms is introduced. This formalism will be discussed using a tri-diagonal system as an example. Its application to general matrix computations is also discussed. Its power in designing parallel algorithms beyond the ability of data dependence analysis is shown by means of a new algorithm called ACER (Alternating Cyclic Elimination and Reduction algorithm.

  7. Design review report for rotary mode core sample truck (RMCST) modifications for flammable gas tanks, preliminary design

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1996-02-01

    This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review

  8. Mechanical design and simulation of an automatized sample exchanger

    International Nuclear Information System (INIS)

    Lopez, Yon; Gora, Jimmy; Bedregal, Patricia; Hernandez, Yuri; Baltuano, Oscar; Gago, Javier

    2013-01-01

    The design of a turntable type sample exchanger for irradiation and with a capacity for up to 20 capsules was performed. Its function is the automatic sending of samples contained in polyethylene capsules, for irradiation in the grid position of the reactor core, using a pneumatic system and further analysis by neutron activation. This study shows the structural design analysis and calculations in selecting motors and actuators. This development will improve efficiency in the analysis, reducing the contribution of the workers and also the radiation exposure time. (authors).

  9. Topology optimization considering design-dependent Stokes flow loads

    NARCIS (Netherlands)

    Picelli, R.; Vicente, W.M.; Pavanello, R.; van Keulen, A.; Li, Qing; Steven, Grant P.; Zhang, Zhongpu

    2015-01-01

    This article presents an evolutionary topology optimization method for mean compliance minimization of structures under design-dependent viscous fluid flow loads. The structural domain is governed by the elasticity equation and the fluid by the incompressible Stokes flow equations. When the

  10. [Sampling and measurement methods of the protocol design of the China Nine-Province Survey for blindness, visual impairment and cataract surgery].

    Science.gov (United States)

    Zhao, Jia-liang; Wang, Yu; Gao, Xue-cheng; Ellwein, Leon B; Liu, Hu

    2011-09-01

    To design the protocol of the China nine-province survey for blindness, visual impairment and cataract surgery to evaluate the prevalence and main causes of blindness and visual impairment, and the prevalence and outcomes of the cataract surgery. The protocol design was began after accepting the task for the national survey for blindness, visual impairment and cataract surgery from the Department of Medicine, Ministry of Health, China, in November, 2005. The protocol in Beijing Shunyi Eye Study in 1996 and Guangdong Doumen County Eye Study in 1997, both supported by World Health Organization, was taken as the basis for the protocol design. The relative experts were invited to discuss and prove the draft protocol. An international advisor committee was established to examine and approve the draft protocol. Finally, the survey protocol was checked and approved by the Department of Medicine, Ministry of Health, China and Prevention Program of Blindness and Deafness, WHO. The survey protocol was designed according to the characteristics and the scale of the survey. The contents of the protocol included determination of target population and survey sites, calculation of the sample size, design of the random sampling, composition and organization of the survey teams, determination of the examinee, the flowchart of the field work, survey items and methods, diagnostic criteria of blindness and moderate and sever visual impairment, the measures of the quality control, the methods of the data management. The designed protocol became the standard and practical protocol for the survey to evaluate the prevalence and main causes of blindness and visual impairment, and the prevalence and outcomes of the cataract surgery.

  11. Evidence-based architectural and space design supports Magnet® empirical outcomes.

    Science.gov (United States)

    Ecoff, Laurie; Brown, Caroline E

    2010-12-01

    This department expands nursing leaders' knowledge and competencies in health facility design. The editor of this department, Dr Jaynelle Stichler, asked guest authors, Drs Ecoff and Brown, to describe the process of using the conceptual models of a nursing evidence-based practice model and the Magnet Recognition Program® as a structured process to lead decision making in the planning and design processes and to achieve desired outcomes in hospital design.

  12. Functional brain networks associated with cognitive control, cocaine dependence, and treatment outcome.

    Science.gov (United States)

    Worhunsky, Patrick D; Stevens, Michael C; Carroll, Kathleen M; Rounsaville, Bruce J; Calhoun, Vince D; Pearlson, Godfrey D; Potenza, Marc N

    2013-06-01

    Individuals with cocaine dependence often evidence poor cognitive control. The purpose of this exploratory study was to investigate networks of functional connectivity underlying cognitive control in cocaine dependence and examine the relationship of the networks to the disorder and its treatment. Independent component analysis (ICA) was applied to fMRI data to investigate if regional activations underlying cognitive control processes operate in functional networks, and whether these networks relate to performance and treatment outcome measures in cocaine dependence. Twenty patients completed a Stroop task during fMRI prior to entering outpatient treatment and were compared to 20 control participants. ICA identified five distinct functional networks related to cognitive control interference events. Cocaine-dependent patients displayed differences in performance-related recruitment of three networks. Reduced involvement of a "top-down" fronto-cingular network contributing to conflict monitoring correlated with better treatment retention. Greater engagement of two "bottom-up" subcortical and ventral prefrontal networks related to cue-elicited motivational processing correlated with abstinence during treatment. The identification of subcortical networks linked to cocaine abstinence and cortical networks to treatment retention suggests that specific circuits may represent important, complementary targets in treatment development for cocaine dependence. 2013 APA, all rights reserved

  13. Conservative Sample Size Determination for Repeated Measures Analysis of Covariance.

    Science.gov (United States)

    Morgan, Timothy M; Case, L Douglas

    2013-07-05

    In the design of a randomized clinical trial with one pre and multiple post randomized assessments of the outcome variable, one needs to account for the repeated measures in determining the appropriate sample size. Unfortunately, one seldom has a good estimate of the variance of the outcome measure, let alone the correlations among the measurements over time. We show how sample sizes can be calculated by making conservative assumptions regarding the correlations for a variety of covariance structures. The most conservative choice for the correlation depends on the covariance structure and the number of repeated measures. In the absence of good estimates of the correlations, the sample size is often based on a two-sample t-test, making the 'ultra' conservative and unrealistic assumption that there are zero correlations between the baseline and follow-up measures while at the same time assuming there are perfect correlations between the follow-up measures. Compared to the case of taking a single measurement, substantial savings in sample size can be realized by accounting for the repeated measures, even with very conservative assumptions regarding the parameters of the assumed correlation matrix. Assuming compound symmetry, the sample size from the two-sample t-test calculation can be reduced at least 44%, 56%, and 61% for repeated measures analysis of covariance by taking 2, 3, and 4 follow-up measures, respectively. The results offer a rational basis for determining a fairly conservative, yet efficient, sample size for clinical trials with repeated measures and a baseline value.

  14. Identifying Indicators Related to Constructs for Engineering Design Outcome

    Science.gov (United States)

    Wilhelmsen, Cheryl A.; Dixon, Raymond A.

    2016-01-01

    This study ranked constructs articulated by Childress and Rhodes (2008) and identified the key indicators for each construct as a starting point to explore what should be included on an instrument to measure the engineering design process and outcomes of students in high schools that use the PLTW and EbDTM curricula in Idaho. A case-study design…

  15. Design development of robotic system for on line sampling in fuel reprocessing

    International Nuclear Information System (INIS)

    Balasubramanian, G.R.; Venugopal, P.R.; Padmashali, G.K.

    1990-01-01

    This presentation describes the design and developmental work that is being carried out for the design of an automated sampling system for fast reactor fuel reprocessing plants. The plant proposes to use integrated sampling system. The sample is taken across regular process streams from any intermediate hold up pot. A robot system is planned to take the sample from the sample pot, transfer it to the sample bottle, cap the bottle and transfer the bottle to a pneumatic conveying station. The system covers a large number of sample pots. Alternate automated systems are also examined (1). (author). 4 refs., 2 figs

  16. Sample dependent response of a LaCl{sub 3}:Ce detector in prompt gamma neutron activation analysis of bulk hydrocarbon samples

    Energy Technology Data Exchange (ETDEWEB)

    Naqvi, A.A., E-mail: aanaqvi@kfupm.edu.sa [Department of Physics, King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia); Al-Matouq, Faris A.; Khiari, F.Z. [Department of Physics, King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia); Isab, A.A. [Department of Chemistry, King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia); Khateeb-ur-Rehman,; Raashid, M. [Department of Physics, King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia)

    2013-08-11

    The response of a LaCl{sub 3}:Ce detector has been found to depend upon the hydrogen content of bulk samples in prompt gamma analysis using 14 MeV neutron inelastic scattering. The moderation of 14 MeV neutrons from hydrogen in the bulk sample produces thermal neutrons around the sample which ultimately excite chlorine capture gamma rays in the LaCl{sub 3}:Ce detector material. Interference of 6.11 MeV chlorine gamma rays from the detector itself with 6.13 MeV oxygen gamma rays from the bulk samples makes the intensity of the 6.13 MeV oxygen gamma ray peak relatively insensitive to variations in oxygen concentration. The strong dependence of the 1.95 MeV doublet chlorine gamma ray yield on hydrogen content of the bulk samples confirms fast neutron moderation from hydrogen in the bulk samples as a major source of production of thermal neutrons and chlorine gamma rays in the LaCl{sub 3}:Ce detector material. Despite their poor oxygen detection capabilities, these detectors have nonetheless excellent detection capabilities for hydrogen and carbon in benzene, butyl alcohol, propanol, propanic acid, and formic acid bulk samples using 14 MeV neutron inelastic scattering.

  17. Bayesian dose selection design for a binary outcome using restricted response adaptive randomization.

    Science.gov (United States)

    Meinzer, Caitlyn; Martin, Renee; Suarez, Jose I

    2017-09-08

    In phase II trials, the most efficacious dose is usually not known. Moreover, given limited resources, it is difficult to robustly identify a dose while also testing for a signal of efficacy that would support a phase III trial. Recent designs have sought to be more efficient by exploring multiple doses through the use of adaptive strategies. However, the added flexibility may potentially increase the risk of making incorrect assumptions and reduce the total amount of information available across the dose range as a function of imbalanced sample size. To balance these challenges, a novel placebo-controlled design is presented in which a restricted Bayesian response adaptive randomization (RAR) is used to allocate a majority of subjects to the optimal dose of active drug, defined as the dose with the lowest probability of poor outcome. However, the allocation between subjects who receive active drug or placebo is held constant to retain the maximum possible power for a hypothesis test of overall efficacy comparing the optimal dose to placebo. The design properties and optimization of the design are presented in the context of a phase II trial for subarachnoid hemorrhage. For a fixed total sample size, a trade-off exists between the ability to select the optimal dose and the probability of rejecting the null hypothesis. This relationship is modified by the allocation ratio between active and control subjects, the choice of RAR algorithm, and the number of subjects allocated to an initial fixed allocation period. While a responsive RAR algorithm improves the ability to select the correct dose, there is an increased risk of assigning more subjects to a worse arm as a function of ephemeral trends in the data. A subarachnoid treatment trial is used to illustrate how this design can be customized for specific objectives and available data. Bayesian adaptive designs are a flexible approach to addressing multiple questions surrounding the optimal dose for treatment efficacy

  18. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  19. Latent spatial models and sampling design for landscape genetics

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.; Knick, Steven T.; Oyler-McCance, Sara J.; Fike, Jennifer A.; Cross, Todd B.; Schwartz, Michael K.

    2016-01-01

    We propose a spatially-explicit approach for modeling genetic variation across space and illustrate how this approach can be used to optimize spatial prediction and sampling design for landscape genetic data. We propose a multinomial data model for categorical microsatellite allele data commonly used in landscape genetic studies, and introduce a latent spatial random effect to allow for spatial correlation between genetic observations. We illustrate how modern dimension reduction approaches to spatial statistics can allow for efficient computation in landscape genetic statistical models covering large spatial domains. We apply our approach to propose a retrospective spatial sampling design for greater sage-grouse (Centrocercus urophasianus) population genetics in the western United States.

  20. Surface area-dependence of gas-particle interactions influences pulmonary and neuroinflammatory outcomes

    OpenAIRE

    Tyler, Christina R.; Zychowski, Katherine E.; Sanchez, Bethany N.; Rivero, Valeria; Lucas, Selita; Herbert, Guy; Liu, June; Irshad, Hammad; McDonald, Jacob D.; Bleske, Barry E.; Campen, Matthew J.

    2016-01-01

    Background Deleterious consequences of exposure to traffic emissions may derive from interactions between carbonaceous particulate matter (PM) and gaseous components in a manner that is dependent on the surface area or complexity of the particles. To determine the validity of this hypothesis, we examined pulmonary and neurological inflammatory outcomes in C57BL/6 and apolipoprotein E knockout (ApoE?/?) male mice after acute and chronic exposure to vehicle engine-derived particulate matter, ge...

  1. An assessment of Lot Quality Assurance Sampling to evaluate malaria outcome indicators: extending malaria indicator surveys.

    Science.gov (United States)

    Biedron, Caitlin; Pagano, Marcello; Hedt, Bethany L; Kilian, Albert; Ratcliffe, Amy; Mabunda, Samuel; Valadez, Joseph J

    2010-02-01

    Large investments and increased global prioritization of malaria prevention and treatment have resulted in greater emphasis on programme monitoring and evaluation (M&E) in many countries. Many countries currently use large multistage cluster sample surveys to monitor malaria outcome indicators on a regional and national level. However, these surveys often mask local-level variability important to programme management. Lot Quality Assurance Sampling (LQAS) has played a valuable role for local-level programme M&E. If incorporated into these larger surveys, it would provide a comprehensive M&E plan at little, if any, extra cost. The Mozambique Ministry of Health conducted a Malaria Indicator Survey (MIS) in June and July 2007. We applied LQAS classification rules to the 345 sampled enumeration areas to demonstrate identifying high- and low-performing areas with respect to two malaria program indicators-'household possession of any bednet' and 'household possession of any insecticide-treated bednet (ITN)'. As shown by the MIS, no province in Mozambique achieved the 70% coverage target for household possession of bednets or ITNs. By applying LQAS classification rules to the data, we identify 266 of the 345 enumeration areas as having bednet coverage severely below the 70% target. An additional 73 were identified with low ITN coverage. This article demonstrates the feasibility of integrating LQAS into multistage cluster sampling surveys and using these results to support a comprehensive national, regional and local programme M&E system. Furthermore, in the recommendations we outlined how to integrate the Large Country-LQAS design into macro-surveys while still obtaining results available through current sampling practices.

  2. Video-enabled cue-exposure-based intervention improves postdischarge drinking outcomes among alcohol-dependent men: A prospective study at a government addiction treatment setting in India.

    Science.gov (United States)

    Nattala, Prasanthi; Murthy, Pratima; Leung, Kit Sang; Rentala, Sreevani; Ramakrishna, Jayashree

    2017-04-25

    Returning to alcohol use following inpatient treatment occurs due to various real life cues/triggers. It is a challenge to demonstrate to patients how to deal with these triggers during inpatient treatment. Aims of the current study were (a) to evaluate the effectiveness of video-enabled cue-exposure-based intervention (VE-CEI) in influencing treatment outcomes in alcohol dependence, (b) to identify postdischarge predictors of intervention failure (returning to ≥50% of baseline alcohol consumption quantity/day). The VE-CEI comprises live action videos in which human characters model various alcohol use cues and strategies to deal with them effectively. The VE-CEI was administered to an inpatient alcohol-dependent sample (n = 43) and compared with treatment as usual (TAU) (n = 42) at a government addiction treatment setting in India. Patients were followed up over 6 months postdischarge to evaluate effectiveness of the VE-CEI on specific drinking outcomes. Over 6-month follow-up, VE-CEI group (vs. TAU) reported significantly lesser alcohol consumption quantity, fewer drinking days, and lower intervention failure rates. Results of multivariate Cox regression showed that participants who did not receive VE-CEI had an elevated risk of intervention failure (hazards ratio: 11.14; 95% confidence interval [4.93, 25.15]), other intervention failure predictors being early-onset dependence and increased baseline drinking. Findings provide evidence from India for effectiveness of cue-exposure-based intervention delivered using video technology in improving postdischarge treatment outcomes.

  3. Visual Sample Plan (VSP) Software: Designs and Data Analyses for Sampling Contaminated Buildings

    International Nuclear Information System (INIS)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Nuffer, Lisa L.; Hassig, Nancy L.

    2005-01-01

    A new module of the Visual Sample Plan (VSP) software has been developed to provide sampling designs and data analyses for potentially contaminated buildings. An important application is assessing levels of contamination in buildings after a terrorist attack. This new module, funded by DHS through the Combating Terrorism Technology Support Office, Technical Support Working Group, was developed to provide a tailored, user-friendly and visually-orientated buildings module within the existing VSP software toolkit, the latest version of which can be downloaded from http://dqo.pnl.gov/vsp. In case of, or when planning against, a chemical, biological, or radionuclide release within a building, the VSP module can be used to quickly and easily develop and visualize technically defensible sampling schemes for walls, floors, ceilings, and other surfaces to statistically determine if contamination is present, its magnitude and extent throughout the building and if decontamination has been effective. This paper demonstrates the features of this new VSP buildings module, which include: the ability to import building floor plans or to easily draw, manipulate, and view rooms in several ways; being able to insert doors, windows and annotations into a room; 3-D graphic room views with surfaces labeled and floor plans that show building zones that have separate air handing units. The paper will also discuss the statistical design and data analysis options available in the buildings module. Design objectives supported include comparing an average to a threshold when the data distribution is normal or unknown, and comparing measurements to a threshold to detect hotspots or to insure most of the area is uncontaminated when the data distribution is normal or unknown

  4. Dependence of outcomes of executing of sports exercises on mental condition of students which go in for sports.

    Directory of Open Access Journals (Sweden)

    Titovich A.O.

    2010-08-01

    Full Text Available Dependence of outcomes of executing of concrete sports exercise on mental condition of the sportsman is construed. 24 students participated in research. Sportsmen discharged six kinds of competitive exercises: run 100г, run of 110 meters with barriers, broad jumps, in height with the sixth and a jolting of a nucleus. It is exhibited, that before exercises of the relevant direction diagnostic parameters at sportsmen vary adequately to the basic demands. It is established, that outcomes of executing of physical exercises depend on adequacy of setting.

  5. Mediated and Moderated Effects of Neurocognitive Impairment on Outcomes of Treatment for Substance Dependence and Major Depression

    Science.gov (United States)

    Worley, Matthew J.; Tate, Susan R.; Granholm, Eric; Brown, Sandra A.

    2015-01-01

    Objective Neurocognitive impairment has not consistently predicted substance use treatment outcomes but has been linked to proximal mediators of outcome. These indirect effects have not been examined in adults with substance dependence and co-occurring psychiatric disorders. We examined mediators and moderators of the effects of neurocognitive impairment on substance use among adults in treatment for alcohol or drug dependence and major depression (MDD). Method Participants were veterans (N =197, mean age = 49.3 years, 90% male, 75% Caucasian) in a trial of two group interventions for alcohol/drug dependence and MDD. Measures examined here included intake neurocognitive assessments and percent days drinking (PDD), percent days using drugs (PDDRG), self-efficacy, 12-step affiliation, and depressive symptoms measured every 3 months from intake to the 18-month follow-up. Results Greater intake neurocognitive impairment predicted lower self-efficacy, lower 12-step affiliation, and greater depression severity, and these time-varying variables mediated the effects of impairment on future PDD and PDDRG. The prospective effects of 12-step affiliation on future PDD were greater for those with greater neurocognitive impairment. Impairment also interacted with depression to moderate the effects of 12-step affiliation and self-efficacy on PDD. Adults with greater impairment and currently severe depression had the strongest associations between 12-step affiliation/self-efficacy and future drinking. Conclusions Greater neurocognitive impairment may lead to poorer outcomes from group therapy for alcohol/drug dependence and MDD due to compromised change in therapeutic processes. Distal factors such as neurocognitive impairment can interact with dynamic risk factors to modulate the association between therapeutic processes and future drinking outcomes. PMID:24588403

  6. Coupling methods for multistage sampling

    OpenAIRE

    Chauvet, Guillaume

    2015-01-01

    Multistage sampling is commonly used for household surveys when there exists no sampling frame, or when the population is scattered over a wide area. Multistage sampling usually introduces a complex dependence in the selection of the final units, which makes asymptotic results quite difficult to prove. In this work, we consider multistage sampling with simple random without replacement sampling at the first stage, and with an arbitrary sampling design for further stages. We consider coupling ...

  7. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    Science.gov (United States)

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  8. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  9. Velocity dependent passive sampling for monitoring of micropollutants in dynamic stormwater discharges

    DEFF Research Database (Denmark)

    Birch, Heidi; Sharma, Anitha Kumari; Vezzaro, Luca

    2013-01-01

    Micropollutant monitoring in stormwater discharges is challenging because of the diversity of sources and thus large number of pollutants found in stormwater. This is further complicated by the dynamics in runoff flows and the large number of discharge points. Most passive samplers are non......-ideal for sampling such systems because they sample in a time-integrative manner. This paper reports test of a flow-through passive sampler, deployed in stormwater runoff at the outlet of a residential-industrial catchment. Momentum from the water velocity during runoff events created flow through the sampler...... resulting in velocity dependent sampling. This approach enables the integrative sampling of stormwater runoff during periods of weeks to months while weighting actual runoff events higher than no flow periods. Results were comparable to results from volume-proportional samples and results obtained from...

  10. Sample size estimation and sampling techniques for selecting a representative sample

    Directory of Open Access Journals (Sweden)

    Aamir Omair

    2014-01-01

    Full Text Available Introduction: The purpose of this article is to provide a general understanding of the concepts of sampling as applied to health-related research. Sample Size Estimation: It is important to select a representative sample in quantitative research in order to be able to generalize the results to the target population. The sample should be of the required sample size and must be selected using an appropriate probability sampling technique. There are many hidden biases which can adversely affect the outcome of the study. Important factors to consider for estimating the sample size include the size of the study population, confidence level, expected proportion of the outcome variable (for categorical variables/standard deviation of the outcome variable (for numerical variables, and the required precision (margin of accuracy from the study. The more the precision required, the greater is the required sample size. Sampling Techniques: The probability sampling techniques applied for health related research include simple random sampling, systematic random sampling, stratified random sampling, cluster sampling, and multistage sampling. These are more recommended than the nonprobability sampling techniques, because the results of the study can be generalized to the target population.

  11. Outcome of heroin-dependent adolescents presenting for opiate substitution treatment.

    LENUS (Irish Health Repository)

    Smyth, Bobby P

    2012-01-01

    Because the outcome of methadone and buprenorphine substitution treatment in adolescents is unclear, we completed a retrospective cohort study of 100 consecutive heroin-dependent adolescents who sought these treatments over an 8-year recruitment period. The participants\\' average age was 16.6 years, and 54 were female. Half of the patient group remained in treatment for over 1 year. Among those still in treatment at 12 months, 39% demonstrated abstinence from heroin. The final route of departure from the treatment program was via planned detox for 22%, dropout for 32%, and imprisonment for 8%. The remaining 39% were transferred elsewhere for ongoing opiate substitution treatment after a median period of 23 months of treatment. Males were more likely to exit via imprisonment (p < .05), but other outcomes were not predicted by gender. There were no deaths during treatment among these 100 patients who had a cumulative period of 129 person years at risk. Our findings suggest that this treatment delivers reductions in heroin use and that one fifth of patients will exit treatment following detox completion within a 1- to 2-year time frame.

  12. Social network recruitment for Yo Puedo: an innovative sexual health intervention in an underserved urban neighborhood—sample and design implications.

    Science.gov (United States)

    Minnis, Alexandra M; vanDommelen-Gonzalez, Evan; Luecke, Ellen; Cheng, Helen; Dow, William; Bautista-Arredondo, Sergio; Padian, Nancy S

    2015-02-01

    Most existing evidence-based sexual health interventions focus on individual-level behavior, even though there is substantial evidence that highlights the influential role of social environments in shaping adolescents' behaviors and reproductive health outcomes. We developed Yo Puedo, a combined conditional cash transfer and life skills intervention for youth to promote educational attainment, job training, and reproductive health wellness that we then evaluated for feasibility among 162 youth aged 16-21 years in a predominantly Latino community in San Francisco, CA. The intervention targeted youth's social networks and involved recruitment and randomization of small social network clusters. In this paper we describe the design of the feasibility study and report participants' baseline characteristics. Furthermore, we examined the sample and design implications of recruiting social network clusters as the unit of randomization. Baseline data provide evidence that we successfully enrolled high risk youth using a social network recruitment approach in community and school-based settings. Nearly all participants (95%) were high risk for adverse educational and reproductive health outcomes based on multiple measures of low socioeconomic status (81%) and/or reported high risk behaviors (e.g., gang affiliation, past pregnancy, recent unprotected sex, frequent substance use; 62%). We achieved variability in the study sample through heterogeneity in recruitment of the index participants, whereas the individuals within the small social networks of close friends demonstrated substantial homogeneity across sociodemographic and risk profile characteristics. Social networks recruitment was feasible and yielded a sample of high risk youth willing to enroll in a randomized study to evaluate a novel sexual health intervention.

  13. The Dutch Cannabis Dependence (CanDep) study on the course of frequent cannabis use and dependence: objectives, methods and sample characteristics

    NARCIS (Netherlands)

    van der Pol, P.; Liebregts, N.; de Graaf, R.; Korf, D.J.; van den Brink, W.; van Laar, M.

    2011-01-01

    This paper presents an overview of the prospective cohort design of the Dutch Cannabis Dependence (CanDep) study, which investigates (i) the three-year natural course of frequent cannabis use (≥ three days per week in the past 12 months) and cannabis dependence; and (ii) the factors involved in the

  14. A proposal of optimal sampling design using a modularity strategy

    Science.gov (United States)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  15. Rationale, design, methodology and sample characteristics for the Vietnam pre-conceptual micronutrient supplementation trial (PRECONCEPT: a randomized controlled study

    Directory of Open Access Journals (Sweden)

    Nguyen Phuong H

    2012-10-01

    Full Text Available Abstract Background Low birth weight and maternal anemia remain intractable problems in many developing countries. The adequacy of the current strategy of providing iron-folic acid (IFA supplements only during pregnancy has been questioned given many women enter pregnancy with poor iron stores, the substantial micronutrient demand by maternal and fetal tissues, and programmatic issues related to timing and coverage of prenatal care. Weekly IFA supplementation for women of reproductive age (WRA improves iron status and reduces the burden of anemia in the short term, but few studies have evaluated subsequent pregnancy and birth outcomes. The Preconcept trial aims to determine whether pre-pregnancy weekly IFA or multiple micronutrient (MM supplementation will improve birth outcomes and maternal and infant iron status compared to the current practice of prenatal IFA supplementation only. This paper provides an overview of study design, methodology and sample characteristics from baseline survey data and key lessons learned. Methods/design We have recruited 5011 WRA in a double-blind stratified randomized controlled trial in rural Vietnam and randomly assigned them to receive weekly supplements containing either: 1 2800 μg folic acid 2 60 mg iron and 2800 μg folic acid or 3 MM. Women who become pregnant receive daily IFA, and are being followed through pregnancy, delivery, and up to three months post-partum. Study outcomes include birth outcomes and maternal and infant iron status. Data are being collected on household characteristics, maternal diet and mental health, anthropometry, infant feeding practices, morbidity and compliance. Discussion The study is timely and responds to the WHO Global Expert Consultation which identified the need to evaluate the long term benefits of weekly IFA and MM supplementation in WRA. Findings will generate new information to help guide policy and programs designed to reduce the burden of anemia in women and

  16. Empirically sampling Universal Dependencies

    DEFF Research Database (Denmark)

    Schluter, Natalie; Agic, Zeljko

    2017-01-01

    Universal Dependencies incur a high cost in computation for unbiased system development. We propose a 100% empirically chosen small subset of UD languages for efficient parsing system development. The technique used is based on measurements of model capacity globally. We show that the diversity o...

  17. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  18. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  19. Understanding the cluster randomised crossover design: a graphical illustraton of the components of variation and a sample size tutorial.

    Science.gov (United States)

    Arnup, Sarah J; McKenzie, Joanne E; Hemming, Karla; Pilcher, David; Forbes, Andrew B

    2017-08-15

    In a cluster randomised crossover (CRXO) design, a sequence of interventions is assigned to a group, or 'cluster' of individuals. Each cluster receives each intervention in a separate period of time, forming 'cluster-periods'. Sample size calculations for CRXO trials need to account for both the cluster randomisation and crossover aspects of the design. Formulae are available for the two-period, two-intervention, cross-sectional CRXO design, however implementation of these formulae is known to be suboptimal. The aims of this tutorial are to illustrate the intuition behind the design; and provide guidance on performing sample size calculations. Graphical illustrations are used to describe the effect of the cluster randomisation and crossover aspects of the design on the correlation between individual responses in a CRXO trial. Sample size calculations for binary and continuous outcomes are illustrated using parameters estimated from the Australia and New Zealand Intensive Care Society - Adult Patient Database (ANZICS-APD) for patient mortality and length(s) of stay (LOS). The similarity between individual responses in a CRXO trial can be understood in terms of three components of variation: variation in cluster mean response; variation in the cluster-period mean response; and variation between individual responses within a cluster-period; or equivalently in terms of the correlation between individual responses in the same cluster-period (within-cluster within-period correlation, WPC), and between individual responses in the same cluster, but in different periods (within-cluster between-period correlation, BPC). The BPC lies between zero and the WPC. When the WPC and BPC are equal the precision gained by crossover aspect of the CRXO design equals the precision lost by cluster randomisation. When the BPC is zero there is no advantage in a CRXO over a parallel-group cluster randomised trial. Sample size calculations illustrate that small changes in the specification of

  20. Concurrent cannabis use during treatment for comorbid ADHD and cocaine dependence: effects on outcome.

    Science.gov (United States)

    Aharonovich, Efrat; Garawi, Fatima; Bisaga, Adam; Brooks, Daniel; Raby, Wilfrid N; Rubin, Eric; Nunes, Edward V; Levin, Frances R

    2006-01-01

    Cannabis is the most widely used illicit substance in the United States with especially high prevalence of use among those with psychiatric disorders. Few studies have examined the relationship between concurrent cannabis use and treatment outcome among patients receiving treatment for comorbid substance abuse and psychiatric disorders. This study investigated the effects of cannabis use on treatment retention and abstinence from cocaine among cocaine dependent patients with Attention Deficit Hyperactivity Disorder (ADHD). Cocaine dependent patients diagnosed with current ADHD (DSM-IV, N = 92) aged 25 to 51 participated in a randomized clinical trial of methylphenidate for treatment of ADHD and cocaine dependence in an outpatient setting. The majority of patients (69%) used cannabis during treatment. Results suggest that moderate/intermittent cannabis users had greater retention rates compared to abstainers and consistent users (p = .02). This study is the first to examine concurrent cannabis use in cocaine dependent patients diagnosed with ADHD.

  1. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  2. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    OpenAIRE

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than ...

  3. Individualized Treatment for Tobacco Dependence in Addictions Treatment Settings: The Role of Current Depressive Symptoms on Outcomes at 3 and 6 Months.

    Science.gov (United States)

    Zawertailo, Laurie A; Baliunas, Dolly; Ivanova, Anna; Selby, Peter L

    2015-08-01

    Individuals with concurrent tobacco dependence and other addictions often report symptoms of low mood and depression and as such may have more difficulty quitting smoking. We hypothesized that current symptoms of depression would be a significant predictor of quit success among a group of smokers receiving individualized treatment for tobacco dependence within addiction treatment settings. Individuals in treatment for other addictions were enrolled in a smoking cessation program involving brief behavioral counseling and individualized dosing of nicotine replacement therapy. The baseline assessment included the Patient Health Questionnaire (PHQ9) for depression. Smoking cessation outcomes were measured at 3 and 6 months post-enrollment. Bivariate associations between cessation outcomes and PHQ9 score were analyzed. Of the 1,196 subjects enrolled to date, 1,171 (98%) completed the PHQ9. Moderate to severe depression (score >9) was reported by 28% of the sample, and another 29% reported mild depression (score between 5 and 9). Contrary to the extant literature and other findings by our own group, there was no association between current depression and cessation outcome at either 3 months (n = 1,171) (17.0% in those with PHQ9 > 9 vs. 19.8% in those with PHQ9 addictions treatment setting. These data indicate that patients in an addictions treatment setting can successfully quit smoking regardless of current depressive symptoms. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Bionic Design for Mars Sampling Scoop Inspired by Himalayan Marmot Claw

    Directory of Open Access Journals (Sweden)

    Long Xue

    2016-01-01

    Full Text Available Cave animals are often adapted to digging and life underground, with claw toes similar in structure and function to a sampling scoop. In this paper, the clawed toes of the Himalayan marmot were selected as a biological prototype for bionic research. Based on geometric parameter optimization of the clawed toes, a bionic sampling scoop for use on Mars was designed. Using a 3D laser scanner, the point cloud data of the second front claw toe was acquired. Parametric equations and contour curves for the claw were then built with cubic polynomial fitting. We obtained 18 characteristic curve equations for the internal and external contours of the claw. A bionic sampling scoop was designed according to the structural parameters of Curiosity’s sampling shovel and the contours of the Himalayan marmot’s claw. Verifying test results showed that when the penetration angle was 45° and the sampling speed was 0.33 r/min, the bionic sampling scoops’ resistance torque was 49.6% less than that of the prototype sampling scoop. When the penetration angle was 60° and the sampling speed was 0.22 r/min, the resistance torque of the bionic sampling scoop was 28.8% lower than that of the prototype sampling scoop.

  5. Multi-morbidity, dependency, and frailty singly or in combination have different impact on health outcomes.

    Science.gov (United States)

    Woo, Jean; Leung, Jason

    2014-04-01

    Multi-morbidity, dependency, and frailty were studied simultaneously in a community-living cohort of 4,000 men and women aged 65 years and over to examine the independent and combined effects on four health outcomes (mortality, decline in physical function, depression, and polypharmacy). The influence of socioeconomic status on these relationships is also examined. Mortality data was documented after a mean follow-up period of 9 years, while other health outcomes were documented after 4 years of follow-up. Fifteen percent of the cohort did not have any of these syndromes. Of the remaining participants, nearly one third had multi-morbidity and frailty (pre-frail and frail), while all three syndromes were present in 11 %. All syndromes as well as socioeconomic status were significantly associated with all health outcomes. Mortality was only increased for age, being male, frailty status, and combinations of syndromes that included frailty. Both multi-morbidity and frailtymale was protective. Only a combination of all three syndromes, and age per se, increased the risk of depressive symptoms at 4 years while being male conferred reduced risk. Multi-morbidity, but not frailty status or dependency, and all syndrome combinations that included multi-morbidity were associated with use of ≥ four medications. Decline in homeostatic function with age may thus be quantified and taken into account in prediction of various health outcomes, with a view to prevention, management, formulation of guidelines, service planning, and the conduct of randomized controlled trials of interventions or treatment.

  6. Design of Field Experiments for Adaptive Sampling of the Ocean with Autonomous Vehicles

    Science.gov (United States)

    Zheng, H.; Ooi, B. H.; Cho, W.; Dao, M. H.; Tkalich, P.; Patrikalakis, N. M.

    2010-05-01

    Due to the highly non-linear and dynamical nature of oceanic phenomena, the predictive capability of various ocean models depends on the availability of operational data. A practical method to improve the accuracy of the ocean forecast is to use a data assimilation methodology to combine in-situ measured and remotely acquired data with numerical forecast models of the physical environment. Autonomous surface and underwater vehicles with various sensors are economic and efficient tools for exploring and sampling the ocean for data assimilation; however there is an energy limitation to such vehicles, and thus effective resource allocation for adaptive sampling is required to optimize the efficiency of exploration. In this paper, we use physical oceanography forecasts of the coastal zone of Singapore for the design of a set of field experiments to acquire useful data for model calibration and data assimilation. The design process of our experiments relied on the oceanography forecast including the current speed, its gradient, and vorticity in a given region of interest for which permits for field experiments could be obtained and for time intervals that correspond to strong tidal currents. Based on these maps, resources available to our experimental team, including Autonomous Surface Craft (ASC) are allocated so as to capture the oceanic features that result from jets and vortices behind bluff bodies (e.g., islands) in the tidal current. Results are summarized from this resource allocation process and field experiments conducted in January 2009.

  7. Evaluation of three sampling methods to monitor outcomes of antiretroviral treatment programmes in low- and middle-income countries.

    Science.gov (United States)

    Tassie, Jean-Michel; Malateste, Karen; Pujades-Rodríguez, Mar; Poulet, Elisabeth; Bennett, Diane; Harries, Anthony; Mahy, Mary; Schechter, Mauro; Souteyrand, Yves; Dabis, François

    2010-11-10

    Retention of patients on antiretroviral therapy (ART) over time is a proxy for quality of care and an outcome indicator to monitor ART programs. Using existing databases (Antiretroviral in Lower Income Countries of the International Databases to Evaluate AIDS and Médecins Sans Frontières), we evaluated three sampling approaches to simplify the generation of outcome indicators. We used individual patient data from 27 ART sites and included 27,201 ART-naive adults (≥15 years) who initiated ART in 2005. For each site, we generated two outcome indicators at 12 months, retention on ART and proportion of patients lost to follow-up (LFU), first using all patient data and then within a smaller group of patients selected using three sampling methods (random, systematic and consecutive sampling). For each method and each site, 500 samples were generated, and the average result was compared with the unsampled value. The 95% sampling distribution (SD) was expressed as the 2.5(th) and 97.5(th) percentile values from the 500 samples. Overall, retention on ART was 76.5% (range 58.9-88.6) and the proportion of patients LFU, 13.5% (range 0.8-31.9). Estimates of retention from sampling (n = 5696) were 76.5% (SD 75.4-77.7) for random, 76.5% (75.3-77.5) for systematic and 76.0% (74.1-78.2) for the consecutive method. Estimates for the proportion of patients LFU were 13.5% (12.6-14.5), 13.5% (12.6-14.3) and 14.0% (12.5-15.5), respectively. With consecutive sampling, 50% of sites had SD within ±5% of the unsampled site value. Our results suggest that random, systematic or consecutive sampling methods are feasible for monitoring ART indicators at national level. However, sampling may not produce precise estimates in some sites.

  8. Optimal experiment design in a filtering context with application to sampled network data

    OpenAIRE

    Singhal, Harsh; Michailidis, George

    2010-01-01

    We examine the problem of optimal design in the context of filtering multiple random walks. Specifically, we define the steady state E-optimal design criterion and show that the underlying optimization problem leads to a second order cone program. The developed methodology is applied to tracking network flow volumes using sampled data, where the design variable corresponds to controlling the sampling rate. The optimal design is numerically compared to a myopic and a naive strategy. Finally, w...

  9. Preeminence and prerequisites of sample size calculations in clinical trials

    Directory of Open Access Journals (Sweden)

    Richa Singhal

    2015-01-01

    Full Text Available The key components while planning a clinical study are the study design, study duration, and sample size. These features are an integral part of planning a clinical trial efficiently, ethically, and cost-effectively. This article describes some of the prerequisites for sample size calculation. It also explains that sample size calculation is different for different study designs. The article in detail describes the sample size calculation for a randomized controlled trial when the primary outcome is a continuous variable and when it is a proportion or a qualitative variable.

  10. Sample requirements and design of an inter-laboratory trial for radiocarbon laboratories

    NARCIS (Netherlands)

    Bryant, C; Carmi, [No Value; Cook, G; Gulliksen, S; Harkness, D; Heinemeier, J; McGee, E; Naysmith, P; Possnert, G; van der Plicht, H; van Strydonck, M; Carmi, Israel

    2000-01-01

    An on-going inter-comparison programme which is focused on assessing and establishing consensus protocols to be applied in the identification, selection and sub-sampling of materials for subsequent C-14 analysis is described. The outcome of the programme will provide a detailed quantification of the

  11. Advanced dependent pressure vessel (DPV) nickel-hydrogen spacecraft battery design

    Energy Technology Data Exchange (ETDEWEB)

    Coates, D.K.; Grindstaff, B.; Swaim, O.; Fox, C. [Eagle-Picher Industries, Inc., Joplin, MO (United States). Advanced Systems Operation

    1995-12-31

    The dependent pressure vessel (DPV) nickel-hydrogen (NiH{sub 2}) battery is being developed as a potential spacecraft battery design for both military and commercial satellites. The limitations of standard NiH{sub 2} individual pressure vessel (IPV) flight battery technology are primarily related to the internal cell design and the battery packaging issues associated with grouping multiple cylindrical cells. The DPV cell design offers higher energy density and reduced cost, while retaining the established IPV technology flight heritage and database. The advanced cell design offers a more efficient mechanical, electrical and thermal cell configuration and a reduced parts count. The geometry of the DPV cell promotes compact, minimum volume packaging and weight efficiency. The DPV battery design offers significant cost and weight savings advantages while providing minimal design risks.

  12. Design/Operations review of core sampling trucks and associated equipment

    International Nuclear Information System (INIS)

    Shrivastava, H.P.

    1996-01-01

    A systematic review of the design and operations of the core sampling trucks was commissioned by Characterization Equipment Engineering of the Westinghouse Hanford Company in October 1995. The review team reviewed the design documents, specifications, operating procedure, training manuals and safety analysis reports. The review process, findings and corrective actions are summarized in this supporting document

  13. Indirect estimation of signal-dependent noise with nonadaptive heterogeneous samples.

    Science.gov (United States)

    Azzari, Lucio; Foi, Alessandro

    2014-08-01

    We consider the estimation of signal-dependent noise from a single image. Unlike conventional algorithms that build a scatterplot of local mean-variance pairs from either small or adaptively selected homogeneous data samples, our proposed approach relies on arbitrarily large patches of heterogeneous data extracted at random from the image. We demonstrate the feasibility of our approach through an extensive theoretical analysis based on mixture of Gaussian distributions. A prototype algorithm is also developed in order to validate the approach on simulated data as well as on real camera raw images.

  14. Practical iterative learning control with frequency domain design and sampled data implementation

    CERN Document Server

    Wang, Danwei; Zhang, Bin

    2014-01-01

    This book is on the iterative learning control (ILC) with focus on the design and implementation. We approach the ILC design based on the frequency domain analysis and address the ILC implementation based on the sampled data methods. This is the first book of ILC from frequency domain and sampled data methodologies. The frequency domain design methods offer ILC users insights to the convergence performance which is of practical benefits. This book presents a comprehensive framework with various methodologies to ensure the learnable bandwidth in the ILC system to be set with a balance between learning performance and learning stability. The sampled data implementation ensures effective execution of ILC in practical dynamic systems. The presented sampled data ILC methods also ensure the balance of performance and stability of learning process. Furthermore, the presented theories and methodologies are tested with an ILC controlled robotic system. The experimental results show that the machines can work in much h...

  15. Design of sampling tools for Monte Carlo particle transport code JMCT

    International Nuclear Information System (INIS)

    Shangguan Danhua; Li Gang; Zhang Baoyin; Deng Li

    2012-01-01

    A class of sampling tools for general Monte Carlo particle transport code JMCT is designed. Two ways are provided to sample from distributions. One is the utilization of special sampling methods for special distribution; the other is the utilization of general sampling methods for arbitrary discrete distribution and one-dimensional continuous distribution on a finite interval. Some open source codes are included in the general sampling method for the maximum convenience of users. The sampling results show sampling correctly from distribution which are popular in particle transport can be achieved with these tools, and the user's convenience can be assured. (authors)

  16. Accuracy of micro four-point probe measurements on inhomogeneous samples: A probe spacing dependence study

    DEFF Research Database (Denmark)

    Wang, Fei; Petersen, Dirch Hjorth; Østerberg, Frederik Westergaard

    2009-01-01

    In this paper, we discuss a probe spacing dependence study in order to estimate the accuracy of micro four-point probe measurements on inhomogeneous samples. Based on sensitivity calculations, both sheet resistance and Hall effect measurements are studied for samples (e.g. laser annealed samples...... the probe spacing is smaller than 1/40 of the variation wavelength, micro four-point probes can provide an accurate record of local properties with less than 1% measurement error. All the calculations agree well with previous experimental results.......) with periodic variations of sheet resistance, sheet carrier density, and carrier mobility. With a variation wavelength of ¿, probe spacings from 0.0012 to 1002 have been applied to characterize the local variations. The calculations show that the measurement error is highly dependent on the probe spacing. When...

  17. Optimizing incomplete sample designs for item response model parameters

    NARCIS (Netherlands)

    van der Linden, Willem J.

    Several models for optimizing incomplete sample designs with respect to information on the item parameters are presented. The following cases are considered: (1) known ability parameters; (2) unknown ability parameters; (3) item sets with multiple ability scales; and (4) response models with

  18. Temperature dependence of the calibration factor of radon and radium determination in water samples by SSNTD

    CERN Document Server

    Hunyadi, I; Hakl, J; Baradacs, E; Dezso, Z

    1999-01-01

    The sensitivity of a sup 2 sup 2 sup 6 Ra determination method of water samples by SSNTD was measured as a function of storage temperature during exposure. The method is based on an etched track type radon monitor, which is closed into a gas permeable foil and is immersed in the water sample. The sample is sealed in a glass vessel and stored for an exposure time of 10-30 days. The sensitivity increased more than a factor of two when the storage temperature was raised from 2 deg. C to 30 deg. C. Temperature dependence of the partition coefficient of radon between water and air provides explanation for this dependence. For practical radio- analytical application the temperature dependence of the calibration factor is given by fitting the sensitivity data obtained by measuring sup 2 sup 2 sup 6 Ra standard solutions (in the activity concentration range of 0.1-48.5 kBq m sup - sup 3) at different storage temperatures.

  19. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  20. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  1. Considerations on sample holder design and custom-made non-polarizable electrodes for Spectral Induced Polarization measurements on unsaturated soils

    Science.gov (United States)

    Kaouane, C.; Chouteau, M. C.; Fauchard, C.; Cote, P.

    2014-12-01

    Spectral Induced Polarization (SIP) is a geophysical method sensitive to water content, saturation and grain size distribution. It could be used as an alternative to nuclear probes to assess the compaction of soils in road works. To evaluate the potential of SIP as a practical tool, we designed an experiment for complex conductivity measurements on unsaturated soil samples.Literature presents a large variety of sample holders and designs, each depending on the context. Although we might find some precise description about the sample holder, exact replication is not always possible. Furthermore, the potential measurements are often done using custom-made Ag/AgCl electrodes and very few indications are given on their reliability with time and temperature. Our objective is to perform complex conductivity measurements on soil samples compacted in a PVC cylindrical mould (10 cm-long, 5 cm-diameter) according to geotechnical standards. To expect homogeneous current density, electrical current is transmitted through the sample via chambers filled with agar gel. Agar gel is a good non-polarizable conductor within the frequency range (1 mHz -20kHz). Its electrical properties are slightly known. We measured increasing of agar-agar electrical conductivity in time. We modelled the influence of this variation on the measurement. If the electrodes are located on the sample, it is minimized. Because of the dimensions at stake and the need for simple design, potential electrodes are located outside the sample, hence the gel contributes to the measurements. Since the gel is fairly conductive, we expect to overestimate the sample conductivity. Potential electrodes are non-polarizable Ag/AgCl electrodes. To avoid any leakage, the KCl solution in the electrodes is replaced by saturated KCl-agar gel. These electrodes are low cost and show a low, stable, self-potential (<1mV). In addition, the technique of making electrode can be easily reproduced and storage and maintenance are simple

  2. Optimum study designs.

    Science.gov (United States)

    Gu, C; Rao, D C

    2001-01-01

    Because simplistic designs will lead to prohibitively large sample sizes, the optimization of genetic study designs is critical for successfully mapping genes for complex diseases. Creative designs are necessary for detecting and amplifying the usually weak signals for complex traits. Two important outcomes of a study design--power and resolution--are implicitly tied together by the principle of uncertainty. Overemphasis on either one may lead to suboptimal designs. To achieve optimality for a particular study, therefore, practical measures such as cost-effectiveness must be used to strike a balance between power and resolution. In this light, the myriad of factors involved in study design can be checked for their effects on the ultimate outcomes, and the popular existing designs can be sorted into building blocks that may be useful for particular situations. It is hoped that imaginative construction of novel designs using such building blocks will lead to enhanced efficiency in finding genes for complex human traits.

  3. Economic burden associated with alcohol dependence in a German primary care sample: a bottom-up study

    Directory of Open Access Journals (Sweden)

    Jakob Manthey

    2016-08-01

    Full Text Available Abstract Background A considerable economic burden has been repeatedly associated with alcohol dependence (AD – mostly calculated using aggregate data and alcohol-attributable fractions (top-down approach. However, this approach is limited by a number of assumptions, which are hard to test. Thus, cost estimates should ideally be validated with studies using individual data to estimate the same costs (bottom-up approach. However, bottom-up studies on the economic burden associated with AD are lacking. Our study aimed to fill this gap using the bottom-up approach to examine costs for AD, and also stratified the results by the following subgroups: sex, age, diagnostic approach and severity of AD, as relevant variations could be expected by these factors. Methods Sample: 1356 primary health care patients, representative for two German regions. AD was diagnosed by a standardized instrument and treating physicians. Individual costs were calculated by combining resource use and productivity data representing a period of six months prior to the time of interview, with unit costs derived from the literature or official statistics. The economic burden associated with AD was determined via excess costs by comparing utilization of various health care resources and impaired productivity between people with and without AD, controlling for relevant confounders. Additional analyses for several AD characteristics were performed. Results Mean costs among alcohol dependent patients were 50 % higher compared to the remaining patients, resulting in 1836 € excess costs per alcohol dependent patient in 6 months. More than half of these excess costs incurred through increased productivity loss among alcohol dependent patients. Treatment for alcohol problems represents only 6 % of these costs. The economic burden associated with AD incurred mainly among males and among 30 to 49 year old patients. Both diagnostic approaches were significantly related to the

  4. Subjective and objective outcomes from new BiCROS technology in a veteran sample.

    Science.gov (United States)

    Williams, Victoria A; McArdle, Rachel A; Chisolm, Theresa H

    2012-01-01

    . Overall, the objective (WIN) and subjective (SSQ, MarkeTrak, and open-ended questions) measures indicated that the new BiCROS provided better outcomes than the previous BiCROS system. In addition, an overlap of favorable results was seen across measures. Of the 39 participants, 95% reported improvements with the new BiCROS and chose to utilize the device regularly. The favorable objective and subjective outcomes indicate that the new BiCROS system is as good, or better than, what was previously utilized by our sample of veterans. American Academy of Audiology.

  5. Estimating scaled treatment effects with multiple outcomes.

    Science.gov (United States)

    Kennedy, Edward H; Kangovi, Shreya; Mitra, Nandita

    2017-01-01

    In classical study designs, the aim is often to learn about the effects of a treatment or intervention on a single outcome; in many modern studies, however, data on multiple outcomes are collected and it is of interest to explore effects on multiple outcomes simultaneously. Such designs can be particularly useful in patient-centered research, where different outcomes might be more or less important to different patients. In this paper, we propose scaled effect measures (via potential outcomes) that translate effects on multiple outcomes to a common scale, using mean-variance and median-interquartile range based standardizations. We present efficient, nonparametric, doubly robust methods for estimating these scaled effects (and weighted average summary measures), and for testing the null hypothesis that treatment affects all outcomes equally. We also discuss methods for exploring how treatment effects depend on covariates (i.e., effect modification). In addition to describing efficiency theory for our estimands and the asymptotic behavior of our estimators, we illustrate the methods in a simulation study and a data analysis. Importantly, and in contrast to much of the literature concerning effects on multiple outcomes, our methods are nonparametric and can be used not only in randomized trials to yield increased efficiency, but also in observational studies with high-dimensional covariates to reduce confounding bias.

  6. Trait-specific dependence in romantic relationships.

    Science.gov (United States)

    Ellis, Bruce J; Simpson, Jeffry A; Campbell, Lorne

    2002-10-01

    Informed by three theoretical frameworks--trait psychology, evolutionary psychology, and interdependence theory--we report four investigations designed to develop and test the reliability and validity of a new construct and accompanying multiscale inventory, the Trait-Specific Dependence Inventory (TSDI). The TSDI assesses comparisons between present and alternative romantic partners on major dimensions of mate value. In Study 1, principal components analyses revealed that the provisional pool of theory-generated TSDI items were represented by six factors: Agreeable/Committed, Resource Accruing Potential, Physical Prowess, Emotional Stability, Surgency, and Physical Attractiveness. In Study 2, confirmatory factor analysis replicated these results on a different sample and tested how well different structural models fit the data. Study 3 provided evidence for the convergent and discriminant validity of the six TSDI scales by correlating each one with a matched personality trait scale that did not explicitly incorporate comparisons between partners. Study 4 provided further validation evidence, revealing that the six TSDI scales successfully predicted three relationship outcome measures--love, time investment, and anger/upset--above and beyond matched sets of traditional personality trait measures. These results suggest that the TSDI is a reliable, valid, and unique construct that represents a new trait-specific method of assessing dependence in romantic relationships. The construct of trait-specific dependence is introduced and linked with other theories of mate value.

  7. The Impact of Mobile Phone Dependency on Health and Biomarkers in a Greek University Student Sample

    Directory of Open Access Journals (Sweden)

    Eugenia Minasidou

    2015-12-01

    Full Text Available Introduction: Mobile phone use can be addictive for the young. However, little is known about the behavioral and biological effects of this addiction among the student population. Aim: The purpose of this study was to explore the effects of mobile phone use on the health behaviors and specific biomarkers in a sample of Greek students. Methods: Sample included 104 Nursing students from a stratified randomised sample. In 30 ran- domly selected out of the 104 students, melatonin and total antioxidant levels in saliva were also measured. Mobile phone dependence was estimated with the Mobile Phone Dependence Question- naire by Toda et al and general health with the GHQ-28. We used the Antioxidant assay kit-SIGMA to measure the total antioxidant levels and the anosoenzymic method ELISA (IBL kit to measure melatonine levels. Analysis was performed with SPSS v15.0 software. All hypothesis were tested at a p<0,05 level. Results: No statistical difference between genders was detected (p=0.182. High dependence was apparent in 13.5% of the sample, which correlated with worse general health (p=0.004, greater alcohol consumption (p=0.007, sleep disturbances (p=0.02 and worse nutritional habits (p=0.032. Moreover, high mobile dependent students exhibited higher concentration of early morning mel- atonin (p=0.07 and lower antioxidant concentration (p=0.333 in saliva, compared to low mobile dependent students. Conclusions: Excessive use of mobiles among students seems to correlate with unhealthy habits and impaired health. The effect in well known biomarkers may suggest the burden of the health of the student population. However, the long term effects on health require further investigation.

  8. Improving tobacco dependence treatment outcomes for smokers of lower socioeconomic status: A randomized clinical trial.

    Science.gov (United States)

    Sheffer, Christine E; Bickel, Warren K; Franck, Christopher T; Panissidi, Luana; Pittman, Jami C; Stayna, Helen; Evans, Shenell

    2017-12-01

    Evidence-based treatments for tobacco dependence are significantly less effective for smokers of lower socioeconomic status which contributes to socioeconomic disparities in smoking prevalence rates and health. We aimed to reduce the socioeconomic gradient in treatment outcomes by systematically adapting evidence-based, cognitive-behavioral treatment for tobacco dependence for diverse lower socioeconomic smokers. Participants were randomized to adapted or standard treatment, received six 1-h group treatment sessions, and were followed for six months. We examined the effectiveness of the adapted treatment to improve treatment outcomes for lower socioeconomic groups. Participants (n=227) were ethnically, racially, and socioeconomically diverse. The adapted treatment significantly reduced the days to relapse for the two lowest socioeconomic groups: SES1: M=76.6 (SD 72.9) vs. 38.3 (SD 60.1) days to relapse (RR=0.63 95% CI, 0.45, 0.88, p=0.0013); SES2: M=88.2 (SD 67.3) vs. 40.1 (SD 62.6 days to relapse (RR=0.57 95% CI, 0.18, 0.70, p=0.0024). Interactions between socioeconomic status and condition were significant for initial abstinence (OR=1.26, 95% CI 1.09, 1.46, p=0.002), approached significance for 3-month abstinence (OR=0.90, 95% CI 0.80, 1.01, psocioeconomic smokers and reduce inequities in days to relapse. Novel methods of providing targeted extended support are needed to improve long-term outcomes. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Frequency-Dependent Streaming Potential of Porous Media—Part 1: Experimental Approaches and Apparatus Design

    Directory of Open Access Journals (Sweden)

    P. W. J. Glover

    2012-01-01

    Full Text Available Electrokinetic phenomena link fluid flow and electrical flow in porous and fractured media such that a hydraulic flow will generate an electrical current and vice versa. Such a link is likely to be extremely useful, especially in the development of the electroseismic method. However, surprisingly few experimental measurements have been carried out, particularly as a function of frequency because of their difficulty. Here we have considered six different approaches to make laboratory determinations of the frequency-dependent streaming potential coefficient. In each case, we have analyzed the mechanical, electrical, and other technical difficulties involved in each method. We conclude that the electromagnetic drive is currently the only approach that is practicable, while the piezoelectric drive may be useful for low permeability samples and at specified high frequencies. We have used the electro-magnetic drive approach to design, build, and test an apparatus for measuring the streaming potential coefficient of unconsolidated and disaggregated samples such as sands, gravels, and soils with a diameter of 25.4 mm and lengths between 50 mm and 300 mm.

  10. Sampling design for long-term regional trends in marine rocky intertidal communities

    Science.gov (United States)

    Irvine, Gail V.; Shelley, Alice

    2013-01-01

    Probability-based designs reduce bias and allow inference of results to the pool of sites from which they were chosen. We developed and tested probability-based designs for monitoring marine rocky intertidal assemblages at Glacier Bay National Park and Preserve (GLBA), Alaska. A multilevel design was used that varied in scale and inference. The levels included aerial surveys, extensive sampling of 25 sites, and more intensive sampling of 6 sites. Aerial surveys of a subset of intertidal habitat indicated that the original target habitat of bedrock-dominated sites with slope ≤30° was rare. This unexpected finding illustrated one value of probability-based surveys and led to a shift in the target habitat type to include steeper, more mixed rocky habitat. Subsequently, we evaluated the statistical power of different sampling methods and sampling strategies to detect changes in the abundances of the predominant sessile intertidal taxa: barnacles Balanomorpha, the mussel Mytilus trossulus, and the rockweed Fucus distichus subsp. evanescens. There was greatest power to detect trends in Mytilus and lesser power for barnacles and Fucus. Because of its greater power, the extensive, coarse-grained sampling scheme was adopted in subsequent years over the intensive, fine-grained scheme. The sampling attributes that had the largest effects on power included sampling of “vertical” line transects (vs. horizontal line transects or quadrats) and increasing the number of sites. We also evaluated the power of several management-set parameters. Given equal sampling effort, sampling more sites fewer times had greater power. The information gained through intertidal monitoring is likely to be useful in assessing changes due to climate, including ocean acidification; invasive species; trampling effects; and oil spills.

  11. Quantification of errors in ordinal outcome scales using shannon entropy: effect on sample size calculations.

    Science.gov (United States)

    Mandava, Pitchaiah; Krumpelman, Chase S; Shah, Jharna N; White, Donna L; Kent, Thomas A

    2013-01-01

    Clinical trial outcomes often involve an ordinal scale of subjective functional assessments but the optimal way to quantify results is not clear. In stroke, the most commonly used scale, the modified Rankin Score (mRS), a range of scores ("Shift") is proposed as superior to dichotomization because of greater information transfer. The influence of known uncertainties in mRS assessment has not been quantified. We hypothesized that errors caused by uncertainties could be quantified by applying information theory. Using Shannon's model, we quantified errors of the "Shift" compared to dichotomized outcomes using published distributions of mRS uncertainties and applied this model to clinical trials. We identified 35 randomized stroke trials that met inclusion criteria. Each trial's mRS distribution was multiplied with the noise distribution from published mRS inter-rater variability to generate an error percentage for "shift" and dichotomized cut-points. For the SAINT I neuroprotectant trial, considered positive by "shift" mRS while the larger follow-up SAINT II trial was negative, we recalculated sample size required if classification uncertainty was taken into account. Considering the full mRS range, error rate was 26.1%±5.31 (Mean±SD). Error rates were lower for all dichotomizations tested using cut-points (e.g. mRS 1; 6.8%±2.89; overall pdecrease in reliability. The resultant errors need to be considered since sample size may otherwise be underestimated. In principle, we have outlined an approach to error estimation for any condition in which there are uncertainties in outcome assessment. We provide the user with programs to calculate and incorporate errors into sample size estimation.

  12. Outcomes of Zika virus infection during pregnancy: contributions to the debate on the efficiency of cohort studies.

    Science.gov (United States)

    Duarte, Elisabeth Carmen; Garcia, Leila Posenato; de Araújo, Wildo Navegantes; Velez, Maria P

    2017-12-02

    Zika infection during pregnancy (ZIKVP) is known to be associated with adverse outcomes. Studies on this matter involve both rare outcomes and rare exposures and methodological choices are not straightforward. Cohort studies will surely offer more robust evidences, but their efficiency must be enhanced. We aim to contribute to the debate on sample selection strategies in cohort studies to assess outcomes associated with ZKVP. A study can be statistically more efficient than another if its estimates are more accurate (precise and valid), even if the studies involve the same number of subjects. Sample size and specific design strategies can enhance or impair the statistical efficiency of a study, depending on how the subjects are distributed in subgroups pertinent to the analysis. In most ZIKVP cohort studies to date there is an a priori identification of the source population (pregnant women, regardless of their exposure status) which is then sampled or included in its entirety (census). Subsequently, the group of pregnant women is classified according to exposure (presence or absence of ZIKVP), respecting the exposed:unexposed ratio in the source population. We propose that the sample selection be done from the a priori identification of groups of pregnant women exposed and unexposed to ZIKVP. This method will allow for an oversampling (even 100%) of the pregnant women with ZKVP and a optimized sampling from the general population of pregnant women unexposed to ZIKVP, saving resources in the unexposed group and improving the expected number of incident cases (outcomes) overall. We hope that this proposal will broaden the methodological debate on the improvement of statistical power and protocol harmonization of cohort studies that aim to evaluate the association between Zika infection during pregnancy and outcomes for the offspring, as well as those with similar objectives.

  13. Design of shell-and-tube heat exchangers when the fouling depends on local temperature and velocity

    Energy Technology Data Exchange (ETDEWEB)

    Butterworth, D. [HTFS, Hyprotech, Didcot (United Kingdom)

    2002-07-01

    Shell-and-tube heat exchangers are normally designed on the basis of a uniform and constant fouling resistance that is specified in advance by the exchanger user. The design process is then one of determining the best exchanger that will achieve the thermal duty within the specified pressure drop constraints. It has been shown in previous papers [Designing shell-and-tube heat exchangers with velocity-dependant fouling, 34th US national Heat Transfer Conference, 20-22 August 2000, Pittsburg, PA; Designing shell-and-tube heat exchangers with velocity-dependant fouling, 2nd Int. Conf. on Petroleum and Gas Phase Behavior and Fouling, 27-31 August 2000, Copenhagen] that this approach can be extended to the design of exchangers where the design fouling resistance depends on velocity. The current paper briefly reviews the main findings of the previous papers and goes on to treat the case where the fouling depends also on the local temperatures. The Ebert-Panchal [Analysis of Exxon crude-oil, slip-stream coking data, Engineering Foundation Conference on Fouling Mitigation of Heat Exchangers, 18-23 June 1995, California] form of fouling rate equation is used to evaluate this fouling dependence. When allowing for temperature effects, it becomes difficult to divorce the design from the way the exchanger will be operated up to the point when the design fouling is achieved. However, rational ways of separating the design from the operation are proposed. (author)

  14. The impact of parental alcohol dependence on the development and behavior outcome of children in a tertiary care hospital

    Directory of Open Access Journals (Sweden)

    Jasmeet Sidhu

    2016-01-01

    Full Text Available Parents play a pivotal role in upbringing a child and shaping their future. However, children of alcoholics (COAs suffer due to their parent′s dependence pattern. The various domains affected encompass cognitive, behavioural, psychological, emotional and social spheres. This study was designed to assess the impact of alcohol dependence in the parent on the development and behavior of their children, so that further steps could be taken to minimize the negative influences. Aims: To study the impact of parental alcohol dependence on the development and behaviour outcome of children in various domains, alongwith the effect of the family environment. Materials and Methods: The study was a cross-sectional observational study conducted at a tertiary care teaching hospital on 25 children between 6 and 18 years of age, whose atleast one parent was diagnosed as alcohol dependant. The other parent was assessed using a general health questionnaire-28. Child behaviour checklist and family evaluation scale (FES were then applied. Statistical Analysis Used: The analysis was done according the manuals provided with the respective scales to calculate the score. Results: Both male and female COAs had high externalizing and internalizing scores. The girls have higher internalizing scores while the boys of such parents have higher externalizing scores. The FES showed dysfunction in all the three dimensions, namely the relationship, personal growth and the system maintenance. Conclusions: Our study corroborates the findings of the studies done in the past on COAs. The COAs face various affective, anxiety, somatic, attention deficit/hyperactivity, oppositional defiant conduct problems.

  15. Sample size and number of outcome measures of veterinary randomised controlled trials of pharmaceutical interventions funded by different sources, a cross-sectional study.

    Science.gov (United States)

    Wareham, K J; Hyde, R M; Grindlay, D; Brennan, M L; Dean, R S

    2017-10-04

    Randomised controlled trials (RCTs) are a key component of the veterinary evidence base. Sample sizes and defined outcome measures are crucial components of RCTs. To describe the sample size and number of outcome measures of veterinary RCTs either funded by the pharmaceutical industry or not, published in 2011. A structured search of PubMed identified RCTs examining the efficacy of pharmaceutical interventions. Number of outcome measures, number of animals enrolled per trial, whether a primary outcome was identified, and the presence of a sample size calculation were extracted from the RCTs. The source of funding was identified for each trial and groups compared on the above parameters. Literature searches returned 972 papers; 86 papers comprising 126 individual trials were analysed. The median number of outcomes per trial was 5.0; there were no significant differences across funding groups (p = 0.133). The median number of animals enrolled per trial was 30.0; this was similar across funding groups (p = 0.302). A primary outcome was identified in 40.5% of trials and was significantly more likely to be stated in trials funded by a pharmaceutical company. A very low percentage of trials reported a sample size calculation (14.3%). Failure to report primary outcomes, justify sample sizes and the reporting of multiple outcome measures was a common feature in all of the clinical trials examined in this study. It is possible some of these factors may be affected by the source of funding of the studies, but the influence of funding needs to be explored with a larger number of trials. Some veterinary RCTs provide a weak evidence base and targeted strategies are required to improve the quality of veterinary RCTs to ensure there is reliable evidence on which to base clinical decisions.

  16. Family violence in a sample of treatment-seeking gamblers: the effect of having dependent children.

    Science.gov (United States)

    Bellringer, Maria; Pearson, Janet; du Preez, Katie Palmer; Wilson, Denise; Koziol-McLain, Jane; Garrett, Nick; Abbott, Max

    2017-01-01

    This study investigated the effect of problem gambler gender on the relationship between the gambler having dependent children (younger than 18 years) living at home and the gambler perpetrating or being a victim of family violence. The sample comprised 164 help-seeking gamblers (43% female; 37% with dependent child/ren) recruited from three national gambling treatment services in New Zealand. Family violence was measured using a modified version of the HITS scale covering physical, psychological, verbal, emotional and sexual violence. Forty-nine percent of participants reported being a victim of violence and 43% had perpetrated violence. Multivariable logistic regression modelling was conducted, adjusting in sequence for significant socio-demographic, psychosocial and gambling factors. The relationship between having dependent children and being a victim of family violence was gender-related. Female gamblers living with dependent children reported more family violence perpetration and victimisation than male gamblers living with dependent children. Female gamblers with dependent children living at home had greater odds of being a victim of family violence than male gamblers without dependent children living at home. This relationship remained when adjusted for contextual factors of being a victim (ethnicity, income support status, and feelings of inadequacy) in this sample. A similar gender effect of having dependent children living at home on violence perpetration disappeared when known psychosocial contextual factors of violence perpetration (aggression, difficulties in emotion regulation, drug issue in the family, and interpersonal support) were taken into account. These findings suggest the value of coordinated approaches between gambling treatment services and programmes supporting vulnerable families in order to identify vulnerable families and put support mechanisms in place.

  17. Family violence in a sample of treatment-seeking gamblers: the effect of having dependent children

    Directory of Open Access Journals (Sweden)

    Maria Bellringer

    2017-10-01

    Full Text Available Abstract This study investigated the effect of problem gambler gender on the relationship between the gambler having dependent children (younger than 18 years living at home and the gambler perpetrating or being a victim of family violence. The sample comprised 164 help-seeking gamblers (43% female; 37% with dependent child/ren recruited from three national gambling treatment services in New Zealand. Family violence was measured using a modified version of the HITS scale covering physical, psychological, verbal, emotional and sexual violence. Forty-nine percent of participants reported being a victim of violence and 43% had perpetrated violence. Multivariable logistic regression modelling was conducted, adjusting in sequence for significant socio-demographic, psychosocial and gambling factors. The relationship between having dependent children and being a victim of family violence was gender-related. Female gamblers living with dependent children reported more family violence perpetration and victimisation than male gamblers living with dependent children. Female gamblers with dependent children living at home had greater odds of being a victim of family violence than male gamblers without dependent children living at home. This relationship remained when adjusted for contextual factors of being a victim (ethnicity, income support status, and feelings of inadequacy in this sample. A similar gender effect of having dependent children living at home on violence perpetration disappeared when known psychosocial contextual factors of violence perpetration (aggression, difficulties in emotion regulation, drug issue in the family, and interpersonal support were taken into account. These findings suggest the value of coordinated approaches between gambling treatment services and programmes supporting vulnerable families in order to identify vulnerable families and put support mechanisms in place.

  18. Occurrence of multiple mental health or substance use outcomes among bisexuals: a respondent-driven sampling study

    Directory of Open Access Journals (Sweden)

    Greta R. Bauer

    2016-06-01

    Full Text Available Abstract Background Bisexual populations have higher prevalence of depression, anxiety, suicidality and substance use than heterosexuals, and often than gay men or lesbians. The co-occurrence of multiple outcomes has rarely been studied. Methods Data were collected from 405 bisexuals using respondent-driven sampling. Weighted analyses were conducted for 387 with outcome data. Multiple outcomes were defined as 3 or more of: depression, anxiety, suicide ideation, problematic alcohol use, or polysubstance use. Results Among bisexuals, 19.0 % had multiple outcomes. We did not find variation in raw frequency of multiple outcomes across sociodemographic variables (e.g. gender, age. After adjustment, gender and sexual orientation identity were associated, with transgender women and those identifying as bisexual only more likely to have multiple outcomes. Social equity factors had a strong impact in both crude and adjusted analysis: controlling for other factors, high mental health/substance use burden was associated with greater discrimination (prevalence risk ratio (PRR = 5.71; 95 % CI: 2.08, 15.63 and lower education (PRR = 2.41; 95 % CI: 1.06, 5.49, while higher income-to-needs ratio was protective (PRR = 0.44; 0.20, 1.00. Conclusions Mental health and substance use outcomes with high prevalence among bisexuals frequently co-occurred. We find some support for the theory that these multiple outcomes represent a syndemic, defined as co-occurring and mutually reinforcing adverse outcomes driven by social inequity.

  19. Does Power Distance Exacerbate or Mitigate the Effects of Abusive Supervision? It Depends on the Outcome

    Science.gov (United States)

    Lian, Huiwen; Ferris, D. Lance; Brown, Douglas J.

    2012-01-01

    We predicted that the effects of abusive supervision are likely to be moderated by subordinate power distance orientation and that the nature of the moderating effect will depend on the outcome. Drawing upon work suggesting that high power distance orientation subordinates are more tolerant of supervisory mistreatment, we posited that high power…

  20. Comparative analysis of whole mount processing and systematic sampling of radical prostatectomy specimens: pathological outcomes and risk of biochemical recurrence.

    Science.gov (United States)

    Salem, Shady; Chang, Sam S; Clark, Peter E; Davis, Rodney; Herrell, S Duke; Kordan, Yakup; Wills, Marcia L; Shappell, Scott B; Baumgartner, Roxelyn; Phillips, Sharon; Smith, Joseph A; Cookson, Michael S; Barocas, Daniel A

    2010-10-01

    Whole mount processing is more resource intensive than routine systematic sampling of radical retropubic prostatectomy specimens. We compared whole mount and systematic sampling for detecting pathological outcomes, and compared the prognostic value of pathological findings across pathological methods. We included men (608 whole mount and 525 systematic sampling samples) with no prior treatment who underwent radical retropubic prostatectomy at Vanderbilt University Medical Center between January 2000 and June 2008. We used univariate and multivariate analysis to compare the pathological outcome detection rate between pathological methods. Kaplan-Meier curves and the log rank test were used to compare the prognostic value of pathological findings across pathological methods. There were no significant differences between the whole mount and the systematic sampling groups in detecting extraprostatic extension (25% vs 30%), positive surgical margins (31% vs 31%), pathological Gleason score less than 7 (49% vs 43%), 7 (39% vs 43%) or greater than 7 (12% vs 13%), seminal vesicle invasion (8% vs 10%) or lymph node involvement (3% vs 5%). Tumor volume was higher in the systematic sampling group and whole mount detected more multiple surgical margins (each p systematic sampling yield similar pathological information. Each method stratifies patients into comparable risk groups for biochemical recurrence. Thus, while whole mount is more resource intensive, it does not appear to result in improved detection of clinically important pathological outcomes or prognostication. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  1. Can Research Design Explain Variation in Head Start Research Results? A Meta-Analysis of Cognitive and Achievement Outcomes

    Science.gov (United States)

    Shager, Hilary M.; Schindler, Holly S.; Magnuson, Katherine A.; Duncan, Greg J.; Yoshikawa, Hirokazu; Hart, Cassandra M. D.

    2013-01-01

    This study explores the extent to which differences in research design explain variation in Head Start program impacts. We employ meta-analytic techniques to predict effect sizes for cognitive and achievement outcomes as a function of the type and rigor of research design, quality and type of outcome measure, activity level of control group, and…

  2. Subjective and objective outcomes in randomized clinical trials

    DEFF Research Database (Denmark)

    Moustgaard, Helene; Bello, Segun; Miller, Franklin G

    2014-01-01

    explicitly defined the terms. CONCLUSION: The terms "subjective" and "objective" are ambiguous when used to describe outcomes in randomized clinical trials. We suggest that the terms should be defined explicitly when used in connection with the assessment of risk of bias in a clinical trial......OBJECTIVES: The degree of bias in randomized clinical trials varies depending on whether the outcome is subjective or objective. Assessment of the risk of bias in a clinical trial will therefore often involve categorization of the type of outcome. Our primary aim was to examine how the concepts...... "subjective outcome" and "objective outcome" are defined in methodological publications and clinical trial reports. To put this examination into perspective, we also provide an overview of how outcomes are classified more broadly. STUDY DESIGN AND SETTING: A systematic review of methodological publications...

  3. Patterns of drug dependence in a Queensland (Australia) sample of Indigenous and non-Indigenous people who inject drugs.

    Science.gov (United States)

    Smirnov, Andrew; Kemp, Robert; Ward, James; Henderson, Suzanna; Williams, Sidney; Dev, Abhilash; Najman, Jake M

    2016-09-01

    Despite over-representation of Indigenous Australians in sentinel studies of injecting drug use, little is known about relevant patterns of drug use and dependence. This study compares drug dependence and possible contributing factors in Indigenous and non-Indigenous Australians who inject drugs. Respondent-driven sampling was used in major cities and 'peer recruitment' in regional towns of Queensland to obtain a community sample of Indigenous (n = 282) and non-Indigenous (n = 267) injectors. Data are cross sectional. Multinomial models were developed for each group to examine types of dependence on injected drugs (no dependence, methamphetamine-dependent only, opioid-dependent only, dependent on methamphetamine and opioids). Around one-fifth of Indigenous and non-Indigenous injectors were dependent on both methamphetamine and opioids in the previous 12 months. Psychological distress was associated with dual dependence on these drugs for Indigenous [adjusted relative risk (ARR) 4.86, 95% confidence interval (CI) 2.08-11.34] and non-Indigenous (ARR 4.14, 95% CI 1.59-10.78) participants. Unemployment (ARR 8.98, 95% CI 2.25-35.82) and repeated (> once) incarceration as an adult (ARR 3.78, 95% CI 1.43-9.97) were associated with dual dependence for Indigenous participants only. Indigenous participants had high rates of alcohol dependence, except for those dependent on opioids only. The drug dependence patterns of Indigenous and non-Indigenous people who inject drugs were similar, including the proportions dependent on both methamphetamine and opioids. However, for Indigenous injectors, there was a stronger association between drug dependence and contextual factors such as unemployment and incarceration. Expansion of treatment options and community-level programs may be required. [Smirnov A, Kemp R, Ward J, Henderson S, Williams S, Dev A, Najman J M. Patterns of drug dependence in a Queensland (Australia) sample of Indigenous and non-Indigenous people who

  4. An internal pilot design for prospective cancer screening trials with unknown disease prevalence.

    Science.gov (United States)

    Brinton, John T; Ringham, Brandy M; Glueck, Deborah H

    2015-10-13

    For studies that compare the diagnostic accuracy of two screening tests, the sample size depends on the prevalence of disease in the study population, and on the variance of the outcome. Both parameters may be unknown during the design stage, which makes finding an accurate sample size difficult. To solve this problem, we propose adapting an internal pilot design. In this adapted design, researchers will accrue some percentage of the planned sample size, then estimate both the disease prevalence and the variances of the screening tests. The updated estimates of the disease prevalence and variance are used to conduct a more accurate power and sample size calculation. We demonstrate that in large samples, the adapted internal pilot design produces no Type I inflation. For small samples (N less than 50), we introduce a novel adjustment of the critical value to control the Type I error rate. We apply the method to two proposed prospective cancer screening studies: 1) a small oral cancer screening study in individuals with Fanconi anemia and 2) a large oral cancer screening trial. Conducting an internal pilot study without adjusting the critical value can cause Type I error rate inflation in small samples, but not in large samples. An internal pilot approach usually achieves goal power and, for most studies with sample size greater than 50, requires no Type I error correction. Further, we have provided a flexible and accurate approach to bound Type I error below a goal level for studies with small sample size.

  5. Impulsivity and attentional bias as predictors of modafinil treatment outcome for retention and drug use in crack-cocaine dependent patients: Results of a randomised controlled trial

    NARCIS (Netherlands)

    Nuijten, Mascha; Blanken, Peter; van den Brink, Wim; Goudriaan, Anna E.; Hendriks, Vincent M.

    2016-01-01

    High impulsivity and attentional bias are common in cocaine-dependent patients and predict poor treatment outcomes. The pharmacological agent modafinil is studied for its cognitive-enhancing capacities and may therefore improve clinical outcomes in crack-cocaine dependent patients. In this study, we

  6. Infill sampling criteria to locate extremes

    CSIR Research Space (South Africa)

    Watson, AG

    1995-07-01

    Full Text Available Three problem-dependent meanings for engineering ''extremes'' are motivated, established, and translated into formal geostatistical (model-based) criteria for designing infill sample networks. (I) Locate an area within the domain of interest where a...

  7. Sampling inspection for the evaluation of time-dependent reliability of deteriorating systems under imperfect defect detection

    International Nuclear Information System (INIS)

    Kuniewski, Sebastian P.; Weide, Johannes A.M. van der; Noortwijk, Jan M. van

    2009-01-01

    The paper presents a sampling-inspection strategy for the evaluation of time-dependent reliability of deteriorating systems, where the deterioration is assumed to initiate at random times and at random locations. After initiation, defects are weakening the system's resistance. The system becomes unacceptable when at least one defect reaches a critical depth. The defects are assumed to initiate at random times modeled as event times of a non-homogeneous Poisson process (NHPP) and to develop according to a non-decreasing time-dependent gamma process. The intensity rate of the NHPP is assumed to be a combination of a known time-dependent shape function and an unknown proportionality constant. When sampling inspection (i.e. inspection of a selected subregion of the system) results in a number of defect initiations, Bayes' theorem can be used to update prior beliefs about the proportionality constant of the NHPP intensity rate to the posterior distribution. On the basis of a time- and space-dependent Poisson process for the defect initiation, an adaptive Bayesian model for sampling inspection is developed to determine the predictive probability distribution of the time to failure. A potential application is, for instance, the inspection of a large vessel or pipeline suffering pitting/localized corrosion in the oil industry. The possibility of imperfect defect detection is also incorporated in the model.

  8. Diffusion Capillary Phantom vs. Human Data: Outcomes for Reconstruction Methods Depend on Evaluation Medium

    Directory of Open Access Journals (Sweden)

    Sarah D. Lichenstein

    2016-09-01

    Full Text Available Purpose: Diffusion MRI provides a non-invasive way of estimating structural connectivity in the brain. Many studies have used diffusion phantoms as benchmarks to assess the performance of different tractography reconstruction algorithms and assumed that the results can be applied to in vivo studies. Here we examined whether quality metrics derived from a common, publically available, diffusion phantom can reliably predict tractography performance in human white matter tissue. Material and Methods: We compared estimates of fiber length and fiber crossing among a simple tensor model (diffusion tensor imaging, a more complicated model (ball-and-sticks and model-free (diffusion spectrum imaging, generalized q-sampling imaging reconstruction methods using a capillary phantom and in vivo human data (N=14. Results: Our analysis showed that evaluation outcomes differ depending on whether they were obtained from phantom or human data. Specifically, the diffusion phantom favored a more complicated model over a simple tensor model or model-free methods for resolving crossing fibers. On the other hand, the human studies showed the opposite pattern of results, with the model-free methods being more advantageous than model-based methods or simple tensor models. This performance difference was consistent across several metrics, including estimating fiber length and resolving fiber crossings in established white matter pathways. Conclusions: These findings indicate that the construction of current capillary diffusion phantoms tends to favor complicated reconstruction models over a simple tensor model or model-free methods, whereas the in vivo data tends to produce opposite results. This brings into question the previous phantom-based evaluation approaches and suggests that a more realistic phantom or simulation is necessary to accurately predict the relative performance of different tractography reconstruction methods. Acronyms: BSM: ball-and-sticks model; d

  9. Design of a gravity corer for near shore sediment sampling

    Digital Repository Service at National Institute of Oceanography (India)

    Bhat, S.T.; Sonawane, A.V.; Nayak, B.U.

    For the purpose of geotechnical investigation a gravity corer has been designed and fabricated to obtain undisturbed sediment core samples from near shore waters. The corer was successfully operated at 75 stations up to water depth 30 m. Simplicity...

  10. ESPRIT study design and outcomes--a critical appraisal.

    Science.gov (United States)

    Einhäupl, Karl

    2007-02-01

    Evidence is needed to guide therapeutic decisions on patients who had ischaemic cerebral events. The recently published European/Australasian Stroke Prevention in Reversible Ischaemia Trial (ESPRIT), an open-label randomised controlled study, compared long-term treatment of patients randomised to aspirin 30-325 mg daily with (n = 1363) or without (n = 1376) dipyridamole 200 mg twice daily. The study found the combination to be superior to aspirin alone (13% vs. 16% events in a composite endpoint of vascular death, non-fatal stroke, non-fatal myocardial infarction, or major bleeding; hazard ratio 0.8; 95% confidence interval 0.66-0.98). In the interpretation of the results, criticism has been raised related to the study design (open-label, change during the study), the study conduct (half of the aspirin patients underdosed, 33% drop-out rate in the combination group, missing information on potential confounders such as protective concomitant medication), and the outcomes (lack of differences in the efficacy outcomes between the intent-to-treat and the on-treatment populations, lack of differences in minor bleedings between treatment groups, borderline statistical significance of primary study endpoint). Further studies are needed to determine the place of aspirin/dipyridamole combinations in the secondary prevention of stroke.

  11. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  12. Modelling of Context: Designing Mobile Systems from Domain-Dependent Models

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Stage, Jan

    2009-01-01

    Modelling of domain-dependent aspects is a key prerequisite for the design of software for mobile systems. Most mobile systems include a more or less advanced model of selected aspects of the domain in which they are used. This paper discusses the creation of such a model and its relevance for te...

  13. Service Design Outcomes in Finnish Book Industry : From Transition to Transformation

    OpenAIRE

    Nousiainen, Anu K.

    2013-01-01

    Service Design Outcomes in Finnish Book Industry - From Transition to Transformation Book industry is in transition especially due to technology advancements, converging operational environment and evolving consumer values and practices in the digital context. Simultaneously, the post-industrial paradigm shift from products to services, systems and experiences highlights the intangibles-driven economy, where existing value chain members need to find new directions and new business oppo...

  14. A literature review of the effects of computer input device design on biomechanical loading and musculoskeletal outcomes during computer work.

    Science.gov (United States)

    Bruno Garza, J L; Young, J G

    2015-01-01

    Extended use of conventional computer input devices is associated with negative musculoskeletal outcomes. While many alternative designs have been proposed, it is unclear whether these devices reduce biomechanical loading and musculoskeletal outcomes. To review studies describing and evaluating the biomechanical loading and musculoskeletal outcomes associated with conventional and alternative input devices. Included studies evaluated biomechanical loading and/or musculoskeletal outcomes of users' distal or proximal upper extremity regions associated with the operation of alternative input devices (pointing devices, mice, other devices) that could be used in a desktop personal computing environment during typical office work. Some alternative pointing device designs (e.g. rollerbar) were consistently associated with decreased biomechanical loading while other designs had inconsistent results across studies. Most alternative keyboards evaluated in the literature reduce biomechanical loading and musculoskeletal outcomes. Studies of other input devices (e.g. touchscreen and gestural controls) were rare, however, those reported to date indicate that these devices are currently unsuitable as replacements for traditional devices. Alternative input devices that reduce biomechanical loading may make better choices for preventing or alleviating musculoskeletal outcomes during computer use, however, it is unclear whether many existing designs are effective.

  15. Time-dependent rheoforging of A6061 aluminum alloy on a mechanical servo press and the effects of forming conditions on homogeneity of rheoforged samples

    Directory of Open Access Journals (Sweden)

    Meng Yi

    2015-01-01

    Full Text Available The solid and liquid phases in semisolid metal slurry exhibited different forming behaviours during deformation result in products with inhomogeneous quality. A6061 aluminum alloy was forged in the semisolid state on a mechanical servo press with the capability of multistage compression. To improve the homogeneity of rheoforged samples a time-dependent rheoforging strategy was designed. The distributions of the microstructure and mechanical properties the samples manufactured under various experimental conditions were investigated. The A6061 samples forged in the temperature range from 625 to 628 ∘C with a short holding time of 4 s and the upper die preheated to 300 ∘C exhibited a homogeneous microstructure and mechanical properties. The homogeneity of rheoforged samples resulted from the controllable free motion capability of the mechanical servo press and the adjustable fluidity and viscosity of the semisolid slurry.

  16. Turnover, staffing, skill mix, and resident outcomes in a national sample of US nursing homes.

    Science.gov (United States)

    Trinkoff, Alison M; Han, Kihye; Storr, Carla L; Lerner, Nancy; Johantgen, Meg; Gartrell, Kyungsook

    2013-12-01

    The authors examined the relationship of staff turnover to selected nursing home quality outcomes, in the context of staffing and skill mix. Staff turnover is a serious concern in nursing homes as it has been found to adversely affect care. When employee turnover is minimized, better care quality is more likely in nursing homes. Data from the National Nursing Home Survey, a nationally representative sample of US nursing homes, were linked to Nursing Home Compare quality outcomes and analyzed using logistic regression. Nursing homes with high certified nursing assistant turnover had significantly higher odds of pressure ulcers, pain, and urinary tract infections even after controlling for staffing, skill mix, bed size, and ownership. Nurse turnover was associated with twice the odds of pressure ulcers, although this was attenuated when staffing was controlled. This study suggests turnover may be more important in explaining nursing home (NH) outcomes than staffing and skill mix and should therefore be given greater emphasis.

  17. Sexual orientation, substance use behaviors and substance dependence in the United States

    Science.gov (United States)

    McCabe, Sean Esteban; Hughes, Tonda L.; Bostwick, Wendy B.; West, Brady T.; Boyd, Carol J.

    2009-01-01

    Aims To assess past-year prevalence rates of substance use behaviors and substance dependence across three major dimensions of sexual orientation (identity, attraction, and behavior) in a large national sample of adult women and men in the United States. Design Data were collected from structured diagnostic face-to-face interviews using the Alcohol Use Disorder and Associated Disabilities Interview Schedule DSM-IV Version (AUDADIS-IV). Setting Prevalence estimates were based on data collected from the 2004–2005 (Wave 2) National Epidemiologic Survey on Alcohol and Related Conditions (NESARC). Participants A large national sample of 34,653 adults aged 20 years and older: 52% female, 71% White, 12% Hispanic, 11% African American, 4% Asian, and 2% Native American or other racial/ethnic categories. Findings Approximately 2% of the sample self-identified as lesbian, gay or bisexual; 4% reported at least one lifetime same-sex sexual partner, and 6% reported same-sex sexual attraction. Although non-heterosexual orientation was generally associated with a higher risk of substance use and substance dependence, the majority of sexual minority respondents did not report substance use or meet criteria for DSM-IV substance dependence. There was considerable variation in substance use outcomes across sexual orientation dimensions; these variations were more pronounced among women than among men. Conclusions Results support previous research findings of heightened risk of substance use and substance dependence among some sexual minority groups and point to the need for research that examines reasons for such differences. Results also highlight important gender differences and question previous findings indicating uniformly higher risk for substance dependence among sexual minorities. Risks appear to vary based on gender and how sexual orientation is defined. Findings have implications for prevention and intervention efforts that more effectively target subgroups at greatest

  18. Improving neutron multiplicity counting for the spatial dependence of multiplication: Results for spherical plutonium samples

    Energy Technology Data Exchange (ETDEWEB)

    Göttsche, Malte, E-mail: malte.goettsche@physik.uni-hamburg.de; Kirchner, Gerald

    2015-10-21

    The fissile mass deduced from a neutron multiplicity counting measurement of high mass dense items is underestimated if the spatial dependence of the multiplication is not taken into account. It is shown that an appropriate physics-based correction successfully removes the bias. It depends on four correction coefficients which can only be exactly determined if the sample geometry and composition are known. In some cases, for example in warhead authentication, available information on the sample will be very limited. MCNPX-PoliMi simulations have been performed to obtain the correction coefficients for a range of spherical plutonium metal geometries, with and without polyethylene reflection placed around the spheres. For hollow spheres, the analysis shows that the correction coefficients can be approximated with high accuracy as a function of the sphere's thickness depending only slightly on the radius. If the thickness remains unknown, less accurate estimates of the correction coefficients can be obtained from the neutron multiplication. The influence of isotopic composition is limited. The correction coefficients become somewhat smaller when reflection is present.

  19. Development of a standard data base for FBR core nuclear design (XIII). Analysis of small sample reactivity experiments at ZPPR-9

    International Nuclear Information System (INIS)

    Sato, Wakaei; Fukushima, Manabu; Ishikawa, Makoto

    2000-09-01

    A comprehensive study to evaluate and accumulate the abundant results of fast reactor physics is now in progress at O-arai Engineering Center to improve analytical methods and prediction accuracy of nuclear design for large fast breeder cores such as future commercial FBRs. The present report summarizes the analytical results of sample reactivity experiments at ZPPR-9 core, which has not been evaluated by the latest analytical method yet. The intention of the work is to extend and further generalize the standard data base for FBR core nuclear design. The analytical results of the sample reactivity experiments (samples: PU-30, U-6, DU-6, SS-1 and B-1) at ZPPR-9 core in JUPITER series, with the latest nuclear data library JENDL-3.2 and the analytical method which was established by the JUPITER analysis, can be concluded as follows: The region-averaged final C/E values generally agreed with unity within 5% differences at the inner core region. However, the C/E values of every sample showed the radial space-dependency increasing from center to core edge, especially the discrepancy of B-1 was the largest by 10%. Next, the influence of the present analytical results for the ZPPR-9 sample reactivity to the cross-section adjustment was evaluated. The reference case was a unified cross-section set ADJ98 based on the recent JUPITER analysis. As a conclusion, the present analytical results have sufficient physical consistency with other JUPITER data, and possess qualification as a part of the standard data base for FBR nuclear design. (author)

  20. Assessment of radioactivity for 24 hours urine sample depending on correction factor by using creatinine

    International Nuclear Information System (INIS)

    Kharita, M. H.; Maghrabi, M.

    2006-09-01

    Assessment of intake and internal does requires knowing the amount of radioactivity in 24 hours urine sample, sometimes it is difficult to get 24 hour sample because this method is not comfortable and in most cases the workers refuse to collect this amount of urine. This work focuses on finding correction factor of 24 hour sample depending on knowing the amount of creatinine in the sample whatever the size of this sample. Then the 24 hours excretion of radionuclide is calculated assuming the average creatinine excretion rate is 1.7 g per 24 hours, based on the amount of activity and creatinine in the urine sample. Several urine sample were collected from occupationally exposed workers the amount and ratios of creatinine and activity in these samples were determined, then normalized to 24 excretion of radionuclide. The average chemical recovery was 77%. It should be emphasized that this method should only be used if a 24 hours sample was not possible to collect. (author)

  1. Sampling methods

    International Nuclear Information System (INIS)

    Loughran, R.J.; Wallbrink, P.J.; Walling, D.E.; Appleby, P.G.

    2002-01-01

    Methods for the collection of soil samples to determine levels of 137 Cs and other fallout radionuclides, such as excess 210 Pb and 7 Be, will depend on the purposes (aims) of the project, site and soil characteristics, analytical capacity, the total number of samples that can be analysed and the sample mass required. The latter two will depend partly on detector type and capabilities. A variety of field methods have been developed for different field conditions and circumstances over the past twenty years, many of them inherited or adapted from soil science and sedimentology. The use of them inherited or adapted from soil science and sedimentology. The use of 137 Cs in erosion studies has been widely developed, while the application of fallout 210 Pb and 7 Be is still developing. Although it is possible to measure these nuclides simultaneously, it is common for experiments to designed around the use of 137 Cs along. Caesium studies typically involve comparison of the inventories found at eroded or sedimentation sites with that of a 'reference' site. An accurate characterization of the depth distribution of these fallout nuclides is often required in order to apply and/or calibrate the conversion models. However, depending on the tracer involved, the depth distribution, and thus the sampling resolution required to define it, differs. For example, a depth resolution of 1 cm is often adequate when using 137 Cs. However, fallout 210 Pb and 7 Be commonly has very strong surface maxima that decrease exponentially with depth, and fine depth increments are required at or close to the soil surface. Consequently, different depth incremental sampling methods are required when using different fallout radionuclides. Geomorphic investigations also frequently require determination of the depth-distribution of fallout nuclides on slopes and depositional sites as well as their total inventories

  2. Analysis of Longitudinal Studies With Repeated Outcome Measures: Adjusting for Time-Dependent Confounding Using Conventional Methods.

    Science.gov (United States)

    Keogh, Ruth H; Daniel, Rhian M; VanderWeele, Tyler J; Vansteelandt, Stijn

    2018-05-01

    Estimation of causal effects of time-varying exposures using longitudinal data is a common problem in epidemiology. When there are time-varying confounders, which may include past outcomes, affected by prior exposure, standard regression methods can lead to bias. Methods such as inverse probability weighted estimation of marginal structural models have been developed to address this problem. However, in this paper we show how standard regression methods can be used, even in the presence of time-dependent confounding, to estimate the total effect of an exposure on a subsequent outcome by controlling appropriately for prior exposures, outcomes, and time-varying covariates. We refer to the resulting estimation approach as sequential conditional mean models (SCMMs), which can be fitted using generalized estimating equations. We outline this approach and describe how including propensity score adjustment is advantageous. We compare the causal effects being estimated using SCMMs and marginal structural models, and we compare the two approaches using simulations. SCMMs enable more precise inferences, with greater robustness against model misspecification via propensity score adjustment, and easily accommodate continuous exposures and interactions. A new test for direct effects of past exposures on a subsequent outcome is described.

  3. Irritability Trajectories, Cortical Thickness, and Clinical Outcomes in a Sample Enriched for Preschool Depression.

    Science.gov (United States)

    Pagliaccio, David; Pine, Daniel S; Barch, Deanna M; Luby, Joan L; Leibenluft, Ellen

    2018-05-01

    Cross-sectional, longitudinal, and genetic associations exist between irritability and depression. Prior studies have examined developmental trajectories of irritability, clinical outcomes, and associations with child and familial depression. However, studies have not integrated neurobiological measures. The present study examined developmental trajectories of irritability, clinical outcomes, and cortical structure among preschoolers oversampled for depressive symptoms. Beginning at 3 to 5 years old, a sample of 271 children enriched for early depressive symptoms were assessed longitudinally by clinical interview. Latent class mixture models identified trajectories of irritability severity. Risk factors, clinical outcomes, and cortical thickness were compared across trajectory classes. Cortical thickness measures were extracted from 3 waves of magnetic resonance imaging at 7 to 12 years of age. Three trajectory classes were identified among these youth: 53.50% of children exhibited elevated irritability during preschool that decreased longitudinally, 30.26% exhibited consistently low irritability, and 16.24% exhibited consistently elevated irritability. Compared with other classes, the elevated irritability class exhibited higher rates of maternal depression, early life adversity, later psychiatric diagnoses, and functional impairment. Further, elevated baseline irritability predicted later depression beyond adversity and personal and maternal depression history. The elevated irritability class exhibited a thicker cortex in the left superior frontal and temporal gyri and the right inferior parietal lobule. Irritability manifested with specific developmental trajectories in this sample enriched for early depression. Persistently elevated irritability predicted poor psychiatric outcomes, higher risk for later depression, and decreased overall function later in development. Greater frontal, temporal, and parietal cortical thickness also was found, providing neural

  4. The Impact of Group Design Projects in Engineering on Achievement Goal Orientations and Academic Outcomes

    Science.gov (United States)

    Rambo-Hernandez, Karen E.; Atadero, Rebecca A.; Balgopal, Meena

    2017-01-01

    This study examined the impact of incorporating group design projects into a second-year engineering class on achievement goal orientations and two academic outcomes: concept inventory and final exam scores. In this study, two sections were taught using lecture format, but one section also completed three group design projects as part of their…

  5. Quantification of errors in ordinal outcome scales using shannon entropy: effect on sample size calculations.

    Directory of Open Access Journals (Sweden)

    Pitchaiah Mandava

    Full Text Available OBJECTIVE: Clinical trial outcomes often involve an ordinal scale of subjective functional assessments but the optimal way to quantify results is not clear. In stroke, the most commonly used scale, the modified Rankin Score (mRS, a range of scores ("Shift" is proposed as superior to dichotomization because of greater information transfer. The influence of known uncertainties in mRS assessment has not been quantified. We hypothesized that errors caused by uncertainties could be quantified by applying information theory. Using Shannon's model, we quantified errors of the "Shift" compared to dichotomized outcomes using published distributions of mRS uncertainties and applied this model to clinical trials. METHODS: We identified 35 randomized stroke trials that met inclusion criteria. Each trial's mRS distribution was multiplied with the noise distribution from published mRS inter-rater variability to generate an error percentage for "shift" and dichotomized cut-points. For the SAINT I neuroprotectant trial, considered positive by "shift" mRS while the larger follow-up SAINT II trial was negative, we recalculated sample size required if classification uncertainty was taken into account. RESULTS: Considering the full mRS range, error rate was 26.1%±5.31 (Mean±SD. Error rates were lower for all dichotomizations tested using cut-points (e.g. mRS 1; 6.8%±2.89; overall p<0.001. Taking errors into account, SAINT I would have required 24% more subjects than were randomized. CONCLUSION: We show when uncertainty in assessments is considered, the lowest error rates are with dichotomization. While using the full range of mRS is conceptually appealing, a gain of information is counter-balanced by a decrease in reliability. The resultant errors need to be considered since sample size may otherwise be underestimated. In principle, we have outlined an approach to error estimation for any condition in which there are uncertainties in outcome assessment. We

  6. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  7. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  8. Length-dependent optical properties of single-walled carbon nanotube samples

    International Nuclear Information System (INIS)

    Naumov, Anton V.; Tsyboulski, Dmitri A.; Bachilo, Sergei M.; Weisman, R. Bruce

    2013-01-01

    Highlights: ► Length-independent absorption per atom in single-walled carbon nanotubes. ► Reduced fluorescence quantum yield for short nanotubes. ► Exciton quenching at nanotube ends, sidewall defects probably limits quantum yield. - Abstract: Contradictory findings have been reported on the length dependence of optical absorption cross sections and fluorescence quantum yields in single-walled carbon nanotubes (SWCNTs). To clarify these points, studies have been made on bulk SWCNT dispersions subjected to length fractionation by electrophoretic separation or by ultrasonication-induced scission. Fractions ranged from ca. 120 to 760 nm in mean length. Samples prepared by shear-assisted dispersion were subsequently shortened by ultrasonic processing. After accounting for processing-induced changes in the surfactant absorption background, SWCNT absorption was found constant within ±11% as average nanotube length changed by a factor of 3.8. This indicates that the absorption cross-section per carbon atom is not length dependent. By contrast, in length fractions prepared by both methods, the bulk fluorescence efficiency or average quantum yield increased with SWCNT average length and approached an apparent asymptotic limit near 1 μm. This result is interpreted as reflecting the combined contributions of exciton quenching by sidewall defects and by the ends of shorter nanotubes

  9. Design for mosquito abundance, diversity, and phenology sampling within the National Ecological Observatory Network

    Science.gov (United States)

    Hoekman, D.; Springer, Yuri P.; Barker, C.M.; Barrera, R.; Blackmore, M.S.; Bradshaw, W.E.; Foley, D. H.; Ginsberg, Howard; Hayden, M. H.; Holzapfel, C. M.; Juliano, S. A.; Kramer, L. D.; LaDeau, S. L.; Livdahl, T. P.; Moore, C. G.; Nasci, R.S.; Reisen, W.K.; Savage, H. M.

    2016-01-01

    The National Ecological Observatory Network (NEON) intends to monitor mosquito populations across its broad geographical range of sites because of their prevalence in food webs, sensitivity to abiotic factors and relevance for human health. We describe the design of mosquito population sampling in the context of NEON’s long term continental scale monitoring program, emphasizing the sampling design schedule, priorities and collection methods. Freely available NEON data and associated field and laboratory samples, will increase our understanding of how mosquito abundance, demography, diversity and phenology are responding to land use and climate change.

  10. Impact of Social Cognition on Alcohol Dependence Treatment Outcome: Poorer Facial Emotion Recognition Predicts Relapse/Dropout.

    Science.gov (United States)

    Rupp, Claudia I; Derntl, Birgit; Osthaus, Friederike; Kemmler, Georg; Fleischhacker, W Wolfgang

    2017-12-01

    Despite growing evidence for neurobehavioral deficits in social cognition in alcohol use disorder (AUD), the clinical relevance remains unclear, and little is known about its impact on treatment outcome. This study prospectively investigated the impact of neurocognitive social abilities at treatment onset on treatment completion. Fifty-nine alcohol-dependent patients were assessed with measures of social cognition including 3 core components of empathy via paradigms measuring: (i) emotion recognition (the ability to recognize emotions via facial expression), (ii) emotional perspective taking, and (iii) affective responsiveness at the beginning of inpatient treatment for alcohol dependence. Subjective measures were also obtained, including estimates of task performance and a self-report measure of empathic abilities (Interpersonal Reactivity Index). According to treatment outcomes, patients were divided into a patient group with a regular treatment course (e.g., with planned discharge and without relapse during treatment) or an irregular treatment course (e.g., relapse and/or premature and unplanned termination of treatment, "dropout"). Compared with patients completing treatment in a regular fashion, patients with relapse and/or dropout of treatment had significantly poorer facial emotion recognition ability at treatment onset. Additional logistic regression analyses confirmed these results and identified poor emotion recognition performance as a significant predictor for relapse/dropout. Self-report (subjective) measures did not correspond with neurobehavioral social cognition measures, respectively objective task performance. Analyses of individual subtypes of facial emotions revealed poorer recognition particularly of disgust, anger, and no (neutral faces) emotion in patients with relapse/dropout. Social cognition in AUD is clinically relevant. Less successful treatment outcome was associated with poorer facial emotion recognition ability at the beginning of

  11. The contribution of self-efficacy and outcome expectations in the ...

    African Journals Online (AJOL)

    This study examined the effectiveness of Bandura's self-efficacy theory to predict exercise adherence. A sample of new members at a gymnasium was assessed on a Physical Self-Efficacy Scale, an Adherence Efficacy Scale and an Outcome Expectancy Scale. The dependent variable, exercise adherence, was assessed by ...

  12. Sampling depth confounds soil acidification outcomes

    Science.gov (United States)

    In the northern Great Plains (NGP) of North America, surface sampling depths of 0-15 or 0-20 cm are suggested for testing soil characteristics such as pH. However, acidification is often most pronounced near the soil surface. Thus, sampling deeper can potentially dilute (increase) pH measurements an...

  13. Multi-saline sample distillation apparatus for hydrogen isotope analyses : design and accuracy

    Science.gov (United States)

    Hassan, Afifa Afifi

    1981-01-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated. (USGS)

  14. Using simulation to aid trial design: Ring-vaccination trials.

    Directory of Open Access Journals (Sweden)

    Matt David Thomas Hitchings

    2017-03-01

    Full Text Available The 2014-6 West African Ebola epidemic highlights the need for rigorous, rapid clinical trial methods for vaccines. A challenge for trial design is making sample size calculations based on incidence within the trial, total vaccine effect, and intracluster correlation, when these parameters are uncertain in the presence of indirect effects of vaccination.We present a stochastic, compartmental model for a ring vaccination trial. After identification of an index case, a ring of contacts is recruited and either vaccinated immediately or after 21 days. The primary outcome of the trial is total vaccine effect, counting cases only from a pre-specified window in which the immediate arm is assumed to be fully protected and the delayed arm is not protected. Simulation results are used to calculate necessary sample size and estimated vaccine effect. Under baseline assumptions about vaccine properties, monthly incidence in unvaccinated rings and trial design, a standard sample-size calculation neglecting dynamic effects estimated that 7,100 participants would be needed to achieve 80% power to detect a difference in attack rate between arms, while incorporating dynamic considerations in the model increased the estimate to 8,900. This approach replaces assumptions about parameters at the ring level with assumptions about disease dynamics and vaccine characteristics at the individual level, so within this framework we were able to describe the sensitivity of the trial power and estimated effect to various parameters. We found that both of these quantities are sensitive to properties of the vaccine, to setting-specific parameters over which investigators have little control, and to parameters that are determined by the study design.Incorporating simulation into the trial design process can improve robustness of sample size calculations. For this specific trial design, vaccine effectiveness depends on properties of the ring vaccination design and on the

  15. Derivation of time dependent design-values for SNR 300 structural material

    International Nuclear Information System (INIS)

    Lorenz, H.; Breitling, H.; de Heesen, E.

    1976-01-01

    Time-dependent design values were derived from long-term creep rupture data for steel X 6 CrNi 1811 in the unwelded and welded condition. The design values had to satisfy the ASME CC 1592 criterea with respect to creep rupture strength, time to reach 1% strain and transition to tertiary creep as well as the requirement of German regulatory rules to properly account for weld bahaviour. For the evaluation and extrapolation 2 proven computer programmes were used. The design data derived under consideration of weld joints show relative good agreement with the values of ASME CC 1592. Consideration of welds leads to lower design values above 550 0 C and 5x10 3 h with the difference between rolled and weld material becoming larger with increasing time and temperature. (author)

  16. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    International Nuclear Information System (INIS)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B.

    2015-01-01

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging

  17. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B. [Radiation Impact Assessment Section, Radiological Safety Division, Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India)

    2015-07-15

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  18. Non-motor outcomes of subthalamic stimulation in Parkinson's disease depend on location of active contacts.

    Science.gov (United States)

    Dafsari, Haidar Salimi; Petry-Schmelzer, Jan Niklas; Ray-Chaudhuri, K; Ashkan, Keyoumars; Weis, Luca; Dembek, Till A; Samuel, Michael; Rizos, Alexandra; Silverdale, Monty; Barbe, Michael T; Fink, Gereon R; Evans, Julian; Martinez-Martin, Pablo; Antonini, Angelo; Visser-Vandewalle, Veerle; Timmermann, Lars

    2018-03-16

    Subthalamic nucleus (STN) deep brain stimulation (DBS) improves quality of life (QoL), motor, and non-motor symptoms (NMS) in Parkinson's disease (PD). Few studies have investigated the influence of the location of neurostimulation on NMS. To investigate the impact of active contact location on NMS in STN-DBS in PD. In this prospective, open-label, multicenter study including 50 PD patients undergoing bilateral STN-DBS, we collected NMSScale (NMSS), NMSQuestionnaire (NMSQ), Hospital Anxiety and Depression Scale (anxiety/depression, HADS-A/-D), PDQuestionnaire-8 (PDQ-8), Scales for Outcomes in PD-motor examination, motor complications, activities of daily living (ADL), and levodopa equivalent daily dose (LEDD) preoperatively and at 6 months follow-up. Changes were analyzed with Wilcoxon signed-rank/t-test and Bonferroni-correction for multiple comparisons. Although the STN was targeted visually, we employed an atlas-based approach to explore the relationship between active contact locations and DBS outcomes. Based on fused MRI/CT-images, we identified Cartesian coordinates of active contacts with patient-specific Mai-atlas standardization. We computed linear mixed-effects models with x-/y-/z-coordinates as independent, hemispheres as within-subject, and test change scores as dependent variables. NMSS, NMSQ, PDQ-8, motor examination, complications, and LEDD significantly improved at follow-up. Linear mixed-effect models showed that NMS and QoL improvement significantly depended on more medial (HADS-D, NMSS), anterior (HADS-D, NMSQ, PDQ-8), and ventral (HADS-A/-D, NMSS, PDQ-8) neurostimulation. ADL improved more in posterior, LEDD in lateral neurostimulation locations. No relationship was observed for motor examination and complications scores. Our study provides evidence that more anterior, medial, and ventral STN-DBS is significantly related to more beneficial non-motor outcomes. Copyright © 2018. Published by Elsevier Inc.

  19. Design issues in toxicogenomics using DNA microarray experiment

    International Nuclear Information System (INIS)

    Lee, Kyoung-Mu; Kim, Ju-Han; Kang, Daehee

    2005-01-01

    The methods of toxicogenomics might be classified into omics study (e.g., genomics, proteomics, and metabolomics) and population study focusing on risk assessment and gene-environment interaction. In omics study, microarray is the most popular approach. Genes falling into several categories (e.g., xenobiotics metabolism, cell cycle control, DNA repair etc.) can be selected up to 20,000 according to a priori hypothesis. The appropriate type of samples and species should be selected in advance. Multiple doses and varied exposure durations are suggested to identify those genes clearly linked to toxic response. Microarray experiments can be affected by numerous nuisance variables including experimental designs, sample extraction, type of scanners, etc. The number of slides might be determined from the magnitude and variance of expression change, false-positive rate, and desired power. Instead, pooling samples is an alternative. Online databases on chemicals with known exposure-disease outcomes and genetic information can aid the interpretation of the normalized results. Gene function can be inferred from microarray data analyzed by bioinformatics methods such as cluster analysis. The population study often adopts hospital-based or nested case-control design. Biases in subject selection and exposure assessment should be minimized, and confounding bias should also be controlled for in stratified or multiple regression analysis. Optimal sample sizes are dependent on the statistical test for gene-to-environment or gene-to-gene interaction. The design issues addressed in this mini-review are crucial in conducting toxicogenomics study. In addition, integrative approach of exposure assessment, epidemiology, and clinical trial is required

  20. Shielding design of highly activated sample storage at reactor TRIGA PUSPATI

    International Nuclear Information System (INIS)

    Naim Syauqi Hamzah; Julia Abdul Karim; Mohamad Hairie Rabir; Muhd Husamuddin Abdul Khalil; Mohd Amin Sharifuldin Salleh

    2010-01-01

    Radiation protection has always been one of the most important things considered in Reaktor Triga PUSPATI (RTP) management. Currently, demands on sample activation were increased from variety of applicant in different research field area. Radiological hazard may occur if the samples evaluation done were misjudge or miscalculated. At present, there is no appropriate storage for highly activated samples. For that purpose, special irradiated samples storage box should be provided in order to segregate highly activated samples that produce high dose level and typical activated samples that produce lower dose level (1 - 2 mR/ hr). In this study, thickness required by common shielding material such as lead and concrete to reduce highly activated radiotracer sample (potassium bromide) with initial exposure dose of 5 R/ hr to background level (0.05 mR/ hr) were determined. Analyses were done using several methods including conventional shielding equation, half value layer calculation and Micro shield computer code. Design of new irradiated samples storage box for RTP that capable to contain high level gamma radioactivity were then proposed. (author)

  1. Sampling designs for contaminant temporal trend analyses using sedentary species exemplified by the snails Bellamya aeruginosa and Viviparus viviparus.

    Science.gov (United States)

    Yin, Ge; Danielsson, Sara; Dahlberg, Anna-Karin; Zhou, Yihui; Qiu, Yanling; Nyberg, Elisabeth; Bignert, Anders

    2017-10-01

    Environmental monitoring typically assumes samples and sampling activities to be representative of the population being studied. Given a limited budget, an appropriate sampling strategy is essential to support detecting temporal trends of contaminants. In the present study, based on real chemical analysis data on polybrominated diphenyl ethers in snails collected from five subsites in Tianmu Lake, computer simulation is performed to evaluate three sampling strategies by the estimation of required sample size, to reach a detection of an annual change of 5% with a statistical power of 80% and 90% with a significant level of 5%. The results showed that sampling from an arbitrarily selected sampling spot is the worst strategy, requiring much more individual analyses to achieve the above mentioned criteria compared with the other two approaches. A fixed sampling site requires the lowest sample size but may not be representative for the intended study object e.g. a lake and is also sensitive to changes of that particular sampling site. In contrast, sampling at multiple sites along the shore each year, and using pooled samples when the cost to collect and prepare individual specimens are much lower than the cost for chemical analysis, would be the most robust and cost efficient strategy in the long run. Using statistical power as criterion, the results demonstrated quantitatively the consequences of various sampling strategies, and could guide users with respect of required sample sizes depending on sampling design for long term monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Radon exhalation and its dependence on moisture content from samples of soil and building materials

    International Nuclear Information System (INIS)

    Faheem, Munazza; Matiullah

    2008-01-01

    Indoor radon has long been recognized as a potential health hazard for mankind. Building materials are considered as one of the major sources of radon in the indoor environment. To study radon exhalation rate and its dependence on moisture content, samples of soil and some common types of building materials (sand, cement, bricks and marble) were collected from Gujranwala, Gujrat, Hafizabad, Sialkot, Mandibahauddin and Narowal districts of the Punjab province (Pakistan). After processing, samples of 200 g each were placed in plastic vessels. CR-39 based NRPB detector were placed at the top of these vessels and were then hermetically sealed. After exposing to radon for 30 days within the closed vessels, the CR-39 detectors were processed. Radon exhalation rate was found to vary from 122±19 to 681±10mBqm -2 h -1 with an average of 376±147mBqm -2 h -1 in the soil samples whereas an average of 212±34, 195±25, 231±30 and 292±35mBqm -2 h -1 was observed in bricks, sand, cement and marble samples, respectively. Dependence of exhalation on moisture content has also been studied. Radon exhalation rate was found to increase with an increase in moisture, reached its maximum value and then decreased with further increase in the water content

  3. Two specialized delayed-neutron detector designs for assays of fissionable elements in water and sediment samples

    International Nuclear Information System (INIS)

    Balestrini, S.J.; Balagna, J.P.; Menlove, H.O.

    1976-01-01

    Two specialized neutron-sensitive detectors are described which are employed for rapid assays of fissionable elements by sensing for delayed neutrons emitted by samples after they have been irradiated in a nuclear reactor. The more sensitive of the two detectors, designed to assay for uranium in water samples, is 40% efficient; the other, designed for sediment sample assays, is 27% efficient. These detectors are also designed to operate under water as an inexpensive shielding against neutron leakage from the reactor and neutrons from cosmic rays. (Auth.)

  4. The Study of early maladaptive Schemas in Men Dependent on Drugs and those Not Dependent

    Directory of Open Access Journals (Sweden)

    MR Firoozi

    2015-11-01

    Full Text Available Introduction and Objectives of the Study Dependence on drugs is a prevalent problem throughout the world, in general and in Iran, in particular. Such a phenomenon is associated with numerous negative consequences. Given the changes in the consumption patterns in different countries, especially in Iran, and the abuse of drugs, identifying the factors which may pave the way for drug abuse is absolutely essential, which can be taken into account in setting the objectives of therapy programs. Cognizant of this, the present study has sought to study early maladaptive schemas in men dependent on drugs and those that are not. Materials and Method: The research design adopted in the present study was causal-comparative. The populaiton of interest was all the men dependent on drugs in the city of Yasouj, who had presented to recovery centers affiliated to welfare organization and Yasouj University of Medical Sciences in the year 2014 and were undergoing treatment. Using multi-stage cluster sampling, initially out of 23 centers for recovery programs, four centers were randomly chosen. Following that, out of each center, 20 and in total, 80 clients were chosen as the sample through convenience sampling. In addition, 80 men not dependent on drugs were chosen through matching with the sample dependent on drugs in terms of age, gender and locality. For the purpsoe of measuing early maladaptive schemas, we made use of the short version of Young’s early maladaptive Schemas Questionnaire. In order to determine the prevalent schemas in men dependent on drugs and those not dependent and the difference between the two groups, use was made of independent-sample t-test and effect size (d. Findings: The findings suggest that mean values of those dependent on drugs in all the schemas in quesiton were significantly higher than those of men not dependent. Although the difference in effect size of all schemas fell in the domain of the large effect, the largest

  5. Clinical syndromes, personality and recovery from stress: A study in occupational samples

    Directory of Open Access Journals (Sweden)

    José Miguel Rodríguez Molina

    2012-12-01

    Full Text Available Introduction. Stress manifests itself with different intensity and effects on different people. In many cases it leads to serious health problems or may worsen the prognosis of certain diseases. Stress has been linked to many conditions such as cancer, cardiovascular diseases and infectious diseases. The workplace can be a source of chronic stress. Many variables have been described that allegedly modulate stress response. Aim. To rank the relationship between some of these variables. A model is presented in this study whereby psychopathological personality traits should be related to one of those modulating variables and thus, with the subject's ability to recover from stress. Design. A cross-sectional descriptive design was used. Participants. The sample consisted of 108 volunteers: 15 drivers of Madrid city buses, 44 Iberia flight attendants and 49 waiters in bars in the Community of Madrid. Only 4 bus drivers refused to participate. All flight attendants and waiters consented to be included in the study. Intervention. Tests RESTQ-WORK of Kallus and Jiménez and MCMI of Millon were applied to a sample of 108 workers (bus drivers, bar tender and flight attendants. Outcomes. The hypothesis was verified through Hierarchical Multiple Regression Analysis for each dependant variable.

  6. Influence of Flow Sequencing Attributed to Climate Change and Climate Variability on the Assessment of Water-dependent Ecosystem Outcomes

    Science.gov (United States)

    Wang, J.; Nathan, R.; Horne, A.

    2017-12-01

    Traditional approaches to characterize water-dependent ecosystem outcomes in response to flow have been based on time-averaged hydrological indicators, however there is increasing recognition for the need to characterize ecological processes that are highly dependent on the sequencing of flow conditions (i.e. floods and droughts). This study considers the representation of flow regimes when considering assessment of ecological outcomes, and in particular, the need to account for sequencing and variability of flow. We conducted two case studies - one in the largely unregulated Ovens River catchment and one in the highly regulated Murray River catchment (both located in south-eastern Australia) - to explore the importance of flow sequencing to the condition of a typical long-lived ecological asset in Australia, the River Red Gum forests. In the first, the Ovens River case study, the implications of representing climate change using different downscaling methods (annual scaling, monthly scaling, quantile mapping, and weather generator method) on the sequencing of flows and resulting ecological outcomes were considered. In the second, the Murray River catchment, sequencing within a historic drought period was considered by systematically making modest adjustments on an annual basis to the hydrological records. In both cases, the condition of River Red Gum forests was assessed using an ecological model that incorporates transitions between ecological conditions in response to sequences of required flow components. The results of both studies show the importance of considering how hydrological alterations are represented when assessing ecological outcomes. The Ovens case study showed that there is significant variation in the predicted ecological outcomes when different downscaling techniques are applied. Similarly, the analysis in the Murray case study showed that the drought as it historically occurred provided one of the best possible outcomes for River Red Gum

  7. A phoswich detector design for improved spatial sampling in PET

    Science.gov (United States)

    Thiessen, Jonathan D.; Koschan, Merry A.; Melcher, Charles L.; Meng, Fang; Schellenberg, Graham; Goertzen, Andrew L.

    2018-02-01

    Block detector designs, utilizing a pixelated scintillator array coupled to a photosensor array in a light-sharing design, are commonly used for positron emission tomography (PET) imaging applications. In practice, the spatial sampling of these designs is limited by the crystal pitch, which must be large enough for individual crystals to be resolved in the detector flood image. Replacing the conventional 2D scintillator array with an array of phoswich elements, each consisting of an optically coupled side-by-side scintillator pair, may improve spatial sampling in one direction of the array without requiring resolving smaller crystal elements. To test the feasibility of this design, a 4 × 4 phoswich array was constructed, with each phoswich element consisting of two optically coupled, 3 . 17 × 1 . 58 × 10mm3 LSO crystals co-doped with cerium and calcium. The amount of calcium doping was varied to create a 'fast' LSO crystal with decay time of 32.9 ns and a 'slow' LSO crystal with decay time of 41.2 ns. Using a Hamamatsu R8900U-00-C12 position-sensitive photomultiplier tube (PS-PMT) and a CAEN V1720 250 MS/s waveform digitizer, we were able to show effective discrimination of the fast and slow LSO crystals in the phoswich array. Although a side-by-side phoswich array is feasible, reflections at the crystal boundary due to a mismatch between the refractive index of the optical adhesive (n = 1 . 5) and LSO (n = 1 . 82) caused it to behave optically as an 8 × 4 array rather than a 4 × 4 array. Direct coupling of each phoswich element to individual photodetector elements may be necessary with the current phoswich array design. Alternatively, in order to implement this phoswich design with a conventional light sharing PET block detector, a high refractive index optical adhesive is necessary to closely match the refractive index of LSO.

  8. Threshold-dependent sample sizes for selenium assessment with stream fish tissue

    Science.gov (United States)

    Hitt, Nathaniel P.; Smith, David R.

    2015-01-01

    precision of composites for estimating mean conditions. However, low sample sizes (<5 fish) did not achieve 80% power to detect near-threshold values (i.e., <1 mg Se/kg) under any scenario we evaluated. This analysis can assist the sampling design and interpretation of Se assessments from fish tissue by accounting for natural variation in stream fish populations.

  9. Clinical presentation and outcome of avoidant/restrictive food intake disorder in a Japanese sample.

    Science.gov (United States)

    Nakai, Yoshikatsu; Nin, Kazuko; Noma, Shun'ichi; Hamagaki, Seiji; Takagi, Ryuro; Teramukai, Satoshi; Wonderlich, Stephen A

    2017-01-01

    We conducted a study of the clinical presentation and outcome in patients with avoidant/restrictive food intake disorder (ARFID), aged 15-40years, and compared this group to an anorexia nervosa (AN) group in a Japanese sample. A retrospective chart review was completed on 245 patients with feeding and eating disorders (FEDs), analyzing prevalence, clinical presentation, psychopathological properties, and outcomes. Using the DSM-5 criteria, 27 (11.0%) out of the 245 patients with a FED met the criteria for ARFID at entry. All patients with ARFID were women. In terms of eating disorder symptoms, all patients with ARFID had restrictive eating related to emotional problems and/or gastrointestinal symptoms. However, none of the ARFID patients reported food avoidance related to sensory characteristics or functional dysphagia. Additionally, none of them exhibited binge eating or purging behaviors, and none of them reported excessive exercise. The ARFID group had a significantly shorter duration of illness, lower rates of admission history, and less severe psychopathology than the AN group. The ARFID group reported significantly better outcome results than the AN group. These results suggest that patients with ARFID in this study were clinically distinct from those with AN and somewhat different from pediatric patients with ARFID in previous studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Effect of smoking status and nicotine dependence on pain intensity and outcome of treatment in Indian patients with temporomandibular disorders: A longitudinal cohort study.

    Science.gov (United States)

    Katyayan, Preeti Agarwal; Katyayan, Manish Khan

    2017-01-01

    Evidence regarding the association of smoking with various forms of chronic musculoskeletal pain is vast, but that with temporomandibular disorders (TMD) is scarce. The aims of this study are to evaluate the effect of smoking status (SS) and nicotine dependence (ND) on TMD pain intensity and treatment outcome in an Indian population with TMD. Nine hundred and sixty-two patients with TMD were selected for this longitudinal cohort study. Lifetime SS was evaluated and patients were classified as current smokers (YS), former smokers (FS), or nonsmokers (NS). The Fagerstrom test was used to evaluate the ND of YS. Pain intensity was evaluated using visual analog scale scores. Six months posttreatment, the pain intensity was again recorded. The effect of treatment was evaluated using a global transition outcome measure and categorized as treatment success or failure. A minimum 30% reduction in pain was used as a criterion for categorizing patients as those who had gotten "better." Data obtained from the study were compared using Chi-square tests, paired samples t -tests, and one-way ANOVA tests. The criterion for statistical significance for all analyses was set at P = 0.05. Among groups of SS, YS showed the maximum pain intensity at baseline and posttreatment. The outcome of treatment was most successful in NS and least in FS. The number of patients who had gotten "better" after treatment was significantly highest in NS. There was no significant difference between groups of ND with respect to pain intensity, treatment outcome, or "better" patients. Among Indian patients with TMD, smokers reported significantly greater pain intensity and poorer response to treatment than NS. Pain intensity or treatment outcome was independent of ND.

  11. Quality-control design for surface-water sampling in the National Water-Quality Network

    Science.gov (United States)

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  12. Planning Considerations for a Mars Sample Receiving Facility: Summary and Interpretation of Three Design Studies

    Science.gov (United States)

    Beaty, David W.; Allen, Carlton C.; Bass, Deborah S.; Buxbaum, Karen L.; Campbell, James K.; Lindstrom, David J.; Miller, Sylvia L.; Papanastassiou, Dimitri A.

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  13. Low-sensitivity H ∞ filter design for linear delta operator systems with sampling time jitter

    Science.gov (United States)

    Guo, Xiang-Gui; Yang, Guang-Hong

    2012-04-01

    This article is concerned with the problem of designing H ∞ filters for a class of linear discrete-time systems with low-sensitivity to sampling time jitter via delta operator approach. Delta-domain model is used to avoid the inherent numerical ill-condition resulting from the use of the standard shift-domain model at high sampling rates. Based on projection lemma in combination with the descriptor system approach often used to solve problems related to delay, a novel bounded real lemma with three slack variables for delta operator systems is presented. A sensitivity approach based on this novel lemma is proposed to mitigate the effects of sampling time jitter on system performance. Then, the problem of designing a low-sensitivity filter can be reduced to a convex optimisation problem. An important consideration in the design of correlation filters is the optimal trade-off between the standard H ∞ criterion and the sensitivity of the transfer function with respect to sampling time jitter. Finally, a numerical example demonstrating the validity of the proposed design method is given.

  14. Core design and operation optimization methods based on time-dependent perturbation theory

    International Nuclear Information System (INIS)

    Greenspan, E.

    1983-08-01

    A general approach for the optimization of nuclear reactor core design and operation is outlined; it is based on two cornerstones: a newly developed time-dependent (or burnup-dependent) perturbation theory for nonlinear problems and a succesive iteration technique. The resulting approach is capable of handling realistic reactor models using computational methods of any degree of sophistication desired, while accounting for all the constraints imposed. Three general optimization strategies, different in the way for handling the constraints, are formulated. (author)

  15. Marijuana use and inpatient outcomes among hospitalized patients: analysis of the nationwide inpatient sample database

    OpenAIRE

    Vin?Raviv, Neomi; Akinyemiju, Tomi; Meng, Qingrui; Sakhuja, Swati; Hayward, Reid

    2016-01-01

    Abstract The purpose of this paper is to examine the relationship between marijuana use and health outcomes among hospitalized patients, including those hospitalized with a diagnosis of cancer. A total of 387,608 current marijuana users were identified based on ICD?9 codes for marijuana use among hospitalized patients in the Nationwide Inpatient Sample database between 2007 and 2011. Logistic regression analysis was performed to determine the association between marijuana use and heart failur...

  16. A SIMPLE FRAILTY QUESTIONNAIRE (FRAIL) PREDICTS OUTCOMES IN MIDDLE AGED AFRICAN AMERICANS

    Science.gov (United States)

    MORLEY, J.E.; MALMSTROM, T.K.; MILLER, D.K.

    2015-01-01

    Objective To validate the FRAIL scale. Design Longitudinal study. Setting Community. Participants Representative sample of African Americans age 49 to 65 years at onset of study. Measurements The 5-item FRAIL scale (Fatigue, Resistance, Ambulation, Illnesses, & Loss of Weight), at baseline and activities of daily living (ADLs), instrumental activities of daily living (IADLs), mortality, short physical performance battery (SPPB), gait speed, one-leg stand, grip strength and injurious falls at baseline and 9 years. Blood tests for CRP, SIL6R, STNFR1, STNFR2 and 25 (OH) vitamin D at baseline. Results Cross-sectionally the FRAIL scale correlated significantly with IADL difficulties, SPPB, grip strength and one-leg stand among participants with no baseline ADL difficulties (N=703) and those outcomes plus gait speed in those with no baseline ADL dependencies (N=883). TNFR1 was increased in pre-frail and frail subjects and CRP in some subgroups. Longitudinally (N=423 with no baseline ADL difficulties or N=528 with no baseline ADL dependencies), and adjusted for the baseline value for each outcome, being pre-frail at baseline significantly predicted future ADL difficulties, worse one-leg stand scores, and mortality in both groups, plus IADL difficulties in the dependence-excluded group. Being frail at baseline significantly predicted future ADL difficulties, IADL difficulties, and mortality in both groups, plus worse SPPB in the dependence-excluded group. Conclusion This study has validated the FRAIL scale in a late middle-aged African American population. This simple 5-question scale is an excellent screening test for clinicians to identify frail persons at risk of developing disability as well as decline in health functioning and mortality. PMID:22836700

  17. The Importance of Design Thinking for Technological Literacy: A Phenomenological Perspective

    Science.gov (United States)

    Wells, Alastair

    2013-01-01

    "We know that progress depends on discovery, inventions, creativity and design, but we have simply supposed that it happens anyway," de Bono (1999 p. 43). Technology education is ostensibly a foundation for future designers and creative thinking. However evidence of good design or creative thinking in outcomes displayed in school…

  18. Architectural Design Space Exploration of an FPGA-based Compressed Sampling Engine

    DEFF Research Database (Denmark)

    El-Sayed, Mohammad; Koch, Peter; Le Moullec, Yannick

    2015-01-01

    We present the architectural design space exploration of a compressed sampling engine for use in a wireless heart-rate monitoring system. We show how parallelism affects execution time at the register transfer level. Furthermore, two example solutions (modified semi-parallel and full...

  19. Employment-based abstinence reinforcement as a maintenance intervention for the treatment of cocaine dependence: post-intervention outcomes

    Science.gov (United States)

    DeFulio, Anthony; Silverman, Kenneth

    2011-01-01

    Aims Due to the chronicity of cocaine dependence, practical and effective maintenance interventions are needed to sustain long-term abstinence. We sought to assess the effects of long-term employment-based reinforcement of cocaine abstinence after discontinuation of the intervention. Design Participants who initiated sustained opiate and cocaine abstinence during a 6-month abstinence reinforcement and training program worked as data entry operators and were randomly assigned to a group that could work independent of drug use (Control, n = 24), or an abstinence-contingent employment (n = 27) group that was required to provide cocaine- and opiate-negative urine samples to work and maintain maximum rate of pay. Setting A nonprofit data entry business. Participants Unemployed welfare recipients who persistently used cocaine while in methadone treatment. Measurements Urine samples and self-reports were collected every six months for 30 months. Findings During the employment year, abstinence-contingent employment participants provided significantly more cocaine-negative samples than controls (82.7% and 54.2%; P = .01, OR = 4.61). During the follow-up year, the groups had similar rates of cocaine-negative samples (44.2% and 50.0%; P = .93), and HIV-risk behaviors. Participants’ social, employment, economic, and legal conditions were similar in the two groups across all phases of the study. Conclusions Employment-based reinforcement effectively maintains long-term cocaine abstinence, but many patients relapse to use when the abstinence contingency is discontinued, even after a year of abstinence-contingent employment. Relapse could be prevented in many patients by leaving employment-based abstinence reinforcement in place indefinitely, which could be facilitated by integrating it into typical workplaces. PMID:21226886

  20. Outcomes in cervical screening using various cytology technologies

    DEFF Research Database (Denmark)

    Barken, Sidsel S; Rebolj, Matejka; Lynge, Elsebeth

    2013-01-01

    of samples with atypical squamous cells of undetermined significance or worse (≥ASCUS) by age and technology phase. We included 391 140 samples. The proportion of ≥ASCUS increased steadily from 3.8% in phase 1 to 6.0% in phase 5. This pattern varied considerably across age groups. In women aged 23-34 years......Unlike for human papillomavirus screening, little is known about the possible age-dependent variation in the outcomes of cervical cytology screening. The aim of our study was to describe age-related outcomes of five cytological technologies in a population-based screening program targeting women...... aged 23-59 years. All cervical cytology from women residing in Copenhagen has been analyzed in the laboratory of the Department of Pathology, Hvidovre University Hospital. We studied five technology phases: (1) conventional cytology with manual reading, (2) conventional cytology with 50% automatically...

  1. The Fagerström Test for Nicotine Dependence in a Dutch sample of daily smokers and ex-smokers

    NARCIS (Netherlands)

    Vink, Jacqueline M.; Willemsen, Gonneke; Beem, A. Leo; Boomsma, Dorret I.

    2005-01-01

    We explored the performance of the Fagerström Test for Nicotine Dependence (FTND) in a sample of 1378 daily smokers and 1058 ex-smokers who participated in a survey study of the Netherlands Twin Register. FTND scores were higher for smokers than for ex-smokers. Nicotine dependence level was not

  2. Integrating Patient-Reported Outcomes into Spine Surgical Care through Visual Dashboards: Lessons Learned from Human-Centered Design

    Science.gov (United States)

    Hartzler, Andrea L.; Chaudhuri, Shomir; Fey, Brett C.; Flum, David R.; Lavallee, Danielle

    2015-01-01

    Introduction: The collection of patient-reported outcomes (PROs) draws attention to issues of importance to patients—physical function and quality of life. The integration of PRO data into clinical decisions and discussions with patients requires thoughtful design of user-friendly interfaces that consider user experience and present data in personalized ways to enhance patient care. Whereas most prior work on PROs focuses on capturing data from patients, little research details how to design effective user interfaces that facilitate use of this data in clinical practice. We share lessons learned from engaging health care professionals to inform design of visual dashboards, an emerging type of health information technology (HIT). Methods: We employed human-centered design (HCD) methods to create visual displays of PROs to support patient care and quality improvement. HCD aims to optimize the design of interactive systems through iterative input from representative users who are likely to use the system in the future. Through three major steps, we engaged health care professionals in targeted, iterative design activities to inform the development of a PRO Dashboard that visually displays patient-reported pain and disability outcomes following spine surgery. Findings: Design activities to engage health care administrators, providers, and staff guided our work from design concept to specifications for dashboard implementation. Stakeholder feedback from these health care professionals shaped user interface design features, including predefined overviews that illustrate at-a-glance trends and quarterly snapshots, granular data filters that enable users to dive into detailed PRO analytics, and user-defined views to share and reuse. Feedback also revealed important considerations for quality indicators and privacy-preserving sharing and use of PROs. Conclusion: Our work illustrates a range of engagement methods guided by human-centered principles and design

  3. Integrating Patient-Reported Outcomes into Spine Surgical Care through Visual Dashboards: Lessons Learned from Human-Centered Design.

    Science.gov (United States)

    Hartzler, Andrea L; Chaudhuri, Shomir; Fey, Brett C; Flum, David R; Lavallee, Danielle

    2015-01-01

    The collection of patient-reported outcomes (PROs) draws attention to issues of importance to patients-physical function and quality of life. The integration of PRO data into clinical decisions and discussions with patients requires thoughtful design of user-friendly interfaces that consider user experience and present data in personalized ways to enhance patient care. Whereas most prior work on PROs focuses on capturing data from patients, little research details how to design effective user interfaces that facilitate use of this data in clinical practice. We share lessons learned from engaging health care professionals to inform design of visual dashboards, an emerging type of health information technology (HIT). We employed human-centered design (HCD) methods to create visual displays of PROs to support patient care and quality improvement. HCD aims to optimize the design of interactive systems through iterative input from representative users who are likely to use the system in the future. Through three major steps, we engaged health care professionals in targeted, iterative design activities to inform the development of a PRO Dashboard that visually displays patient-reported pain and disability outcomes following spine surgery. Design activities to engage health care administrators, providers, and staff guided our work from design concept to specifications for dashboard implementation. Stakeholder feedback from these health care professionals shaped user interface design features, including predefined overviews that illustrate at-a-glance trends and quarterly snapshots, granular data filters that enable users to dive into detailed PRO analytics, and user-defined views to share and reuse. Feedback also revealed important considerations for quality indicators and privacy-preserving sharing and use of PROs. Our work illustrates a range of engagement methods guided by human-centered principles and design recommendations for optimizing PRO Dashboards for patient

  4. Spatiotemporally Representative and Cost-Efficient Sampling Design for Validation Activities in Wanglang Experimental Site

    Directory of Open Access Journals (Sweden)

    Gaofei Yin

    2017-11-01

    Full Text Available Spatiotemporally representative Elementary Sampling Units (ESUs are required for capturing the temporal variations in surface spatial heterogeneity through field measurements. Since inaccessibility often coexists with heterogeneity, a cost-efficient sampling design is mandatory. We proposed a sampling strategy to generate spatiotemporally representative and cost-efficient ESUs based on the conditioned Latin hypercube sampling scheme. The proposed strategy was constrained by multi-temporal Normalized Difference Vegetation Index (NDVI imagery, and the ESUs were limited within a sampling feasible region established based on accessibility criteria. A novel criterion based on the Overlapping Area (OA between the NDVI frequency distribution histogram from the sampled ESUs and that from the entire study area was used to assess the sampling efficiency. A case study in Wanglang National Nature Reserve in China showed that the proposed strategy improves the spatiotemporally representativeness of sampling (mean annual OA = 74.7% compared to the single-temporally constrained (OA = 68.7% and the random sampling (OA = 63.1% strategies. The introduction of the feasible region constraint significantly reduces in-situ labour-intensive characterization necessities at expenses of about 9% loss in the spatiotemporal representativeness of the sampling. Our study will support the validation activities in Wanglang experimental site providing a benchmark for locating the nodes of automatic observation systems (e.g., LAINet which need a spatially distributed and temporally fixed sampling design.

  5. Evaluation of Nine Consensus Indices in Delphi Foresight Research and Their Dependency on Delphi Survey Characteristics: A Simulation Study and Debate on Delphi Design and Interpretation.

    Science.gov (United States)

    Birko, Stanislav; Dove, Edward S; Özdemir, Vural

    2015-01-01

    The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger's Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss' Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts' opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0

  6. Outcome of Minnesota's gambling treatment programs.

    Science.gov (United States)

    Stinchfield, R; Winters, K C

    2001-01-01

    This study measured the outcome of four state-supported outpatient gambling treatment programs in Minnesota. The programs were developed specifically for the treatment of pathological gamblers and offered multiple modalities of treatment including individual, group, education, twelve-step work, family groups, and financial counseling. The therapeutic orientation was eclectic with an emphasis on the twelve steps of Gamblers Anonymous (GA) and a treatment goal of abstinence. The sample included 348 men and 220 women treated between January 1992 and January 1995. A pretest-posttest design was utilized with multidimensional assessments obtained at intake, discharge, six-months, and twelve-months post-discharge. Variables assessed included a range of clinical and outcome variables. At six month follow-up, 28% reported that they had abstained from gambling during the six months following discharge and an additional 20% had gambled less than once per month. Almost half of the sample (48%) showed clinically significant improvement in gambling frequency at six month follow-up. Outcome variables of gambling frequency, SOGS scores, amount of money gambled, number of friends who gamble, psychosocial problems, and number of financial problems, all showed statistically significant improvements from pretreatment to follow-up. The treatment programs yielded outcome results similar to those reported for alcohol and drug abuse treatment programs.

  7. The Effect of Childbirth Self-Efficacy on Perinatal Outcomes

    Science.gov (United States)

    Tilden, Ellen L.; Caughey, Aaron B.; Lee, Christopher S.; Emeis, Cathy

    2016-01-01

    Objective To synthesize and critique the quantitative literature on measuring childbirth self-efficacy and the effect of childbirth self-efficacy on perinatal outcomes. Data Sources Eligible studies were identified through searching MEDLINE, CINAHL, Scopus, and Google Scholar databases. Study Selection Published research using a tool explicitly intended to measure childbirth self-efficacy and also examining outcomes within the perinatal period were included. All manuscripts were in English and published in peer-reviewed journals. Data Extraction First author, country, year of publication, reference and definition of childbirth self-efficacy, measurement of childbirth self-efficacy, sample recruitment and retention, sample characteristics, study design, interventions (with experimental and quasi-experimental studies), and perinatal outcomes were extracted and summarized. Data Synthesis Of 619 publications, 23 studies published between 1983 and 2015 met inclusion criteria and were critiqued and synthesized in this review. Conclusions There is overall consistency in how childbirth self-efficacy is defined and measured among studies, facilitating comparison and synthesis. Our findings suggest that increased childbirth self-efficacy is associated with a wide variety of improved perinatal outcomes. Moreover, there is evidence that childbirth self-efficacy is a psychosocial factor that can be modified through various efficacy-enhancing interventions. Future researchers will be able to build knowledge in this area through: (a) utilization of experimental and quasi-experimental design; (b) recruitment and retention of more diverse samples; (c) explicit reporting of definitions of terms (e.g. ‘high risk’); (d) investigation of interventions that increase childbirth self-efficacy during pregnancy; and, (e) investigation regarding how childbirth self-efficacy enhancing interventions might lead to decreased active labor pain and suffering. Exploratory research should

  8. Association of the Affordable Care Act Dependent Coverage Provision With Prenatal Care Use and Birth Outcomes.

    Science.gov (United States)

    Daw, Jamie R; Sommers, Benjamin D

    2018-02-13

    The effect of the Affordable Care Act (ACA) dependent coverage provision on pregnancy-related health care and health outcomes is unknown. To determine whether the dependent coverage provision was associated with changes in payment for birth, prenatal care, and birth outcomes. Retrospective cohort study, using a differences-in-differences analysis of individual-level birth certificate data comparing live births among US women aged 24 to 25 years (exposure group) and women aged 27 to 28 years (control group) before (2009) and after (2011-2013) enactment of the dependent coverage provision. Results were stratified by marital status. The dependent coverage provision of the ACA, which allowed young adults to stay on their parent's health insurance until age 26 years. Primary outcomes were payment source for birth, early prenatal care (first visit in first trimester), and adequate prenatal care (a first trimester visit and 80% of expected visits). Secondary outcomes were cesarean delivery, premature birth, low birth weight, and infant neonatal intensive care unit (NICU) admission. The study population included 1 379 005 births among women aged 24-25 years (exposure group; 299 024 in 2009; 1 079 981 in 2011-2013), and 1 551 192 births among women aged 27-28 years (control group; 325 564 in 2009; 1 225 628 in 2011-2013). From 2011-2013, compared with 2009, private insurance payment for births increased in the exposure group (36.9% to 35.9% [difference, -1.0%]) compared with the control group (52.4% to 51.1% [difference, -1.3%]), adjusted difference-in-differences, 1.9 percentage points (95% CI, 1.6 to 2.1). Medicaid payment decreased in the exposure group (51.6% to 53.6% [difference, 2.0%]) compared with the control group (37.4% to 39.4% [difference, 1.9%]), adjusted difference-in-differences, -1.4 percentage points (95% CI, -1.7 to -1.2). Self-payment for births decreased in the exposure group (5.2% to 4.3% [difference, -0.9%]) compared with the

  9. The Influence of Pituitary Size on Outcome After Transsphenoidal Hypophysectomy in a Large Cohort of Dogs with Pituitary-Dependent Hypercortisolism

    NARCIS (Netherlands)

    van Rijn, Sarah; Galac, S.; Tryfonidou, M. A.; Hesselink, J. W.; Penning, L. C.; Kooistra, H. S.; Meij, B. P.

    2016-01-01

    BACKGROUND Transsphenoidal hypophysectomy is one of the treatment strategies in the comprehensive management of dogs with pituitary-dependent hypercortisolism (PDH). OBJECTIVES To describe the influence of pituitary size at time of pituitary gland surgery on long-term outcome. ANIMALS

  10. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    Science.gov (United States)

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure

  11. Sample size for post-marketing safety studies based on historical controls.

    Science.gov (United States)

    Wu, Yu-te; Makuch, Robert W

    2010-08-01

    As part of a drug's entire life cycle, post-marketing studies are an important part in the identification of rare, serious adverse events. Recently, the US Food and Drug Administration (FDA) has begun to implement new post-marketing safety mandates as a consequence of increased emphasis on safety. The purpose of this research is to provide exact sample size formula for the proposed hybrid design, based on a two-group cohort study with incorporation of historical external data. Exact sample size formula based on the Poisson distribution is developed, because the detection of rare events is our outcome of interest. Performance of exact method is compared to its approximate large-sample theory counterpart. The proposed hybrid design requires a smaller sample size compared to the standard, two-group prospective study design. In addition, the exact method reduces the number of subjects required in the treatment group by up to 30% compared to the approximate method for the study scenarios examined. The proposed hybrid design satisfies the advantages and rationale of the two-group design with smaller sample sizes generally required. 2010 John Wiley & Sons, Ltd.

  12. Age-dependent effect of apolipoprotein E4 on functional outcome after controlled cortical impact in mice.

    Science.gov (United States)

    Mannix, Rebekah C; Zhang, Jimmy; Park, Juyeon; Zhang, Xuan; Bilal, Kiran; Walker, Kendall; Tanzi, Rudolph E; Tesco, Giuseppina; Whalen, Michael J

    2011-01-01

    The apolipoprotein E4 (APOE4) gene leads to increased brain amyloid beta (Aβ) and poor outcome in adults with traumatic brain injury (TBI); however, its role in childhood TBI is controversial. We hypothesized that the transgenic expression of human APOE4 worsens the outcome after controlled cortical impact (CCI) in adult but not immature mice. Adult and immature APOE4 mice had worse motor outcome after CCI (P<0.001 versus wild type (WT)), but the Morris water maze performance was worse only in adult APOE4 mice (P=0.028 at 2 weeks, P=0.019 at 6 months versus WT), because immature APOE4 mice had performance similar to WT for up to 1 year after injury. Brain lesion size was similar in adult APOE4 mice but was decreased (P=0.029 versus WT) in injured immature APOE4 mice. Microgliosis was similar in all groups. Soluble brain Aβ(40) was increased at 48 hours after CCI in adult and immature APOE4 mice and in adult WT (P<0.05), and was dynamically regulated during the chronic period by APOE4 in adults but not immature mice. The data suggest age-dependent effects of APOE4 on cognitive outcome after TBI, and that therapies targeting APOE4 may be more effective in adults versus children with TBI.

  13. Dose dependency of outcomes of intrapleural fibrinolytic therapy in new rabbit empyema models

    Science.gov (United States)

    Florova, Galina; Azghani, Ali O.; Buchanan, Ann; Boren, Jake; Allen, Timothy; Rahman, Najib M.; Koenig, Kathleen; Chamiso, Mignote; Karandashova, Sophia; Henry, James; Idell, Steven

    2016-01-01

    The incidence of empyema (EMP) is increasing worldwide; EMP generally occurs with pleural loculation and impaired drainage is often treated with intrapleural fibrinolytic therapy (IPFT) or surgery. A number of IPFT options are used clinically with empiric dosing and variable outcomes in adults. To evaluate mechanisms governing intrapleural fibrinolysis and disease outcomes, models of Pasteurella multocida and Streptococcus pneumoniae were generated in rabbits and the animals were treated with either human tissue (tPA) plasminogen activator or prourokinase (scuPA). Rabbit EMP was characterized by the development of pleural adhesions detectable by chest ultrasonography and fibrinous coating of the pleura. Similar to human EMP, rabbits with EMP accumulated sizable, 20- to 40-ml fibrinopurulent pleural effusions associated with extensive intrapleural organization, significantly increased pleural thickness, suppression of fibrinolytic and plasminogen-activating activities, and accumulation of high levels of plasminogen activator inhibitor 1, plasminogen, and extracellular DNA. IPFT with tPA (0.145 mg/kg) or scuPA (0.5 mg/kg) was ineffective in rabbit EMP (n = 9 and 3 for P. multocida and S. pneumoniae, respectively); 2 mg/kg tPA or scuPA IPFT (n = 5) effectively cleared S. pneumoniae-induced EMP collections in 24 h with no bleeding observed. Although intrapleural fibrinolytic activity for up to 40 min after IPFT was similar for effective and ineffective doses of fibrinolysin, it was lower for tPA than for scuPA treatments. These results demonstrate similarities between rabbit and human EMP, the importance of pleural fluid PAI-1 activity, and levels of plasminogen in the regulation of intrapleural fibrinolysis and illustrate the dose dependency of IPFT outcomes in EMP. PMID:27343192

  14. Concepts in sample size determination

    Directory of Open Access Journals (Sweden)

    Umadevi K Rao

    2012-01-01

    Full Text Available Investigators involved in clinical, epidemiological or translational research, have the drive to publish their results so that they can extrapolate their findings to the population. This begins with the preliminary step of deciding the topic to be studied, the subjects and the type of study design. In this context, the researcher must determine how many subjects would be required for the proposed study. Thus, the number of individuals to be included in the study, i.e., the sample size is an important consideration in the design of many clinical studies. The sample size determination should be based on the difference in the outcome between the two groups studied as in an analytical study, as well as on the accepted p value for statistical significance and the required statistical power to test a hypothesis. The accepted risk of type I error or alpha value, which by convention is set at the 0.05 level in biomedical research defines the cutoff point at which the p value obtained in the study is judged as significant or not. The power in clinical research is the likelihood of finding a statistically significant result when it exists and is typically set to >80%. This is necessary since the most rigorously executed studies may fail to answer the research question if the sample size is too small. Alternatively, a study with too large a sample size will be difficult and will result in waste of time and resources. Thus, the goal of sample size planning is to estimate an appropriate number of subjects for a given study design. This article describes the concepts in estimating the sample size.

  15. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    Energy Technology Data Exchange (ETDEWEB)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  16. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    International Nuclear Information System (INIS)

    Shine, E. P.; Poirier, M. R.

    2013-01-01

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and data interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  17. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Science.gov (United States)

    Deitchler, Megan; Deconinck, Hedwig; Bergeron, Gilles

    2008-01-01

    The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster) and a 67 × 3 cluster design (67 clusters, 3 observations per cluster). Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals) than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data collection in emergency

  18. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Directory of Open Access Journals (Sweden)

    Deconinck Hedwig

    2008-05-01

    Full Text Available Abstract The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster and a 67 × 3 cluster design (67 clusters, 3 observations per cluster. Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data

  19. Gamma-ray yield dependence on bulk density and moisture content of a sample of a PGNAA setup. A Monte Carlo study

    International Nuclear Information System (INIS)

    Nagadi, M.M.; Naqvi, A.A.

    2007-01-01

    Monte Carlo calculations were carried out to study the dependence of γ-ray yield on the bulk density and moisture content of a sample in a thermalneutron capture-based prompt gamma neutron activation analysis (PGNAA) setup. The results of the study showed a strong dependence of the γ-ray yield upon the sample bulk density. An order of magnitude increase in yield of 1.94 and 6.42 MeV prompt γ-rays from calcium in a Portland cement sample was observed for a corresponding order of magnitude increase in the sample bulk density. On the contrary the γ-ray yield has a weak dependence on sample moisture content and an increase of only 20% in yield of 1.94 and 6.42 MeV prompt γ-rays from calcium in the Portland cement sample was observed for an order of magnitude increase in the moisture content of the Portland cement sample. A similar effect of moisture content has been observed on the yield of 1.167 MeV prompt γ-rays from chlorine contaminants in Portland cement samples. For an order of magnitude increase in the moisture content of the sample, a 7 to 12% increase in the yield of the 1.167 MeV chlorine γ-ray was observed for the Portland cement samples containing 1 to 5 wt.% chlorine contaminants. This study has shown that effects of sample moisture content on prompt γ-ray yield from constituents of a Portland cement sample are insignificant in a thermal-neutrons capture-based PGNAA setup. (author)

  20. Sample size determinations for group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms.

    Science.gov (United States)

    Heo, Moonseong; Litwin, Alain H; Blackstock, Oni; Kim, Namhee; Arnsten, Julia H

    2017-02-01

    We derived sample size formulae for detecting main effects in group-based randomized clinical trials with different levels of data hierarchy between experimental and control arms. Such designs are necessary when experimental interventions need to be administered to groups of subjects whereas control conditions need to be administered to individual subjects. This type of trial, often referred to as a partially nested or partially clustered design, has been implemented for management of chronic diseases such as diabetes and is beginning to emerge more commonly in wider clinical settings. Depending on the research setting, the level of hierarchy of data structure for the experimental arm can be three or two, whereas that for the control arm is two or one. Such different levels of data hierarchy assume correlation structures of outcomes that are different between arms, regardless of whether research settings require two or three level data structure for the experimental arm. Therefore, the different correlations should be taken into account for statistical modeling and for sample size determinations. To this end, we considered mixed-effects linear models with different correlation structures between experimental and control arms to theoretically derive and empirically validate the sample size formulae with simulation studies.

  1. Impact of hospital atmosphere on perceived health care outcome.

    Science.gov (United States)

    Narang, Ritu; Polsa, Pia; Soneye, Alabi; Fuxiang, Wei

    2015-01-01

    Healthcare service quality studies primarily examine the relationships between patients' perceived quality and satisfaction with healthcare services, clinical effectiveness, service use, recommendations and value for money. These studies suggest that patient-independent quality dimensions (structure, process and outcome) are antecedents to quality. The purpose of this paper is to propose an alternative by looking at the relationship between hospital atmosphere and healthcare quality with perceived outcome. Data were collected from Finland, India, Nigeria and the People's Republic of China. Regression analysis used perceived outcome as the dependent variable and atmosphere and healthcare service quality as independent variables. Findings - Results showed that atmosphere and healthcare service quality have a statistically significant relationship with patient perceived outcomes. The sample size was small and the sampling units were selected on convenience; thus, caution must be exercised in generalizing the findings. The study determined that service quality and atmosphere are considered significant for developing and developed nations. This result could have significant implications for policy makers and service providers developing healthcare quality and hospital atmosphere. Studies concentrate on healthcare outcome primarily regarding population health status, mortality, morbidity, customer satisfaction, loyalty, quality of life, customer behavior and consumption. However, the study exposes how patients perceive their health after treatment. Furthermore, the authors develop the healthcare service literature by considering atmosphere and perceived outcome.

  2. Dependence of critical current on sample length analyzed by the variation of local critical current bent of BSCCO superconducting composite tape

    International Nuclear Information System (INIS)

    Matsubayashi, H.; Mukai, Y.; Shin, J.K.; Ochiai, S.; Okuda, H.; Osamura, K.; Otto, A.; Malozemoff, A.

    2008-01-01

    Using the high critical current type BSCCO composite tape fabricated at American Superconductor Corporation, the relation of overall critical current to the distribution of local critical current and the dependence of overall critical current on sample length of the bent samples were studied experimentally and analytically. The measured overall critical current was described well from the distribution of local critical current and n-value of the constituting short elements, by regarding the overall sample to be composed of local series circuits and applying the voltage summation model. Also the dependence of overall critical current on sample length could be reproduced in computer satisfactorily by the proposed simulation method

  3. Complex Dynamic Systems View on Conceptual Change: How a Picture of Students' Intuitive Conceptions Accrue from Dynamically Robust Task Dependent Learning Outcomes

    Science.gov (United States)

    Koponen, Ismo T.; Kokkonen, Tommi; Nousiainen, Maiji

    2017-01-01

    We discuss here conceptual change and the formation of robust learning outcomes from the viewpoint of complex dynamic systems (CDS). The CDS view considers students' conceptions as context dependent and multifaceted structures which depend on the context of their application. In the CDS view the conceptual patterns (i.e. intuitive conceptions…

  4. Accounting for interactions and complex inter-subject dependency in estimating treatment effect in cluster-randomized trials with missing outcomes.

    Science.gov (United States)

    Prague, Melanie; Wang, Rui; Stephens, Alisa; Tchetgen Tchetgen, Eric; DeGruttola, Victor

    2016-12-01

    Semi-parametric methods are often used for the estimation of intervention effects on correlated outcomes in cluster-randomized trials (CRTs). When outcomes are missing at random (MAR), Inverse Probability Weighted (IPW) methods incorporating baseline covariates can be used to deal with informative missingness. Also, augmented generalized estimating equations (AUG) correct for imbalance in baseline covariates but need to be extended for MAR outcomes. However, in the presence of interactions between treatment and baseline covariates, neither method alone produces consistent estimates for the marginal treatment effect if the model for interaction is not correctly specified. We propose an AUG-IPW estimator that weights by the inverse of the probability of being a complete case and allows different outcome models in each intervention arm. This estimator is doubly robust (DR); it gives correct estimates whether the missing data process or the outcome model is correctly specified. We consider the problem of covariate interference which arises when the outcome of an individual may depend on covariates of other individuals. When interfering covariates are not modeled, the DR property prevents bias as long as covariate interference is not present simultaneously for the outcome and the missingness. An R package is developed implementing the proposed method. An extensive simulation study and an application to a CRT of HIV risk reduction-intervention in South Africa illustrate the method. © 2016, The International Biometric Society.

  5. Dealing with trade-offs in destructive sampling designs for occupancy surveys.

    Directory of Open Access Journals (Sweden)

    Stefano Canessa

    Full Text Available Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor's priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a

  6. Determination of temperature dependence of full matrix material constants of PZT-8 piezoceramics using only one sample.

    Science.gov (United States)

    Zhang, Yang; Tang, Liguo; Tian, Hua; Wang, Jiyang; Cao, Wenwu; Zhang, Zhongwu

    2017-08-15

    Resonant ultrasound spectroscopy (RUS) was used to determine the temperature dependence of full matrix material constants of PZT-8 piezoceramics from room temperature to 100 °C. Property variations from sample to samples can be eliminated by using only one sample, so that data self-consistency can be guaranteed. The RUS measurement system error was estimated to be lower than 2.35%. The obtained full matrix material constants at different temperatures all have excellent self-consistency, which can help accurately predict device performance at high temperatures using finite element simulations.

  7. Utility of the Mayo-Portland adaptability inventory-4 for self-reported outcomes in a military sample with traumatic brain injury.

    Science.gov (United States)

    Kean, Jacob; Malec, James F; Cooper, Douglas B; Bowles, Amy O

    2013-12-01

    To investigate the psychometric properties of the Mayo-Portland Adaptability Inventory-4 (MPAI-4) obtained by self-report in a large sample of active duty military personnel with traumatic brain injury (TBI). Consecutive cohort who completed the MPAI-4 as a part of a larger battery of clinical outcome measures at the time of intake to an outpatient brain injury clinic. Medical center. Consecutively referred sample of active duty military personnel (N=404) who suffered predominantly mild (n=355), but also moderate (n=37) and severe (n=12), TBI. Not applicable. MPAI-4 RESULTS: Initial factor analysis suggested 2 salient dimensions. In subsequent analysis, the ratio of the first and second eigenvalues (6.84:1) and parallel analysis indicated sufficient unidimensionality in 26 retained items. Iterative Rasch analysis resulted in the rescaling of the measure and the removal of 5 additional items for poor fit. The items of the final 21-item Mayo-Portland Adaptability Inventory-military were locally independent, demonstrated monotonically increasing responses, adequately fit the item response model, and permitted the identification of nearly 5 statistically distinct levels of disability in the study population. Slight mistargeting of the population resulted in the global outcome, as measured by the Mayo-Portland Adaptability Inventory-military, tending to be less reflective of very mild levels of disability. These data collected in a relatively large sample of active duty service members with TBI provide insight into the ability of patients to self-report functional impairment and the distinct effects of military deployment on outcome, providing important guidance for the meaningful measurement of outcome in this population. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. Genetic Contribution to Alcohol Dependence: Investigation of a Heterogeneous German Sample of Individuals with Alcohol Dependence, Chronic Alcoholic Pancreatitis, and Alcohol-Related Cirrhosis

    Science.gov (United States)

    Treutlein, Jens; Streit, Fabian; Juraeva, Dilafruz; Degenhardt, Franziska; Rietschel, Liz; Forstner, Andreas J.; Ridinger, Monika; Dukal, Helene; Foo, Jerome C.; Soyka, Michael; Maier, Wolfgang; Gaebel, Wolfgang; Dahmen, Norbert; Scherbaum, Norbert; Müller-Myhsok, Bertram; Lucae, Susanne; Ising, Marcus; Stickel, Felix; Berg, Thomas; Roggenbuck, Ulla; Jöckel, Karl-Heinz; Scholz, Henrike; Zimmermann, Ulrich S.; Buch, Stephan; Sommer, Wolfgang H.; Spanagel, Rainer; Brors, Benedikt; Cichon, Sven; Mann, Karl; Kiefer, Falk; Hampe, Jochen; Rosendahl, Jonas; Nöthen, Markus M.; Rietschel, Marcella

    2017-01-01

    The present study investigated the genetic contribution to alcohol dependence (AD) using genome-wide association data from three German samples. These comprised patients with: (i) AD; (ii) chronic alcoholic pancreatitis (ACP); and (iii) alcohol-related liver cirrhosis (ALC). Single marker, gene-based, and pathway analyses were conducted. A significant association was detected for the ADH1B locus in a gene-based approach (puncorrected = 1.2 × 10−6; pcorrected = 0.020). This was driven by the AD subsample. No association with ADH1B was found in the combined ACP + ALC sample. On first inspection, this seems surprising, since ADH1B is a robustly replicated risk gene for AD and may therefore be expected to be associated also with subgroups of AD patients. The negative finding in the ACP + ALC sample, however, may reflect genetic stratification as well as random fluctuation of allele frequencies in the cases and controls, demonstrating the importance of large samples in which the phenotype is well assessed. PMID:28714907

  9. A novel sampling design to explore gene-longevity associations

    DEFF Research Database (Denmark)

    De Rango, Francesco; Dato, Serena; Bellizzi, Dina

    2008-01-01

    To investigate the genetic contribution to familial similarity in longevity, we set up a novel experimental design where cousin-pairs born from siblings who were concordant or discordant for the longevity trait were analyzed. To check this design, two chromosomal regions already known to encompass...... from concordant and discordant siblings. In addition, we analyzed haplotype transmission from centenarians to offspring, and a statistically significant Transmission Ratio Distortion (TRD) was observed for both chromosomal regions in the discordant families (P=0.007 for 6p21.3 and P=0.015 for 11p15.......5). In concordant families, a marginally significant TRD was observed at 6p21.3 only (P=0.06). Although no significant difference emerged between the two groups of cousin-pairs, our study gave new insights on the hindrances to recruiting a suitable sample to obtain significant IBD data on longevity...

  10. A quasi-experimental design based on regional variations: discussion of a method for evaluating outcomes of medical practice

    DEFF Research Database (Denmark)

    Loft, A; Andersen, T F; Madsen, Mette

    1989-01-01

    A large proportion of common medical practices are subject to substantial regional variation resulting in numerous natural experiments. Opportunities are thereby provided for outcome evaluation through quasi-experimental design. If patients treated in different regions were comparable a natural...... experiment involving alternative treatments could be regarded as 'pseudo randomised', but empirical investigations are needed to verify this prerequisite. This paper discusses the role of quasi-experimental designs in assessment of medical care with evaluation of outcomes after hysterectomy in Denmark...... groups are elicited from administrative data. We conclude that it is possible to establish a quasi-experimental design based on regional variations and that the comparability of the groups included may be assessed through registry data. The importance of technology diffusion for the prospects...

  11. Time-Dependent Deformation Modelling for a Chopped-Glass Fiber Composite for Automotive Durability Design Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Ren, W

    2001-08-24

    Time-dependent deformation behavior of a polymeric composite with chopped-glass-fiber reinforcement was investigated for automotive applications, The material under stress was exposed to representative automobile service environments. Results show that environment has substantial effects on time-dependent deformation behavior of the material. The data were analyzed and experimentally-based models developed for the time-dependent deformation behavior as a basis for automotive structural durability design criteria.

  12. Dependability evaluation of computing systems - physical faults, design faults, malicious faults

    International Nuclear Information System (INIS)

    Kaaniche, Mohamed

    1999-01-01

    The research summarized in this report focuses on the dependability of computer systems. It addresses several complementary, theoretical as well as experimental, issues that are grouped into four topics. The first topic concerns the definition of efficient methods that aim to assist the users in the construction and validation of complex dependability analysis and evaluation models. The second topic deals with the modeling of reliability and availability growth that mainly result from the progressive removal of design faults. A method is also defined to support the application of software reliability evaluation studies in an industrial context. The third topic deals with the development and experimentation of a new approach for the quantitative evaluation of operational security. This approach aims to assist the system administrators in the monitoring of operational security, when modifications, that are likely to introduce new vulnerabilities, occur in the system configuration, the applications, the user behavior, etc. Finally, the fourth topic addresses: a) the definition of a development model focused at the production of dependable systems, and b) the development of assessment criteria to obtain justified confidence that a system will achieve, during its operation and up to its decommissioning, its dependability objectives. (author) [fr

  13. Assessment of long-term gas sampling design at two commercial manure-belt layer barns.

    Science.gov (United States)

    Chai, Li-Long; Ni, Ji-Qin; Chen, Yan; Diehl, Claude A; Heber, Albert J; Lim, Teng T

    2010-06-01

    Understanding temporal and spatial variations of aerial pollutant concentrations is important for designing air quality monitoring systems. In long-term and continuous air quality monitoring in large livestock and poultry barns, these systems usually use location-shared analyzers and sensors and can only sample air at limited number of locations. To assess the validity of the gas sampling design at a commercial layer farm, a new methodology was developed to map pollutant gas concentrations using portable sensors under steady-state or quasi-steady-state barn conditions. Three assessment tests were conducted from December 2008 to February 2009 in two manure-belt layer barns. Each barn was 140.2 m long and 19.5 m wide and had 250,000 birds. Each test included four measurements of ammonia and carbon dioxide concentrations at 20 locations that covered all operating fans, including six of the fans used in the long-term sampling that represented three zones along the lengths of the barns, to generate data for complete-barn monitoring. To simulate the long-term monitoring, gas concentrations from the six long-term sampling locations were extracted from the 20 assessment locations. Statistical analyses were performed to test the variances (F-test) and sample means (t test) between the 6- and 20-sample data. The study clearly demonstrated ammonia and carbon dioxide concentration gradients that were characterized by increasing concentrations from the west to east ends of the barns following the under-cage manure-belt travel direction. Mean concentrations increased from 7.1 to 47.7 parts per million (ppm) for ammonia and from 2303 to 3454 ppm for carbon dioxide from the west to east of the barns. Variations of mean gas concentrations were much less apparent between the south and north sides of the barns, because they were 21.2 and 20.9 ppm for ammonia and 2979 and 2951 ppm for carbon dioxide, respectively. The null hypotheses that the variances and means between the 6- and 20

  14. Sampling design considerations for demographic studies: a case of colonial seabirds

    Science.gov (United States)

    Kendall, William L.; Converse, Sarah J.; Doherty, Paul F.; Naughton, Maura B.; Anders, Angela; Hines, James E.; Flint, Elizabeth

    2009-01-01

    For the purposes of making many informed conservation decisions, the main goal for data collection is to assess population status and allow prediction of the consequences of candidate management actions. Reducing the bias and variance of estimates of population parameters reduces uncertainty in population status and projections, thereby reducing the overall uncertainty under which a population manager must make a decision. In capture-recapture studies, imperfect detection of individuals, unobservable life-history states, local movement outside study areas, and tag loss can cause bias or precision problems with estimates of population parameters. Furthermore, excessive disturbance to individuals during capture?recapture sampling may be of concern because disturbance may have demographic consequences. We address these problems using as an example a monitoring program for Black-footed Albatross (Phoebastria nigripes) and Laysan Albatross (Phoebastria immutabilis) nesting populations in the northwestern Hawaiian Islands. To mitigate these estimation problems, we describe a synergistic combination of sampling design and modeling approaches. Solutions include multiple capture periods per season and multistate, robust design statistical models, dead recoveries and incidental observations, telemetry and data loggers, buffer areas around study plots to neutralize the effect of local movements outside study plots, and double banding and statistical models that account for band loss. We also present a variation on the robust capture?recapture design and a corresponding statistical model that minimizes disturbance to individuals. For the albatross case study, this less invasive robust design was more time efficient and, when used in combination with a traditional robust design, reduced the standard error of detection probability by 14% with only two hours of additional effort in the field. These field techniques and associated modeling approaches are applicable to studies of

  15. An Examination of Dependence Power, Father Involvement, and Judgments about Violence in an At-Risk Community Sample of Mothers

    Science.gov (United States)

    Samp, Jennifer A.; Abbott, Leslie

    2011-01-01

    Individuals sometimes remain in dysfunctional, and even violent, relationships due to a perceived dependence on a partner. We examined the influence of dependence power judgments (defined by a combined assessment of mother commitment, perceived father commitment, and perceived father alternatives) in a community sample of mothers potentially bound…

  16. An Exploration of Supervisory and Therapeutic Relationships and Client Outcomes

    Science.gov (United States)

    Bell, Hope; Hagedorn, W. Bryce; Robinson, E. H. Mike

    2016-01-01

    The authors explored the connection between the facilitative conditions present within the supervisory relationship, the therapeutic relationship, and client outcomes. A correlational research design was used with a sample of 55 counselors-in-training and 88 clients. Results indicated a significant positive relationship between the therapeutic…

  17. Involving Latina/o parents in patient-centered outcomes research: Contributions to research study design, implementation and outcomes.

    Science.gov (United States)

    Pérez Jolles, Mónica; Martinez, Maria; Garcia, San Juanita; Stein, Gabriela L; Thomas, Kathleen C

    2017-10-01

    Comparative effectiveness research (CER) is supported by policymakers as a way to provide service providers and patients with evidence-based information to make better health-care decisions and ultimately improve services for patients. However, Latina/o patients are rarely involved as study advisors, and there is a lack of documentation on how their voices contribute to the research process when they are included as collaborators. The purpose of this article was to contribute to the literature by presenting concrete contributions of Latina/o parent involvement to study design, implementation and outcomes in the context of a CER study called Padres Efectivos (Parent Activation). Researchers facilitated a collaborative relationship with parents by establishing a mentor parent group. The contributions of parent involvement in the following stages of the research process are described: (i) proposal development, (ii) implementation of protocols, (iii) analysis plan and (iv) dissemination of results. Mentor parents' contributions helped tailor the content of the intervention to their needs during proposal, increased recruitment, validated the main outcome measure and added two important outcome measures, emphasized the importance of controlling for novice treatment status and developed innovative dissemination strategies. Mentor parents' guidance to the researchers has contributed to reaching recruitment goals, strengthened the study protocol, expanded findings, supported broad ownership of study implications and enriched the overall study data collection efforts. These findings can inform future research efforts seeking an active Latino parent collaboration and the timely incorporation of parent voices in each phase of the research process. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  18. Classifier-guided sampling for discrete variable, discontinuous design space exploration: Convergence and computational performance

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shahan, David W. [HRL Labs., LLC, Malibu, CA (United States); Seepersad, Carolyn Conner [Univ. of Texas, Austin, TX (United States)

    2014-04-22

    A classifier-guided sampling (CGS) method is introduced for solving engineering design optimization problems with discrete and/or continuous variables and continuous and/or discontinuous responses. The method merges concepts from metamodel-guided sampling and population-based optimization algorithms. The CGS method uses a Bayesian network classifier for predicting the performance of new designs based on a set of known observations or training points. Unlike most metamodeling techniques, however, the classifier assigns a categorical class label to a new design, rather than predicting the resulting response in continuous space, and thereby accommodates nondifferentiable and discontinuous functions of discrete or categorical variables. The CGS method uses these classifiers to guide a population-based sampling process towards combinations of discrete and/or continuous variable values with a high probability of yielding preferred performance. Accordingly, the CGS method is appropriate for discrete/discontinuous design problems that are ill-suited for conventional metamodeling techniques and too computationally expensive to be solved by population-based algorithms alone. In addition, the rates of convergence and computational properties of the CGS method are investigated when applied to a set of discrete variable optimization problems. Results show that the CGS method significantly improves the rate of convergence towards known global optima, on average, when compared to genetic algorithms.

  19. Orientation-dependent backbone-only residue pair scoring functions for fixed backbone protein design

    Directory of Open Access Journals (Sweden)

    Bordner Andrew J

    2010-04-01

    Full Text Available Abstract Background Empirical scoring functions have proven useful in protein structure modeling. Most such scoring functions depend on protein side chain conformations. However, backbone-only scoring functions do not require computationally intensive structure optimization and so are well suited to protein design, which requires fast score evaluation. Furthermore, scoring functions that account for the distinctive relative position and orientation preferences of residue pairs are expected to be more accurate than those that depend only on the separation distance. Results Residue pair scoring functions for fixed backbone protein design were derived using only backbone geometry. Unlike previous studies that used spherical harmonics to fit 2D angular distributions, Gaussian Mixture Models were used to fit the full 3D (position only and 6D (position and orientation distributions of residue pairs. The performance of the 1D (residue separation only, 3D, and 6D scoring functions were compared by their ability to identify correct threading solutions for a non-redundant benchmark set of protein backbone structures. The threading accuracy was found to steadily increase with increasing dimension, with the 6D scoring function achieving the highest accuracy. Furthermore, the 3D and 6D scoring functions were shown to outperform side chain-dependent empirical potentials from three other studies. Next, two computational methods that take advantage of the speed and pairwise form of these new backbone-only scoring functions were investigated. The first is a procedure that exploits available sequence data by averaging scores over threading solutions for homologs. This was evaluated by applying it to the challenging problem of identifying interacting transmembrane alpha-helices and found to further improve prediction accuracy. The second is a protein design method for determining the optimal sequence for a backbone structure by applying Belief Propagation

  20. Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products

    Science.gov (United States)

    Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun

    2011-10-01

    To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."

  1. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    Science.gov (United States)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  2. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  3. Extended exergy concept to facilitate designing and optimization of frequency-dependent direct energy conversion systems

    International Nuclear Information System (INIS)

    Wijewardane, S.; Goswami, Yogi

    2014-01-01

    Highlights: • Proved exergy method is not adequate to optimize frequency-dependent energy conversion. • Exergy concept is modified to facilitate the thermoeconomic optimization of photocell. • The exergy of arbitrary radiation is used for a practical purpose. • The utility of the concept is illustrated using pragmatic examples. - Abstract: Providing the radiation within the acceptable (responsive) frequency range(s) is a common method to increase the efficiency of the frequency-dependent energy conversion systems, such as photovoltaic and nano-scale rectenna. Appropriately designed auxiliary items such as spectrally selective thermal emitters, optical filters, and lenses are used for this purpose. However any energy conversion method that utilizes auxiliary components to increase the efficiency of a system has to justify the potential cost incurred by those auxiliary components through the economic gain emerging from the increased system efficiency. Therefore much effort should be devoted to design innovative systems, effectively integrating the auxiliary items and to optimize the system with economic considerations. Exergy is the widely used method to design and optimize conventional energy conversion systems. Although the exergy concept is used to analyze photovoltaic systems, it has not been used effectively to design and optimize such systems. In this manuscript, we present a modified exergy method in order to effectively design and economically optimize frequency-dependent energy conversion systems. Also, we illustrate the utility of this concept using examples of thermophotovoltaic, Photovoltaic/Thermal and concentrated solar photovoltaic

  4. Understanding reasons for and outcomes of patients lost to follow-up in antiretroviral therapy programs in Africa through a sampling-based approach.

    Science.gov (United States)

    Geng, Elvin H; Bangsberg, David R; Musinguzi, Nicolas; Emenyonu, Nneka; Bwana, Mwebesa Bosco; Yiannoutsos, Constantin T; Glidden, David V; Deeks, Steven G; Martin, Jeffrey N

    2010-03-01

    Losses to follow-up after initiation of antiretroviral therapy (ART) are common in Africa and are a considerable obstacle to understanding the effectiveness of nascent treatment programs. We sought to characterize, through a sampling-based approach, reasons for and outcomes of patients who become lost to follow-up. Cohort study. We searched for and interviewed a representative sample of lost patients or close informants in the community to determine reasons for and outcomes among lost patients. Three thousand six hundred twenty-eight HIV-infected adults initiated ART between January 1, 2004 and September 30, 2007 in Mbarara, Uganda. Eight hundred twenty-nine became lost to follow-up (cumulative incidence at 1, 2, and 3 years of 16%, 30%, and 39%). We sought a representative sample of 128 lost patients in the community and ascertained vital status in 111 (87%). Top reasons for loss included lack of transportation or money and work/child care responsibilities. Among the 111 lost patients who had their vital status ascertained through tracking, 32 deaths occurred (cumulative 1-year incidence 36%); mortality was highest shortly after the last clinic visit. Lower pre-ART CD4 T-cell count, older age, low blood pressure, and a central nervous system syndrome at the last clinic visit predicted deaths. Of patients directly interviewed, 83% were in care at another clinic and 71% were still using ART. Sociostructural factors are the primary reasons for loss to follow-up. Outcomes among the lost are heterogeneous: both deaths and transfers to other clinics were common. Tracking a sample of lost patients is an efficient means for programs to understand site-specific reasons for and outcomes among patients lost to follow-up.

  5. Psychological changes in alcohol-dependent patients during a residential rehabilitation program

    Directory of Open Access Journals (Sweden)

    Giorgi I

    2015-12-01

    Full Text Available Ines Giorgi,1 Marcella Ottonello,2,3 Giovanni Vittadini,4 Giorgio Bertolotti5 1Psychology Unit, Salvatore Maugeri Foundation, Clinica del Lavoro e della Riabilitazione, IRCCS, Pavia, 2Department of Physical & Rehabilitation Medicine, Salvatore Maugeri Foundation, Clinica del Lavoro e della Riabilitazione, IRCCS, Genoa, 3Department of Medicine, PhD Program in Advanced Sciences and Technologies in Rehabilitation Medicine and Sport, Università di Tor Vergata, Rome, 4Alcohol Rehabilitation Unit, Salvatore Maugeri Foundation, Clinica del Lavoro e della Riabilitazione, IRCCS, Pavia, 5Psychology Unit, Salvatore Maugeri Foundation, Clinica del Lavoro e della Riabilitazione, IRCCS, Tradate, Italy Background: Alcohol-dependent patients usually experience negative affects under the influence of alcohol, and these affective symptoms have been shown to decrease as a result of alcohol-withdrawal treatment. A recent cognitive–affective model suggests an interaction between drug motivation and affective symptoms. The aim of this multicenter study was to evaluate the psychological changes in subjects undergoing a residential rehabilitation program specifically designed for alcohol addiction, and to identify at discharge patients with greater affective symptoms and therefore more at risk of relapse.Materials and methods: The sample included 560 subjects (mean age 46.91±10.2 years who completed 28-day rehabilitation programs for alcohol addiction, following a tailored routine characterized by short duration and high intensity of medical and psychotherapeutic treatment. The psychological clinical profiles of anxiety, depression, psychological distress, psychological well-being, and self-perception of a positive change were assessed using the Cognitive Behavioral Assessment – Outcome Evaluation questionnaire at the beginning and at the end of the program. The changes in the psychological variables of the questionnaire were identified and considered as outcome

  6. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    KAUST Repository

    Beck, Joakim

    2018-02-19

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized for a specified error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a single-loop Monte Carlo method that uses the Laplace approximation of the return value of the inner loop. The first demonstration example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  7. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    Science.gov (United States)

    Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl

    2018-06-01

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  8. Single-subject withdrawal designs in delayed matching-to-sample procedures

    OpenAIRE

    Eilifsen, Christoffer; Arntzen, Erik

    2011-01-01

    In most studies of delayed matching-to-sample (DMTS) and stimulus equivalence, the delay has remained fixed throughout a single experimental condition. We wanted to expand on the DMTS and stimulus equivalence literature by examining the effects of using titrating delays with different starting points during the establishment of conditional discriminations prerequisite for stimulus equivalence. In Experiment 1, a variation of a single-subject withdrawal design was used. Ten adults were exposed...

  9. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    Science.gov (United States)

    Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes

    2017-01-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...

  10. An Alternative View of Some FIA Sample Design and Analysis Issues

    Science.gov (United States)

    Paul C. Van Deusen

    2005-01-01

    Sample design and analysis decisions are the result of compromises and inputs from many sources. The end result would likely change if different individuals or groups were involved in the planning process. Discussed here are some alternatives to the procedures that are currently being used for the annual inventory. The purpose is to indicate that alternatives exist and...

  11. Predictive Sampling of Rare Conformational Events in Aqueous Solution: Designing a Generalized Orthogonal Space Tempering Method.

    Science.gov (United States)

    Lu, Chao; Li, Xubin; Wu, Dongsheng; Zheng, Lianqing; Yang, Wei

    2016-01-12

    In aqueous solution, solute conformational transitions are governed by intimate interplays of the fluctuations of solute-solute, solute-water, and water-water interactions. To promote molecular fluctuations to enhance sampling of essential conformational changes, a common strategy is to construct an expanded Hamiltonian through a series of Hamiltonian perturbations and thereby broaden the distribution of certain interactions of focus. Due to a lack of active sampling of configuration response to Hamiltonian transitions, it is challenging for common expanded Hamiltonian methods to robustly explore solvent mediated rare conformational events. The orthogonal space sampling (OSS) scheme, as exemplified by the orthogonal space random walk and orthogonal space tempering methods, provides a general framework for synchronous acceleration of slow configuration responses. To more effectively sample conformational transitions in aqueous solution, in this work, we devised a generalized orthogonal space tempering (gOST) algorithm. Specifically, in the Hamiltonian perturbation part, a solvent-accessible-surface-area-dependent term is introduced to implicitly perturb near-solute water-water fluctuations; more importantly in the orthogonal space response part, the generalized force order parameter is generalized as a two-dimension order parameter set, in which essential solute-solvent and solute-solute components are separately treated. The gOST algorithm is evaluated through a molecular dynamics simulation study on the explicitly solvated deca-alanine (Ala10) peptide. On the basis of a fully automated sampling protocol, the gOST simulation enabled repetitive folding and unfolding of the solvated peptide within a single continuous trajectory and allowed for detailed constructions of Ala10 folding/unfolding free energy surfaces. The gOST result reveals that solvent cooperative fluctuations play a pivotal role in Ala10 folding/unfolding transitions. In addition, our assessment

  12. Effect of dislocation pile-up on size-dependent yield strength in finite single-crystal micro-samples

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Bo; Shibutani, Yoji, E-mail: sibutani@mech.eng.osaka-u.ac.jp [Department of Mechanical Engineering, Osaka University, Suita 565-0871 (Japan); Zhang, Xu [State Key Laboratory for Strength and Vibration of Mechanical Structures, School of Aerospace, Xi' an Jiaotong University, Xi' an 710049 (China); School of Mechanics and Engineering Science, Zhengzhou University, Zhengzhou 450001 (China); Shang, Fulin [State Key Laboratory for Strength and Vibration of Mechanical Structures, School of Aerospace, Xi' an Jiaotong University, Xi' an 710049 (China)

    2015-07-07

    Recent research has explained that the steeply increasing yield strength in metals depends on decreasing sample size. In this work, we derive a statistical physical model of the yield strength of finite single-crystal micro-pillars that depends on single-ended dislocation pile-up inside the micro-pillars. We show that this size effect can be explained almost completely by considering the stochastic lengths of the dislocation source and the dislocation pile-up length in the single-crystal micro-pillars. The Hall–Petch-type relation holds even in a microscale single-crystal, which is characterized by its dislocation source lengths. Our quantitative conclusions suggest that the number of dislocation sources and pile-ups are significant factors for the size effect. They also indicate that starvation of dislocation sources is another reason for the size effect. Moreover, we investigated the explicit relationship between the stacking fault energy and the dislocation “pile-up” effect inside the sample: materials with low stacking fault energy exhibit an obvious dislocation pile-up effect. Our proposed physical model predicts a sample strength that agrees well with experimental data, and our model can give a more precise prediction than the current single arm source model, especially for materials with low stacking fault energy.

  13. A UAV-Based Fog Collector Design for Fine-Scale Aerobiological Sampling

    Science.gov (United States)

    Gentry, Diana; Guarro, Marcello; Demachkie, Isabella Siham; Stumfall, Isabel; Dahlgren, Robert P.

    2017-01-01

    Airborne microbes are found throughout the troposphere and into the stratosphere. Knowing how the activity of airborne microorganisms can alter water, carbon, and other geochemical cycles is vital to a full understanding of local and global ecosystems. Just as on the land or in the ocean, atmospheric regions vary in habitability; the underlying geochemical, climatic, and ecological dynamics must be characterized at different scales to be effectively modeled. Most aerobiological studies have focused on a high level: 'How high are airborne microbes found?' and 'How far can they travel?' Most fog and cloud water studies collect from stationary ground stations (point) or along flight transects (1D). To complement and provide context for this data, we have designed a UAV-based modified fog and cloud water collector to retrieve 4D-resolved samples for biological and chemical analysis.Our design uses a passive impacting collector hanging from a rigid rod suspended between two multi-rotor UAVs. The suspension design reduces the effect of turbulence and potential for contamination from the UAV downwash. The UAVs are currently modeled in a leader-follower configuration, taking advantage of recent advances in modular UAVs, UAV swarming, and flight planning.The collector itself is a hydrophobic mesh. Materials including Tyvek, PTFE, nylon, and polypropylene monofilament fabricated via laser cutting, CNC knife, or 3D printing were characterized for droplet collection efficiency using a benchtop atomizer and particle counter. Because the meshes can be easily and inexpensively fabricated, a set can be pre-sterilized and brought to the field for 'hot swapping' to decrease cross-contamination between flight sessions or use as negative controls.An onboard sensor and logging system records the time and location of each sample; when combined with flight tracking data, the samples can be resolved into a 4D volumetric map of the fog bank. Collected samples can be returned to the lab for

  14. Reliability of risk-adjusted outcomes for profiling hospital surgical quality.

    Science.gov (United States)

    Krell, Robert W; Hozain, Ahmed; Kao, Lillian S; Dimick, Justin B

    2014-05-01

    Quality improvement platforms commonly use risk-adjusted morbidity and mortality to profile hospital performance. However, given small hospital caseloads and low event rates for some procedures, it is unclear whether these outcomes reliably reflect hospital performance. To determine the reliability of risk-adjusted morbidity and mortality for hospital performance profiling using clinical registry data. A retrospective cohort study was conducted using data from the American College of Surgeons National Surgical Quality Improvement Program, 2009. Participants included all patients (N = 55,466) who underwent colon resection, pancreatic resection, laparoscopic gastric bypass, ventral hernia repair, abdominal aortic aneurysm repair, and lower extremity bypass. Outcomes included risk-adjusted overall morbidity, severe morbidity, and mortality. We assessed reliability (0-1 scale: 0, completely unreliable; and 1, perfectly reliable) for all 3 outcomes. We also quantified the number of hospitals meeting minimum acceptable reliability thresholds (>0.70, good reliability; and >0.50, fair reliability) for each outcome. For overall morbidity, the most common outcome studied, the mean reliability depended on sample size (ie, how high the hospital caseload was) and the event rate (ie, how frequently the outcome occurred). For example, mean reliability for overall morbidity was low for abdominal aortic aneurysm repair (reliability, 0.29; sample size, 25 cases per year; and event rate, 18.3%). In contrast, mean reliability for overall morbidity was higher for colon resection (reliability, 0.61; sample size, 114 cases per year; and event rate, 26.8%). Colon resection (37.7% of hospitals), pancreatic resection (7.1% of hospitals), and laparoscopic gastric bypass (11.5% of hospitals) were the only procedures for which any hospitals met a reliability threshold of 0.70 for overall morbidity. Because severe morbidity and mortality are less frequent outcomes, their mean

  15. Accuracy assessment of the National Forest Inventory map of Mexico: sampling designs and the fuzzy characterization of landscapes

    Directory of Open Access Journals (Sweden)

    Stéphane Couturier

    2009-10-01

    Full Text Available There is no record so far in the literature of a comprehensive method to assess the accuracy of regional scale Land Cover/ Land Use (LCLU maps in the sub-tropical belt. The elevated biodiversity and the presence of highly fragmented classes hamper the use of sampling designs commonly employed in previous assessments of mainly temperate zones. A sampling design for assessing the accuracy of the Mexican National Forest Inventory (NFI map at community level is presented. A pilot study was conducted on the Cuitzeo Lake watershed region covering 400 000 ha of the 2000 Landsat-derived map. Various sampling designs were tested in order to find a trade-off between operational costs, a good spatial distribution of the sample and the inclusion of all scarcely distributed classes (‘rare classes’. A two-stage sampling design where the selection of Primary Sampling Units (PSU was done under separate schemes for commonly and scarcely distributed classes, showed best characteristics. A total of 2 023 punctual secondary sampling units were verified against their NFI map label. Issues regarding the assessment strategy and trends of class confusions are devised.

  16. Economic Design of Acceptance Sampling Plans in a Two-Stage Supply Chain

    Directory of Open Access Journals (Sweden)

    Lie-Fern Hsu

    2012-01-01

    Full Text Available Supply Chain Management, which is concerned with material and information flows between facilities and the final customers, has been considered the most popular operations strategy for improving organizational competitiveness nowadays. With the advanced development of computer technology, it is getting easier to derive an acceptance sampling plan satisfying both the producer's and consumer's quality and risk requirements. However, all the available QC tables and computer software determine the sampling plan on a noneconomic basis. In this paper, we design an economic model to determine the optimal sampling plan in a two-stage supply chain that minimizes the producer's and the consumer's total quality cost while satisfying both the producer's and consumer's quality and risk requirements. Numerical examples show that the optimal sampling plan is quite sensitive to the producer's product quality. The product's inspection, internal failure, and postsale failure costs also have an effect on the optimal sampling plan.

  17. Sampling design and procedures for fixed surface-water sites in the Georgia-Florida coastal plain study unit, 1993

    Science.gov (United States)

    Hatzell, H.H.; Oaksford, E.T.; Asbury, C.E.

    1995-01-01

    The implementation of design guidelines for the National Water-Quality Assessment (NAWQA) Program has resulted in the development of new sampling procedures and the modification of existing procedures commonly used in the Water Resources Division of the U.S. Geological Survey. The Georgia-Florida Coastal Plain (GAFL) study unit began the intensive data collection phase of the program in October 1992. This report documents the implementation of the NAWQA guidelines by describing the sampling design and procedures for collecting surface-water samples in the GAFL study unit in 1993. This documentation is provided for agencies that use water-quality data and for future study units that will be entering the intensive phase of data collection. The sampling design is intended to account for large- and small-scale spatial variations, and temporal variations in water quality for the study area. Nine fixed sites were selected in drainage basins of different sizes and different land-use characteristics located in different land-resource provinces. Each of the nine fixed sites was sampled regularly for a combination of six constituent groups composed of physical and chemical constituents: field measurements, major ions and metals, nutrients, organic carbon, pesticides, and suspended sediments. Some sites were also sampled during high-flow conditions and storm events. Discussion of the sampling procedure is divided into three phases: sample collection, sample splitting, and sample processing. A cone splitter was used to split water samples for the analysis of the sampling constituent groups except organic carbon from approximately nine liters of stream water collected at four fixed sites that were sampled intensively. An example of the sample splitting schemes designed to provide the sample volumes required for each sample constituent group is described in detail. Information about onsite sample processing has been organized into a flowchart that describes a pathway for each of

  18. Genetic Contribution to Alcohol Dependence: Investigation of a Heterogeneous German Sample of Individuals with Alcohol Dependence, Chronic Alcoholic Pancreatitis, and Alcohol-Related Cirrhosis

    Directory of Open Access Journals (Sweden)

    Jens Treutlein

    2017-07-01

    Full Text Available The present study investigated the genetic contribution to alcohol dependence (AD using genome-wide association data from three German samples. These comprised patients with: (i AD; (ii chronic alcoholic pancreatitis (ACP; and (iii alcohol-related liver cirrhosis (ALC. Single marker, gene-based, and pathway analyses were conducted. A significant association was detected for the ADH1B locus in a gene-based approach (puncorrected = 1.2 × 10−6; pcorrected = 0.020. This was driven by the AD subsample. No association with ADH1B was found in the combined ACP + ALC sample. On first inspection, this seems surprising, since ADH1B is a robustly replicated risk gene for AD and may therefore be expected to be associated also with subgroups of AD patients. The negative finding in the ACP + ALC sample, however, may reflect genetic stratification as well as random fluctuation of allele frequencies in the cases and controls, demonstrating the importance of large samples in which the phenotype is well assessed.

  19. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    Science.gov (United States)

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Spin-dependent transport and functional design in organic ferromagnetic devices

    Directory of Open Access Journals (Sweden)

    Guichao Hu

    2017-09-01

    Full Text Available Organic ferromagnets are intriguing materials in that they combine ferromagnetic and organic properties. Although challenges in their synthesis still remain, the development of organic spintronics has triggered strong interest in high-performance organic ferromagnetic devices. This review first introduces our theory for spin-dependent electron transport through organic ferromagnetic devices, which combines an extended Su–Schrieffer–Heeger model with the Green’s function method. The effects of the intrinsic interactions in the organic ferromagnets, including strong electron–lattice interaction and spin–spin correlation between π-electrons and radicals, are highlighted. Several interesting functional designs of organic ferromagnetic devices are discussed, specifically the concepts of a spin filter, multi-state magnetoresistance, and spin-current rectification. The mechanism of each phenomenon is explained by transmission and orbital analysis. These works show that organic ferromagnets are promising components for spintronic devices that deserve to be designed and examined in future experiments.

  1. Statistical Power and Optimum Sample Allocation Ratio for Treatment and Control Having Unequal Costs Per Unit of Randomization

    Science.gov (United States)

    Liu, Xiaofeng

    2003-01-01

    This article considers optimal sample allocation between the treatment and control condition in multilevel designs when the costs per sampling unit vary due to treatment assignment. Optimal unequal allocation may reduce the cost from that of a balanced design without sacrificing any power. The optimum sample allocation ratio depends only on the…

  2. Design-based estimators for snowball sampling

    OpenAIRE

    Shafie, Termeh

    2010-01-01

    Snowball sampling, where existing study subjects recruit further subjects from amongtheir acquaintances, is a popular approach when sampling from hidden populations.Since people with many in-links are more likely to be selected, there will be a selectionbias in the samples obtained. In order to eliminate this bias, the sample data must beweighted. However, the exact selection probabilities are unknown for snowball samplesand need to be approximated in an appropriate way. This paper proposes d...

  3. Slicing, sampling, and distance-dependent effects affect network measures in simulated cortical circuit structures.

    Science.gov (United States)

    Miner, Daniel C; Triesch, Jochen

    2014-01-01

    The neuroanatomical connectivity of cortical circuits is believed to follow certain rules, the exact origins of which are still poorly understood. In particular, numerous nonrandom features, such as common neighbor clustering, overrepresentation of reciprocal connectivity, and overrepresentation of certain triadic graph motifs have been experimentally observed in cortical slice data. Some of these data, particularly regarding bidirectional connectivity are seemingly contradictory, and the reasons for this are unclear. Here we present a simple static geometric network model with distance-dependent connectivity on a realistic scale that naturally gives rise to certain elements of these observed behaviors, and may provide plausible explanations for some of the conflicting findings. Specifically, investigation of the model shows that experimentally measured nonrandom effects, especially bidirectional connectivity, may depend sensitively on experimental parameters such as slice thickness and sampling area, suggesting potential explanations for the seemingly conflicting experimental results.

  4. Occupational therapy practitioners' perceptions of rehabilitation managers' leadership styles and the outcomes of leadership.

    Science.gov (United States)

    Jeff, Snodgrass; Douthitt, Shannon; Ellis, Rachel; Wade, Shelly; Plemons, Josh

    2008-01-01

    The purpose of this research was to serve as a pilot study to investigate the association between occupational therapy practitioners' perceptions of rehabilitation managers' leadership styles and the outcomes of leadership. Data for this study were collected using the Multifactor Leadership Questionnaire Form 5X and a self-designed demographic questionnaire. The study working sample included 73 occupational therapy practitioners. Major findings from the study indicate that overall, transformational, and transactional leadership styles are associated with leadership outcomes. Transformational leadership had a significant (p leadership outcomes, whereas transactional leadership had a significant (p leadership outcomes. The contingent reward leadership attribute (although belonging to the transactional leadership construct) was found to be positively associated with leadership outcomes, similar to the transformational leadership constructs. The results of this research suggest that transformational leadership styles have a positive association with leadership outcomes, whereas transactional leadership styles have a negative association, excluding the positive transactional contingent reward attribute. A larger, random sample is recommended as a follow-up study.

  5. Sampling and energy evaluation challenges in ligand binding protein design.

    Science.gov (United States)

    Dou, Jiayi; Doyle, Lindsey; Jr Greisen, Per; Schena, Alberto; Park, Hahnbeom; Johnsson, Kai; Stoddard, Barry L; Baker, David

    2017-12-01

    The steroid hormone 17α-hydroxylprogesterone (17-OHP) is a biomarker for congenital adrenal hyperplasia and hence there is considerable interest in development of sensors for this compound. We used computational protein design to generate protein models with binding sites for 17-OHP containing an extended, nonpolar, shape-complementary binding pocket for the four-ring core of the compound, and hydrogen bonding residues at the base of the pocket to interact with carbonyl and hydroxyl groups at the more polar end of the ligand. Eight of 16 designed proteins experimentally tested bind 17-OHP with micromolar affinity. A co-crystal structure of one of the designs revealed that 17-OHP is rotated 180° around a pseudo-two-fold axis in the compound and displays multiple binding modes within the pocket, while still interacting with all of the designed residues in the engineered site. Subsequent rounds of mutagenesis and binding selection improved the ligand affinity to nanomolar range, while appearing to constrain the ligand to a single bound conformation that maintains the same "flipped" orientation relative to the original design. We trace the discrepancy in the design calculations to two sources: first, a failure to model subtle backbone changes which alter the distribution of sidechain rotameric states and second, an underestimation of the energetic cost of desolvating the carbonyl and hydroxyl groups of the ligand. The difference between design model and crystal structure thus arises from both sampling limitations and energy function inaccuracies that are exacerbated by the near two-fold symmetry of the molecule. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  6. How Mobile App Design Impacts User Responses to Mixed Self-Tracking Outcomes: Randomized Online Experiment to Explore the Role of Spatial Distance for Hedonic Editing

    Science.gov (United States)

    Lorenz, Jana

    2018-01-01

    Background Goal setting is among the most common behavioral change techniques employed in contemporary self-tracking apps. For these techniques to be effective, it is relevant to understand how the visual presentation of goal-related outcomes employed in the app design affects users’ responses to their self-tracking outcomes. Objective This study examined whether a spatially close (vs distant) presentation of mixed positive and negative self-tracking outcomes from multiple domains (ie, activity, diet) on a digital device’s screen can provide users the opportunity to hedonically edit their self-tracking outcome profile (ie, to view their mixed self-tracking outcomes in the most positive light). Further, this study examined how the opportunity to hedonically edit one’s self-tracking outcome profile relates to users’ future health behavior intentions. Methods To assess users’ responses to a spatially close (vs distant) presentation of a mixed-gain (vs mixed-loss) self-tracking outcome profile, a randomized 2×2 between-subjects online experiment with a final sample of 397 participants (mean age 27.4, SD 7.2 years; 71.5%, 284/397 female) was conducted in Germany. The experiment started with a cover story about a fictitious self-tracking app. Thereafter, participants saw one of four manipulated self-tracking outcome profiles. Variables of interest measured were health behavior intentions, compensatory health beliefs, health motivation, and recall of the outcome profile. We analyzed data using chi-square tests (SPSS version 23) and moderated mediation analyses with the PROCESS macro 2.16.1. Results Spatial distance facilitated hedonic editing, which was indicated by systematic memory biases in users’ recall of positive and negative self-tracking outcomes. In the case of a mixed-gain outcome profile, a spatially close (vs distant) presentation tended to increase the underestimation of the negative outcome (P=.06). In the case of a mixed-loss outcome profile, a

  7. Inferior petrosal sinus sampling in the diagnosis of adrenocorticotropin dependent Cushing syndrome with unknown origin

    International Nuclear Information System (INIS)

    Shen Xuefeng; Yuan Dequan; Yue Ming; Feng Juanjuan

    2011-01-01

    Objective: To evaluate the value of inferior petrosal sinus sampling (IPSS) in the diagnosis of adrenocorticotropic hormone (ACTH) dependent Cushing syndrome (CS) with unknown origin. Methods: IPSS was carried out for the diagnosis of 16 cases with ACTH dependent CS who had not been identified after a series of dexamethasone suppression tests and radiological examinations. The ratio of inferior petrosal sinus/peripheral ACTH was assayed. The sensitivity and specificity of diagnosis of the Cushing disease were estimated. Results: The inferior petrosal sinus/peripheral ACTH ratio was over 2.0 in 13 cases. Twelve cases underwent surgery with pathological diagnosis of pituitary ACTH adenoma, 1 patient relieved after γ knife treatment. The ratio was < 2.0 in 3 cases including 2 pulmonary carcinoid and one pituitary ACTH adenoma. The sensitivity and specify of IPSS for the diagnosis of Cushing disease were 13/14 and 2/2 respectively. Conclusion: IPSS was a safe technique with high sensitivity, specify and infrequent complications in the diagnosis of ACTH dependent Cushing disease. It had great clinical value in the differential diagnosis of ACTH dependent Cushing disease with unknown origin. (authors)

  8. Adrenal Vein Sampling for Conn's Syndrome: Diagnosis and Clinical Outcomes.

    Science.gov (United States)

    Deipolyi, Amy R; Bailin, Alexander; Wicky, Stephan; Alansari, Shehab; Oklu, Rahmi

    2015-06-19

    Adrenal vein sampling (AVS) is the gold standard test to determine unilateral causes of primary aldosteronism (PA). We have retrospectively characterized our experience with AVS including concordance of AVS results and imaging, and describe the approach for the PA patient in whom bilateral AVS is unsuccessful. We reviewed the medical records of 85 patients with PA and compared patients who were treated medically and surgically on pre-procedure presentation and post-treatment outcomes, and evaluated how technically unsuccessful AVS results were used in further patient management. Out of the 92 AVS performed in 85 patients, AVS was technically successful bilaterally in 58 (63%) of cases. Either unsuccessful AVS prompted a repeat AVS, or results from the contralateral side and from CT imaging were used to guide further therapy. Patients who were managed surgically with adrenalectomy had higher initial blood pressure and lower potassium levels compared with patients who were managed medically. Adrenalectomy results in significantly decreased blood pressure and normalization of potassium levels. AVS can identify surgically curable causes of PA, but can be technically challenging. When one adrenal vein fails to be cannulated, results from the contralateral vein can be useful in conjunction with imaging and clinical findings to suggest further management.

  9. Surface area-dependence of gas-particle interactions influences pulmonary and neuroinflammatory outcomes.

    Science.gov (United States)

    Tyler, Christina R; Zychowski, Katherine E; Sanchez, Bethany N; Rivero, Valeria; Lucas, Selita; Herbert, Guy; Liu, June; Irshad, Hammad; McDonald, Jacob D; Bleske, Barry E; Campen, Matthew J

    2016-12-01

    Deleterious consequences of exposure to traffic emissions may derive from interactions between carbonaceous particulate matter (PM) and gaseous components in a manner that is dependent on the surface area or complexity of the particles. To determine the validity of this hypothesis, we examined pulmonary and neurological inflammatory outcomes in C57BL/6 and apolipoprotein E knockout (ApoE -/- ) male mice after acute and chronic exposure to vehicle engine-derived particulate matter, generated as ultrafine (UFP) and fine (FP) sizes, with additional exposures using UFP or FP combined with gaseous copollutants derived from fresh gasoline and diesel emissions, labeled as UFP + G and FP + G. The UFP and UFP + G exposure groups resulted in the most profound pulmonary and neuroinflammatory effects. Phagocytosis of UFP + G particles via resident alveolar macrophages was substantial in both mouse strains, particularly after chronic exposure, with concurrent increased proinflammatory cytokine expression of CXCL1 and TNFα in the bronchial lavage fluid. In the acute exposure paradigm, only UFP and UFP + G induced significant changes in pulmonary inflammation and only in the ApoE -/- animals. Similarly, acute exposure to UFP and UFP + G increased the expression of several cytokines in the hippocampus of ApoE -/- mice including Il-1β, IL-6, Tgf-β and Tnf-α and in the hippocampus of C57BL/6 mice including Ccl5, Cxcl1, Il-1β, and Tnf-α. Interestingly, Il-6 and Tgf-β expression were decreased in the C57BL/6 hippocampus after acute exposure. Chronic exposure to UFP + G increased expression of Ccl5, Cxcl1, Il-6, and Tgf-β in the ApoE -/- hippocampus, but this effect was minimal in the C57BL/6 mice, suggesting compensatory mechanisms to manage neuroinflammation in this strain. Inflammatory responses the lung and brain were most substantial in ApoE -/- animals exposed to UFP + G, suggesting that the surface area-dependent interaction of gases and

  10. A retrospective study of long-term treatment outcomes for reduced vocal intensity in hypokinetic dysarthria

    Directory of Open Access Journals (Sweden)

    Christopher R. Watts

    2016-02-01

    Full Text Available Abstract Background Reduced vocal intensity is a core impairment of hypokinetic dysarthria in Parkinson’s disease (PD. Speech treatments have been developed to rehabilitate the vocal subsystems underlying this impairment. Intensive treatment programs requiring high-intensity voice and speech exercises with clinician-guided prompting and feedback have been established as effective for improving vocal function. Less is known, however, regarding long-term outcomes of clinical benefit in speakers with PD who receive these treatments. Methods A retrospective cohort design was utilized. Data from 78 patient files across a three year period were analyzed. All patients received a structured, intensive program of voice therapy focusing on speaking intent and loudness. The dependent variable for all analyses was vocal intensity in decibels (dBSPL. Vocal intensity during sustained vowel production, reading, and novel conversational speech was compared at pre-treatment, post-treatment, six month follow-up, and twelve month follow-up periods. Results Statistically significant increases in vocal intensity were found at post-treatment, 6 months, and 12 month follow-up periods with intensity gains ranging from 5 to 17 dB depending on speaking condition and measurement period. Significant treatment effects were found in all three speaking conditions. Effect sizes for all outcome measures were large, suggesting a strong degree of practical significance. Conclusions Significant increases in vocal intensity measured at 6 and 12 moth follow-up periods suggested that the sample of patients maintained treatment benefit for up to a year. These findings are supported by outcome studies reporting treatment outcomes within a few months post-treatment, in addition to prior studies that have reported long-term outcome results. The positive treatment outcomes experienced by the PD cohort in this study are consistent with treatment responses subsequent to other treatment

  11. Sample-length dependence of the critical current of slightly and significantly bent-damaged Bi2223 superconducting composite tape

    International Nuclear Information System (INIS)

    Ochiai, S; Fujimoto, M; Okuda, H; Oh, S S; Ha, D W

    2007-01-01

    The local critical current along a sample length is different from position to position in a long sample, especially when the sample is damaged by externally applied strain. In the present work, we attempted to reveal the relation of the distribution of the local critical current to overall critical current and the sample-length dependence of critical current for slightly and significantly damaged Bi2223 composite tape samples. In the experiment, 48 cm long Bi2223 composite tape samples, composed of 48 local elements with a length of 1 cm and 8 parts with a length 6 cm, were bent by 0.37 and 1.0% to cause slight and significant damage, respectively. The V-I curve, critical current (1 μV cm -1 criterion) and n value were measured for the overall sample as well as for the local elements and parts. It was found that the critical current distributions of the 1 cm elements at 0.37 and 1.0% bending strains are described by the three-parameter- and bimodal Weibull distribution functions, respectively. The critical current of a long sample at both bending strains could be described well by substituting the distributed critical current and n value of the short elements into the series circuit model for voltage generation. Also the measured relation of average critical current to sample length could be reproduced well in the computer by a Monte Carlo simulation method. It was shown that the critical current and n value decrease with increasing sample length at both bending strains. The extent of the decrease in critical current with sample length is dependent on the criterion of the critical current; the critical current decreases only slightly under the 1 μV cm -1 criterion which is not damage-sensitive, while it decreases greatly with increasing sample length under damage-sensitive criteria such as the 1 μV one

  12. Residential energy efficiency retrofits: How program design affects participation and outcomes

    International Nuclear Information System (INIS)

    Hoicka, Christina E.; Parker, Paul; Andrey, Jean

    2014-01-01

    Better methods of characterizing and addressing heterogeneity in preferences and decision making are needed to stimulate reductions in household greenhouse gas emissions. Four residential energy efficiency programs were delivered consecutively in the Region of Waterloo, Canada, between 1999 and 2011, and each offered a unique combination of information, financial reward structure, and price. A natural quasi-experimental intervention design was employed to assess differences in outcomes across these program structures. Participation at the initial (evaluation by an energy advisor) and follow-up (verification of retrofit) stages, and the material characteristics (e.g., energy performance) were measured and compared between the groups of houses included in each program at each stage. The programs appealed to people with different types of material concerns; each phase of the program was associated with houses with a different mix of material characteristics and depths of recommended and achieved changes. While a performance-based reward attracted fewer houses at each stage than a larger list-based reward, older houses with poorer energy performance were included at each stage. The findings support experimentation with program designs to target sub-populations of housing stock; future program designs should experiment more carefully and with larger performance-based rewards and test parallels with potential carbon market structures. - Highlights: • Multi-program data over 12 years detailing residential energy retrofits. • Natural experimental intervention research design for program evaluation. • Number and attributes of participating households differed by program design. • Financial rewards attracted more participants to the verification stage. • Performance-based incentives have the largest potential for energy savings

  13. Time-Dependent Deformation Modelling for a Chopped-Glass Fiber Composite for Automotive Durability Design Criteria; FINAL

    International Nuclear Information System (INIS)

    Ren, W

    2001-01-01

    Time-dependent deformation behavior of a polymeric composite with chopped-glass-fiber reinforcement was investigated for automotive applications, The material under stress was exposed to representative automobile service environments. Results show that environment has substantial effects on time-dependent deformation behavior of the material. The data were analyzed and experimentally-based models developed for the time-dependent deformation behavior as a basis for automotive structural durability design criteria

  14. Treatment outcome for a sample of patients with Class II division 1 malocclusion treated at a regional hospital orthodontic department.

    LENUS (Irish Health Repository)

    Burden, D J

    1998-01-01

    This retrospective study assessed the outcome of orthodontic treatment of 264 patients with Class II division 1 malocclusion (overjet greater than 6 mm). The sample comprised patients who had completed their fixed appliance orthodontic treatment at a regional hospital orthodontic unit in the Republic of Ireland. The PAR Index (Peer Assessment Rating) was used to evaluate treatment outcome using before and after treatment study casts. The results revealed that treatment for this particular type of malocclusion was highly effective with a very few patients failing to benefit from their orthodontic treatment.

  15. OSIRIS-REx Touch-and-Go (TAG) Mission Design for Asteroid Sample Collection

    Science.gov (United States)

    May, Alexander; Sutter, Brian; Linn, Timothy; Bierhaus, Beau; Berry, Kevin; Mink, Ron

    2014-01-01

    The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRIS-REx) mission is a NASA New Frontiers mission launching in September 2016 to rendezvous with the near-Earth asteroid Bennu in October 2018. After several months of proximity operations to characterize the asteroid, OSIRIS-REx flies a Touch-And-Go (TAG) trajectory to the asteroid's surface to collect at least 60 g of pristine regolith sample for Earth return. This paper provides mission and flight system overviews, with more details on the TAG mission design and key events that occur to safely and successfully collect the sample. An overview of the navigation performed relative to a chosen sample site, along with the maneuvers to reach the desired site is described. Safety monitoring during descent is performed with onboard sensors providing an option to abort, troubleshoot, and try again if necessary. Sample collection occurs using a collection device at the end of an articulating robotic arm during a brief five second contact period, while a constant force spring mechanism in the arm assists to rebound the spacecraft away from the surface. Finally, the sample is measured quantitatively utilizing the law of conservation of angular momentum, along with qualitative data from imagery of the sampling device. Upon sample mass verification, the arm places the sample into the Stardust-heritage Sample Return Capsule (SRC) for return to Earth in September 2023.

  16. Do clinical outcomes in chronic hemodialysis depend on the choice of a dialyzer?

    Science.gov (United States)

    Ward, Richard A

    2011-01-01

    Nephrologists are presented with a range of choices when selecting a dialyzer for chronic hemodialysis. Dialyzers differ in the material, structure, permeability and surface area of their membrane, and how the dialyzer is sterilized. Opinions vary regarding the impact of dialyzer characteristics on patient outcomes and which, if any, of these properties to take into account when choosing a dialyzer can be confusing. In the general dialysis population, there is no compelling evidence that the choice of a membrane material from among those materials currently in clinical use has a significant impact on morbidity or mortality (although there are rare patients who will react adversely to a given dialysis membrane). Similarly, most dialyzers are capable of adequately removing small solutes, such as urea, provided they are used with an appropriate blood flow rate and treatment time to ensure delivery of a single-pool Kt/V(urea) of at least 1.25 for men and 1.65 for women. However, in some dialysis patient subpopulations, the results of randomized clinical trials suggest that use of dialyzer containing high-flux membranes confers an outcome advantage. The extent to which this advantage is realized might also depend on how the dialyzer is used, with application in convective therapies such as hemodiafiltration being superior to diffusive therapies such as hemodialysis. This possibility is currently the subject of several large clinical trials. © 2011 Wiley Periodicals, Inc.

  17. Sonographic prediction of outcome of vacuum deliveries

    DEFF Research Database (Denmark)

    Kahrs, Birgitte H; Usman, Sana; Ghi, Tullio

    2017-01-01

    of vacuum extraction in relation to ultrasound measured head-perineum distance with a predefined cut-off of 25 mm, and 220 women were needed to discriminate between groups using a hazard ratio of 1.5 with 80% power and alpha 5%. Secondary outcomes were delivery mode and umbilical artery cord blood samples...... outcome in nulliparous women with prolonged second stage of labor. STUDY DESIGN: We performed a prospective cohort study in nulliparous women at term with prolonged second stage of labor in 7 European maternity units from 2013 through 2016. Fetal head position and station were determined using...... after birth. The time interval was evaluated using survival analyses, and the outcomes of delivery were evaluated using receiver operating characteristic curves and descriptive statistics. Results were analyzed according to intention to treat. RESULTS: The study population comprised 222 women...

  18. Robotic Irradiated Sample Handling Concept Design in Reactor TRIGA PUSPATI using Simulation Software

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abdul Manan; Mohd Sabri Minhat; Ridzuan Abdul Mutalib; Zareen Khan Abdul Jalil Khan; Nurfarhana Ayuni Joha

    2015-01-01

    This paper introduces the concept design of an Robotic Irradiated Sample Handling Machine using graphical software application, designed as a general, flexible and open platform to work on robotics. Webots has proven to be a useful tool in many fields of robotics, such as manipulator programming, mobile robots control (wheeled, sub-aquatic and walking robots), distance computation, sensor simulation, collision detection, motion planning and so on. Webots is used as the common interface for all the applications. Some practical cases and application for this concept design are illustrated on the paper to present the possibilities of this simulation software. (author)

  19. Urine sampling techniques in symptomatic primary-care patients

    DEFF Research Database (Denmark)

    Holm, Anne; Aabenhus, Rune

    2016-01-01

    in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. Conclusions: At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However......Background: Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection...... a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. Results: We included...

  20. Slicing, sampling, and distance-dependent effects affect network measures in simulated cortical circuit structures

    Directory of Open Access Journals (Sweden)

    Daniel Carl Miner

    2014-11-01

    Full Text Available The neuroanatomical connectivity of cortical circuits is believed to follow certain rules, the exact origins of which are still poorly understood. In particular, numerous nonrandom features, such as common neighbor clustering, overrepresentation of reciprocal connectivity, and overrepresentation of certain triadic graph motifs have been experimentally observed in cortical slice data. Some of these data, particularly regarding bidirectional connectivity are seemingly contradictory, and the reasons for this are unclear. Here we present a simple static geometric network model with distance-dependent connectivity on a realistic scale that naturally gives rise to certain elements of these observed behaviors, and may provide plausible explanations for some of the conflicting findings. Specifically, investigation of the model shows that experimentally measured nonrandom effects, especially bidirectional connectivity, may depend sensitively on experimental parameters such as slice thickness and sampling area, suggesting potential explanations for the seemingly conflicting experimental results.

  1. [Clinical skills and outcomes of chair-side computer aided design and computer aided manufacture system].

    Science.gov (United States)

    Yu, Q

    2018-04-09

    Computer aided design and computer aided manufacture (CAD/CAM) technology is a kind of oral digital system which is applied to clinical diagnosis and treatment. It overturns the traditional pattern, and provides a solution to restore defect tooth quickly and efficiently. In this paper we mainly discuss the clinical skills of chair-side CAD/CAM system, including tooth preparation, digital impression, the three-dimensional design of prosthesis, numerical control machining, clinical bonding and so on, and review the outcomes of several common kinds of materials at the same time.

  2. Design and characterization of poly(dimethylsiloxane)-based valves for interfacing continuous-flow sampling to microchip electrophoresis.

    Science.gov (United States)

    Li, Michelle W; Huynh, Bryan H; Hulvey, Matthew K; Lunte, Susan M; Martin, R Scott

    2006-02-15

    This work describes the fabrication and evaluation of a poly(dimethyl)siloxane (PDMS)-based device that enables the discrete injection of a sample plug from a continuous-flow stream into a microchannel for subsequent analysis by electrophoresis. Devices were fabricated by aligning valving and flow channel layers followed by plasma sealing the combined layers onto a glass plate that contained fittings for the introduction of liquid sample and nitrogen gas. The design incorporates a reduced-volume pneumatic valve that actuates (on the order of hundreds of milliseconds) to allow analyte from a continuously flowing sampling channel to be injected into a separation channel for electrophoresis. The injector design was optimized to include a pushback channel to flush away stagnant sample associated with the injector dead volume. The effect of the valve actuation time, the pushback voltage, and the sampling stream flow rate on the performance of the device was characterized. Using the optimized design and an injection frequency of 0.64 Hz showed that the injection process is reproducible (RSD of 1.77%, n = 15). Concentration change experiments using fluorescein as the analyte showed that the device could achieve a lag time as small as 14 s. Finally, to demonstrate the potential uses of this device, the microchip was coupled to a microdialysis probe to monitor a concentration change and sample a fluorescein dye mixture.

  3. Sampling design for the Study of Cardiovascular Risks in Adolescents (ERICA

    Directory of Open Access Journals (Sweden)

    Mauricio Teixeira Leite de Vasconcellos

    2015-05-01

    Full Text Available The Study of Cardiovascular Risk in Adolescents (ERICA aims to estimate the prevalence of cardiovascular risk factors and metabolic syndrome in adolescents (12-17 years enrolled in public and private schools of the 273 municipalities with over 100,000 inhabitants in Brazil. The study population was stratified into 32 geographical strata (27 capitals and five sets with other municipalities in each macro-region of the country and a sample of 1,251 schools was selected with probability proportional to size. In each school three combinations of shift (morning and afternoon and grade were selected, and within each of these combinations, one class was selected. All eligible students in the selected classes were included in the study. The design sampling weights were calculated by the product of the reciprocals of the inclusion probabilities in each sampling stage, and were later calibrated considering the projections of the numbers of adolescents enrolled in schools located in the geographical strata by sex and age.

  4. A design-based approximation to the Bayes Information Criterion in finite population sampling

    Directory of Open Access Journals (Sweden)

    Enrico Fabrizi

    2014-05-01

    Full Text Available In this article, various issues related to the implementation of the usual Bayesian Information Criterion (BIC are critically examined in the context of modelling a finite population. A suitable design-based approximation to the BIC is proposed in order to avoid the derivation of the exact likelihood of the sample which is often very complex in a finite population sampling. The approximation is justified using a theoretical argument and a Monte Carlo simulation study.

  5. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  6. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    Science.gov (United States)

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  7. Variation in choice of study design: findings from the Epidemiology Design Decision Inventory and Evaluation (EDDIE) survey.

    Science.gov (United States)

    Stang, Paul E; Ryan, Patrick B; Overhage, J Marc; Schuemie, Martijn J; Hartzema, Abraham G; Welebob, Emily

    2013-10-01

    Researchers using observational data to understand drug effects must make a number of analytic design choices that suit the characteristics of the data and the subject of the study. Review of the published literature suggests that there is a lack of consistency even when addressing the same research question in the same database. To characterize the degree of similarity or difference in the method and analysis choices made by observational database research experts when presented with research study scenarios. On-line survey using research scenarios on drug-effect studies to capture method selection and analysis choices that follow a dependency branching based on response to key questions. Voluntary participants experienced in epidemiological study design solicited for participation through registration on the Observational Medical Outcomes Partnership website, membership in particular professional organizations, or links in relevant newsletters. Description (proportion) of respondents selecting particular methods and making specific analysis choices based on individual drug-outcome scenario pairs. The number of questions/decisions differed based on stem questions of study design, time-at-risk, outcome definition, and comparator. There is little consistency across scenarios, by drug or by outcome of interest, in the decisions made for design and analyses in scenarios using large healthcare databases. The most consistent choice was the cohort study design but variability in the other critical decisions was common. There is great variation among epidemiologists in the design and analytical choices that they make when implementing analyses in observational healthcare databases. These findings confirm that it will be important to generate empiric evidence to inform these decisions and to promote a better understanding of the impact of standardization on research implementation.

  8. Design, analysis, and interpretation of field quality-control data for water-sampling projects

    Science.gov (United States)

    Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.

    2015-01-01

    The process of obtaining and analyzing water samples from the environment includes a number of steps that can affect the reported result. The equipment used to collect and filter samples, the bottles used for specific subsamples, any added preservatives, sample storage in the field, and shipment to the laboratory have the potential to affect how accurately samples represent the environment from which they were collected. During the early 1990s, the U.S. Geological Survey implemented policies to include the routine collection of quality-control samples in order to evaluate these effects and to ensure that water-quality data were adequately representing environmental conditions. Since that time, the U.S. Geological Survey Office of Water Quality has provided training in how to design effective field quality-control sampling programs and how to evaluate the resultant quality-control data. This report documents that training material and provides a reference for methods used to analyze quality-control data.

  9. Quality of life outcome after subthalamic stimulation in Parkinson's disease depends on age.

    Science.gov (United States)

    Dafsari, Haidar S; Reker, Paul; Stalinski, Lisa; Silverdale, Monty; Rizos, Alexandra; Ashkan, Keyoumars; Barbe, Michael T; Fink, Gereon R; Evans, Julian; Steffen, Julia; Samuel, Michael; Dembek, Till A; Visser-Vandewalle, Veerle; Antonini, Angelo; Ray-Chaudhuri, K; Martinez-Martin, Pablo; Timmermann, Lars

    2018-01-01

    The purpose of this study was to investigate how quality of life outcome after bilateral subthalamic nucleus (STN) deep brain stimulation (DBS) in Parkinson's disease (PD) depends on age. In this prospective, open-label, multicenter study including 120 PD patients undergoing bilateral STN-DBS, we investigated the PDQuestionnaire-8 (PDQ-8), Unified PD Rating Scale-III, Scales for Outcomes in PD-motor examination, complications, activities of daily living, and levodopa equivalent daily dose preoperatively and at 5 months follow-up. Significant changes at follow-up were analyzed with Wilcoxon signed-rank test and Bonferroni correction for multiple comparisons. To explore the influence of age post hoc, the patients were classified into 3 age groups (≤59, 60-69, ≥70 years). Intragroup changes were analyzed with Wilcoxon signed-rank and intergroup differences with Kruskal-Wallis tests. The strength of clinical responses was evaluated using effect size. The PDQuestionnaire-8, Scales for Outcomes in PD-motor complications, activities of daily living, and levodopa equivalent daily dose significantly improved in the overall cohort and all age groups with no significant intergroup differences. However, PDQuestionnaire-8 effect sizes for age groups ≤59, 60 to 69, and ≥70 years, respectively, were strong, moderate, and small. Furthermore, PDQuestionnaire-8 domain analyses revealed that all domains except cognition and emotional well-being significantly improved in patients aged ≤59 years, whereas only communication, activities of daily living, and stigma improved in patients aged 60-69 years, and activities of daily living and stigma in patients aged ≥70 years. Although quality of life, motor complications, and activities of daily living significantly improved in all age groups after bilateral STN-DBS, the beneficial effect on overall quality of life was more pronounced and affected a wider range of quality of life domains in younger patients. © 2017 International

  10. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique

    International Nuclear Information System (INIS)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia; Lopez, Yon; Urquizo, Rafael

    2014-01-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  11. Design of a sample acquistion system for the Mars exobiological penetrator

    Science.gov (United States)

    Thomson, Ron; Gwynne, Owen

    1988-01-01

    The Mars Exobiological Penetrator will be imbedded into several locations on the Martian surface. It contains various scientific instruments, such as an Alpha-Particle Instrument (API), Differential Scanning Calorimeter (DSC), Evolved Gas Analyzer (EGA) and accelerometers. A sample is required for analysis in the API and DSC. To avoid impact contaminated material, this sample must be taken from soil greater than 2 cm away from the penetrator shell. This study examines the design of a dedicated sampling system including deployment, suspension, fore/after body coupling, sample gathering and placement. To prevent subsurface material from entering the penetrator sampling compartment during impact, a plug is placed in the exit hole of the wall. A U-lever device is used to hold this plug in the penetrator wall. The U-lever rotates upon initial motion of the core-grinder mechanism (CGM), releasing the plug. Research points to a combination of coring and grinding as a plausible solution to the problem of dry drilling. The CGM, driven by two compressed springs, will be deployed along a tracking system. A slowly varying load i.e., springs, is favored over a fixed displacement motion because of its adaptability to different material hardness. However, to accommodate sampling in a low density soil, two dash pots set a maximum transverse velocity. In addition, minimal power use is achieved by unidirectional motion of the CGM. The sample will be transported to the scientific instruments by means of a sample placement tray that is driven by a compressed spring to avoid unnecessary power usage. This paper also explores possible modifications for size, weight, and time as well as possible future studies.

  12. Optimization of sampling pattern and the design of Fourier ptychographic illuminator.

    Science.gov (United States)

    Guo, Kaikai; Dong, Siyuan; Nanda, Pariksheet; Zheng, Guoan

    2015-03-09

    Fourier ptychography (FP) is a recently developed imaging approach that facilitates high-resolution imaging beyond the cutoff frequency of the employed optics. In the original FP approach, a periodic LED array is used for sample illumination, and therefore, the scanning pattern is a uniform grid in the Fourier space. Such a uniform sampling scheme leads to 3 major problems for FP, namely: 1) it requires a large number of raw images, 2) it introduces the raster grid artefacts in the reconstruction process, and 3) it requires a high-dynamic-range detector. Here, we investigate scanning sequences and sampling patterns to optimize the FP approach. For most biological samples, signal energy is concentrated at low-frequency region, and as such, we can perform non-uniform Fourier sampling in FP by considering the signal structure. In contrast, conventional ptychography perform uniform sampling over the entire real space. To implement the non-uniform Fourier sampling scheme in FP, we have designed and built an illuminator using LEDs mounted on a 3D-printed plastic case. The advantages of this illuminator are threefold in that: 1) it reduces the number of image acquisitions by at least 50% (68 raw images versus 137 in the original FP setup), 2) it departs from the translational symmetry of sampling to solve the raster grid artifact problem, and 3) it reduces the dynamic range of the captured images 6 fold. The results reported in this paper significantly shortened acquisition time and improved quality of FP reconstructions. It may provide new insights for developing Fourier ptychographic imaging platforms and find important applications in digital pathology.

  13. How to include frequency dependent complex permeability Into SPICE models to improve EMI filters design?

    Science.gov (United States)

    Sixdenier, Fabien; Yade, Ousseynou; Martin, Christian; Bréard, Arnaud; Vollaire, Christian

    2018-05-01

    Electromagnetic interference (EMI) filters design is a rather difficult task where engineers have to choose adequate magnetic materials, design the magnetic circuit and choose the size and number of turns. The final design must achieve the attenuation requirements (constraints) and has to be as compact as possible (goal). Alternating current (AC) analysis is a powerful tool to predict global impedance or attenuation of any filter. However, AC analysis are generally performed without taking into account the frequency-dependent complex permeability behaviour of soft magnetic materials. That's why, we developed two frequency-dependent complex permeability models able to be included into SPICE models. After an identification process, the performances of each model are compared to measurements made on a realistic EMI filter prototype in common mode (CM) and differential mode (DM) to see the benefit of the approach. Simulation results are in good agreement with the measured ones especially in the middle frequency range.

  14. Baseline Design Compliance Matrix for the Type 4 In Situ Vapor Samplers and Supernate and Sludge and Soft Saltcake Grab Sampling

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    The DOE has identified a need to sample vapor space, exhaust ducts, supernate, sludge, and soft saltcake in waste tanks that store radioactive waste. This document provides the Design Compliance Matrix (DCM) for the Type 4 In-Situ Vapor Sampling (ISVS) system and the Grab Sampling System that are used for completing this type of sampling function. The DCM identifies the design requirements and the source of the requirements for the Type 4 ISVS system and the Grab Sampling system. The DCM is a single-source compilation design requirements for sampling and sampling support equipment and supports the configuration management of these systems

  15. Reading and Comprehension Levels in a Sample of Urban, Low-Income Persons

    Science.gov (United States)

    Delgado, Cheryl; Weitzel, Marilyn

    2013-01-01

    Objective: Because health literacy is related to healthcare outcomes, this study looked at reading and comprehension levels in a sample of urban, low-income persons. Design: This was a descriptive exploration of reading comprehension levels, controlled for medical problems that could impact on vision and therefore ability to read. Setting: Ninety…

  16. Attention Deficit Hyperactivity Disorder Symptoms, Comorbidities, Substance Use, and Social Outcomes among Men and Women in a Canadian Sample

    Directory of Open Access Journals (Sweden)

    Evelyn Vingilis

    2015-01-01

    Full Text Available Background. Attention deficit hyperactivity disorder (ADHD is a neurodevelopmental disorder that can persist in adolescence and adulthood. Aim. To examine prevalence of ADHD symptoms and correlates in a representative sample of adults 18 years and older living in Ontario, Canada. Method. We used the Centre for Addiction and Mental Health Monitor, an ongoing cross-sectional telephone survey, to examine the relationships between ADHD positive symptoms and comorbidities, substance use, medication use, social outcomes, and sociodemographics. Results. Of 4014 residents sampled in 2011-2012, 3.30% (2.75%–3.85% screened positively for ADHD symptoms (women = 3.6%; men = 3.0%. For men, distress, antisocial symptoms, cocaine use, antianxiety medication use, antidepressant medication use, and criminal offence arrest were associated with positive ADHD screen. For women, distress, cocaine use, antianxiety medication use, antidepressant medication use, pain medication use, and motor vehicle collision in the past year were associated with positive ADHD screen. Conclusions. ADHD symptoms are associated with adverse medical and social outcomes that are in some cases gender specific.

  17. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  18. [Design of standard voice sample text for subjective auditory perceptual evaluation of voice disorders].

    Science.gov (United States)

    Li, Jin-rang; Sun, Yan-yan; Xu, Wen

    2010-09-01

    To design a speech voice sample text with all phonemes in Mandarin for subjective auditory perceptual evaluation of voice disorders. The principles for design of a speech voice sample text are: The short text should include the 21 initials and 39 finals, this may cover all the phonemes in Mandarin. Also, the short text should have some meanings. A short text was made out. It had 155 Chinese words, and included 21 initials and 38 finals (the final, ê, was not included because it was rarely used in Mandarin). Also, the text covered 17 light tones and one "Erhua". The constituent ratios of the initials and finals presented in this short text were statistically similar as those in Mandarin according to the method of similarity of the sample and population (r = 0.742, P text were statistically not similar as those in Mandarin (r = 0.731, P > 0.05). A speech voice sample text with all phonemes in Mandarin was made out. The constituent ratios of the initials and finals presented in this short text are similar as those in Mandarin. Its value for subjective auditory perceptual evaluation of voice disorders need further study.

  19. Dependence of the time-constant of a fuel rod on different design and operational parameters

    International Nuclear Information System (INIS)

    Elenkov, D.; Lassmann, K.; Schubert, A.; Laar, J. van de

    2001-01-01

    The temperature response during a reactor shutdown has been measured for many years in the OECD-Halden Project. It has been shown that the complicated shutdown processes can be characterized by a time constant τ which depends on different fuel design and operational parameters, such as fuel geometry, gap size, fill gas pressure and composition, burnup and linear heat rate. In the paper the concept of a time constant is analyzed and the dependence of the time constant on various parameters is investigated analytically. Measured time constants for different designs and conditions are compared with those derived from calculations of the TRANSURANUS code. Employing standard models results in a systematic underprediction of the time constant, i.e. the heat transfer during shutdown is overestimated. (author)

  20. Gas and liquid sampling for closed canisters in K-West basins - functional design criteria

    International Nuclear Information System (INIS)

    Pitkoff, C.C.

    1994-01-01

    The purpose of this document is to provide functions and requirements for the design and fabrication of equipment for sampling closed canisters in the K-West basin. The samples will be used to help determine the state of the fuel elements in closed canisters. The characterization information obtained will support evaluation and development of processes required for safe storage and disposition of Spent Nuclear Fuel (SNF) materials

  1. Prosperity of Thought Versus Retreat of Application: A Comprehensive Approach in Urban Design Teaching

    Directory of Open Access Journals (Sweden)

    Abeer Mohamed Elshater

    2014-11-01

    Full Text Available This study focuses on a relevant question, practically in urban design studios; when will specialists discuss their educational visions around urban design in teaching plans? Currently, although Egyptian architecture and urban environments follow those of postwar European cities, numerous architectural schools teach the new urban design paradigm while ignoring the declining of cities. To reverse this trend, the author proposes that architectural educational institutions in Egypt need to alter their learning programs. Therefore, this study aims to create a new urban design module depends on outcomes based-learning consistent with the present Egyptian city scene on the basis of intended learning-outcomes.

  2. Caregiver Burden in Alcohol Dependence Syndrome

    Directory of Open Access Journals (Sweden)

    Ramanujam Vaishnavi

    2017-01-01

    Full Text Available Background. Alcoholism is a major threat to the individual as well as the society and the maximum burden of the illness is borne by the family. Aim. The study is aimed at assessing the pattern of burden on the caregivers of alcohol dependent patients and at assessing the relationship between the severity of dependence and the burden on caregivers. Settings and Design. Cross-sectional descriptive study conducted in the Department of Psychiatry, Sri Ramachandra Medical College and Research Institute. Materials and Methods. A cross-sectional assessment was done in 200 patients with alcohol dependence and their caregivers. The severity of dependence and the pattern of burden on caregivers were assessed. Statistical Analysis. The data thus collected was analyzed using SPSS version 20. Results. The study demonstrates that caregivers of alcohol dependent patients reported significant objective burden and subjective burden. Furthermore, the severity of alcohol dependence and the domains of burden such as financial burden, disruption of family interaction, and disruption of family routine activities were positively correlated with high level of significance. Conclusion. The current study has illustrated that all the caregivers experienced significant amount of burden which has to be addressed for better treatment outcome of the patients.

  3. Design Dependent Cutoff Frequency of Nanotransistors Near the Ultimate Performance Limit

    Science.gov (United States)

    Kordrostami, Zoheir; Sheikhi, M. Hossein; Zarifkar, Abbas

    2012-12-01

    We have studied the effect of different structural designs of double gate MOSFETs (DG-MOSFETs) and carbon nanotube field effect transistors (CNTFETs) on the cutoff frequency (fT). The effects of metallic contacts with Schottky barriers, gate work function, dual material gate (DMG), halo doped channel and lightly doped drain and source (LDDS) architectures on the fT have been investigated for DG-MOSFETs and CNTFETs and the design dependent fT for both types of transistors has been studied for the first time. The simulations are based on the Schrödinger-Poisson solvers developed for each nanotransistor separately. The ballistic limit has been studied as the ultimate performance limit of the DG-MOSFETs and CNTFETs. The results of this paper, for the first time, show how some designations used for modification of short channel effects or current-voltage characteristics affect the fT. The results revealed that the cutoff frequencies of both types of the transistors exhibit the same behavior with changing design parameters. We have shown that the Schottky barriers, parasitic capacitances and halo doping reduce the fT and have proposed the DMG and LDDS artchitectures as ways to increase the fT for DG-MOSFETs and CNTFETs for the first time.

  4. Information and Communication Technology: Design, Delivery, and Outcomes from a Nursing Informatics Boot Camp

    Science.gov (United States)

    Kleib, Manal; Simpson, Nicole; Rhodes, Beverly

    2016-05-31

    Information and communication technology (ICT) is integral in today’s healthcare as a critical piece of support to both track and improve patient and organizational outcomes. Facilitating nurses’ informatics competency development through continuing education is paramount to enhance their readiness to practice safely and accurately in technologically enabled work environments. In this article, we briefly describe progress in nursing informatics (NI) and share a project exemplar that describes our experience in the design, implementation, and evaluation of a NI educational event, a one-day boot camp format that was used to provide foundational knowledge in NI targeted primarily at frontline nurses in Alberta, Canada. We also discuss the project outcomes, including lessons learned and future implications. Overall, the boot camp was successful to raise nurses’ awareness about the importance of informatics in nursing practice.

  5. Towards Representative Metallurgical Sampling and Gold Recovery Testwork Programmes

    Directory of Open Access Journals (Sweden)

    Simon C. Dominy

    2018-05-01

    Full Text Available When developing a process flowsheet, the risks in achieving positive financial outcomes are minimised by ensuring representative metallurgical samples and high quality testwork. The quality and type of samples used are as important as the testwork itself. The key characteristic required of any set of samples is that they represent a given domain and quantify its variability. There are those who think that stating a sample(s is representative makes it representative without justification. There is a need to consider both (1 in-situ and (2 testwork sub-sample representativity. Early ore/waste characterisation and domain definition are required, so that sampling and testwork protocols can be designed to suit the style of mineralisation in question. The Theory of Sampling (TOS provides an insight into the causes and magnitude of errors that may occur during the sampling of particulate materials (e.g., broken rock and is wholly applicable to metallurgical sampling. Quality assurance/quality control (QAQC is critical throughout all programmes. Metallurgical sampling and testwork should be fully integrated into geometallurgical studies. Traditional metallurgical testwork is critical for plant design and is an inherent part of geometallurgy. In a geometallurgical study, multiple spatially distributed small-scale tests are used as proxies for process parameters. These will be validated against traditional testwork results. This paper focusses on sampling and testwork for gold recovery determination. It aims to provide the reader with the background to move towards the design, implementation and reporting of representative and fit-for-purpose sampling and testwork programmes. While the paper does not intend to provide a definitive commentary, it critically assesses the hard-rock sampling methods used and their optimal collection and preparation. The need for representative sampling and quality testwork to avoid financial and intangible losses is

  6. Outcome Measurement Using Naturalistic Language Samples: A Feasibility Pilot Study Using Language Transcription Software and Speech and Language Therapy Assistants

    Science.gov (United States)

    Overton, Sarah; Wren, Yvonne

    2014-01-01

    The ultimate aim of intervention for children with language impairment is an improvement in their functional language skills. Baseline and outcome measurement of this is often problematic however and practitioners commonly resort to using formal assessments that may not adequately reflect the child's competence. Language sampling,…

  7. ANALYSING THE USE OF FOUR CREATIVITY TOOLS IN A CONSTRAINED DESIGN SITUATION

    DEFF Research Database (Denmark)

    Snider, C.M.; Dekoninck, E.A.; Yue, H.

    2011-01-01

    This paper investigates creativity tools and their use within highly constrained design tasks. Previously, a coding scheme was developed to classify design changes as ‘Creative Modes of Change’. The coding scheme is used to compare the outcomes from the use of four creative tools (supported design......) against unsupported design within a constrained task. The tools showed design space expansion, developing additional concepts to those from the unsupported stage. All four tools stimulated ‘Creative Modes of Change’, although the type varied depending on their operation. ‘Assumption Smashing......’ and the ‘Contradiction Matrix’ usually stimulate extra function; ‘Analogies’ and ‘Trends of Evolution’ improve design performance. The former two usually produce ‘Creative Modes of Change’ as opposed to routine. The results show some links between the designer’s driving force, mode of change and the design outcome. ‘New...

  8. Nurse working conditions and patient safety outcomes.

    Science.gov (United States)

    Stone, Patricia W; Mooney-Kane, Cathy; Larson, Elaine L; Horan, Teresa; Glance, Laurent G; Zwanziger, Jack; Dick, Andrew W

    2007-06-01

    System approaches, such as improving working conditions, have been advocated to improve patient safety. However, the independent effect of many working condition variables on patient outcomes is unknown. To examine effects of a comprehensive set of working conditions on elderly patient safety outcomes in intensive care units. Observational study, with patient outcome data collected using the National Nosocomial Infection Surveillance system protocols and Medicare files. Several measures of health status and fixed setting characteristics were used to capture distinct dimensions of patient severity of illness and risk for disease. Working condition variables included organizational climate measured by nurse survey; objective measures of staffing, overtime, and wages (derived from payroll data); and hospital profitability and magnet accreditation. The sample comprised 15,846 patients in 51 adult intensive care units in 31 hospitals depending on the outcome analyzed; 1095 nurses were surveyed. Central line associated bloodstream infections (CLBSI), ventilator-associated pneumonia, catheter-associated urinary tract infections, 30-day mortality, and decubiti. Units with higher staffing had lower incidence of CLBSI, ventilator-associated pneumonia, 30-day mortality, and decubiti (P working conditions were associated with all outcomes measured. Improving working conditions will most likely promote patient safety. Future researchers and policymakers should consider a broad set of working condition variables.

  9. Dimensions of dependence and their influence on the outcome of cognitive behaviour therapy for health anxiety: randomized controlled trial.

    Science.gov (United States)

    Tyrer, Peter; Wang, Duolao; Tyrer, Helen; Crawford, Mike; Cooper, Sylvia

    2016-05-01

    The personality trait of dependence is common in health-seeking behaviour. We therefore examined its impact in a large randomized controlled trial of psychological treatment for health anxiety. To test whether dependent personality traits were positive or negative in determining the outcome of an adapted form of cognitive behaviour therapy for health anxiety (CBT-HA) over the course of 5 years and whether dependent personality dysfunction could be viewed dimensionally in a similar way to the new ICD-11 diagnostic system for general personality disorder. Dependent personality dysfunction was assessed using a self-rated questionnaire, the Dependent Personality Questionnaire, at baseline in a randomized controlled trial of 444 patients from medical clinics with pathological health anxiety treated with a modified form of CBT-HA or standard treatment in the medical clinics, with assessment on five occasions over 5 years. Dependent personality dysfunction was assessed using four severity groups. Patients with mild and moderate dependent personality disorder treated with CBT-HA showed the greatest reduction in health anxiety compared with standard care, and those with no dependent dysfunction showed the least benefit. Patients with higher dependent traits received significantly more treatment sessions (8.6) than those with low trait levels (5.4) (p dependent personality. The reasons for this may be related to better adherence to psychological treatment and greater negative effects of frequent reassurance and excessive consultation in those treated in standard care. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Outcomes of the Smoker's Health Project: A Pragmatic Comparative Effectiveness Trial of Tobacco-Dependence Interventions Based on Self-Determination Theory

    Science.gov (United States)

    Williams, Geoffrey C.; Niemiec, Christopher P.; Patrick, Heather; Ryan, Richard M.; Deci, Edward L.

    2016-01-01

    A pragmatic comparative effectiveness trial examined whether extending the duration of a cost-effective, intensive tobacco-dependence intervention designed to support autonomy will facilitate long-term tobacco abstinence. Participants were randomly assigned to one of three tobacco-dependence interventions based on self-determination theory,…

  11. Sample Size Calculation for Controlling False Discovery Proportion

    Directory of Open Access Journals (Sweden)

    Shulian Shang

    2012-01-01

    Full Text Available The false discovery proportion (FDP, the proportion of incorrect rejections among all rejections, is a direct measure of abundance of false positive findings in multiple testing. Many methods have been proposed to control FDP, but they are too conservative to be useful for power analysis. Study designs for controlling the mean of FDP, which is false discovery rate, have been commonly used. However, there has been little attempt to design study with direct FDP control to achieve certain level of efficiency. We provide a sample size calculation method using the variance formula of the FDP under weak-dependence assumptions to achieve the desired overall power. The relationship between design parameters and sample size is explored. The adequacy of the procedure is assessed by simulation. We illustrate the method using estimated correlations from a prostate cancer dataset.

  12. Low ficolin-3 levels in early follow-up serum samples are associated with the severity and unfavorable outcome of acute ischemic stroke

    DEFF Research Database (Denmark)

    Füst, George; Munthe-Fog, Lea; Illes, Zsolt

    2011-01-01

    demonstrated the significance of MBL in ischemic stroke, the role of ficolins has not been examined. METHODS: Sera were obtained within 12 hours after the onset of ischemic stroke (admission samples) and 3-4 days later (follow-up samples) from 65 patients. The control group comprised 100 healthy individuals......-up samples an inverse correlation was observed between ficolin-3 levels and concentration of S100β, an indicator of the size of cerebral infarct. Patients with low ficolin-3 levels and high CRP levels in the follow up samples had a significantly worse outcome (adjusted ORs 5.6 and 3.9, respectively...

  13. Assertive Community Treatment for alcohol dependence (ACTAD: study protocol for a randomised controlled trial

    Directory of Open Access Journals (Sweden)

    Gilburt Helen

    2012-02-01

    Full Text Available Abstract Background Alcohol dependence is a significant and costly problem in the UK yet only 6% of people a year receive treatment. Current service provision based on the treatment of acute episodes of illness and emphasising personal choice and motivation results in a small proportion of these patients engaging with alcohol treatment. There is a need for interventions targeted at the population of alcohol dependent patients who are hard to engage in conventional treatment. Assertive Community Treatment (ACT, a model of care based on assertive outreach, has been used for treating patients with severe mental illnesses and presents a promising avenue for engaging patients with primary alcohol dependence. So far there has been little research on this. Methods/Design In this single blind exploratory randomised controlled trial, a total of 90 alcohol dependent participants will be recruited from community addiction services. After completing a baseline assessment, they will be assigned to one of two conditions: (1 ACT plus care as usual, or (2 care as usual. Those allocated to the ACT plus care as usual will receive the same treatment that is routinely provided by services, plus a trained key worker who will provide ACT. ACT comprises intensive and assertive contact at least once a week, over 50% of contacts in the participant's home or local community, and comprehensive case management across social and health care, for a period of one year. All participants will be followed up at 6 months and 12 months to assess outcome post randomisation. The primary outcome measures will be alcohol consumption: mean drinks per drinking day and percentage of days abstinent measured by the Time Line Follow Back interview. Secondary outcome measures will include severity of alcohol dependence, alcohol related problems, motivation to change, social network involvement, quality of life, therapeutic relationship and service use. Other outcome variables are treatment

  14. Design of sample analysis device for iodine adsorption efficiency test in NPPs

    International Nuclear Information System (INIS)

    Ji Jinnan

    2015-01-01

    In nuclear power plants, iodine adsorption efficiency test is used to check the iodine adsorption efficiency of the iodine adsorber. The iodine adsorption efficiency can be calculated through the analysis of the test sample, and thus to determine if the performance of the adsorber meets the requirement on the equipment operation and emission. Considering the process of test and actual demand, in this paper, a special device for the analysis of this kind of test sample is designed. The application shows that the device is with convenient operation and high reliability and accurate calculation, and improves the experiment efficiency and reduces the experiment risk. (author)

  15. Double pulse laser induced breakdown spectroscopy: Experimental study of lead emission intensity dependence on the wavelengths and sample matrix

    Energy Technology Data Exchange (ETDEWEB)

    Piscitelli S, V; Martinez L, M A; Fernandez C, A J [Laboratorio de Espectroscopia Laser, Escuela de Quimica, Facultad de Ciencias, Universidad Central de Venezuela, Caracas, DC 1020 (Venezuela, Bolivarian Republic of); Gonzalez, J J; Mao, X L [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Russo, R.E. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)], E-mail: RERusso@lbl.gov

    2009-02-15

    Lead (Pb) emission intensity (atomic line 405.78 nm) dependence on the sample matrix (metal alloy) was studied by means of collinear double pulse (DP)-laser induced breakdown spectroscopy (LIBS). The measurement of the emission intensity produced by three different wavelength combinations (i.e. I:532 nm-II:1064 nm, I:532 nm-II:532 nm, and I:532 nm-II:355 nm) from three series of standard reference materials showed that the lead atomic line 405.78 nm emission intensity was dependent on the sample matrix for all the combination of wavelengths, however reduced dependency was found for the wavelength combination I:532 nm-II:355 nm. Two series of standard reference materials from the National Institute of Standards and Technology (NIST) and one series from the British Chemical Standards (BCS) were used for these experiments. Calibration curves for lead ablated from NIST 626-630 ('Zn{sub 95}Al{sub 4}Cu{sub 1}') provided higher sensitivity (slope) than those calibration curves produced from NIST 1737-1741 ('Zn{sub 99.5}Al{sub 0.5}') and with the series BCS 551-556 ('Cu{sub 87}Sn{sub 11}'). Similar trends between lead emission intensity (calibration curve sensitivities) and reported variations in plasma temperatures caused by the differing ionization potentials of the major and minor elements in these samples were established.

  16. Infliximab dependency in children with Crohn's disease

    DEFF Research Database (Denmark)

    Duricova, D; Pedersen, N; Lenicek, M

    2009-01-01

    BACKGROUND: Recently, infliximab dependency has been described. AIM: To assess frequency of ID in 82 consecutive Crohn's disease children treated with infliximab 2000-2006 and to describe clinical and genetic predictors of long-term infliximab response. METHODS: A phenotype model of infliximab...... dependency was used to assess treatment response: 'immediate outcome' (30 days after infliximab start)--complete/partial/no response. 'Long-term outcome': (i) prolonged response: maintenance of complete/partial response; (ii) infliximab dependency: relapse .../partial response. In long-term outcome, 22% maintained prolonged response, 12% had no response, while 66% became infliximab dependent. Perianal disease and no previous surgery were associated with infliximab dependency (OR 5.34, 95% CI: 1.24-22.55; OR 6.7, 95% CI: 1.67-26.61). No association was found with studied...

  17. Evaluating outcome-correlated recruitment and geographic recruitment bias in a respondent-driven sample of people who inject drugs in Tijuana, Mexico.

    Science.gov (United States)

    Rudolph, Abby E; Gaines, Tommi L; Lozada, Remedios; Vera, Alicia; Brouwer, Kimberly C

    2014-12-01

    Respondent-driven sampling's (RDS) widespread use and reliance on untested assumptions suggests a need for new exploratory/diagnostic tests. We assessed geographic recruitment bias and outcome-correlated recruitment among 1,048 RDS-recruited people who inject drugs (Tijuana, Mexico). Surveys gathered demographics, drug/sex behaviors, activity locations, and recruiter-recruit pairs. Simulations assessed geographic and network clustering of active syphilis (RPR titers ≥1:8). Gender-specific predicted probabilities were estimated using logistic regression with GEE and robust standard errors. Active syphilis prevalence was 7 % (crude: men = 5.7 % and women = 16.6 %; RDS-adjusted: men = 6.7 % and women = 7.6 %). Syphilis clustered in the Zona Norte, a neighborhood known for drug and sex markets. Network simulations revealed geographic recruitment bias and non-random recruitment by syphilis status. Gender-specific prevalence estimates accounting for clustering were highest among those living/working/injecting/buying drugs in the Zona Norte and directly/indirectly connected to syphilis cases (men: 15.9 %, women: 25.6 %) and lowest among those with neither exposure (men: 3.0 %, women: 6.1 %). Future RDS analyses should assess/account for network and spatial dependencies.

  18. Monitoring well design and sampling techniques at NAPL sites

    International Nuclear Information System (INIS)

    Collins, M.; Rohrman, W.R.; Drake, K.D.

    1992-01-01

    The existence of Non-Aqueous Phase Liquids (NAPLs) at many Superfund and RCRA hazardous waste sites has become a recognized problem in recent years. The large number of sites exhibiting this problem results from the fact that many of the most frequently used industrial solvents and petroleum products can exist as NAPLs. Hazardous waste constituents occurring as NAPLs possess a common characteristic that causes great concern during groundwater contamination evaluation: while solubility in water is generally very low, it is sufficient to cause groundwater to exceed Maximum Contamination Levels (MCLs). Thus, even a small quantity of NAPL within a groundwater regime can act as a point source with the ability to contaminate vast quantities of groundwater over time. This property makes it imperative that groundwater investigations focus heavily on characterizing the nature, extent, and migration pathways of NAPLs at sites where it exists. Two types of NAPLs may exist in a groundwater system. Water-immiscible liquid constituents having a specific gravity greater than one are termed Dense Non-Aqueous Phase Liquids, while those with a specific gravity less than one are considered Light Non-Aqueous Phase Liquids. For a groundwater investigation to properly characterize the two types of NAPLs, careful consideration must be given to the placement and sampling of groundwater monitoring wells. Unfortunately, technical reviewers at EPA Region VII and the Corps of Engineers find that many groundwater investigations fall short in characterizing NAPLs because several basic considerations were overlooked. Included among these are monitoring well location and screen placement with respect to the water table and significant confining units, and the ability of the well sampling method to obtain samples of NAPL. Depending on the specific gravity of the NAPL that occurs at a site, various considerations can substantially enhance adequate characterization of NAPL contaminants

  19. Characterizing the Joint Effect of Diverse Test-Statistic Correlation Structures and Effect Size on False Discovery Rates in a Multiple-Comparison Study of Many Outcome Measures

    Science.gov (United States)

    Feiveson, Alan H.; Ploutz-Snyder, Robert; Fiedler, James

    2011-01-01

    In their 2009 Annals of Statistics paper, Gavrilov, Benjamini, and Sarkar report the results of a simulation assessing the robustness of their adaptive step-down procedure (GBS) for controlling the false discovery rate (FDR) when normally distributed test statistics are serially correlated. In this study we extend the investigation to the case of multiple comparisons involving correlated non-central t-statistics, in particular when several treatments or time periods are being compared to a control in a repeated-measures design with many dependent outcome measures. In addition, we consider several dependence structures other than serial correlation and illustrate how the FDR depends on the interaction between effect size and the type of correlation structure as indexed by Foerstner s distance metric from an identity. The relationship between the correlation matrix R of the original dependent variables and R, the correlation matrix of associated t-statistics is also studied. In general R depends not only on R, but also on sample size and the signed effect sizes for the multiple comparisons.

  20. Sample size requirements for separating out the effects of combination treatments: randomised controlled trials of combination therapy vs. standard treatment compared to factorial designs for patients with tuberculous meningitis.

    Science.gov (United States)

    Wolbers, Marcel; Heemskerk, Dorothee; Chau, Tran Thi Hong; Yen, Nguyen Thi Bich; Caws, Maxine; Farrar, Jeremy; Day, Jeremy

    2011-02-02

    In certain diseases clinical experts may judge that the intervention with the best prospects is the addition of two treatments to the standard of care. This can either be tested with a simple randomized trial of combination versus standard treatment or with a 2 x 2 factorial design. We compared the two approaches using the design of a new trial in tuberculous meningitis as an example. In that trial the combination of 2 drugs added to standard treatment is assumed to reduce the hazard of death by 30% and the sample size of the combination trial to achieve 80% power is 750 patients. We calculated the power of corresponding factorial designs with one- to sixteen-fold the sample size of the combination trial depending on the contribution of each individual drug to the combination treatment effect and the strength of an interaction between the two. In the absence of an interaction, an eight-fold increase in sample size for the factorial design as compared to the combination trial is required to get 80% power to jointly detect effects of both drugs if the contribution of the less potent treatment to the total effect is at least 35%. An eight-fold sample size increase also provides a power of 76% to detect a qualitative interaction at the one-sided 10% significance level if the individual effects of both drugs are equal. Factorial designs with a lower sample size have a high chance to be underpowered, to show significance of only one drug even if both are equally effective, and to miss important interactions. Pragmatic combination trials of multiple interventions versus standard therapy are valuable in diseases with a limited patient pool if all interventions test the same treatment concept, it is considered likely that either both or none of the individual interventions are effective, and only moderate drug interactions are suspected. An adequately powered 2 x 2 factorial design to detect effects of individual drugs would require at least 8-fold the sample size of the

  1. Using patient-reported outcomes in schizophrenia: the Scottish Schizophrenia Outcomes Study.

    Science.gov (United States)

    Hunter, Robert; Cameron, Rosie; Norrie, John

    2009-02-01

    The primary aim of the Scottish Schizophrenia Outcomes Study (SSOS) was to assess the feasibility and utility of routinely collecting outcome data in everyday clinical settings. Data were collected over three years in the Scottish National Health Service (NHS). There were two secondary aims of SSOS: first, to compare data from patient-rated, objective, and clinician-rated outcomes, and second, to describe trends in outcome data and service use across Scotland over the three years of the study (2002-2005). This study used a naturalistic, longitudinal, observational cohort design. A representative sample of 1,015 persons with ICD-10 F20-F29 diagnoses (schizophrenia, schizotypal disorders, or delusional disorders) was assessed annually using the clinician-rated measure, the Health of the Nation Outcome Scale (HoNOS), and the patient-reported assessment, the Avon Mental Health Measure (Avon). Objective outcomes data and information on services and interventions were collected. Data were analyzed with regression modeling. Of the 1,015 persons recruited, 78% of the cohort (N=789) completed the study. Over the study period, significant decreases were seen in the number of hospitalizations, incidence of attempted suicide and self-harm, and civil detentions. Avon scores indicated significant improvement on all subscales (behavior, social, access, and mental health) and on the total score. However, HoNOS scores on the behavior and symptom subscales did not change, scores on the impairment subscale increased significantly (indicating increased levels of impairment), and scores on the social subscale decreased significantly (indicating improved social functioning). This study has demonstrated that it is feasible within the Scottish NHS to routinely collect meaningful outcomes data in schizophrenia. Patient-reported assessments were also successfully collected and used in care plans. This model shows that it is possible to incorporate patient-reported assessments into routine

  2. Multi-saline sample distillation apparatus for hydrogen isotope analyses: design and accuracy. Water-resources investigations

    International Nuclear Information System (INIS)

    Hassan, A.A.

    1981-04-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 degrees C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated

  3. Field Investigation Plan for 1301-N and 1325-N Facilities Sampling to Support Remedial Design

    International Nuclear Information System (INIS)

    Weiss, S. G.

    1998-01-01

    This field investigation plan (FIP) provides for the sampling and analysis activities supporting the remedial design planning for the planned removal action for the 1301-N and 1325-N Liquid Waste Disposal Facilities (LWDFs), which are treatment, storage,and disposal (TSD) units (cribs/trenches). The planned removal action involves excavation, transportation, and disposal of contaminated material at the Environmental Restoration Disposal Facility (ERDF).An engineering study (BHI 1997) was performed to develop and evaluate various options that are predominantly influenced by the volume of high- and low-activity contaminated soil requiring removal. The study recommended that additional sampling be performed to supplement historical data for use in the remedial design

  4. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes.

    Science.gov (United States)

    Achana, Felix A; Cooper, Nicola J; Bujkiewicz, Sylwia; Hubbard, Stephanie J; Kendrick, Denise; Jones, David R; Sutton, Alex J

    2014-07-21

    Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on

  5. Climate Change Professional Development: Design, Implementation, and Initial Outcomes on Teacher Learning, Practice, and Student Beliefs

    Science.gov (United States)

    Shea, Nicole A.; Mouza, Chrystalla; Drewes, Andrea

    2016-01-01

    In this work, we present the design, implementation, and initial outcomes of the Climate Academy, a hybrid professional development program delivered through a combination of face-to-face and online interactions, intended to prepare formal and informal science teachers (grades 5-16) in teaching about climate change. The Climate Academy was…

  6. Primary and secondary exercise dependence in a community-based sample of road race runners.

    Science.gov (United States)

    Cook, Brian; Karr, Trisha M; Zunker, Christie; Mitchell, James E; Thompson, Ron; Sherman, Roberta; Crosby, Ross D; Cao, Li; Erickson, Ann; Wonderlich, Stephen A

    2013-10-01

    The purpose of our study was to examine exercise dependence (EXD) in a large community-based sample of runners. The secondary purpose of this study was to examine differences in EXD symptoms between primary and secondary EXD. Our sample included 2660 runners recruited from a local road race (M age = 38.78 years, SD = 10.80; 66.39% women; 91.62% Caucasian) who completed all study measures online within 3 weeks of the race. In this study, EXD prevalence was lower than most previously reported rates (gamma = .248, p < .001) and individuals in the at-risk for EXD category participated in longer distance races, F(8,1) = 14.13, p = .01, partial eta squared = .05. Group differences were found for gender, F(1,1921) 8.08, p = .01, partial eta squared = .004, and primary or secondary group status, F(1,1921) 159.53, p = .01, partial eta squared = .077. Implications of primary and secondary EXD differences and future research are discussed.

  7. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    Science.gov (United States)

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while

  8. Competitive closed-loop supply chain network design with price-dependent demands

    DEFF Research Database (Denmark)

    Rezapour, Shabnam; Farahani, Reza Zanjirani; Fahimnia, Behnam

    2015-01-01

    Abstract This paper presents a bi-level model for the strategic reverse network design (upper level) and tactical/operational planning (lower level) of a closed-loop single-period supply chain operating in a competitive environment with price-dependent market demand. An existing supply chain (SC...... for the supply of new and remanufactured products. The performance behaviors of both SCs are evaluated with specific focus placed on investigating the impacts of the strategic facility location decisions of the new SC on the tactical/operational transport and inventory decisions of the overall network. The bi...

  9. DESIGN AND CALIBRATION OF A VIBRANT SAMPLE MAGNETOMETER: CHARACTERIZATION OF MAGNETIC MATERIALS

    Directory of Open Access Journals (Sweden)

    Freddy P. Guachun

    2018-01-01

    Full Text Available This paper presents the process followed in the implementation of a vibrating sample magnetometer (VSM, constructed with materials commonly found in an electromagnetism laboratory. It describes the design, construction, calibration and use in the characterization of some magnetic materials. A VSM measures the magnetic moment of a sample when it is vibrated perpendicular to a uniform magnetic field; Magnetization and magnetic susceptibility can be determined from these readings. This instrument stands out for its simplicity, versatility and low cost, but it is very sensitive and capable of eliminating or minimizing many sources of error that are found in other methods of measurement, allowing to obtain very accurate and reliable results. Its operation is based on the law of magnetic induction of Lenz-Faraday that consists in measuring the induced voltage in coils of detection produced by the variation of the magnetic flux that crosses them. The calibration of the VSM was performed by means of a standard sample (Magnetite and verified by means of a test sample (Nickel.

  10. Bottom sample taker

    Energy Technology Data Exchange (ETDEWEB)

    Garbarenko, O V; Slonimskiy, L D

    1982-01-01

    In order to improve the quality of the samples taken during offshore exploration from benthic sediments, the proposed design of the sample taker has a device which makes it possible to regulate the depth of submersion of the core lifter. For this purpose the upper part of the core lifter has an inner delimiting ring, and within the core lifter there is a piston suspended on a cable. The position of the piston in relation to the core lifter is previously assigned depending on the compactness of the benthic sediments and is fixed by tension of the cable which is held by a clamp in the cover of the core taker housing. When lowered to the bottom, the core taker is released, and under the influence of hydrostatic pressure of sea water, it enters the sediments. The magnitude of penetration is limited by the distance between the piston and the stopping ring. The piston also guarantees better preservation of the sample when the instrument is lifted to the surface.

  11. Design of modified annulus air sampling system for the detection of leakage in waste transfer line

    International Nuclear Information System (INIS)

    Deokar, U.V; Khot, A.R.; Mathew, P.; Ganesh, G.; Tripathi, R.M.; Srivastava, Srishti

    2018-01-01

    Various liquid waste streams are generated during the operation of reprocessing plant. The High Level (HL), Intermediate Level (IL) and Low Level (LL) liquid wastes generated, are transferred from reprocessing plant to Waste Management Facility. These respective waste streams are transferred through pipe-in-pipe lines along the shielded concrete trench. For detection of radioactive leakage from primary waste transfer line into secondary line, sampling of the annulus air between the two pipes is carried out. The currently installed pressurized annulus air sampling system did not have online leakage detection provision. Hence, there are chances of personal exposure and airborne activity in the working area. To overcome these design flaws, free air flow modified online annulus air sampling system with more safety features is designed

  12. An M/M/c/K State-Dependent Model for Pedestrian Flow Control and Design of Facilities.

    Directory of Open Access Journals (Sweden)

    Khalidur Rahman

    Full Text Available Pedestrian overflow causes queuing delay and in turn, is controlled by the capacity of a facility. Flow control or blocking control takes action to avoid queues from building up to extreme values. Thus, in this paper, the problem of pedestrian flow control in open outdoor walking facilities in equilibrium condition is investigated using M/M/c/K queuing models. State dependent service rate based on speed and density relationship is utilized. The effective rate of the Poisson arrival process to the facility is determined so as there is no overflow of pedestrians. In addition, the use of the state dependent queuing models to the design of the facilities and the effect of pedestrian personal capacity on the design and the traffic congestion are discussed. The study does not validate the sustainability of adaptation of Western design codes for the pedestrian facilities in the countries like Bangladesh.

  13. Blahut-Arimoto algorithm and code design for action-dependent source coding problems

    DEFF Research Database (Denmark)

    Trillingsgaard, Kasper Fløe; Simeone, Osvaldo; Popovski, Petar

    2013-01-01

    The source coding problem with action-dependent side information at the decoder has recently been introduced to model data acquisition in resource-constrained systems. In this paper, an efficient Blahut-Arimoto-type algorithm for the numerical computation of the rate-distortion-cost function...... for this problem is proposed. Moreover, a simplified two-stage code structure based on multiplexing is put forth, whereby the first stage encodes the actions and the second stage is composed of an array of classical Wyner-Ziv codes, one for each action. Leveraging this structure, specific coding/decoding...... strategies are designed based on LDGM codes and message passing. Through numerical examples, the proposed code design is shown to achieve performance close to the rate-distortion-cost function....

  14. Drinking game participation and outcomes in a sample of Australian university students.

    Science.gov (United States)

    George, Amanda M; Zamboanga, Byron L

    2018-05-15

    Most drinking game (DG) research among university students has been conducted among USA college samples. The extent to which demographics and game type (e.g. team and sculling games) are linked to DG behaviours/consequences among non-USA students is not well understood. As such, the current study investigated characteristics of DG participation (and associated outcomes) among a sample of Australian university students. University students (N = 252; aged 18-24 years; 67% female) who had consumed alcohol in the prior year completed an online survey. Measures included demographics, DG behaviours (lifetime, frequency and consumption) and gaming-specific consequences. Most of the students reported lifetime DG participation (85%). Among those who played a DG in the prior 6 months (69%), most had experienced a negative gaming-specific consequence. While team games were the most popular DG played, regression analysis demonstrated that participation in games which encouraged consumption (e.g. sculling) were associated with increased alcohol consumption during play. In addition to being older, playing DGs more frequently, and consuming more alcohol while playing, participation in both consumption and dice games (e.g. 7-11, doubles) predicted more negative gaming-specific consequences. DG participation is common among Australian university students, as it is in other parts of the world. The importance of game type is clear, particularly the risk of consumption games. Findings could help inform interventions to reduce participation in consumption games and identify students who might be especially at-risk for experiencing negative DG consequences. © 2018 Australasian Professional Society on Alcohol and other Drugs.

  15. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Directory of Open Access Journals (Sweden)

    Jake M Ferguson

    2014-06-01

    Full Text Available The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  16. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Science.gov (United States)

    Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W

    2014-06-01

    The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  17. Case-Cohort Studies: Design and Applicability to Hand Surgery.

    Science.gov (United States)

    Vojvodic, Miliana; Shafarenko, Mark; McCabe, Steven J

    2018-04-24

    Observational studies are common research strategies in hand surgery. The case-cohort design offers an efficient and resource-friendly method for risk assessment and outcomes analysis. Case-cohorts remain underrepresented in upper extremity research despite several practical and economic advantages over case-control studies. This report outlines the purpose, utility, and structure of the case-cohort design and offers a sample research question to demonstrate its value to risk estimation for adverse surgical outcomes. The application of well-designed case-cohort studies is advocated in an effort to improve the quality and quantity of observational research evidence in hand and upper extremity surgery. Copyright © 2018 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  18. Sample Processor for Life on Icy Worlds (SPLIce): Design and Test Results

    Science.gov (United States)

    Chinn, Tori N.; Lee, Anthony K.; Boone, Travis D.; Tan, Ming X.; Chin, Matthew M.; McCutcheon, Griffin C.; Horne, Mera F.; Padgen, Michael R.; Blaich, Justin T.; Forgione, Joshua B.; hide

    2017-01-01

    We report the design, development, and testing of the Sample Processor for Life on Icy Worlds (SPLIce) system, a microfluidic sample processor to enable autonomous detection of signatures of life and measurements of habitability parameters in Ocean Worlds. This monolithic fluid processing-and-handling system (Figure 1; mass 0.5 kg) retrieves a 50-L-volume sample and prepares it to supply a suite of detection instruments, each with unique preparation needs. SPLIce has potential applications in orbiter missions that sample ocean plumes, such as found in Saturns icy moon Enceladus, or landed missions on the surface of icy satellites, such as Jupiters moon Europa. Answering the question Are we alone in the universe? is captivating and exceptionally challenging. Even general criteria that define life very broadly include a significant role for water [1,2]. Searches for extinct or extant life therefore prioritize locations of abundant water whether in ancient (Mars), or present (Europa and Enceladus) times. Only two previous planetary missions had onboard fluid processing: the Viking Biology Experiments [3] and Phoenixs Wet Chemistry Laboratory (WCL) [4]. SPLIce differs crucially from those systems, including its capability to process and distribute L-volume samples and the integration autonomous control of a wide range of fluidic functions, including: 1) retrieval of fluid samples from an evacuated sample chamber; 2) onboard multi-year storage of dehydrated reagents; 3) integrated pressure, pH, and conductivity measurement; 4) filtration and retention of insoluble particles for microscopy; 5) dilution or vacuum-driven concentration of samples to accommodate instrument working ranges; 6) removal of gas bubbles from sample aliquots; 7) unidirectional flow (check valves); 8) active flow-path selection (solenoid-actuated valves); 9) metered pumping in 100 nL volume increments. The SPLIce manifold, made of three thermally fused layers of precision-machined cyclo

  19. Experimental determination of size distributions: analyzing proper sample sizes

    International Nuclear Information System (INIS)

    Buffo, A; Alopaeus, V

    2016-01-01

    The measurement of various particle size distributions is a crucial aspect for many applications in the process industry. Size distribution is often related to the final product quality, as in crystallization or polymerization. In other cases it is related to the correct evaluation of heat and mass transfer, as well as reaction rates, depending on the interfacial area between the different phases or to the assessment of yield stresses of polycrystalline metals/alloys samples. The experimental determination of such distributions often involves laborious sampling procedures and the statistical significance of the outcome is rarely investigated. In this work, we propose a novel rigorous tool, based on inferential statistics, to determine the number of samples needed to obtain reliable measurements of size distribution, according to specific requirements defined a priori. Such methodology can be adopted regardless of the measurement technique used. (paper)

  20. The outcomes of patients newly admitted to nursing homes after hip fracture.

    Science.gov (United States)

    Kiel, D P; Eichorn, A; Intrator, O; Silliman, R A; Mor, V

    1994-08-01

    The outcomes of elderly, hospitalized patients discharged to nursing homes after hip fracture were examined. For 2624 hip fracture patients admitted to any of 43 proprietary nursing homes between 1984 and 1988, admission assessments were examined in relation to 1-month outcomes. Mean patient age was 82 +/- 7 y; 85% of the sample were female. Within 1 month after discharge, 24% had returned home, 12% had been rehospitalized, 3% had died, and 61% remained in the nursing home. Characteristics significantly associated with morality included disorientation, functional dependency, neurologic diagnoses, and use of cardiac medications, antidepressants, or narcotics. Rehospitalization was significantly associated with age, gender, living with someone, being ambulatory, and functional dependency. Returning home was associated with younger age, living with someone, being ambulatory, and having no disorientation, functional dependency, or psychiatric or neurologic diagnoses, nor any pressure sores. Better-functioning persons and those with social support returned home; physically and cognitively impaired persons and those taking narcotics, cardiac medications, or antidepressants were likely to die; and younger men, those with social support, those with functional dependency, and those who were free of disorientation were more likely to be rehospitalized.

  1. Design and implementation of a factorial randomized controlled trial of methadone maintenance therapy and an evidence-based behavioral intervention for incarcerated people living with HIV and opioid dependence in Malaysia.

    Science.gov (United States)

    Bazazi, Alexander R; Wickersham, Jeffrey A; Wegman, Martin P; Culbert, Gabriel J; Pillai, Veena; Shrestha, Roman; Al-Darraji, Haider; Copenhaver, Michael M; Kamarulzaman, Adeeba; Altice, Frederick L

    2017-08-01

    Incarcerated people living with HIV and opioid dependence face enormous challenges to accessing evidence-based treatment during incarceration and after release into the community, placing them at risk of poor HIV treatment outcomes, relapse to opioid use and accompanying HIV transmission risk behaviors. Here we describe in detail the design and implementation of Project Harapan, a prospective clinical trial conducted among people living with HIV and opioid dependence who transitioned from prison to the community in Malaysia from 2010 to 2014. This trial involved 2 interventions: within-prison initiation of methadone maintenance therapy and an evidence-based behavioral intervention adapted to the Malaysian context (the Holistic Health Recovery Program for Malaysia, HHRP-M). Individuals were recruited and received the interventions while incarcerated and were followed for 12months after release to assess post-release HIV transmission risk behaviors and a range of other health-related outcomes. Project Harapan was designed as a fully randomized 2×2 factorial trial where individuals would be allocated in equal proportions to methadone maintenance therapy and HHRP-M, methadone maintenance therapy alone, HHRP-M alone, or control. Partway through study implementation, allocation to methadone maintenance therapy was changed from randomization to participant choice; randomization to HHRP-M continued throughout. We describe the justification for this study; the development and implementation of these interventions; changes to the protocol; and screening, enrollment, treatment receipt, and retention of study participants. Logistical, ethical, and analytic issues associated with the implementation of this study are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Sampling bias in climate-conflict research

    Science.gov (United States)

    Adams, Courtland; Ide, Tobias; Barnett, Jon; Detges, Adrien

    2018-03-01

    Critics have argued that the evidence of an association between climate change and conflict is flawed because the research relies on a dependent variable sampling strategy1-4. Similarly, it has been hypothesized that convenience of access biases the sample of cases studied (the `streetlight effect'5). This also gives rise to claims that the climate-conflict literature stigmatizes some places as being more `naturally' violent6-8. Yet there has been no proof of such sampling patterns. Here we test whether climate-conflict research is based on such a biased sample through a systematic review of the literature. We demonstrate that research on climate change and violent conflict suffers from a streetlight effect. Further, studies which focus on a small number of cases in particular are strongly informed by cases where there has been conflict, do not sample on the independent variables (climate impact or risk), and hence tend to find some association between these two variables. These biases mean that research on climate change and conflict primarily focuses on a few accessible regions, overstates the links between both phenomena and cannot explain peaceful outcomes from climate change. This could result in maladaptive responses in those places that are stigmatized as being inherently more prone to climate-induced violence.

  3. TA Treatment of Depression - A Hermeneutic Single-Case Efficacy Design Study - ‘Linda’ - a mixed outcome case

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2013-07-01

    Full Text Available Hermeneutic Single-Case EfficacyDesign (HSCED is a systematic case study research method involving the cross-examination of mixed method data to generate both plausible arguments that the client changed due to therapy and alternative explanations. The present study is the fourth article of a case series which has investigated the process and outcome of transactional analysis psychotherapy using Hermeneutic Single-Case Efficacy Design (Elliott 2002. The client, Linda, was a 45 yearold white British woman with mild depression who attended nine sessions of therapy. The conclusion of the judges was that this was a mixed-outcome case:whilst the client improved over the course of therapy and was positive about her experience of therapy, her changes did not last when she experienced considerable stressful events during follow-up. Linda provided a detailed and idiosyncratic description of the aspects of the therapy which were most helpful for her. A cross-case comparison with other cases in this series suggests several interesting features which are worthy of further investigation. Specifically, the use of a shared theoretical framework and an egalitarian therapeutic relationship were helpful. As with other cases in this series, the client experienced positive changes in her interpersonal relationships suggesting that this outcome of TA therapy warrants further investigation

  4. Probabilistic site dependent design spectra for a NPP

    International Nuclear Information System (INIS)

    Chavez, M.; Arroyo, M.; Romo, M.P.

    1985-01-01

    A methodology is proposed to compute the design spectra for a NPP site. Near field earthquakes are included by using an appropriately scaled sample of response spectra. Site effects are considered through a probabilistic site response analysis in the frequency domain which considers nonlinear behaviour of soils. The uncertainties of the soil shear modulus, G, are introduced by using Rosenblueth's point estimates. Strong motion duration is treated by using sensitivity analysis. The procedure is applied to a NPP site and the results are: a) the USNCR R.G.1.60 underestimate the spectral amplitudes for frequencies of interest; b) the omission of the uncertainties on the G leads to under or over-estimate the spectral amplitudes at certain frequency bands; c) the effect of considering the actual strong motion duration instead of an average value is to reduce the peak spectral amplitudes by a ten per cent. (orig.)

  5. Methodology Series Module 5: Sampling Strategies.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  6. Methodology series module 5: Sampling strategies

    Directory of Open Access Journals (Sweden)

    Maninder Singh Setia

    2016-01-01

    Full Text Available Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the 'Sampling Method'. There are essentially two types of sampling methods: 1 probability sampling – based on chance events (such as random numbers, flipping a coin etc.; and 2 non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term 'random sample' when the researcher has used convenience sample. The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the 'generalizability' of these results. In such a scenario, the researcher may want to use 'purposive sampling' for the study.

  7. Thermal stress analysis and the effect of temperature dependence of material properties on Doublet III limiter design

    International Nuclear Information System (INIS)

    McKelvey, T.E.; Koniges, A.E.; Marcus, F.; Sabado, M.; Smith, R.

    1979-10-01

    Temperature and thermal stress parametric design curves are presented for two materials selected for Doublet III primary limiter applications. INC X-750 is a candidate for the medium Z limiter design and ATJ graphite for the low Z design. The dependence of significant material properties on temperature is shown and the impact of this behavior on the decision to actively or passively cool the limiter is discussed

  8. Software documentation and user's manual for fish-impingement sampling design and estimation method computer programs

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-11-01

    This report contains a description of three computer programs that implement the theory of sampling designs and the methods for estimating fish-impingement at the cooling-water intakes of nuclear power plants as described in companion report ANL/ES-60. Complete FORTRAN listings of these programs, named SAMPLE, ESTIMA, and SIZECO, are given and augmented with examples of how they are used

  9. Adrenal Vein Sampling for Conn’s Syndrome: Diagnosis and Clinical Outcomes

    Directory of Open Access Journals (Sweden)

    Amy R. Deipolyi

    2015-06-01

    Full Text Available Adrenal vein sampling (AVS is the gold standard test to determine unilateral causes of primary aldosteronism (PA. We have retrospectively characterized our experience with AVS including concordance of AVS results and imaging, and describe the approach for the PA patient in whom bilateral AVS is unsuccessful. We reviewed the medical records of 85 patients with PA and compared patients who were treated medically and surgically on pre-procedure presentation and post-treatment outcomes, and evaluated how technically unsuccessful AVS results were used in further patient management. Out of the 92 AVS performed in 85 patients, AVS was technically successful bilaterally in 58 (63% of cases. Either unsuccessful AVS prompted a repeat AVS, or results from the contralateral side and from CT imaging were used to guide further therapy. Patients who were managed surgically with adrenalectomy had higher initial blood pressure and lower potassium levels compared with patients who were managed medically. Adrenalectomy results in significantly decreased blood pressure and normalization of potassium levels. AVS can identify surgically curable causes of PA, but can be technically challenging. When one adrenal vein fails to be cannulated, results from the contralateral vein can be useful in conjunction with imaging and clinical findings to suggest further management.

  10. Learning Program for Enhancing Visual Literacy for Non-Design Students Using a CMS to Share Outcomes

    Science.gov (United States)

    Ariga, Taeko; Watanabe, Takashi; Otani, Toshio; Masuzawa, Toshimitsu

    2016-01-01

    This study proposes a basic learning program for enhancing visual literacy using an original Web content management system (Web CMS) to share students' outcomes in class as a blog post. It seeks to reinforce students' understanding and awareness of the design of visual content. The learning program described in this research focuses on to address…

  11. Medical complications of intra-hospital patient transports: implications for architectural design and research.

    Science.gov (United States)

    Ulrich, Roger S; Zhu, Xuemei

    2007-01-01

    Literature on healthcare architecture and evidence-based design has rarely considered explicitly that patient outcomes may be worsened by intra-hospital transport (IHT), which is defined as transport of patients within the hospital. The article focuses on the effects of IHTs on patient complications and outcomes, and the implications of such impacts for designing safer, better hospitals. A review of 22 scientific studies indicates that IHTs are subject to a wide range of complications, many of which occur frequently and have distinctly detrimental effects on patient stability and outcomes. The research suggests that higher patient acuity and longer transport durations are associated with more frequent and serious IHT-related complications and outcome effects. It appears no rigorous research has compared different hospital designs and layouts with respect to having possibly differential effects on transport-related complications and worsened outcomes. Nonetheless, certain design implications can be extracted from the existing research literature, including the importance of minimizing transport delays due to restricted space and congestion, and creating layouts that shorten IHT times for high-acuity patients. Limited evidence raises the possibility that elevator-dependent vertical building layouts may increase susceptibility to transport delays that worsen complications. The strong evidence indicating that IHTs trigger complications and worsen outcomes suggests a powerful justification for adopting acuity-adaptable rooms and care models that substantially reduce transports. A program of studies is outlined to address gaps in knowledge.Key WordsPatient transports, transports within hospitals, patient safety, evidence-based design, hospital design, healthcare architecture, intra-hospital transport complications, acuity-adaptable care, elevators, outcomes.

  12. Just showing up is not enough: Homework adherence and outcome in cognitive-behavioral therapy for cocaine dependence.

    Science.gov (United States)

    Decker, Suzanne E; Kiluk, Brian D; Frankforter, Tami; Babuscio, Theresa; Nich, Charla; Carroll, Kathleen M

    2016-10-01

    Homework in cognitive-behavioral therapy (CBT) provides opportunities to practice skills. In prior studies, homework adherence was associated with improved outcome across a variety of disorders. Few studies have examined whether the relationship between homework adherence and outcome is maintained after treatment end or is independent of treatment attendance. This study combined data from 4 randomized clinical trials of CBT for cocaine dependence to examine relationships among homework adherence, participant variables, and cocaine use outcomes during treatment and at follow-up. The data set included only participants who attended at least 2 CBT sessions to allow for assignment and return of homework (N = 158). Participants returned slightly less than half (41.1%) of assigned homework. Longitudinal random effects regression suggested a greater reduction in cocaine use during treatment and through 12-month follow-up for participants who completed half or more of assigned homework (3-way interaction), F(2, 910.69) = 4.28, p = .01. In multiple linear regression, the percentage of homework adherence was associated with greater number of cocaine-negative urine toxicology screens during treatment, even when accounting for baseline cocaine use frequency and treatment attendance; at 3 months follow-up, multiple logistic regression indicated homework adherence was associated with cocaine-negative urine toxicology screen, controlling for baseline cocaine use and treatment attendance. These results extend findings from prior studies regarding the importance of homework adherence by demonstrating associations among homework and cocaine use outcomes during treatment and up to 12 months after, independent of treatment attendance and baseline cocaine use severity. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Long-term outcomes of kidney transplantation across a positive complement-dependent cytotoxicity crossmatch.

    Science.gov (United States)

    Riella, Leonardo V; Safa, Kassem; Yagan, Jude; Lee, Belinda; Azzi, Jamil; Najafian, Nader; Abdi, Reza; Milford, Edgar; Mah, Helen; Gabardi, Steven; Malek, Sayeed; Tullius, Stefan G; Magee, Colm; Chandraker, Anil

    2014-06-27

    More than 30% of potential kidney transplant recipients have pre-existing anti-human leukocyte antigen antibodies. This subgroup has significantly lower transplant rates and increased mortality. Desensitization has become an important tool to overcome this immunological barrier. However, limited data is available regarding long-term outcomes, in particular for the highest risk group with a positive complement-dependent cytotoxicity crossmatch (CDC XM) before desensitization. Between 2002 and 2010, 39 patients underwent living-kidney transplantation across a positive CDC XM against their donors at our center. The desensitization protocol involved pretransplant immunosuppression, plasmapheresis, and low-dose intravenous immunoglobulin±rituximab. Measured outcomes included patient survival, graft survival, renal function, rates of rejection, infection, and malignancy. The mean and median follow-up was 5.2 years. Patient survival was 95% at 1 year, 95% at 3 years, and 86% at 5 years. Death-censored graft survival was 94% at 1 year, 88% at 3 years, and 84% at 5 years. Uncensored graft survival was 87% at 1 year, 79% at 3 years, and 72% at 5 years. Twenty-four subjects (61%) developed acute antibody-mediated rejection of the allograft and one patient lost her graft because of hyperacute rejection. Infectious complications included pneumonia (17%), BK nephropathy (10%), and CMV disease (5%). Skin cancer was the most prevalent malignancy in 10% of patients. There were no cases of lymphoproliferative disorder. Mean serum creatinine was 1.7±1 mg/dL in functioning grafts at 5 years after transplantation. Despite high rates of early rejection, desensitization in living-kidney transplantation results in acceptable 5-year patient and graft survival rates.

  14. Expect With Me: development and evaluation design for an innovative model of group prenatal care to improve perinatal outcomes.

    Science.gov (United States)

    Cunningham, Shayna D; Lewis, Jessica B; Thomas, Jordan L; Grilo, Stephanie A; Ickovics, Jeannette R

    2017-05-18

    Despite biomedical advances and intervention efforts, rates of preterm birth and other adverse outcomes in the United States have remained relatively intransigent. Evidence suggests that group prenatal care can reduce these risks, with implications for maternal and child health as well as substantial cost savings. However, widespread dissemination presents challenges, in part because training and health systems have not been designed to deliver care in a group setting. This manuscript describes the design and evaluation of Expect With Me, an innovative model of group prenatal care with a strong integrated information technology (IT) platform designed to be scalable nationally. Expect With Me follows clinical guidelines from the American Congress of Obstetricians and Gynecologists. Expect With Me incorporates the best evidence-based features of existing models of group care with a novel integrated IT platform designed to improve patient engagement and support, enhance health behaviors and decision making, connect providers and patients, and improve health service delivery. A multisite prospective longitudinal cohort study is being conducted to examine the impact of Expect With Me on perinatal and postpartum outcomes, and to identify and address barriers to national scalability. Process and outcome evaluation will include quantitative and qualitative data collection at patient, provider, and organizational levels. Mixed-method data collection includes patient surveys, medical record reviews, patient focus groups; provider surveys, session evaluations, provider focus groups and in-depth interviews; an online tracking system; and clinical site visits. A two-to-one matched cohort of women receiving individual care from each site will provide a comparison group (n = 1,000 Expect With Me patients; n = 2,000 individual care patients) for outcome and cost analyses. By bundling prevention and care services into a high-touch, high-tech group prenatal care model

  15. Tradeoff analysis for Dependable Real-Time Embedded Systems during the Early Design Phases

    DEFF Research Database (Denmark)

    Gan, Junhe

    Embedded systems are becoming increasingly complex and have tight competing constraints in terms of performance, cost, energy consumption, dependability, flexibility, security, etc. The objective of this thesis is to propose design methods and tools for supporting the tradeoff analysis of competing...... to processing elements, as well as the processor voltage and frequency levels for executing each task, such that transient faults are tolerated, the real-time constraints of the application are satisfied, and the energy consumed is minimized. In this thesis, we target the early design phases, when decisions...... have a high impact on the subsequent implementation choices. However, due to a lack of information, the early design phases are characterized by uncertainties, e.g., in the worst-case execution times (WCETs), in the functionality requirements, or in the hardware component costs. In this context, we...

  16. Methodology Series Module 5: Sampling Strategies

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  17. Analysing designed experiments in distance sampling

    Science.gov (United States)

    Stephen T. Buckland; Robin E. Russell; Brett G. Dickson; Victoria A. Saab; Donal N. Gorman; William M. Block

    2009-01-01

    Distance sampling is a survey technique for estimating the abundance or density of wild animal populations. Detection probabilities of animals inherently differ by species, age class, habitats, or sex. By incorporating the change in an observer's ability to detect a particular class of animals as a function of distance, distance sampling leads to density estimates...

  18. FACTORIAL VALIDITY OF THE KOREAN VERSION OF THE EXERCISE DEPENDENCE SCALE-REVISED.

    Science.gov (United States)

    Shin, Kyulee; You, Sukkyung

    2015-12-01

    This study evaluated the psychometric properties of the Korean version of the 21-item Exercise Dependence Scale-Revised (EDS-R). The EDS-R was designed to measure the multidimensional aspects of exercise dependence symptoms such as withdrawal, continuance, tolerance, lack of control, reductions, time, and intention. Although the EDS-R has demonstrated sound psychometric properties, it has only been validated in Western samples. Cross-cultural validations of the instrument may increase the knowledge of exercise dependence. Therefore, this study aimed to contribute to the file by investigating the validity and utility of the construct of the EDS-R, using a non-Western sample. 402 adult participants who were over 18 years of age and who reported exercising at least once a week were asked to complete the EDS-R. The results from factor analyses supported that the seven-factor model of exercise dependence symptoms showed an adequate fit for both men and women. The EDS-R scores differentiated between samples, with varying amounts of exercise; 15.4% of the sample was classified as being at risk for exercise dependence. In sum, the results indicated that the EDS-R is a psychometrically reliable assessment tool for exercise dependence symptoms in Korea.

  19. The association between smoking and subsequent suicide-related outcomes in the National Comorbidity Survey panel sample.

    Science.gov (United States)

    Kessler, R C; Borges, G; Sampson, N; Miller, M; Nock, M K

    2009-12-01

    Controversy exists about whether the repeatedly documented associations between smoking and subsequent suicide-related outcomes (SROs; ideation, plans, gestures and attempts) are due to unmeasured common causes or to causal effects of smoking on SROs. We address this issue by examining associations of smoking with subsequent SROs with and without controls for potential explanatory variables in the National Comorbidity Survey (NCS) panel. The latter consists of 5001 people who participated in both the 1990-2002 NCS and the 2001-2003 NCS follow-up survey. Explanatory variables include sociodemographics, potential common causes (parental history of mental-substance disorders; other respondent childhood adversities) and potential mediators (respondent history of Diagnostic and Statistical Manual of Mental Disorders, 3rd edn, revised mental-substance disorders). Small gross (that is, without controls) prospective associations are found between history of early-onset nicotine dependence and both subsequent suicide ideation and, among ideators, subsequent suicide plans. None of the baseline smoking measures, though, predicts subsequent suicide gestures or attempts among ideators. The smoking-ideation association largely disappear, but the association of early-onset nicotine dependence with subsequent suicide plans persists (odds ratio=3.0), after adjustment for control variables. However, the latter association is as strong with remitted as active nicotine dependence, arguing against a direct causal effect of nicotine dependence on suicide plans. Decomposition of the control variable effects, furthermore, suggests that these effects are due to common causes more than to mediators. These results refine our understanding of the ways in which smoking is associated with later SROs and for the most part argue against the view that these associations are due to causal effects of smoking.

  20. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  1. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  2. Clinical Outcomes among Transferred Children with Ischemic and Hemorrhagic Strokes in the Nationwide Inpatient Sample.

    Science.gov (United States)

    Adil, Malik M; Vidal, Gabriel A; Beslow, Lauren A

    2016-11-01

    Children with ischemic stroke (IS) and hemorrhagic stroke (HS) may require interfacility transfer for higher level of care. We compared the characteristics and clinical outcomes of transferred and nontransferred children with IS and HS. Children aged 1-18 years admitted to hospitals in the United States from 2008 to 2011 with a primary discharge diagnosis of IS and HS were identified from the National Inpatient Sample database by ICD-9 codes. Using logistic regression, we estimated the odds ratios (OR) and 95% confidence intervals (CI) for in-hospital mortality and discharge to nursing facilities (versus discharge home) between transferred and nontransferred patients. Of the 2815 children with IS, 26.7% were transferred. In-hospital mortality and discharge to nursing facilities were not different between transferred and nontransferred children in univariable analysis or in multivariable analysis that adjusted for age, sex, and confounding factors. Of the 6879 children with HS, 27.1% were transferred. Transferred compared to nontransferred children had higher rates of both in-hospital mortality (8% versus 4%, P = .003) and discharge to nursing facilities (25% versus 20%, P = .03). After adjusting for age, sex, and confounding factors, in-hospital mortality (OR 1.5, 95% CI 1.1-2.4, P = .04) remained higher in transferred children, whereas discharge to nursing facilities was not different between the groups. HS but not IS was associated with worse outcomes for children transferred to another hospital compared to children who were not transferred. Additional study is needed to understand what factors may contribute to poorer outcomes among transferred children with HS. Copyright © 2016 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  3. Implicit Valuation of the Near-Miss is Dependent on Outcome Context.

    Science.gov (United States)

    Banks, Parker J; Tata, Matthew S; Bennett, Patrick J; Sekuler, Allison B; Gruber, Aaron J

    2018-03-01

    Gambling studies have described a "near-miss effect" wherein the experience of almost winning increases gambling persistence. The near-miss has been proposed to inflate the value of preceding actions through its perceptual similarity to wins. We demonstrate here, however, that it acts as a conditioned stimulus to positively or negatively influence valuation, dependent on reward expectation and cognitive engagement. When subjects are asked to choose between two simulated slot machines, near-misses increase valuation of machines with a low payout rate, whereas they decrease valuation of high payout machines. This contextual effect impairs decisions and persists regardless of manipulations to outcome feedback or financial incentive provided for good performance. It is consistent with proposals that near-misses cause frustration when wins are expected, and we propose that it increases choice stochasticity and overrides avoidance of low-valued options. Intriguingly, the near-miss effect disappears when subjects are required to explicitly value machines by placing bets, rather than choosing between them. We propose that this task increases cognitive engagement and recruits participation of brain regions involved in cognitive processing, causing inhibition of otherwise dominant systems of decision-making. Our results reveal that only implicit, rather than explicit strategies of decision-making are affected by near-misses, and that the brain can fluidly shift between these strategies according to task demands.

  4. Comparative Survey of Mental Disorders and Personality Characteristics in Persons With Drug Dependent and Non Drug Dependent in Hamadan, Iran

    Directory of Open Access Journals (Sweden)

    A. Ghaleiha

    2008-07-01

    Full Text Available Introduction & Objective: The influence of genetic, biological, psychological, social and cultural factors on drug dependency and high rate comorbidity of this phenomenon with psychiatric disorders for example anxiety, depression and characteristics and personality disorders is emphasized. The aim of this study was the comparative investigation of mental disorders and personality traits in persons with drug dependent and non drug dependent in Hamadan city in 2001-2003 .Materials & Methods: Present research was a descriptive comparison and subjects were 100 drug dependent persons and 100 non drug dependent individuals .Case group was chosen through available sampling among persons who call on psychiatrist and control group was chosen through randomly simple sampling from general population. Measurement and diagnostic tools includes questionnaire for examining demographic characteristics, designed by researchers and MMPI, SCL90-R tests and DSM IV criteria diagnostic and also T test that was used for analyzing data.Results: Between two groups in clinical and validity scales of MMPI test expect for hypochondriasis and hysteria and scales of SCL 90-R test expect somatization and interpersonal sensitivity differences were statistically significant (P<0.05.Conclusion: We can conclude that persons with drug dependent display more signs of psychopathology and mental disorders in comparison with non drug dependent people and Major depressive disorder and personality disorder were frequent among drug dependent groups. Depression and personal disorders were frequent in non drug dependent persons too. Our results support the results of previous studies.

  5. HPLC/DAD determination of rosmarinic acid in Salvia officinalis: sample preparation optimization by factorial design

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina B. de [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Farmacia; Oliveira, Bras H. de, E-mail: bho@ufpr.br [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Quimica

    2013-01-15

    Sage (Salvia officinalis) contains high amounts of the biologically active rosmarinic acid (RA) and other polyphenolic compounds. RA is easily oxidized, and may undergo degradation during sample preparation for analysis. The objective of this work was to develop and validate an analytical procedure for determination of RA in sage, using factorial design of experiments for optimizing sample preparation. The statistically significant variables for improving RA extraction yield were determined initially and then used in the optimization step, using central composite design (CCD). The analytical method was then fully validated, and used for the analysis of commercial samples of sage. The optimized procedure involved extraction with aqueous methanol (40%) containing an antioxidant mixture (ascorbic acid and ethylenediaminetetraacetic acid (EDTA)), with sonication at 45 deg C for 20 min. The samples were then injected in a system containing a C{sub 18} column, using methanol (A) and 0.1% phosphoric acid in water (B) in step gradient mode (45A:55B, 0-5 min; 80A:20B, 5-10 min) with flow rate of 1.0 mL min-1 and detection at 330 nm. Using this conditions, RA concentrations were 50% higher when compared to extractions without antioxidants (98.94 {+-} 1.07% recovery). Auto-oxidation of RA during sample extraction was prevented by the use of antioxidants resulting in more reliable analytical results. The method was then used for the analysis of commercial samples of sage. (author)

  6. HPLC/DAD determination of rosmarinic acid in Salvia officinalis: sample preparation optimization by factorial design

    International Nuclear Information System (INIS)

    Oliveira, Karina B. de; Oliveira, Bras H. de

    2013-01-01

    Sage (Salvia officinalis) contains high amounts of the biologically active rosmarinic acid (RA) and other polyphenolic compounds. RA is easily oxidized, and may undergo degradation during sample preparation for analysis. The objective of this work was to develop and validate an analytical procedure for determination of RA in sage, using factorial design of experiments for optimizing sample preparation. The statistically significant variables for improving RA extraction yield were determined initially and then used in the optimization step, using central composite design (CCD). The analytical method was then fully validated, and used for the analysis of commercial samples of sage. The optimized procedure involved extraction with aqueous methanol (40%) containing an antioxidant mixture (ascorbic acid and ethylenediaminetetraacetic acid (EDTA)), with sonication at 45 deg C for 20 min. The samples were then injected in a system containing a C 18 column, using methanol (A) and 0.1% phosphoric acid in water (B) in step gradient mode (45A:55B, 0-5 min; 80A:20B, 5-10 min) with flow rate of 1.0 mL min−1 and detection at 330 nm. Using this conditions, RA concentrations were 50% higher when compared to extractions without antioxidants (98.94 ± 1.07% recovery). Auto-oxidation of RA during sample extraction was prevented by the use of antioxidants resulting in more reliable analytical results. The method was then used for the analysis of commercial samples of sage. (author)

  7. Using instructional design process to improve design and development of Internet interventions.

    Science.gov (United States)

    Hilgart, Michelle M; Ritterband, Lee M; Thorndike, Frances P; Kinzie, Mable B

    2012-06-28

    Given the wide reach and extensive capabilities of the Internet, it is increasingly being used to deliver comprehensive behavioral and mental health intervention and prevention programs. Their goals are to change user behavior, reduce unwanted complications or symptoms, and improve health status and health-related quality of life. Internet interventions have been found efficacious in addressing a wide range of behavioral and mental health problems, including insomnia, nicotine dependence, obesity, diabetes, depression, and anxiety. Despite the existence of many Internet-based interventions, there is little research to inform their design and development. A model for behavior change in Internet interventions has been published to help guide future Internet intervention development and to help predict and explain behavior changes and symptom improvement outcomes through the use of Internet interventions. An argument is made for grounding the development of Internet interventions within a scientific framework. To that end, the model highlights a multitude of design-related components, areas, and elements, including user characteristics, environment, intervention content, level of intervention support, and targeted outcomes. However, more discussion is needed regarding how the design of the program should be developed to address these issues. While there is little research on the design and development of Internet interventions, there is a rich, related literature in the field of instructional design (ID) that can be used to inform Internet intervention development. ID models are prescriptive models that describe a set of activities involved in the planning, implementation, and evaluation of instructional programs. Using ID process models has been shown to increase the effectiveness of learning programs in a broad range of contexts. ID models specify a systematic method for assessing the needs of learners (intervention users) to determine the gaps between current

  8. Design of an automatic sample changer for the measurement of neutron flux by gamma spectrometry

    International Nuclear Information System (INIS)

    Gago, Javier; Bruna, Ruben; Baltuano, Oscar; Montoya, Eduardo; Descreaux, Killian

    2014-01-01

    This paper presents calculus, selection and components design for the construction of an automatic system in order to measure neutron flux in a working nuclear reactor by the gamma spectrometry technique using samples irradiated on the RP-10 nucleus. This system will perform the measurement of interchanging 100 samples in a programed and automatic way, reducing operation time by the user and obtaining more accurate measures. (authors).

  9. Design and construction of a prototype vaporization calorimeter for the assay of radioisotopic samples

    International Nuclear Information System (INIS)

    Tormey, T.V.

    1979-10-01

    A prototype vaporization calorimeter has been designed and constructed for use in the assay of low power output radioisotopic samples. The prototype calorimeter design was based on that of a previous experimental instrument used by H.P. Stephens, to establish the feasibility of the vaporization calorimetry technique for this type of power measurement. The calorimeter is composed of a mechanical calorimeter assembly together with a data acquisition and control system. Detailed drawings of the calorimeter assembly are included and additional drawings are referenced. The data acquisition system is based on an HP 9825A programmable calculator. A description of the hardware is provided together with a listing of all system software programs. The operating procedure is outlined, including initial setup and operation of all related equipment. Preliminary system performance was evaluated by making a series of four measurements on two nominal 1.5W samples and on a nominal 0.75W sample. Data for these measurements indicate that the absolute accuracy (one standard deviation) is approx. = 0.0035W in this power range, resulting in an estimated relative one standard deviation accuracy of 0.24% at 1.5W and 0.48% at 0.75W

  10. Sample size determination for mediation analysis of longitudinal data.

    Science.gov (United States)

    Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying

    2018-03-27

    Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.

  11. A distance-dependent metal-enhanced fluorescence sensing platform based on molecular beacon design.

    Science.gov (United States)

    Zhou, Zhenpeng; Huang, Hongduan; Chen, Yang; Liu, Feng; Huang, Cheng Zhi; Li, Na

    2014-02-15

    A new metal-enhanced fluorescence (MEF) based platform was developed on the basis of distance-dependent fluorescence quenching-enhancement effect, which combined the easiness of Ag-thiol chemistry with the MEF property of noble-metal structures as well as the molecular beacon design. For the given sized AgNPs, the fluorescence enhancement factor was found to increase with a d(6) dependency in agreement with fluorescence resonance energy transfer mechanism at shorter distance and decrease with a d(-3) dependency in agreement with plasmonic enhancement mechanism at longer distance between the fluorophore and the AgNP surface. As a proof of concept, the platform was demonstrated by a sensitive detection of mercuric ions, using thymine-containing molecular beacon to tune silver nanoparticle (AgNP)-enhanced fluorescence. Mercuric ions were detected via formation of a thymine-mercuric-thymine structure to open the hairpin, facilitating fluorescence recovery and AgNP enhancement to yield a limit of detection of 1 nM, which is well below the U.S. Environmental Protection Agency regulation of the Maximum Contaminant Level Goal (10nM) in drinking water. Since the AgNP functioned as not only a quencher to reduce the reagent blank signal but also an enhancement substrate to increase fluorescence of the open hairpin when target mercuric ions were present, the quenching-enhancement strategy can greatly improve the detection sensitivity and can in principle be a universal approach for various targets when combined with molecular beacon design. © 2013 Elsevier B.V. All rights reserved.

  12. Crude incidence in two-phase designs in the presence of competing risks

    Directory of Open Access Journals (Sweden)

    Paola Rebora

    2016-01-01

    Full Text Available Abstract Background In many studies, some information might not be available for the whole cohort, some covariates, or even the outcome, might be ascertained in selected subsamples. These studies are part of a broad category termed two-phase studies. Common examples include the nested case-control and the case-cohort designs. For two-phase studies, appropriate weighted survival estimates have been derived; however, no estimator of cumulative incidence accounting for competing events has been proposed. This is relevant in the presence of multiple types of events, where estimation of event type specific quantities are needed for evaluating outcome. Methods We develop a non parametric estimator of the cumulative incidence function of events accounting for possible competing events. It handles a general sampling design by weights derived from the sampling probabilities. The variance is derived from the influence function of the subdistribution hazard. Results The proposed method shows good performance in simulations. It is applied to estimate the crude incidence of relapse in childhood acute lymphoblastic leukemia in groups defined by a genotype not available for everyone in a cohort of nearly 2000 patients, where death due to toxicity acted as a competing event. In a second example the aim was to estimate engagement in care of a cohort of HIV patients in resource limited setting, where for some patients the outcome itself was missing due to lost to follow-up. A sampling based approach was used to identify outcome in a subsample of lost patients and to obtain a valid estimate of connection to care. Conclusions A valid estimator for cumulative incidence of events accounting for competing risks under a general sampling design from an infinite target population is derived.

  13. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    Science.gov (United States)

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data

  14. Could Learning Outcomes of the First Course in Accounting Predict Overall Academic Performance?

    Science.gov (United States)

    Alanzi, Khalid A.; Alfraih, Mishari M.

    2017-01-01

    Purpose: This study aims to question whether learning outcomes of the first course in accounting could predict the overall academic performance of accounting students as measured by their graduating grade point average (GPA). Design/methodology/approach The sample of the present study was drawn from accounting students who were graduated during…

  15. Social anxiety symptoms in alcohol-dependent outpatients: prevalence, severity and predictors

    Directory of Open Access Journals (Sweden)

    Nicoli Tamie Yoshimi

    2016-06-01

    Full Text Available ABSTRACT Objectives High rates of comorbidity between social anxiety disorder (SAD and alcohol use disorders have been reported, but the predictors of this comorbidity are poorly known and most studies involve primary SAD samples. The aims were to estimate the prevalence and severity of SAD symptoms among alcohol-dependent patients and to investigate sociodemographic and clinical factors associated with SAD comorbidity, including suicidal behaviors. Methods A cross-sectional study with 53 adults who were in treatment for alcohol dependence at a Brazilian public university outpatient service. Assessment instruments Social Phobia Inventory (SPIN, Short Alcohol Dependence Data and Beck Depression Inventory. Bivariate analyses between the categorical outcome (Probable SAD: SPIN ≥ 19 and explanatory variables were conducted. Correlates of SPIN total and subscales scores (dimensional outcomes were also investigated. Results The diagnosis and treatment of alcohol dependence occurred, on average, 30 years after the onset of alcohol use and 39.6% of the 53 patients (37 men and 16 women reported alleviation of social anxiety symptoms with alcohol use. Twenty-four (45.3% patients presented probable SAD. These patients differed from non-SAD alcohol-dependent individuals by having lower income and higher frequency of depression, suicidal ideation, suicide plans and attempts. The SPIN subscales mostly associated with suicidal behaviors were social inadequacy and social inferiority. Conclusions SAD symptoms are common among help-seeking alcohol-dependent individuals and should be directly investigated and treated, since depression and suicidality are associated with this comorbidity. Prospective studies are needed to assess the impact of SAD treatment on the clinical course of alcohol dependence.

  16. Employment-Based Abstinence Reinforcement as a Maintenance Intervention for the Treatment of Cocaine Dependence: A Randomized Controlled Trial

    Science.gov (United States)

    DeFulio, Anthony; Donlin, Wendy D.; Wong, Conrad J.; Silverman, Kenneth

    2009-01-01

    Context: Due to the chronic nature of cocaine dependence, long-term maintenance treatments may be required to sustain abstinence. Abstinence reinforcement is among the most effective means of initiating cocaine abstinence. Practical and effective means of maintaining abstinence reinforcement programs over time are needed. Objective: Determine whether employment-based abstinence reinforcement can be an effective long-term maintenance intervention for cocaine dependence. Design: Participants (N=128) were enrolled in a 6-month job skills training and abstinence initiation program. Participants who initiated abstinence, attended regularly, and developed needed job skills during the first six months were hired as operators in a data entry business and randomly assigned to an employment only (Control, n = 24) or abstinence-contingent employment (n = 27) group. Setting: A nonprofit data entry business. Participants: Unemployed welfare recipients who persistently used cocaine while enrolled in methadone treatment in Baltimore. Intervention: Abstinence-contingent employment participants received one year of employment-based contingency management, in which access to employment was contingent on provision drug-free urine samples under routine and then random drug testing. If a participant provided drug-positive urine or failed to provide a mandatory sample, then that participant received a temporary reduction in pay and could not work until urinalysis confirmed recent abstinence. Main Outcome Measure: Cocaine-negative urine samples at monthly assessments across one year of employment. Results: During the one-year of employment, abstinence-contingent employment participants provided significantly more cocaine-negative urine samples than employment only participants (79.3% and 50.7%, respectively; p = 0.004, OR = 3.73, 95% CI = 1.60 – 8.69). Conclusions: Employment-based abstinence reinforcement that includes random drug testing is effective as a long-term maintenance

  17. Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions.

    Science.gov (United States)

    Johnson, Derek R; Covington, April N; Clark, Nigel N

    2016-06-12

    The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities.

  18. Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions

    Science.gov (United States)

    Johnson, Derek R.; Covington, April N.; Clark, Nigel N.

    2016-01-01

    The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities. PMID:27341646

  19. Systematic review of psychosocial outcomes for patients with advanced melanoma.

    Science.gov (United States)

    Dunn, Jeff; Watson, Maggie; Aitken, Joanne F; Hyde, Melissa K

    2017-11-01

    New advanced melanoma therapies are associated with improved survival; however, quality of survivorship, particularly psychosocial outcomes, for patients overall and those treated with newer therapies is unclear. Synthesize qualitative and quantitative evidence about psychosocial outcomes for advanced (stage III/IV) melanoma patients. Five databases were searched (01/01/1980 to 31/01/2016). Inclusion criteria were as follows: advanced melanoma patients or sub-group analysis; assessed psychosocial outcomes; and English language. Fifty-two studies met review criteria (4 qualitative, 48 quantitative). Trials comprise mostly medical not psychosocial interventions, with psychosocial outcomes assessed within broader quality of life measures. Patients receiving chemotherapy or IFN-alpha showed decreased emotional and social function and increased distress. Five trials of newer therapies appeared to show improvements in emotional and social function. Descriptive studies suggest that patients with advanced, versus localized disease, had decreased emotional and social function and increased distress. Contributors to distress were largely unexplored, and no clear framework described coping/adjustment trajectories. Patients with advanced versus localized disease had more supportive care needs, particularly amount, quality, and timing of melanoma-related information, communication with and emotional support from clinicians. Limitations included: lack of theoretical underpinnings guiding study design; inconsistent measurement approaches; small sample sizes; non-representative sampling; and cross-sectional design. Quality trial evidence is needed to clarify the impact of treatment innovations for advanced melanoma on patients' psychosocial well-being. Survivorship research and subsequent translation of that knowledge into programs and services currently lags behind gains in the medical treatment of advanced melanoma, a troubling circumstance that requires immediate and focused

  20. Optimizing sampling design to deal with mist-net avoidance in Amazonian birds and bats.

    Directory of Open Access Journals (Sweden)

    João Tiago Marques

    Full Text Available Mist netting is a widely used technique to sample bird and bat assemblages. However, captures often decline with time because animals learn and avoid the locations of nets. This avoidance or net shyness can substantially decrease sampling efficiency. We quantified the day-to-day decline in captures of Amazonian birds and bats with mist nets set at the same location for four consecutive days. We also evaluated how net avoidance influences the efficiency of surveys under different logistic scenarios using re-sampling techniques. Net avoidance caused substantial declines in bird and bat captures, although more accentuated in the latter. Most of the decline occurred between the first and second days of netting: 28% in birds and 47% in bats. Captures of commoner species were more affected. The numbers of species detected also declined. Moving nets daily to minimize the avoidance effect increased captures by 30% in birds and 70% in bats. However, moving the location of nets may cause a reduction in netting time and captures. When moving the nets caused the loss of one netting day it was no longer advantageous to move the nets frequently. In bird surveys that could even decrease the number of individuals captured and species detected. Net avoidance can greatly affect sampling efficiency but adjustments in survey design can minimize this. Whenever nets can be moved without losing netting time and the objective is to capture many individuals, they should be moved daily. If the main objective is to survey species present then nets should still be moved for bats, but not for birds. However, if relocating nets causes a significant loss of netting time, moving them to reduce effects of shyness will not improve sampling efficiency in either group. Overall, our findings can improve the design of mist netting sampling strategies in other tropical areas.

  1. Choice of Sample Split in Out-of-Sample Forecast Evaluation

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Timmermann, Allan

    , while conversely the power of forecast evaluation tests is strongest with long out-of-sample periods. To deal with size distortions, we propose a test statistic that is robust to the effect of considering multiple sample split points. Empirical applications to predictabil- ity of stock returns......Out-of-sample tests of forecast performance depend on how a given data set is split into estimation and evaluation periods, yet no guidance exists on how to choose the split point. Empirical forecast evaluation results can therefore be difficult to interpret, particularly when several values...... and inflation demonstrate that out-of-sample forecast evaluation results can critically depend on how the sample split is determined....

  2. Aerosol sampling of an experimental fluidized bed coal combustor

    International Nuclear Information System (INIS)

    Newton, G.J.; Peele, E.R.; Carpenter, R.L.; Yeh, H.C.

    1977-01-01

    Fluidized bed combustion of coal, lignite or other materials has a potential for widespread use in central electric generating stations in the near future. This technology may allow widespread use of low-grade and/or high sulfur fuels due to its high energy utilization at low combustion temperature and its ability to meet emission criteria by using limestone bed material. Particulate and gaseous products resulting from fuel combustion and fluidization of bed material are discharged and proceed out the exhaust clean-up system. Sampling philosophy, methodology and equipment used to obtain aerosol samples from the exhaust system of the 18-inch fluidized bed combustor (FBC) at the Morgantown Energy Research Center (MERC) are described. Identification of sampling sites led to design of an aerosol sampling train which allowed a known quantity of the effluent streams to be sampled. Depending on the position, a 15 to 25 l/min sample is extracted from the duct, immediately diluted and transferred to a sampling/aging chamber. Transmission and scanning electron microscope samples, two types of cascade impactor samples, vapor-phase and particulate-phase organic samples, spiral duct aerosol centrifuge samples, optical size measurements and filter samples were obtained. Samples are undergoing physical, chemical and biological tests to help establish human health risk estimates for fluidized bed coal combustion and to provide information for use in design and evaluation of control technologies

  3. Comparisons of three nicotine dependence scales in a multiethnic sample of young adult menthol and non-menthol smokers.

    Science.gov (United States)

    Fagan, Pebbles; Pohkrel, Pallav; Herzog, Thaddeus; Pagano, Ian; Vallone, Donna; Trinidad, Dennis R; Sakuma, Kari-Lyn; Sterling, Kymberle; Fryer, Craig S; Moolchan, Eric

    2015-04-01

    Few studies have compared nicotine dependence among menthol and non-menthol cigarette smokers in a multiethnic sample of young adult daily cigarette smokers. This study examines differences in nicotine dependence among menthol and non-menthol daily smokers and the associations of nicotine dependence with quitting behaviors among Native Hawaiian, Filipino, and White cigarette smokers aged 18-35. Craigslist.org, newspaper advertisements, and peer-to-peer referrals were used to recruit daily smokers (n = 186) into a lab-based study. Nicotine dependence was assessed using the Fagerstrom Test of Nicotine Dependence (FTND), the Nicotine Dependence Syndrome Scale (NDSS), and the brief Wisconsin Inventory for Smoking Dependence Motives (WISDM). Multiple regression analyses were used to examine differences in nicotine dependence between menthol and non-menthol smokers and the relationship between each nicotine dependence scale with self-efficacy to quit, quit attempt in the past 12 months, and number of attempts. Menthol smokers were more likely to report difficulty refraining from smoking in places where forbidden (p = .04) and had higher scores on social/environmental goads subscale of the WISDM (p = .0005). Two-way interaction models of the FTND and menthol status showed that menthol smokers with higher levels of dependence were more likely to have tried to quit smoking in the past 12 months (p = .02), but were less likely to have had multiple quit attempts (p = .01). Components of the FTND and WISDM distinguish levels of dependence between menthol and non-menthol smokers. Higher FTND scores were associated with having a quit attempt, but fewer quit attempts among menthol smokers. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Real-time photonic sampling with improved signal-to-noise and distortion ratio using polarization-dependent modulators

    Science.gov (United States)

    Liang, Dong; Zhang, Zhiyao; Liu, Yong; Li, Xiaojun; Jiang, Wei; Tan, Qinggui

    2018-04-01

    A real-time photonic sampling structure with effective nonlinearity suppression and excellent signal-to-noise ratio (SNR) performance is proposed. The key points of this scheme are the polarization-dependent modulators (P-DMZMs) and the sagnac loop structure. Thanks to the polarization sensitive characteristic of P-DMZMs, the differences between transfer functions of the fundamental signal and the distortion become visible. Meanwhile, the selection of specific biases in P-DMZMs is helpful to achieve a preferable linearized performance with a low noise level for real-time photonic sampling. Compared with the quadrature-biased scheme, the proposed scheme is capable of valid nonlinearity suppression and is able to provide a better SNR performance even in a large frequency range. The proposed scheme is proved to be effective and easily implemented for real time photonic applications.

  5. Discrepancies in sample size calculations and data analyses reported in randomised trials: comparison of publications with protocols

    DEFF Research Database (Denmark)

    Chan, A.W.; Hrobjartsson, A.; Jorgensen, K.J.

    2008-01-01

    OBJECTIVE: To evaluate how often sample size calculations and methods of statistical analysis are pre-specified or changed in randomised trials. DESIGN: Retrospective cohort study. Data source Protocols and journal publications of published randomised parallel group trials initially approved...... in 1994-5 by the scientific-ethics committees for Copenhagen and Frederiksberg, Denmark (n=70). MAIN OUTCOME MEASURE: Proportion of protocols and publications that did not provide key information about sample size calculations and statistical methods; proportion of trials with discrepancies between...... of handling missing data was described in 16 protocols and 49 publications. 39/49 protocols and 42/43 publications reported the statistical test used to analyse primary outcome measures. Unacknowledged discrepancies between protocols and publications were found for sample size calculations (18/34 trials...

  6. Sample size requirements for separating out the effects of combination treatments: Randomised controlled trials of combination therapy vs. standard treatment compared to factorial designs for patients with tuberculous meningitis

    Directory of Open Access Journals (Sweden)

    Farrar Jeremy

    2011-02-01

    Full Text Available Abstract Background In certain diseases clinical experts may judge that the intervention with the best prospects is the addition of two treatments to the standard of care. This can either be tested with a simple randomized trial of combination versus standard treatment or with a 2 × 2 factorial design. Methods We compared the two approaches using the design of a new trial in tuberculous meningitis as an example. In that trial the combination of 2 drugs added to standard treatment is assumed to reduce the hazard of death by 30% and the sample size of the combination trial to achieve 80% power is 750 patients. We calculated the power of corresponding factorial designs with one- to sixteen-fold the sample size of the combination trial depending on the contribution of each individual drug to the combination treatment effect and the strength of an interaction between the two. Results In the absence of an interaction, an eight-fold increase in sample size for the factorial design as compared to the combination trial is required to get 80% power to jointly detect effects of both drugs if the contribution of the less potent treatment to the total effect is at least 35%. An eight-fold sample size increase also provides a power of 76% to detect a qualitative interaction at the one-sided 10% significance level if the individual effects of both drugs are equal. Factorial designs with a lower sample size have a high chance to be underpowered, to show significance of only one drug even if both are equally effective, and to miss important interactions. Conclusions Pragmatic combination trials of multiple interventions versus standard therapy are valuable in diseases with a limited patient pool if all interventions test the same treatment concept, it is considered likely that either both or none of the individual interventions are effective, and only moderate drug interactions are suspected. An adequately powered 2 × 2 factorial design to detect effects of

  7. A National Long-term Outcomes Evaluation of U.S. Premedical Postbaccalaureate Programs Designed to Promote Health care Access and Workforce Diversity.

    Science.gov (United States)

    McDougle, Leon; Way, David P; Lee, Winona K; Morfin, Jose A; Mavis, Brian E; Matthews, De'Andrea; Latham-Sadler, Brenda A; Clinchot, Daniel M

    2015-08-01

    The National Postbaccalaureate Collaborative (NPBC) is a partnership of Postbaccalaureate Programs (PBPs) dedicated to helping promising college graduates from disadvantaged and underrepresented backgrounds get into and succeed in medical school. This study aims to determine long-term program outcomes by looking at PBP graduates, who are now practicing physicians, in terms of health care service to the poor and underserved and contribution to health care workforce diversity. We surveyed the PBP graduates and a randomly drawn sample of non-PBP graduates from the affiliated 10 medical schools stratified by the year of medical school graduation (1996-2002). The PBP graduates were more likely to be providing care in federally designated underserved areas and practicing in institutional settings that enable access to care for vulnerable populations. The NPBC graduates serve a critical role in providing access to care for underserved populations and serve as a source for health care workforce diversity.

  8. Observer-Based Stabilization of Spacecraft Rendezvous with Variable Sampling and Sensor Nonlinearity

    Directory of Open Access Journals (Sweden)

    Zhuoshi Li

    2013-01-01

    Full Text Available This paper addresses the observer-based control problem of spacecraft rendezvous with nonuniform sampling period. The relative dynamic model is based on the classical Clohessy-Wiltshire equation, and sensor nonlinearity and sampling are considered together in a unified framework. The purpose of this paper is to perform an observer-based controller synthesis by using sampled and saturated output measurements, such that the resulting closed-loop system is exponentially stable. A time-dependent Lyapunov functional is developed which depends on time and the upper bound of the sampling period and also does not grow along the input update times. The controller design problem is solved in terms of the linear matrix inequality method, and the obtained results are less conservative than using the traditional Lyapunov functionals. Finally, a numerical simulation example is built to show the validity of the developed sampled-data control strategy.

  9. Identification of Novel Signal Transduction, Immune Function, and Oxidative Stress Genes and Pathways by Topiramate for Treatment of Methamphetamine Dependence Based on Secondary Outcomes

    Directory of Open Access Journals (Sweden)

    Tianhua Niu

    2017-12-01

    Full Text Available BackgroundTopiramate (TPM is suggested to be a promising medication for treatment of methamphetamine (METH dependence, but the molecular basis remains to be elucidated.MethodsAmong 140 METH-dependent participants randomly assigned to receive either TPM (N = 69 or placebo (N = 71 in a previously conducted randomized controlled trial, 50 TPM- and 49 placebo-treated participants had a total 212 RNA samples available at baseline, week 8, and week 12 time points. Following our primary analysis of gene expression data, we reanalyzed the microarray expression data based on a latent class analysis of binary secondary outcomes during weeks 1–12 that provided a classification of 21 responders and 31 non-responders with consistent responses at both time points.ResultsBased on secondary outcomes, 1,381, 576, 905, and 711 differentially expressed genes at nominal P values < 0.05 were identified in responders versus non-responders for week 8 TPM, week 8 placebo, week 12 TPM, and week 12 placebo groups, respectively. Among 1,381 genes identified in week 8 TPM responders, 359 genes were identified in both week 8 and week 12 TPM groups, of which 300 genes were exclusively detected in TPM responders. Of them, 32 genes had nominal P values < 5 × 10−3 at either week 8 or week 12 and false discovery rates < 0.15 at both time points with consistent directions of gene expression changes, which include GABARAPL1, GPR155, and IL15RA in GABA receptor signaling that represent direct targets for TPM. Analyses of these 300 genes revealed 7 enriched pathways belonging to neuronal function/synaptic plasticity, signal transduction, inflammation/immune function, and oxidative stress response categories. No pathways were enriched for 72 genes exclusively detected in both week 8 and week 12 placebo groups.ConclusionThis secondary analysis study of gene expression data from a TPM clinical trial not only yielded consistent results with those of primary

  10. The adaptive and maladaptive faces of dependency in later life: links to physical and psychological health outcomes.

    Science.gov (United States)

    Fiori, Katherine; Consedine, Nathan; Magai, Carol

    2008-11-01

    Negotiating the balance between reliance on others and desires for autonomy is a fundamental task of successful aging. The purpose of the present study was to replicate and extend a three-factor model of interpersonal dependency in a sample of older adults, and to examine the physical and psychological health correlates of this multifaceted construct. Data come from the third wave of a population-based study of older Americans (n = 166; mean age 80 years). We conducted an exploratory factor analysis of selected dependency items from two scales, and then conducted logistic and hierarchical linear regressions to analyze the association of dependency factors with self-reported health, use of hypertension medication, depressed affect and positive affect. We found three factors closely paralleling those of Bornstein and Languirand's (Psychological Bulletin, 112(1), 3-23, 2004) measure: destructive overdependence, healthy dependency and dysfunctional detachment, as well as a fourth factor we labeled 'healthy independence'. Healthy dependency was associated with better self-reported health. Dysfunctional detachment was related to a greater likelihood and healthy independence a lesser likelihood of taking hypertension medication. Whereas both healthy independence and healthy dependency were positively related to positive affect and negatively related to depressed affect, destructive overdependence was positively related to depressed affect. Understanding the complex nature of interpersonal dependency and autonomy in old age, as well as their implications for health and wellbeing, may enable practitioners to assist older adults in negotiating the task of balancing these needs.

  11. Illness severity and self-efficacy as course predictors of DSM-IV alcohol dependence in a multisite clinical sample.

    Science.gov (United States)

    Langenbucher, J; Sulesund, D; Chung, T; Morgenstern, J

    1996-01-01

    Illness severity and self-efficacy are two constructs of growing interest as predictors of clinical response in alcoholism. Using alternative measures of illness severity (DSM-IV symptom count, Alcohol Dependence Scale, and Addiction Severity Index) and self-efficacy (brief version of the Situational Confidence Questionnaire) rigorously controlled for theoretically important background variables, we studied their unique contribution to multiple indices of relapse, relapse latency, and use of alternative coping behaviors in a large, heterogeneous clinical sample. The Alcohol Dependence Scale contributed to the prediction of 4 of 5 relapse indicators. The SCQ failed to predict relapse behavior or its precursor, coping response. The findings emphasize the predictive validity of severity of dependence as a course specifier and underline the need for more sensitive and externally valid measures of cognitive processes such as self-efficacy for application in future studies of posttreatment behavior.

  12. Changes in Patient and Nurse Outcomes Associated with Magnet Hospital Recognition

    Science.gov (United States)

    Kutney-Lee, Ann; Stimpfel, Amy Witkoski; Sloane, Douglas M.; Cimiotti, Jeannie P.; Quinn, Lisa W.; Aiken, Linda H.

    2015-01-01

    Background Research has documented an association between Magnet hospitals and better outcomes for nurses and patients. However, little longitudinal evidence exists to support a causal link between Magnet recognition and outcomes. Objective To compare changes over time in surgical patient outcomes, nurse-reported quality, and nurse outcomes in a sample of hospitals that attained Magnet recognition between 1999 and 2007 with hospitals that remained non-Magnet. Research Design Retrospective, two-stage panel design using four secondary data sources. Subjects 136 Pennsylvania hospitals (11 “emerging” Magnets and 125 non-Magnets) Measures American Nurses Credentialing Center Magnet recognition; risk-adjusted rates of surgical 30-day mortality and failure-to-rescue, nurse-reported quality measures, and nurse outcomes; the Practice Environment Scale of the Nursing Work Index Methods Fixed effects difference models were used to compare changes in outcomes between emerging Magnet hospitals and hospitals that remained non-Magnet. Results Emerging Magnet hospitals demonstrated markedly greater improvements in their work environments than other hospitals. On average, the changes in 30-day surgical mortality and failure-to-rescue rates over the study period were more pronounced in emerging Magnet hospitals than in non-Magnet hospitals, by 2.4 fewer deaths per 1000 patients (pMagnet hospitals and non-Magnet hospitals were observed in nurse-reported quality of care and nurse outcomes. Conclusions In general, Magnet recognition is associated with significant improvements over time in the quality of the work environment, and in patient and nurse outcomes that exceed those of non-Magnet hospitals. PMID:25906016

  13. Designing a Prognostic Scoring System for Predicting the Outcomes of Proximal Fifth Metatarsal Fractures at 20 Weeks

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Tahririan

    2015-03-01

    Full Text Available Background: Fractures of the proximal fifth metatarsal bone are among the most common fractures observed in the foot and their classification and management has been subject to much discussion and disagreement. In this study, we aim to identify and quantify the effect of possible predictors of the outcome of the treatment of proximal fifth metatarsal fractures. Methods: Patients with established proximal fifth metatarsal fractures were enrolled in this prospective cohort and the outcome of their treatment was assessed using the AOFAS mid foot scale at 6 and 20 weeks. Results: 143 patients were included in the study. Our study showed that displacement, weight and type III fractures were significant independent predictors of poor outcome at 6 weeks while at 20 weeks in addition to these factors, gender and diabetes mellitus were also shown to be significant independent predictors of poor outcome. A scoring system was designed by assigning weight to these factors and it was shown to be a strong predictor of outcome at 20 weeks. Conclusion: We recommend that our scoring system would help surgeons to decide whether patients’ prognostic factors are significant enough for him/her to opt for a surgical approach to treatment rather than a conservative approach.

  14. Dependence Of The Structure And Magnetic Properties Of Cast Plate-Shaped Nd60Fe30Al10 Samples On Their Thickness

    Directory of Open Access Journals (Sweden)

    Michalski B.

    2015-09-01

    Full Text Available The hard magnetic Nd-Fe-Al alloys are inferior to Nd-Fe-B magnets as far as the magnetic properties are concerned, but their great advantage is that they need no additional annealing to achieve good magnetic properties. These properties depend on the cooling rate from the melting state, and on the thickness of the sample - the best values are achieved at the quenching rates at which the samples have a thickness of 0.3-2 mm. The present study is concerned with the correlation between the magnetic properties of the plate-shaped Nd60Fe30Al10 samples and their size - thickness. Two casting ways: with the melt stream perpendicular direction and parallel to the surface of the plates were used. The plates were produced by pressure casting and suction casting. The studies have shown that the cooling rates depends on local propagation on liquid metal in the mold resulting in heterogeneity of structure and properties.

  15. No interactions between genetic polymorphisms and stressful life events on outcome of antidepressant treatment

    DEFF Research Database (Denmark)

    Bukh, Jens Drachmann; Bock, Camilla; Vinberg, Maj

    2009-01-01

    Genetic polymorphisms seem to influence the response on antidepressant treatment and moderate the impact of stress on depression. The present study aimed to assess, whether allelic variants and stressful life events interact on the clinical outcome of depression. In a sample of 290 systematically...... recruited patients diagnosed with a single depressive episode according to ICD-10, we assessed the outcome of antidepressant treatment and the presence of stressful life events in a 6-month period preceding onset of depression by means of structured interviews. Further, we genotyped nine polymorphisms...... dependent on stressful life events experienced by the individual prior to onset of depression....

  16. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  17. Integrating virtual reality and BIM for end-user involvement in building design

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Rasmussen, Mai; Jensen, Rasmus Lund

    2017-01-01

    The outcome of projects within Architecture, Engineering, and Construction is highly dependent on the quality of the collaboration between the involved actors. The end-users occupy the buildings on a daily basis, and therefore their involvement in the design process is essential to the output. Ho...

  18. Effects of physical activity calorie expenditure (PACE) labeling: study design and baseline sample characteristics.

    Science.gov (United States)

    Viera, Anthony J; Tuttle, Laura; Olsson, Emily; Gras-Najjar, Julie; Gizlice, Ziya; Hales, Derek; Linnan, Laura; Lin, Feng-Chang; Noar, Seth M; Ammerman, Alice

    2017-09-12

    Obesity and physical inactivity are responsible for more than 365,000 deaths per year and contribute substantially to rising healthcare costs in the US, making clear the need for effective public health interventions. Calorie labeling on menus has been implemented to guide consumer ordering behaviors, but effects on calories purchased has been minimal. In this project, we tested the effect of physical activity calorie expenditure (PACE) food labels on actual point-of-decision food purchasing behavior as well as physical activity. Using a two-group interrupted time series cohort study design in three worksite cafeterias, one cafeteria was assigned to the intervention condition, and the other two served as controls. Calories from food purchased in the cafeteria were assessed by photographs of meals (accompanied by notes made on-site) using a standardized calorie database and portion size-estimation protocol. Primary outcomes will be average calories purchased and minutes of moderate to vigorous physical activity (MVPA) by individuals in the cohorts. We will compare pre-post changes in study outcomes between study groups using piecewise generalized linear mixed model regressions (segmented regressions) with a single change point in our interrupted time-series study. The results of this project will provide evidence of the effectiveness of worksite cafeteria menu labeling, which could potentially inform policy intervention approaches. Labels that convey information in a more readily understandable manner may be more effective at motivating behavior change. Strengths of this study include its cohort design and its robust data capture methods using food photographs and accelerometry.

  19. A probabilistic design method for LMFBR fuel rods

    International Nuclear Information System (INIS)

    Peck, S.O.; Lovejoy, W.S.

    1977-01-01

    Fuel rod performance analyses for design purposes are dependent upon material properties, dimensions, and loads that are statistical in nature. Conventional design practice accounts for the uncertainties in relevant parameters by designing to a 'safety factor', set so as to assure safe operation. Arbitrary assignment of these safety factors, based upon a number of 'worst case' assumptions, may result in costly over-design. Probabilistic design methods provide a systematic way to reflect the uncertainties in design parameters. PECS-III is a computer code which employs Monte Carlo techniques to generate the probability density and distribution functions for time-to-failure and cumulative damage for sealed plenum LMFBR fuel rods on a single rod or whole core basis. In Monte Carlo analyses, a deterministic model (that maps single-valued inputs into single-valued outputs) is coupled to a statistical 'driver'. Uncertainties in the input are reflected by assigning probability densities to the input parameters. Dependent input variables are considered multivariate normal. Independent input variables may be arbitrarily distributed. Sample values are drawn from these input densities, and a complete analysis is done by the deterministic model to generate a sample point in the output distribution. This process is repeated many times, and the number of times each output value occurs is accumulated. The probability that some measure of rod performance will fall within given limits is estimated by the relative frequency with which the Monte Carlo samples fall within tho

  20. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    Science.gov (United States)

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  1. AN EVALUATION OF PRIMARY DATA-COLLECTION MODES IN AN ADDRESS-BASED SAMPLING DESIGN.

    Science.gov (United States)

    Amaya, Ashley; Leclere, Felicia; Carris, Kari; Liao, Youlian

    2015-01-01

    As address-based sampling becomes increasingly popular for multimode surveys, researchers continue to refine data-collection best practices. While much work has been conducted to improve efficiency within a given mode, additional research is needed on how multimode designs can be optimized across modes. Previous research has not evaluated the consequences of mode sequencing on multimode mail and phone surveys, nor has significant research been conducted to evaluate mode sequencing on a variety of indicators beyond response rates. We conducted an experiment within the Racial and Ethnic Approaches to Community Health across the U.S. Risk Factor Survey (REACH U.S.) to evaluate two multimode case-flow designs: (1) phone followed by mail (phone-first) and (2) mail followed by phone (mail-first). We compared response rates, cost, timeliness, and data quality to identify differences across case-flow design. Because surveys often differ on the rarity of the target population, we also examined whether changes in the eligibility rate altered the choice of optimal case flow. Our results suggested that, on most metrics, the mail-first design was superior to the phone-first design. Compared with phone-first, mail-first achieved a higher yield rate at a lower cost with equivalent data quality. While the phone-first design initially achieved more interviews compared to the mail-first design, over time the mail-first design surpassed it and obtained the greatest number of interviews.

  2. Rationale and design of the HOME trial: A pragmatic randomized controlled trial of home-based human papillomavirus (HPV) self-sampling for increasing cervical cancer screening uptake and effectiveness in a U.S. healthcare system.

    Science.gov (United States)

    Winer, Rachel L; Tiro, Jasmin A; Miglioretti, Diana L; Thayer, Chris; Beatty, Tara; Lin, John; Gao, Hongyuan; Kimbel, Kilian; Buist, Diana S M

    2018-01-01

    Women who delay or do not attend Papanicolaou (Pap) screening are at increased risk for cervical cancer. Trials in countries with organized screening programs have demonstrated that mailing high-risk (hr) human papillomavirus (HPV) self-sampling kits to under-screened women increases participation, but U.S. data are lacking. HOME is a pragmatic randomized controlled trial set within a U.S. integrated healthcare delivery system to compare two programmatic approaches for increasing cervical cancer screening uptake and effectiveness in under-screened women (≥3.4years since last Pap) aged 30-64years: 1) usual care (annual patient reminders and ad hoc outreach by clinics) and 2) usual care plus mailed hrHPV self-screening kits. Over 2.5years, eligible women were identified through electronic medical record (EMR) data and randomized 1:1 to the intervention or control arm. Women in the intervention arm were mailed kits with pre-paid envelopes to return samples to the central clinical laboratory for hrHPV testing. Results were documented in the EMR to notify women's primary care providers of appropriate follow-up. Primary outcomes are detection and treatment of cervical neoplasia. Secondary outcomes are cervical cancer screening uptake, abnormal screening results, and women's experiences and attitudes towards hrHPV self-sampling and follow-up of hrHPV-positive results (measured through surveys and interviews). The trial was designed to evaluate whether a programmatic strategy incorporating hrHPV self-sampling is effective in promoting adherence to the complete screening process (including follow-up of abnormal screening results and treatment). The objective of this report is to describe the rationale and design of this pragmatic trial. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Exploring academics' views on designs, methods, characteristics and outcomes of inclusive health research with people with intellectual disabilities: a modified Delphi study

    Science.gov (United States)

    Frankena, T K; Naaldenberg, J; Cardol, M; Meijering, J V; Leusink, G; van Schrojenstein Lantman-de Valk, H M J

    2016-01-01

    Background The British Medical Journal's (BMJ's) patient revolution strives for collaboration with patients in healthcare and health research. This paper studies collaboration with people with intellectual disabilities (ID) in health research, also known as inclusive health research. Currently, transparency and agreement among academics is lacking regarding its main aspects, preventing upscaling of the patient revolution. Objective This study aims to gain agreement among academics on 3 aspects of inclusive health research for people with ID: (1) designs and methods, (2) most important characteristics and (3) outcomes. Design A Delphi study was conducted with academics with experience in inclusive (health) research and on people with ID. The study consisted of 2 sequential questionnaire rounds (n=24; n=17), followed by in-depth interviews (n=10). Results Academics agreed on (1) a collaborative approach to be most suitable to inclusive health research, (2) characteristics regarding the accessibility and facilitation of inclusive health research, and (3) several outcomes of inclusive health research for people with ID and healthcare. Other characteristics agreed on included: atmosphere, relationship, engagement, partnership and power. It was stressed that these characteristics ensure meaningful inclusion. Interviewed academics voiced the need for a tool supporting the facilitation and evaluation of inclusive health research. There was ambiguity as to what this tool should comprise and the extent to which it was possible to capture the complex process of inclusive health research. Discussion and conclusions This study underlines the need for transparency, facilitation and evaluation of inclusive health research. The need for in-depth interviews after 2 Delphi rounds underlines its complexity and context dependence. To increase process transparency, future research should focus on gaining insight into inclusive health research in its context. A tool could be developed

  4. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    Miretskiy, D.I.; Scheinhardt, W.R.W.; Mandjes, M.R.H.

    2008-01-01

    This paper considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue – it is known that in this setting ‘traditional’ stateindependent importance-sampling distributions

  5. State-dependent importance sampling for a Jackson tandem network

    NARCIS (Netherlands)

    D.I. Miretskiy; W.R.W. Scheinhardt (Werner); M.R.H. Mandjes (Michel)

    2008-01-01

    htmlabstractThis paper considers importance sampling as a tool for rare-event simulation. The focus is on estimating the probability of overflow in the downstream queue of a Jacksonian two-node tandem queue – it is known that in this setting ‘traditional’ stateindependent importance-sampling

  6. 02A. Design, Methods, and Outcomes for Recent Clinical Trials Utilizing Ayurvedic Medicine, Yoga, and Meditation

    Science.gov (United States)

    Saper, Robert; Vinjamury, Sivarama; Elder, Charles

    2013-01-01

    Focus Area: Integrative Approaches to Care The panel discussants will present on the outcomes of four recent pragmatic trials covering the spectrum of Ayurvedic medicine, yoga, and meditation as therapeutic approaches for both acute and chronic conditions. The presenters will discuss: (1) a pilot study of a whole-systems Ayurveda and Yoga Therapy intervention for obesity; (2) a comparative effectiveness randomized controlled trial of hatha yoga, physical therapy, and education for non-specific chronic low back pain in low-income minority populations; (3) an investigation of the therapeutic usefulness of Shirodhara (Ayurvedic oil dripping therapy) as a treatment for insomnia; and (4) a discussion of the evidence base supporting implementation of meditation interventions in schools and workplace settings. Discussants will present information on study designs, research methodology, and outcome measure selection to highlight special considerations in conducting research on whole medical systems that use multi-target therapies and focus on patient-centered outcomes. Ayurvedic medicine and yoga are characterized by low-cost, noninvasive interventions that can be usefully offered as part of an integrative medicine therapeutic approach.

  7. Examining intrinsic versus extrinsic exercise goals: cognitive, affective, and behavioral outcomes.

    Science.gov (United States)

    Sebire, Simon J; Standage, Martyn; Vansteenkiste, Maarten

    2009-04-01

    Grounded in self-determination theory (SDT), this study had two purposes: (a) examine the associations between intrinsic (relative to extrinsic) exercise goal content and cognitive, affective, and behavioral outcomes; and (b) test the mediating role of psychological need satisfaction in the Exercise Goal Content --> Outcomes relationship. Using a sample of 410 adults, hierarchical regression analysis showed relative intrinsic goal content to positively predict physical self-worth, self-reported exercise behavior, psychological well-being, and psychological need satisfaction and negatively predict exercise anxiety. Except for exercise behavior, the predictive utility of relative intrinsic goal content on the dependent variables of interest remained significant after controlling for participants' relative self-determined exercise motivation. Structural equation modeling analyses showed psychological need satisfaction to partially mediate the effect of relative intrinsic goal content on the outcome variables. Our findings support further investigation of exercise goals commensurate with the goal content perspective advanced in SDT.

  8. Association of Mode of Locomotion and Independence in Locomotion With Long-Term Outcomes After Spinal Cord Injury

    Science.gov (United States)

    Krause, James; Carter, Rickey E; Brotherton, Sandra

    2009-01-01

    Background/Objective: To explore the association of mode of locomotion (ambulation vs wheelchair use) and independence in locomotion (independent vs require assistance) with health, participation, and subjective well-being (SWB) after spinal cord injury (SCI). Research Design: Secondary analysis was conducted on survey data collected from 2 rehabilitation hospitals in the Midwest and a specialty hospital in the southeastern United States. The 1,493 participants were a minimum of 18 years of age and had traumatic SCI of at least 1 year duration at enrollment. Main Outcome Measures: Three sets of outcome measures were used: SWB, participation, and health. SWB was measured by 8 scales and a measure of depressive symptoms, participation by 3 items, health by general health ratings, days in poor health, hospitalizations, and treatments. Results: Small but significant associations were observed between independence in locomotion and every outcome. Ambulation was associated with greater participation but a mixed pattern of favorable and unfavorable health and SWB outcomes. Supplemental analyses were conducted on those who ambulated but who were dependent on others to do so (n = 117), because this group reported poor outcomes in several areas. Individuals who were independent in wheelchair use reported substantially better outcomes than nonwheelchair users and those dependent on others in wheelchair use. Conclusions: Although ambulation is often a recovery goal, individuals with SCI who ambulate do not uniformly report better outcomes than wheelchair users, and those who depend on others for assistance with ambulation may experience a unique set of problems. PMID:19810625

  9. Sampling design optimisation for rainfall prediction using a non-stationary geostatistical model

    Science.gov (United States)

    Wadoux, Alexandre M. J.-C.; Brus, Dick J.; Rico-Ramirez, Miguel A.; Heuvelink, Gerard B. M.

    2017-09-01

    The accuracy of spatial predictions of rainfall by merging rain-gauge and radar data is partly determined by the sampling design of the rain-gauge network. Optimising the locations of the rain-gauges may increase the accuracy of the predictions. Existing spatial sampling design optimisation methods are based on minimisation of the spatially averaged prediction error variance under the assumption of intrinsic stationarity. Over the past years, substantial progress has been made to deal with non-stationary spatial processes in kriging. Various well-documented geostatistical models relax the assumption of stationarity in the mean, while recent studies show the importance of considering non-stationarity in the variance for environmental processes occurring in complex landscapes. We optimised the sampling locations of rain-gauges using an extension of the Kriging with External Drift (KED) model for prediction of rainfall fields. The model incorporates both non-stationarity in the mean and in the variance, which are modelled as functions of external covariates such as radar imagery, distance to radar station and radar beam blockage. Spatial predictions are made repeatedly over time, each time recalibrating the model. The space-time averaged KED variance was minimised by Spatial Simulated Annealing (SSA). The methodology was tested using a case study predicting daily rainfall in the north of England for a one-year period. Results show that (i) the proposed non-stationary variance model outperforms the stationary variance model, and (ii) a small but significant decrease of the rainfall prediction error variance is obtained with the optimised rain-gauge network. In particular, it pays off to place rain-gauges at locations where the radar imagery is inaccurate, while keeping the distribution over the study area sufficiently uniform.

  10. Review of Atrazine Sampling by POCIS and Chemcatcher.

    Science.gov (United States)

    Booij, Kees; Chen, Sunmao

    2018-04-24

    A key success factor for the performance of passive samplers is the proper calibration of sampling rates. Sampling rates for a wide range of polar organic compounds are available for Chemcatchers and polar organic chemical integrative samplers (POCIS), but the mechanistic models that are needed to understand the effects of exposure conditions on sampling rates need improvement. Literature data on atrazine sampling rates by these samplers were reviewed with the aim to assess what can be learned from literature reports of this well studied compound and to identify knowledge gaps related to the effects of flow and temperature. The flow dependency of sampling rates could be described by a mass transfer resistance model with one (POCIS) or two (Chemcatcher) adjustable parameters. Literature data were insufficient to evaluate the temperature effect on the sampling rates. An evaluation of reported sampler configurations showed that standardization of sampler design can be improved: for POCIS with respect to surface area and sorbent mass, and for Chemcatcher with respect to housing design. Several reports on atrazine sampling could not be used because the experimental setups were insufficiently described with respect to flow conditions. Recommendations are made for standardization of sampler layout and documentation of flow conditions in calibration studies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Health outcome after major trauma: what are we measuring?

    Science.gov (United States)

    Hoffman, Karen; Cole, Elaine; Playford, E Diane; Grill, Eva; Soberg, Helene L; Brohi, Karim

    2014-01-01

    Trauma is a global disease and is among the leading causes of disability in the world. The importance of outcome beyond trauma survival has been recognised over the last decade. Despite this there is no internationally agreed approach for assessment of health outcome and rehabilitation of trauma patients. To systematically examine to what extent outcomes measures evaluate health outcomes in patients with major trauma. MEDLINE, EMBASE, and CINAHL (from 2006-2012) were searched for studies evaluating health outcome after traumatic injuries. Studies of adult patients with injuries involving at least two body areas or organ systems were included. Information on study design, outcome measures used, sample size and outcomes were extracted. The World Health Organisation International Classification of Function, Disability and Health (ICF) were used to evaluate to what extent outcome measures captured health impacts. 34 studies from 755 studies were included in the review. 38 outcome measures were identified. 21 outcome measures were used only once and only five were used in three or more studies. Only 6% of all possible health impacts were captured. Concepts related to activity and participation were the most represented but still only captured 12% of all possible concepts in this domain. Measures performed very poorly in capturing concepts related to body function (5%), functional activities (11%) and environmental factors (2%). Outcome measures used in major trauma capture only a small proportion of health impacts. There is no inclusive classification for measuring disability or health outcome following trauma. The ICF may provide a useful framework for the development of a comprehensive health outcome measure for trauma care.

  12. A liquid scintillation counter specifically designed for samples deposited on a flat matrix

    International Nuclear Information System (INIS)

    Potter, C.G.; Warner, G.T.

    1986-01-01

    A prototype liquid scintillation counter has been designed to count samples deposited as a 6x16 array on a flat matrix. Applications include the counting of labelled cells processed by a cell harvester from 96-well microtitration plates onto glass fibre filters and of DNA samples directly deposited onto nitrocellulose or nylon transfer membranes (e.g. 'Genescreen' NEN) for genetic studies by dot-blot hybridisation. The whole filter is placed in a bag with 4-12 ml of scintillant, sufficient to count all 96 samples. Nearest-neighbour intersample cross talk ranged from 0.004% for 3 H to 0.015% for 32 P. Background was 1.4 counts/min for glass fibre and 0.7 counts/min for 'Genescreen' in the 3 H channel: for 14 C the respective figures were 5.3 and 4.3 counts/min. Counting efficiency for 3 H-labelled cells on glass fibre was 54%(E 2 /B=2053) and 26% for tritiated thymidine spotted on 'Genescreen'(E 2 /B=980). Similar 14 C samples gave figures on 97%(E 2 /B=1775) and 81(E 2 B=1526) respectively. Electron emission counting from samples containing 125 I and 51 Cr was also possible. (U.K.)

  13. Content validity and nursing sensitivity of community-level outcomes from the Nursing Outcomes Classification (NOC).

    Science.gov (United States)

    Head, Barbara J; Aquilino, Mary Lober; Johnson, Marion; Reed, David; Maas, Meridean; Moorhead, Sue

    2004-01-01

    To evaluate the content validity and nursing sensitivity of six community-level outcomes from the Nursing Outcomes Classification (NOC; Johnson, Maas, & Moorhead, 2000). A survey research design was used. Questionnaires were mailed to 300 public health nursing experts; 102 nurses responded. Experts evaluated between 11 and 30 indicators for each of the six outcomes for: (a) importance of the indicators for measuring the outcome, and (b) influence of nursing on the indicators. Content validity and nursing sensitivity of the outcomes were estimated with a modified Fehring technique. All outcomes were deemed important; only Community Competence had an outcome content validity score < .80. The outcome sensitivity score for Community Health: Immunity was .80; other outcome scores ranged from .62-.70. Indicator ratios for all 102 indicators met the study criterion for importance, with 87% designated as critical and 13% as supplemental. Sensitivity ratios reflected judgments that 45% of the indicators were sensitive to nursing intervention. The study provided evidence of outcome content validity and nursing sensitivity of the study outcomes; further validation research is recommended, followed by testing of the study outcomes in clinical practice. Community-level nursing-sensitive outcomes will potentially enable study of the efficacy and effectiveness of public health interventions focused on improving health of populations and communities.

  14. Sample design and gamma-ray counting strategy of neutron activation system for triton burnup measurements in KSTAR

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Jungmin [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Cheon, Mun Seong [ITER Korea, National Fusion Research Institute, Daejeon (Korea, Republic of); Chung, Kyoung-Jae, E-mail: jkjlsh1@snu.ac.kr [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Hwang, Y.S. [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of)

    2016-11-01

    Highlights: • Sample design for triton burnup ratio measurement is carried out. • Samples for 14.1 MeV neutron measurements are selected for KSTAR. • Si and Cu are the most suitable materials for d-t neutron measurements. • Appropriate γ-ray counting strategies for each selected sample are established. - Abstract: On the purpose of triton burnup measurements in Korea Superconducting Tokamak Advanced Research (KSTAR) deuterium plasmas, appropriate neutron activation system (NAS) samples for 14.1 MeV d-t neutron measurements have been designed and gamma-ray counting strategy is established. Neutronics calculations are performed with the MCNP5 neutron transport code for the KSTAR neutral beam heated deuterium plasma discharges. Based on those calculations and the assumed d-t neutron yield, the activities induced by d-t neutrons are estimated with the inventory code FISPACT-2007 for candidate sample materials: Si, Cu, Al, Fe, Nb, Co, Ti, and Ni. It is found that Si, Cu, Al, and Fe are suitable for the KSATR NAS in terms of the minimum detectable activity (MDA) calculated based on the standard deviation of blank measurements. Considering background gamma-rays radiated from surrounding structures activated by thermalized fusion neutrons, appropriate gamma-ray counting strategy for each selected sample is established.

  15. The role of positive/negative outcome expectancy and refusal self-efficacy of Internet use on Internet addiction among college students in Taiwan.

    Science.gov (United States)

    Lin, Min-Pei; Ko, Huei-Chen; Wu, Jo Yung-Wei

    2008-08-01

    Based on Bandura's social cognitive theory, this study was designed to examine positive and negative outcome expectancy and refusal self-efficacy of Internet use and their contribution to Internet addiction among college students by using hierarchical multiple regression analyses in a cross-sectional study design. Schools were first stratified into technical or nontechnical colleges and then into seven majors. A cluster random sampling by department was further applied to randomly choose participants from each major. A representative sample of 4,456 college students participated in this study. The Outcome Expectancy and Refusal Self-Efficacy of Internet Use Questionnaire and the Chen Internet Addiction Scale were used to assess the cognitive factors and the levels of Internet addiction. Results showed that both positive outcome expectancy and negative outcome expectancy were significantly and positively correlated with Internet addiction, and refusal self-efficacy of Internet use was significantly and negatively related to Internet addiction. Further analyses revealed that refusal self-efficacy of Internet use directly and negatively predicted Internet addiction. Moreover, we discovered that positive outcome expectancy positively predicted Internet addiction via refusal self-efficacy of Internet use; however, surprisingly, negative outcome expectancy had both a direct and indirect positive relationship in predicting Internet addiction via the refusal self-efficacy of Internet use. These results give empirical evidence to verify the theoretical effectiveness of the three cognitive factors to Internet addiction and should be incorporated when designing prevention programs and strategies for Internet addicted college students.

  16. Relative Efficiencies of a Three-Stage Versus a Two-Stage Sample Design For a New NLS Cohort Study. 22U-884-38.

    Science.gov (United States)

    Folsom, R. E.; Weber, J. H.

    Two sampling designs were compared for the planned 1978 national longitudinal survey of high school seniors with respect to statistical efficiency and cost. The 1972 survey used a stratified two-stage sample of high schools and seniors within schools. In order to minimize interviewer travel costs, an alternate sampling design was proposed,…

  17. Comparison of sampling designs for estimating deforestation from landsat TM and MODIS imagery: a case study in Mato Grosso, Brazil.

    Science.gov (United States)

    Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin

    2014-01-01

    Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  18. A weighted sampling algorithm for the design of RNA sequences with targeted secondary structure and nucleotide distribution.

    Science.gov (United States)

    Reinharz, Vladimir; Ponty, Yann; Waldispühl, Jérôme

    2013-07-01

    The design of RNA sequences folding into predefined secondary structures is a milestone for many synthetic biology and gene therapy studies. Most of the current software uses similar local search strategies (i.e. a random seed is progressively adapted to acquire the desired folding properties) and more importantly do not allow the user to control explicitly the nucleotide distribution such as the GC-content in their sequences. However, the latter is an important criterion for large-scale applications as it could presumably be used to design sequences with better transcription rates and/or structural plasticity. In this article, we introduce IncaRNAtion, a novel algorithm to design RNA sequences folding into target secondary structures with a predefined nucleotide distribution. IncaRNAtion uses a global sampling approach and weighted sampling techniques. We show that our approach is fast (i.e. running time comparable or better than local search methods), seedless (we remove the bias of the seed in local search heuristics) and successfully generates high-quality sequences (i.e. thermodynamically stable) for any GC-content. To complete this study, we develop a hybrid method combining our global sampling approach with local search strategies. Remarkably, our glocal methodology overcomes both local and global approaches for sampling sequences with a specific GC-content and target structure. IncaRNAtion is available at csb.cs.mcgill.ca/incarnation/. Supplementary data are available at Bioinformatics online.

  19. Energy drink use, problem drinking and drinking motives in a diverse sample of Alaskan college students

    Directory of Open Access Journals (Sweden)

    Monica C. Skewes

    2013-08-01

    Full Text Available Background. Recent research has identified the use of caffeinated energy drinks as a common, potentially risky behaviour among college students that is linked to alcohol misuse and consequences. Research also suggests that energy drink consumption is related to other risky behaviours such as tobacco use, marijuana use and risky sexual activity. Objective. This research sought to examine the associations between frequency of energy drink consumption and problematic alcohol use, alcohol-related consequences, symptoms of alcohol dependence and drinking motives in an ethnically diverse sample of college students in Alaska. We also sought to examine whether ethnic group moderated these associations in the present sample of White, Alaska Native/American Indian and other ethnic minority college students. Design. A paper-and-pencil self-report questionnaire was completed by a sample of 298 college students. Analysis of covariance (ANCOVA was used to examine the effects of energy drink use, ethnic group and energy drink by ethnic group interactions on alcohol outcomes after controlling for variance attributed to gender, age and frequency of binge drinking. Results. Greater energy drink consumption was significantly associated with greater hazardous drinking, alcohol consequences, alcohol dependence symptoms, drinking for enhancement motives and drinking to cope. There were no main effects of ethnic group, and there were no significant energy drink by ethnic group interactions. Conclusion. These findings replicate those of other studies examining the associations between energy drink use and alcohol problems, but contrary to previous research we did not find ethnic minority status to be protective. It is possible that energy drink consumption may serve as a marker for other health risk behaviours among students of various ethnic groups.

  20. A free-knot spline modeling framework for piecewise linear logistic regression in complex samples with body mass index and mortality as an example

    Directory of Open Access Journals (Sweden)

    Scott W. Keith

    2014-09-01

    Full Text Available This paper details the design, evaluation, and implementation of a framework for detecting and modeling nonlinearity between a binary outcome and a continuous predictor variable adjusted for covariates in complex samples. The framework provides familiar-looking parameterizations of output in terms of linear slope coefficients and odds ratios. Estimation methods focus on maximum likelihood optimization of piecewise linear free-knot splines formulated as B-splines. Correctly specifying the optimal number and positions of the knots improves the model, but is marked by computational intensity and numerical instability. Our inference methods utilize both parametric and nonparametric bootstrapping. Unlike other nonlinear modeling packages, this framework is designed to incorporate multistage survey sample designs common to nationally representative datasets. We illustrate the approach and evaluate its performance in specifying the correct number of knots under various conditions with an example using body mass index (BMI; kg/m2 and the complex multi-stage sampling design from the Third National Health and Nutrition Examination Survey to simulate binary mortality outcomes data having realistic nonlinear sample-weighted risk associations with BMI. BMI and mortality data provide a particularly apt example and area of application since BMI is commonly recorded in large health surveys with complex designs, often categorized for modeling, and nonlinearly related to mortality. When complex sample design considerations were ignored, our method was generally similar to or more accurate than two common model selection procedures, Schwarz’s Bayesian Information Criterion (BIC and Akaike’s Information Criterion (AIC, in terms of correctly selecting the correct number of knots. Our approach provided accurate knot selections when complex sampling weights were incorporated, while AIC and BIC were not effective under these conditions.