WorldWideScience

Sample records for bayesian cost-effectiveness models

  1. Bayesian models for cost-effectiveness analysis in the presence of structural zero costs.

    Science.gov (United States)

    Baio, Gianluca

    2014-05-20

    Bayesian modelling for cost-effectiveness data has received much attention in both the health economics and the statistical literature, in recent years. Cost-effectiveness data are characterised by a relatively complex structure of relationships linking a suitable measure of clinical benefit (e.g. quality-adjusted life years) and the associated costs. Simplifying assumptions, such as (bivariate) normality of the underlying distributions, are usually not granted, particularly for the cost variable, which is characterised by markedly skewed distributions. In addition, individual-level data sets are often characterised by the presence of structural zeros in the cost variable. Hurdle models can be used to account for the presence of excess zeros in a distribution and have been applied in the context of cost data. We extend their application to cost-effectiveness data, defining a full Bayesian specification, which consists of a model for the individual probability of null costs, a marginal model for the costs and a conditional model for the measure of effectiveness (given the observed costs). We presented the model using a working example to describe its main features. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  2. Bayesian cost-effectiveness analysis with the R package BCEA

    CERN Document Server

    Baio, Gianluca; Heath, Anna

    2017-01-01

    The book provides a description of the process of health economic evaluation and modelling for cost-effectiveness analysis, particularly from the perspective of a Bayesian statistical approach. Some relevant theory and introductory concepts are presented using practical examples and two running case studies. The book also describes in detail how to perform health economic evaluations using the R package BCEA (Bayesian Cost-Effectiveness Analysis). BCEA can be used to post-process the results of a Bayesian cost-effectiveness model and perform advanced analyses producing standardised and highly customisable outputs. It presents all the features of the package, including its many functions and their practical application, as well as its user-friendly web interface. The book is a valuable resource for statisticians and practitioners working in the field of health economics wanting to simplify and standardise their workflow, for example in the preparation of dossiers in support of marketing authorisation, or acade...

  3. Bayesian sample size determination for cost-effectiveness studies with censored data.

    Directory of Open Access Journals (Sweden)

    Daniel P Beavers

    Full Text Available Cost-effectiveness models are commonly utilized to determine the combined clinical and economic impact of one treatment compared to another. However, most methods for sample size determination of cost-effectiveness studies assume fully observed costs and effectiveness outcomes, which presents challenges for survival-based studies in which censoring exists. We propose a Bayesian method for the design and analysis of cost-effectiveness data in which costs and effectiveness may be censored, and the sample size is approximated for both power and assurance. We explore two parametric models and demonstrate the flexibility of the approach to accommodate a variety of modifications to study assumptions.

  4. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  5. Bayesian comparison of cost-effectiveness of different clinical approaches to diagnose coronary artery disease

    International Nuclear Information System (INIS)

    Patterson, R.E.; Eng, C.; Horowitz, S.F.; Gorlin, R.; Goldstein, S.R.

    1984-01-01

    The objective of this study was to compare the cost-effectiveness of four clinical policies (policies I to IV) in the diagnosis of the presence or absence of coronary artery disease. A model based on Bayes theorem and published clinical data was constructed to make these comparisons. Effectiveness was defined as either the number of patients with coronary disease diagnosed or as the number of quality-adjusted life years extended by therapy after the diagnosis of coronary disease. The following conclusions arise strictly from analysis of the model and may not necessarily be applicable to all situations. As prevalence of coronary disease in the population increased, it caused a linear increase in cost per patient tested, but a hyperbolic decrease in cost per effect, that is, increased cost-effectiveness. Thus, cost-effectiveness of all policies (I to IV) was poor in populations with a prevalence of disease below 10%. Analysis of the model also indicates that at prevalences less than 80%, exercise thallium scintigraphy alone as a first test (policy II) is a more cost-effective initial test than is exercise electrocardiography alone as a first test (policy I) or exercise electrocardiography first combined with thallium imaging as a second test (policy IV). Exercise electrocardiography before thallium imaging (policy IV) is more cost-effective than exercise electrocardiography alone (policy I) at prevalences less than 80%. 4) Noninvasive exercise testing before angiography (policies I, II and IV) is more cost-effective than using coronary angiography as the first and only test (policy III) at prevalences less than 80%. 5) Above a threshold value of prevalence of 80% (for example patients with typical angina), proceeding to angiography as the first test (policy III) was more cost-effective than initial noninvasive exercise tests (policies I, II and IV)

  6. Cost effectiveness of recycling: A systems model

    Energy Technology Data Exchange (ETDEWEB)

    Tonjes, David J., E-mail: david.tonjes@stonybrook.edu [Department of Technology and Society, College of Engineering and Applied Sciences, Stony Brook University, Stony Brook, NY 11794-3560 (United States); Waste Reduction and Management Institute, School of Marine and Atmospheric Sciences, Stony Brook University, Stony Brook, NY 11794-5000 (United States); Center for Bioenergy Research and Development, Advanced Energy Research and Technology Center, Stony Brook University, 1000 Innovation Rd., Stony Brook, NY 11794-6044 (United States); Mallikarjun, Sreekanth, E-mail: sreekanth.mallikarjun@stonybrook.edu [Department of Technology and Society, College of Engineering and Applied Sciences, Stony Brook University, Stony Brook, NY 11794-3560 (United States)

    2013-11-15

    Highlights: • Curbside collection of recyclables reduces overall system costs over a range of conditions. • When avoided costs for recyclables are large, even high collection costs are supported. • When avoided costs for recyclables are not great, there are reduced opportunities for savings. • For common waste compositions, maximizing curbside recyclables collection always saves money. - Abstract: Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets.

  7. Cost effectiveness of recycling: A systems model

    International Nuclear Information System (INIS)

    Tonjes, David J.; Mallikarjun, Sreekanth

    2013-01-01

    Highlights: • Curbside collection of recyclables reduces overall system costs over a range of conditions. • When avoided costs for recyclables are large, even high collection costs are supported. • When avoided costs for recyclables are not great, there are reduced opportunities for savings. • For common waste compositions, maximizing curbside recyclables collection always saves money. - Abstract: Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets

  8. Cost effectiveness of recycling: a systems model.

    Science.gov (United States)

    Tonjes, David J; Mallikarjun, Sreekanth

    2013-11-01

    Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  10. Bayesian nonparametric hierarchical modeling.

    Science.gov (United States)

    Dunson, David B

    2009-04-01

    In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.

  11. A Layered Decision Model for Cost-Effective System Security

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Huaqiang; Alves-Foss, James; Soule, Terry; Pforsich, Hugh; Zhang, Du; Frincke, Deborah A.

    2008-10-01

    System security involves decisions in at least three areas: identification of well-defined security policies, selection of cost-effective defence strategies, and implementation of real-time defence tactics. Although choices made in each of these areas affect the others, existing decision models typically handle these three decision areas in isolation. There is no comprehensive tool that can integrate them to provide a single efficient model for safeguarding a network. In addition, there is no clear way to determine which particular combinations of defence decisions result in cost-effective solutions. To address these problems, this paper introduces a Layered Decision Model (LDM) for use in deciding how to address defence decisions based on their cost-effectiveness. To validate the LDM and illustrate how it is used, we used simulation to test model rationality and applied the LDM to the design of system security for an e-commercial business case.

  12. Chain Risk Model for quantifying cost effectiveness of phytosanitary measures

    NARCIS (Netherlands)

    Benninga, J.; Hennen, W.H.G.J.; Schans, van de J.

    2010-01-01

    A Chain Risk Model (CRM) was developed for a cost effective assessment of phytosanitary measures. The CRM model can be applied to phytosanitary assessments of all agricultural product chains. In CRM, stages are connected by product volume flows with which pest infections can be spread from one stage

  13. Bayesian analysis of CCDM models

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, J.F. [Universidade Estadual Paulista (Unesp), Câmpus Experimental de Itapeva, Rua Geraldo Alckmin 519, Vila N. Sra. de Fátima, Itapeva, SP, 18409-010 Brazil (Brazil); Valentim, R. [Departamento de Física, Instituto de Ciências Ambientais, Químicas e Farmacêuticas—ICAQF, Universidade Federal de São Paulo (UNIFESP), Unidade José Alencar, Rua São Nicolau No. 210, Diadema, SP, 09913-030 Brazil (Brazil); Andrade-Oliveira, F., E-mail: jfjesus@itapeva.unesp.br, E-mail: valentim.rodolfo@unifesp.br, E-mail: felipe.oliveira@port.ac.uk [Institute of Cosmology and Gravitation—University of Portsmouth, Burnaby Road, Portsmouth, PO1 3FX United Kingdom (United Kingdom)

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3α H {sub 0} model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  14. Modeling and Cost-Effectiveness in HIV Prevention.

    Science.gov (United States)

    Jacobsen, Margo M; Walensky, Rochelle P

    2016-02-01

    With HIV funding plateauing and the number of people living with HIV increasing due to the rollout of life-saving antiretroviral therapy, policy makers are faced with increasingly tighter budgets to manage the ongoing HIV epidemic. Cost-effectiveness and modeling analyses can help determine which HIV interventions may be of best value. Incidence remains remarkably high in certain populations and countries, making prevention key to controlling the spread of HIV. This paper briefly reviews concepts in modeling and cost-effectiveness methodology and then examines results of recently published cost-effectiveness analyses on the following HIV prevention strategies: condoms and circumcision, behavioral- or community-based interventions, prevention of mother-to-child transmission, HIV testing, pre-exposure prophylaxis, and treatment as prevention. We find that the majority of published studies demonstrate cost-effectiveness; however, not all interventions are affordable. We urge continued research on combination strategies and methodologies that take into account willingness to pay and budgetary impact.

  15. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  16. Bayesian models in cognitive neuroscience: A tutorial

    NARCIS (Netherlands)

    O'Reilly, J.X.; Mars, R.B.

    2015-01-01

    This chapter provides an introduction to Bayesian models and their application in cognitive neuroscience. The central feature of Bayesian models, as opposed to other classes of models, is that Bayesian models represent the beliefs of an observer as probability distributions, allowing them to

  17. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  18. Bayesian Model Averaging for Propensity Score Analysis.

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2014-01-01

    This article considers Bayesian model averaging as a means of addressing uncertainty in the selection of variables in the propensity score equation. We investigate an approximate Bayesian model averaging approach based on the model-averaged propensity score estimates produced by the R package BMA but that ignores uncertainty in the propensity score. We also provide a fully Bayesian model averaging approach via Markov chain Monte Carlo sampling (MCMC) to account for uncertainty in both parameters and models. A detailed study of our approach examines the differences in the causal estimate when incorporating noninformative versus informative priors in the model averaging stage. We examine these approaches under common methods of propensity score implementation. In addition, we evaluate the impact of changing the size of Occam's window used to narrow down the range of possible models. We also assess the predictive performance of both Bayesian model averaging propensity score approaches and compare it with the case without Bayesian model averaging. Overall, results show that both Bayesian model averaging propensity score approaches recover the treatment effect estimates well and generally provide larger uncertainty estimates, as expected. Both Bayesian model averaging approaches offer slightly better prediction of the propensity score compared with the Bayesian approach with a single propensity score equation. Covariate balance checks for the case study show that both Bayesian model averaging approaches offer good balance. The fully Bayesian model averaging approach also provides posterior probability intervals of the balance indices.

  19. Flexible Bayesian Human Fecundity Models.

    Science.gov (United States)

    Kim, Sungduk; Sundaram, Rajeshwari; Buck Louis, Germaine M; Pyper, Cecilia

    2012-12-01

    Human fecundity is an issue of considerable interest for both epidemiological and clinical audiences, and is dependent upon a couple's biologic capacity for reproduction coupled with behaviors that place a couple at risk for pregnancy. Bayesian hierarchical models have been proposed to better model the conception probabilities by accounting for the acts of intercourse around the day of ovulation, i.e., during the fertile window. These models can be viewed in the framework of a generalized nonlinear model with an exponential link. However, a fixed choice of link function may not always provide the best fit, leading to potentially biased estimates for probability of conception. Motivated by this, we propose a general class of models for fecundity by relaxing the choice of the link function under the generalized nonlinear model framework. We use a sample from the Oxford Conception Study (OCS) to illustrate the utility and fit of this general class of models for estimating human conception. Our findings reinforce the need for attention to be paid to the choice of link function in modeling conception, as it may bias the estimation of conception probabilities. Various properties of the proposed models are examined and a Markov chain Monte Carlo sampling algorithm was developed for implementing the Bayesian computations. The deviance information criterion measure and logarithm of pseudo marginal likelihood are used for guiding the choice of links. The supplemental material section contains technical details of the proof of the theorem stated in the paper, and contains further simulation results and analysis.

  20. Bayesian operational risk models

    OpenAIRE

    Silvia Figini; Lijun Gao; Paolo Giudici

    2013-01-01

    Operational risk is hard to quantify, for the presence of heavy tailed loss distributions. Extreme value distributions, used in this context, are very sensitive to the data, and this is a problem in the presence of rare loss data. Self risk assessment questionnaires, if properly modelled, may provide the missing piece of information that is necessary to adequately estimate op- erational risks. In this paper we propose to embody self risk assessment data into suitable prior distributions, and ...

  1. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    Dimitrakakis, C.

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more

  2. The humble Bayesian : Model checking from a fully Bayesian perspective

    NARCIS (Netherlands)

    Morey, Richard D.; Romeijn, Jan-Willem; Rouder, Jeffrey N.

    Gelman and Shalizi (2012) criticize what they call the usual story in Bayesian statistics: that the distribution over hypotheses or models is the sole means of statistical inference, thus excluding model checking and revision, and that inference is inductivist rather than deductivist. They present

  3. Dynamic Modeling of Cost-effectiveness of Rotavirus Vaccination, Kazakhstan

    Science.gov (United States)

    Flem, Elmira; Latipov, Renat; Kuatbaeva, Ajnagul; Kristiansen, Ivar Sønbø

    2014-01-01

    The government of Kazakhstan, a middle-income country in Central Asia, is considering the introduction of rotavirus vaccination into its national immunization program. We performed a cost-effectiveness analysis of rotavirus vaccination spanning 20 years by using a synthesis of dynamic transmission models accounting for herd protection. We found that a vaccination program with 90% coverage would prevent ≈880 rotavirus deaths and save an average of 54,784 life-years for children <5 years of age. Indirect protection accounted for 40% and 60% reduction in severe and mild rotavirus gastroenteritis, respectively. Cost per life year gained was US $18,044 from a societal perspective and US $23,892 from a health care perspective. Comparing the 2 key parameters of cost-effectiveness, mortality rates and vaccine cost at

  4. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  5. Bayesian modelling of fusion diagnostics

    Science.gov (United States)

    Fischer, R.; Dinklage, A.; Pasch, E.

    2003-07-01

    Integrated data analysis of fusion diagnostics is the combination of different, heterogeneous diagnostics in order to improve physics knowledge and reduce the uncertainties of results. One example is the validation of profiles of plasma quantities. Integration of different diagnostics requires systematic and formalized error analysis for all uncertainties involved. The Bayesian probability theory (BPT) allows a systematic combination of all information entering the measurement descriptive model that considers all uncertainties of the measured data, calibration measurements, physical model parameters and measurement nuisance parameters. A sensitivity analysis of model parameters allows crucial uncertainties to be found, which has an impact on both diagnostic improvement and design. The systematic statistical modelling within the BPT is used for reconstructing electron density and electron temperature profiles from Thomson scattering data from the Wendelstein 7-AS stellarator. The inclusion of different diagnostics and first-principle information is discussed in terms of improvements.

  6. Bayesian models: A statistical primer for ecologists

    Science.gov (United States)

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  7. Calibration in a Bayesian modelling framework

    NARCIS (Netherlands)

    Jansen, M.J.W.; Hagenaars, T.H.J.

    2004-01-01

    Bayesian statistics may constitute the core of a consistent and comprehensive framework for the statistical aspects of modelling complex processes that involve many parameters whose values are derived from many sources. Bayesian statistics holds great promises for model calibration, provides the

  8. Properties of the Bayesian Knowledge Tracing Model

    Science.gov (United States)

    van de Sande, Brett

    2013-01-01

    Bayesian Knowledge Tracing is used very widely to model student learning. It comes in two different forms: The first form is the Bayesian Knowledge Tracing "hidden Markov model" which predicts the probability of correct application of a skill as a function of the number of previous opportunities to apply that skill and the model…

  9. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  10. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...

  11. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  12. Modeling the cost-effectiveness of health care systems for alcohol use disorders: how implementation of eHealth interventions improves cost-effectiveness

    NARCIS (Netherlands)

    Smit, Filip; Lokkerbol, Joran; Riper, Heleen; Majo, Maria Cristina; Boon, Brigitte; Blankers, Matthijs

    2011-01-01

    Informing policy decisions about the cost-effectiveness of health care systems (ie, packages of clinical interventions) is probably best done using a modeling approach. To this end, an alcohol model (ALCMOD) was developed. The aim of ALCMOD is to estimate the cost-effectiveness of competing health

  13. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross- entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  14. A Bayesian cost-effectiveness analysis of a telemedicine-based strategy for the management of sleep apnoea: a multicentre randomised controlled trial.

    Science.gov (United States)

    Isetta, Valentina; Negrín, Miguel A; Monasterio, Carmen; Masa, Juan F; Feu, Nuria; Álvarez, Ainhoa; Campos-Rodriguez, Francisco; Ruiz, Concepción; Abad, Jorge; Vázquez-Polo, Francisco J; Farré, Ramon; Galdeano, Marina; Lloberes, Patricia; Embid, Cristina; de la Peña, Mónica; Puertas, Javier; Dalmases, Mireia; Salord, Neus; Corral, Jaime; Jurado, Bernabé; León, Carmen; Egea, Carlos; Muñoz, Aida; Parra, Olga; Cambrodi, Roser; Martel-Escobar, María; Arqué, Meritxell; Montserrat, Josep M

    2015-11-01

    Compliance with continuous positive airway pressure (CPAP) therapy is essential in patients with obstructive sleep apnoea (OSA), but adequate control is not always possible. This is clinically important because CPAP can reverse the morbidity and mortality associated with OSA. Telemedicine, with support provided via a web platform and video conferences, could represent a cost-effective alternative to standard care management. To assess the telemedicine impact on treatment compliance, cost-effectiveness and improvement in quality of life (QoL) when compared with traditional face-to-face follow-up. A randomised controlled trial was performed to compare a telemedicine-based CPAP follow-up strategy with standard face-to-face management. Consecutive OSA patients requiring CPAP treatment, with sufficient internet skills and who agreed to participate, were enrolled. They were followed-up at 1, 3 and 6 months and answered surveys about sleep, CPAP side effects and lifestyle. We compared CPAP compliance, cost-effectiveness and QoL between the beginning and the end of the study. A Bayesian cost-effectiveness analysis with non-informative priors was performed. We randomised 139 patients. At 6 months, we found similar levels of CPAP compliance, and improved daytime sleepiness, QoL, side effects and degree of satisfaction in both groups. Despite requiring more visits, the telemedicine group was more cost-effective: costs were lower and differences in effectiveness were not relevant. A telemedicine-based strategy for the follow-up of CPAP treatment in patients with OSA was as effective as standard hospital-based care in terms of CPAP compliance and symptom improvement, with comparable side effects and satisfaction rates. The telemedicine-based strategy had lower total costs due to savings on transport and less lost productivity (indirect costs). NCT01716676. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go

  15. Bayesian modeling of unknown diseases for biosurveillance.

    Science.gov (United States)

    Shen, Yanna; Cooper, Gregory F

    2009-11-14

    This paper investigates Bayesian modeling of unknown causes of events in the context of disease-outbreak detection. We introduce a Bayesian approach that models and detects both (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A key contribution of this paper is that it introduces a Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has broad applicability in medical informatics, where the space of known causes of outcomes of interest is seldom complete.

  16. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  17. Robust bayesian analysis of an autoregressive model with ...

    African Journals Online (AJOL)

    In this work, robust Bayesian analysis of the Bayesian estimation of an autoregressive model with exponential innovations is performed. Using a Bayesian robustness methodology, we show that, using a suitable generalized quadratic loss, we obtain optimal Bayesian estimators of the parameters corresponding to the ...

  18. Empirical Bayesian inference and model uncertainty

    International Nuclear Information System (INIS)

    Poern, K.

    1994-01-01

    This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability

  19. Bayesian mixture models for partially verified data

    DEFF Research Database (Denmark)

    Kostoulas, Polychronis; Browne, William J.; Nielsen, Søren Saxmose

    2013-01-01

    Bayesian mixture models can be used to discriminate between the distributions of continuous test responses for different infection stages. These models are particularly useful in case of chronic infections with a long latent period, like Mycobacterium avium subsp. paratuberculosis (MAP) infection...

  20. Quantifying Registration Uncertainty With Sparse Bayesian Modelling.

    Science.gov (United States)

    Le Folgoc, Loic; Delingette, Herve; Criminisi, Antonio; Ayache, Nicholas

    2017-02-01

    We investigate uncertainty quantification under a sparse Bayesian model of medical image registration. Bayesian modelling has proven powerful to automate the tuning of registration hyperparameters, such as the trade-off between the data and regularization functionals. Sparsity-inducing priors have recently been used to render the parametrization itself adaptive and data-driven. The sparse prior on transformation parameters effectively favors the use of coarse basis functions to capture the global trends in the visible motion while finer, highly localized bases are introduced only in the presence of coherent image information and motion. In earlier work, approximate inference under the sparse Bayesian model was tackled in an efficient Variational Bayes (VB) framework. In this paper we are interested in the theoretical and empirical quality of uncertainty estimates derived under this approximate scheme vs. under the exact model. We implement an (asymptotically) exact inference scheme based on reversible jump Markov Chain Monte Carlo (MCMC) sampling to characterize the posterior distribution of the transformation and compare the predictions of the VB and MCMC based methods. The true posterior distribution under the sparse Bayesian model is found to be meaningful: orders of magnitude for the estimated uncertainty are quantitatively reasonable, the uncertainty is higher in textureless regions and lower in the direction of strong intensity gradients.

  1. Bayesian Modelling of Functional Whole Brain Connectivity

    DEFF Research Database (Denmark)

    Røge, Rasmus

    This thesis deals with parcellation of whole-brain functional magnetic resonance imaging (fMRI) using Bayesian inference with mixture models tailored to the fMRI data. In the three included papers and manuscripts, we analyze two different approaches to modeling fMRI signal; either we accept...... the prevalent strategy of standardizing of fMRI time series and model data using directional statistics or we model the variability in the signal across the brain and across multiple subjects. In either case, we use Bayesian nonparametric modeling to automatically learn from the fMRI data the number...... of funcional units, i.e. parcels. We benchmark the proposed mixture models against state of the art methods of brain parcellation, both probabilistic and non-probabilistic. The time series of each voxel are most often standardized using z-scoring which projects the time series data onto a hypersphere...

  2. Distributed Bayesian Networks for User Modeling

    DEFF Research Database (Denmark)

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang

    2006-01-01

    The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used...... by such adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... of Web-based eLearning platforms. The scenario we are tackling assumes learners who use several systems over time, which are able to create partial Bayesian Networks for user models based on the local system context. In particular, we focus on how to merge these partial user models. Our merge mechanism...

  3. Modelling dependable systems using hybrid Bayesian networks

    International Nuclear Information System (INIS)

    Neil, Martin; Tailor, Manesh; Marquez, David; Fenton, Norman; Hearty, Peter

    2008-01-01

    A hybrid Bayesian network (BN) is one that incorporates both discrete and continuous nodes. In our extensive applications of BNs for system dependability assessment, the models are invariably hybrid and the need for efficient and accurate computation is paramount. We apply a new iterative algorithm that efficiently combines dynamic discretisation with robust propagation algorithms on junction tree structures to perform inference in hybrid BNs. We illustrate its use in the field of dependability with two example of reliability estimation. Firstly we estimate the reliability of a simple single system and next we implement a hierarchical Bayesian model. In the hierarchical model we compute the reliability of two unknown subsystems from data collected on historically similar subsystems and then input the result into a reliability block model to compute system level reliability. We conclude that dynamic discretisation can be used as an alternative to analytical or Monte Carlo methods with high precision and can be applied to a wide range of dependability problems

  4. Cost Effective Community Based Dementia Screening: A Markov Model Simulation

    Directory of Open Access Journals (Sweden)

    Erin Saito

    2014-01-01

    Full Text Available Background. Given the dementia epidemic and the increasing cost of healthcare, there is a need to assess the economic benefit of community based dementia screening programs. Materials and Methods. Markov model simulations were generated using data obtained from a community based dementia screening program over a one-year period. The models simulated yearly costs of caring for patients based on clinical transitions beginning in pre dementia and extending for 10 years. Results. A total of 93 individuals (74 female, 19 male were screened for dementia and 12 meeting clinical criteria for either mild cognitive impairment (n=7 or dementia (n=5 were identified. Assuming early therapeutic intervention beginning during the year of dementia detection, Markov model simulations demonstrated 9.8% reduction in cost of dementia care over a ten-year simulation period, primarily through increased duration in mild stages and reduced time in more costly moderate and severe stages. Discussion. Community based dementia screening can reduce healthcare costs associated with caring for demented individuals through earlier detection and treatment, resulting in proportionately reduced time in more costly advanced stages.

  5. Bayesian disease mapping: hierarchical modeling in spatial epidemiology

    National Research Council Canada - National Science Library

    Lawson, Andrew

    2013-01-01

    .... Exploring these new developments, Bayesian Disease Mapping: Hierarchical Modeling in Spatial Epidemiology, Second Edition provides an up-to-date, cohesive account of the full range of Bayesian disease mapping methods and applications...

  6. Constrained bayesian inference of project performance models

    OpenAIRE

    Sunmola, Funlade

    2013-01-01

    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  7. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  8. Bayesian estimation and modeling: Editorial to the second special issue on Bayesian data analysis.

    Science.gov (United States)

    Chow, Sy-Miin; Hoijtink, Herbert

    2017-12-01

    This editorial accompanies the second special issue on Bayesian data analysis published in this journal. The emphases of this issue are on Bayesian estimation and modeling. In this editorial, we outline the basics of current Bayesian estimation techniques and some notable developments in the statistical literature, as well as adaptations and extensions by psychological researchers to better tailor to the modeling applications in psychology. We end with a discussion on future outlooks of Bayesian data analysis in psychology. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Network structure exploration via Bayesian nonparametric models

    International Nuclear Information System (INIS)

    Chen, Y; Wang, X L; Xiang, X; Tang, B Z; Bu, J Z

    2015-01-01

    Complex networks provide a powerful mathematical representation of complex systems in nature and society. To understand complex networks, it is crucial to explore their internal structures, also called structural regularities. The task of network structure exploration is to determine how many groups there are in a complex network and how to group the nodes of the network. Most existing structure exploration methods need to specify either a group number or a certain type of structure when they are applied to a network. In the real world, however, the group number and also the certain type of structure that a network has are usually unknown in advance. To explore structural regularities in complex networks automatically, without any prior knowledge of the group number or the certain type of structure, we extend a probabilistic mixture model that can handle networks with any type of structure but needs to specify a group number using Bayesian nonparametric theory. We also propose a novel Bayesian nonparametric model, called the Bayesian nonparametric mixture (BNPM) model. Experiments conducted on a large number of networks with different structures show that the BNPM model is able to explore structural regularities in networks automatically with a stable, state-of-the-art performance. (paper)

  10. Bayesian Recurrent Neural Network for Language Modeling.

    Science.gov (United States)

    Chien, Jen-Tzung; Ku, Yuan-Chu

    2016-02-01

    A language model (LM) is calculated as the probability of a word sequence that provides the solution to word prediction for a variety of information systems. A recurrent neural network (RNN) is powerful to learn the large-span dynamics of a word sequence in the continuous space. However, the training of the RNN-LM is an ill-posed problem because of too many parameters from a large dictionary size and a high-dimensional hidden layer. This paper presents a Bayesian approach to regularize the RNN-LM and apply it for continuous speech recognition. We aim to penalize the too complicated RNN-LM by compensating for the uncertainty of the estimated model parameters, which is represented by a Gaussian prior. The objective function in a Bayesian classification network is formed as the regularized cross-entropy error function. The regularized model is constructed not only by calculating the regularized parameters according to the maximum a posteriori criterion but also by estimating the Gaussian hyperparameter by maximizing the marginal likelihood. A rapid approximation to a Hessian matrix is developed to implement the Bayesian RNN-LM (BRNN-LM) by selecting a small set of salient outer-products. The proposed BRNN-LM achieves a sparser model than the RNN-LM. Experiments on different corpora show the robustness of system performance by applying the rapid BRNN-LM under different conditions.

  11. Centralized Bayesian reliability modelling with sensor networks

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Sečkárová, Vladimíra

    2013-01-01

    Roč. 19, č. 5 (2013), s. 471-482 ISSN 1387-3954 R&D Projects: GA MŠk 7D12004 Grant - others:GA MŠk(CZ) SVV-265315 Keywords : Bayesian modelling * Sensor network * Reliability Subject RIV: BD - Theory of Information Impact factor: 0.984, year: 2013 http://library.utia.cas.cz/separaty/2013/AS/dedecius-0392551.pdf

  12. Modelling the cost-effectiveness of catch-up 'MenB' (Bexsero) vaccination in England.

    Science.gov (United States)

    Christensen, Hannah; Trotter, Caroline L

    2017-01-05

    We assessed the cost-effectiveness of offering catch-up vaccination with Bexsero against meningococcal disease to children too old to receive the vaccine under the recently introduced infant programme. Offering catch-up vaccination to increasingly older children is less economically attractive because of declining disease burden. We estimate catch-up vaccination of 1year old children could be cost-effective, incremental on the infant programme with a vaccine price of ⩽£8 per dose. Extending vaccination to 2year olds could only be cost-effective (incremental on infant and 1year old catch-up) with a vaccine price of ⩽£3 per dose and was not cost-effective in sensitivity analyses with more conservative vaccine assumptions. Extending catch-up further to 3-4year olds was not cost-effective. Employing the current criteria for assessing vaccines, our models suggest that even with low vaccine prices only catch-up vaccination in 1year old children could be cost-effective, when considered incrementally on the infant programme. Copyright © 2016. Published by Elsevier Ltd.

  13. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  14. Bayesian structural equation modeling in sport and exercise psychology.

    Science.gov (United States)

    Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus

    2015-08-01

    Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach.

  15. Impact and cost-effectiveness of chlamydia testing in Scotland: a mathematical modelling study.

    Science.gov (United States)

    Looker, Katharine J; Wallace, Lesley A; Turner, Katherine M E

    2015-01-15

    Chlamydia is the most common sexually transmitted bacterial infection in Scotland, and is associated with potentially serious reproductive outcomes, including pelvic inflammatory disease (PID) and tubal factor infertility (TFI) in women. Chlamydia testing in Scotland is currently targeted towards symptomatic individuals, individuals at high risk of existing undetected infection, and young people. The cost-effectiveness of testing and treatment to prevent PID and TFI in Scotland is uncertain. A compartmental deterministic dynamic model of chlamydia infection in 15-24 year olds in Scotland was developed. The model was used to estimate the impact of a change in testing strategy from baseline (16.8% overall testing coverage; 0.4 partners notified and tested/treated per treated positive index) on PID and TFI cases. Cost-effectiveness calculations informed by best-available estimates of the quality-adjusted life years (QALYs) lost due to PID and TFI were also performed. Increasing overall testing coverage by 50% from baseline to 25.2% is estimated to result in 21% fewer cases in young women each year (PID: 703 fewer; TFI: 88 fewer). A 50% decrease to 8.4% would result in 20% more PID (669 additional) and TFI (84 additional) cases occurring annually. The cost per QALY gained of current testing activities compared to no testing is £40,034, which is above the £20,000-£30,000 cost-effectiveness threshold. However, calculations are hampered by lack of reliable data. Any increase in partner notification from baseline would be cost-effective (incremental cost per QALY gained for a partner notification efficacy of 1 compared to baseline: £5,119), and would increase the cost-effectiveness of current testing strategy compared to no testing, with threshold cost-effectiveness reached at a partner notification efficacy of 1.5. However, there is uncertainty in the extent to which partner notification is currently done, and hence the amount by which it could potentially be

  16. Scale models: A proven cost-effective tool for outage planning

    Energy Technology Data Exchange (ETDEWEB)

    Lee, R. [Commonwealth Edison Co., Morris, IL (United States); Segroves, R. [Sargent & Lundy, Chicago, IL (United States)

    1995-03-01

    As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning and monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.

  17. Cost Effectiveness of Screening Colonoscopy Depends on Adequate Bowel Preparation Rates - A Modeling Study.

    Directory of Open Access Journals (Sweden)

    James Kingsley

    Full Text Available Inadequate bowel preparation during screening colonoscopy necessitates repeating colonoscopy. Studies suggest inadequate bowel preparation rates of 20-60%. This increases the cost of colonoscopy for our society.The aim of this study is to determine the impact of inadequate bowel preparation rate on the cost effectiveness of colonoscopy compared to other screening strategies for colorectal cancer (CRC.A microsimulation model of CRC screening strategies for the general population at average risk for CRC. The strategies include fecal immunochemistry test (FIT every year, colonoscopy every ten years, sigmoidoscopy every five years, or stool DNA test every 3 years. The screening could be performed at private practice offices, outpatient hospitals, and ambulatory surgical centers.At the current assumed inadequate bowel preparation rate of 25%, the cost of colonoscopy as a screening strategy is above society's willingness to pay (<$50,000/QALY. Threshold analysis demonstrated that an inadequate bowel preparation rate of 13% or less is necessary before colonoscopy is considered more cost effective than FIT. At inadequate bowel preparation rates of 25%, colonoscopy is still more cost effective compared to sigmoidoscopy and stool DNA test. Sensitivity analysis of all inputs adjusted by ±10% showed incremental cost effectiveness ratio values were influenced most by the specificity, adherence, and sensitivity of FIT and colonoscopy.Screening colonoscopy is not a cost effective strategy when compared with fecal immunochemical test, as long as the inadequate bowel preparation rate is greater than 13%.

  18. A cost-effective model for monitoring medicine use in Namibia: Outcomes and implications

    Directory of Open Access Journals (Sweden)

    Dan Kibuule

    2017-11-01

    Conclusions: A multisectoral collaborative model is cost-effective in medicine surveys, if there are mutual benefits. Student placements provide an opportunity to build local capacity for routine MUE. Ministries of Health should utilise this innovative approach to assess service delivery.

  19. Bayesian analysis of a correlated binomial model

    OpenAIRE

    Diniz, Carlos A. R.; Tutia, Marcelo H.; Leite, Jose G.

    2010-01-01

    In this paper a Bayesian approach is applied to the correlated binomial model, CB(n, p, ρ), proposed by Luceño (Comput. Statist. Data Anal. 20 (1995) 511–520). The data augmentation scheme is used in order to overcome the complexity of the mixture likelihood. MCMC methods, including Gibbs sampling and Metropolis within Gibbs, are applied to estimate the posterior marginal for the probability of success p and for the correlation coefficient ρ. The sensitivity of the posterior is studied taking...

  20. Bayesian Test of Significance for Conditional Independence: The Multinomial Model

    Directory of Open Access Journals (Sweden)

    Pablo de Morais Andrade

    2014-03-01

    Full Text Available Conditional independence tests have received special attention lately in machine learning and computational intelligence related literature as an important indicator of the relationship among the variables used by their models. In the field of probabilistic graphical models, which includes Bayesian network models, conditional independence tests are especially important for the task of learning the probabilistic graphical model structure from data. In this paper, we propose the full Bayesian significance test for tests of conditional independence for discrete datasets. The full Bayesian significance test is a powerful Bayesian test for precise hypothesis, as an alternative to the frequentist’s significance tests (characterized by the calculation of the p-value.

  1. Telemonitoring after discharge from hospital with heart failure: cost-effectiveness modelling of alternative service designs.

    Science.gov (United States)

    Thokala, Praveen; Baalbaki, Hassan; Brennan, Alan; Pandor, Abdullah; Stevens, John W; Gomersall, Tim; Wang, Jenny; Bakhai, Ameet; Al-Mohammad, Abdallah; Cleland, John; Cowie, Martin R; Wong, Ruth

    2013-09-18

    To estimate the cost-effectiveness of remote monitoring strategies versus usual care for adults recently discharged after a heart failure (HF) exacerbation. Decision analysis modelling of cost-effectiveness using secondary data sources. Acute hospitals in the UK. Patients recently discharged (within 28 days) after a HF exacerbation. Structured telephone support (STS) via human to machine (STS HM) interface, (2) STS via human to human (STS HH) contact and (3) home telemonitoring (TM), compared with (4) usual care. The incremental cost per quality-adjusted life year (QALY) gained by each strategy compared to the next most effective alternative and the probability of each strategy being cost-effective at varying willingness to pay per QALY gained. TM was the most cost-effective strategy in the scenario using these base case costs. Compared with usual care, TM had an estimated incremental cost effectiveness ratio (ICER) of £11 873/QALY, whereas STS HH had an ICER of £228 035/QALY against TM. STS HM was dominated by usual care. Threshold analysis suggested that the monthly cost of TM has to be higher than £390 to have an ICER greater than £20 000/QALY against STS HH. Scenario analyses performed using higher costs of usual care, higher costs of STS HH and lower costs of TM do not substantially change the conclusions. Cost-effectiveness analyses suggest that TM was an optimal strategy in most scenarios, but there is considerable uncertainty in relation to clear descriptions of the interventions and robust estimation of costs.

  2. Accelerating Bayesian inference for evolutionary biology models.

    Science.gov (United States)

    Meyer, Xavier; Chopard, Bastien; Salamin, Nicolas

    2017-03-01

    Bayesian inference is widely used nowadays and relies largely on Markov chain Monte Carlo (MCMC) methods. Evolutionary biology has greatly benefited from the developments of MCMC methods, but the design of more complex and realistic models and the ever growing availability of novel data is pushing the limits of the current use of these methods. We present a parallel Metropolis-Hastings (M-H) framework built with a novel combination of enhancements aimed towards parameter-rich and complex models. We show on a parameter-rich macroevolutionary model increases of the sampling speed up to 35 times with 32 processors when compared to a sequential M-H process. More importantly, our framework achieves up to a twentyfold faster convergence to estimate the posterior probability of phylogenetic trees using 32 processors when compared to the well-known software MrBayes for Bayesian inference of phylogenetic trees. https://bitbucket.org/XavMeyer/hogan. nicolas.salamin@unil.ch. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  3. Bayesian Predictive Models for Rayleigh Wind Speed

    DEFF Research Database (Denmark)

    Shahirinia, Amir; Hajizadeh, Amin; Yu, David C

    2017-01-01

    predictive model of the wind speed aggregates the non-homogeneous distributions into a single continuous distribution. Therefore, the result is able to capture the variation among the probability distributions of the wind speeds at the turbines’ locations in a wind farm. More specifically, instead of using...... a wind speed distribution whose parameters are known or estimated, the parameters are considered as random whose variations are according to probability distributions. The Bayesian predictive model for a Rayleigh which only has a single model scale parameter has been proposed. Also closed-form posterior......One of the major challenges with the increase in wind power generation is the uncertain nature of wind speed. So far the uncertainty about wind speed has been presented through probability distributions. Also the existing models that consider the uncertainty of the wind speed primarily view...

  4. Bayesian inference with information content model check for Langevin equations

    DEFF Research Database (Denmark)

    Krog, Jens F. C.; Lomholt, Michael Andersen

    2017-01-01

    The Bayesian data analysis framework has been proven to be a systematic and effective method of parameter inference and model selection for stochastic processes. In this work we introduce an information content model check which may serve as a goodness-of-fit, like the chi-square procedure......, to complement conventional Bayesian analysis. We demonstrate this extended Bayesian framework on a system of Langevin equations, where coordinate dependent mobilities and measurement noise hinder the normal mean squared displacement approach....

  5. Modelling the impact and cost-effectiveness of combination prevention amongst HIV serodiscordant couples in Nigeria.

    Science.gov (United States)

    Mitchell, Kate M; Lépine, Aurélia; Terris-Prestholt, Fern; Torpey, Kwasi; Khamofu, Hadiza; Folayan, Morenike O; Musa, Jonah; Anenih, James; Sagay, Atiene S; Alhassan, Emmanuel; Idoko, John; Vickerman, Peter

    2015-09-24

    To estimate the impact and cost-effectiveness of treatment as prevention (TasP), pre-exposure prophylaxis (PrEP) and condom promotion for serodiscordant couples in Nigeria. Mathematical and cost modelling. A deterministic model of HIV-1 transmission within a cohort of serodiscordant couples and to/from external partners was parameterized using data from Nigeria and other African settings. The impact and cost-effectiveness were estimated for condom promotion, PrEP and/or TasP, compared with a baseline where antiretroviral therapy (ART) was offered according to 2010 national guidelines (CD4 impact was additionally compared with a baseline of current ART coverage (35% of those with CD4 effective strategy [US $1206/disability-adjusted-life-year (DALY)], the next most cost-effective intervention was to additionally give TasP to HIV-positive partners (incremental cost-effectiveness ratio US $1607/DALY), followed by additionally giving PrEP to HIV-negative partners until their HIV-positive partners initiate ART (US $7870/DALY). When impact was measured in terms of infections averted, PrEP with condom promotion prevented double the number of infections as condom promotion alone. The first priority intervention for serodiscordant couples in Nigeria should be scaled up ART access for HIV-positive partners. Subsequent incremental benefits are greatest with condom promotion and TasP, followed by PrEP.

  6. Bayesian Fundamentalism or Enlightenment? On the explanatory status and theoretical contributions of Bayesian models of cognition.

    Science.gov (United States)

    Jones, Matt; Love, Bradley C

    2011-08-01

    The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls

  7. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  8. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  9. Cost-Effectiveness of a Community Pharmacist-Led Sleep Apnea Screening Program - A Markov Model.

    Directory of Open Access Journals (Sweden)

    Clémence Perraudin

    Full Text Available Despite the high prevalence and major public health ramifications, obstructive sleep apnea syndrome (OSAS remains underdiagnosed. In many developed countries, because community pharmacists (CP are easily accessible, they have been developing additional clinical services that integrate the services of and collaborate with other healthcare providers (general practitioners (GPs, nurses, etc.. Alternative strategies for primary care screening programs for OSAS involving the CP are discussed.To estimate the quality of life, costs, and cost-effectiveness of three screening strategies among patients who are at risk of having moderate to severe OSAS in primary care.Markov decision model.Published data.Hypothetical cohort of 50-year-old male patients with symptoms highly evocative of OSAS.The 5 years after initial evaluation for OSAS.Societal.Screening strategy with CP (CP-GP collaboration, screening strategy without CP (GP alone and no screening.Quality of life, survival and costs for each screening strategy.Under almost all modeled conditions, the involvement of CPs in OSAS screening was cost effective. The maximal incremental cost for "screening strategy with CP" was about 455€ per QALY gained.Our results were robust but primarily sensitive to the treatment costs by continuous positive airway pressure, and the costs of untreated OSAS. The probabilistic sensitivity analysis showed that the "screening strategy with CP" was dominant in 80% of cases. It was more effective and less costly in 47% of cases, and within the cost-effective range (maximum incremental cost effectiveness ratio at €6186.67/QALY in 33% of cases.CP involvement in OSAS screening is a cost-effective strategy. This proposal is consistent with the trend in Europe and the United States to extend the practices and responsibilities of the pharmacist in primary care.

  10. Cost-effective degradation test plan for a nonlinear random-coefficients model

    International Nuclear Information System (INIS)

    Kim, Seong-Joon; Bae, Suk Joo

    2013-01-01

    The determination of requisite sample size and the inspection schedule considering both testing cost and accuracy has been an important issue in the degradation test. This paper proposes a cost-effective degradation test plan in the context of a nonlinear random-coefficients model, while meeting some precision constraints for failure-time distribution. We introduce a precision measure to quantify the information losses incurred by reducing testing resources. The precision measure is incorporated into time-varying cost functions to reflect real circumstances. We apply a hybrid genetic algorithm to general cost optimization problem with reasonable constraints on the level of testing precision in order to determine a cost-effective inspection scheme. The proposed method is applied to the degradation data of plasma display panels (PDPs) following a bi-exponential degradation model. Finally, sensitivity analysis via simulation is provided to evaluate the robustness of the proposed degradation test plan.

  11. Model parameter updating using Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Treml, C. A. (Christine A.); Ross, Timothy J.

    2004-01-01

    This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.

  12. Effectiveness and cost-effectiveness of antidepressants in primary care: a multiple treatment comparison meta-analysis and cost-effectiveness model.

    Directory of Open Access Journals (Sweden)

    Joakim Ramsberg

    Full Text Available OBJECTIVE: To determine effectiveness and cost-effectiveness over a one-year time horizon of pharmacological first line treatment in primary care for patients with moderate to severe depression. DESIGN: A multiple treatment comparison meta-analysis was employed to determine the relative efficacy in terms of remission of 10 antidepressants (citalopram, duloxetine escitalopram, fluoxetine, fluvoxamine mirtazapine, paroxetine, reboxetine, sertraline and venlafaxine. The estimated remission rates were then applied in a decision-analytic model in order to estimate costs and quality of life with different treatments at one year. DATA SOURCES: Meta-analyses of remission rates from randomised controlled trials, and cost and quality-of-life data from published sources. RESULTS: The most favourable pharmacological treatment in terms of remission was escitalopram with an 8- to 12-week probability of remission of 0.47. Despite a high acquisition cost, this clinical effectiveness translated into escitalopram being both more effective and having a lower total cost than all other comparators from a societal perspective. From a healthcare perspective, the cost per QALY of escitalopram was €3732 compared with venlafaxine. CONCLUSION: Of the investigated antidepressants, escitalopram has the highest probability of remission and is the most effective and cost-effective pharmacological treatment in a primary care setting, when evaluated over a one year time-horizon. Small differences in remission rates may be important when assessing costs and cost-effectiveness of antidepressants.

  13. Effectiveness and Cost-Effectiveness of Antidepressants in Primary Care: A Multiple Treatment Comparison Meta-Analysis and Cost-Effectiveness Model

    Science.gov (United States)

    Ramsberg, Joakim; Asseburg, Christian; Henriksson, Martin

    2012-01-01

    Objective To determine effectiveness and cost-effectiveness over a one-year time horizon of pharmacological first line treatment in primary care for patients with moderate to severe depression. Design A multiple treatment comparison meta-analysis was employed to determine the relative efficacy in terms of remission of 10 antidepressants (citalopram, duloxetine escitalopram, fluoxetine, fluvoxamine mirtazapine, paroxetine, reboxetine, sertraline and venlafaxine). The estimated remission rates were then applied in a decision-analytic model in order to estimate costs and quality of life with different treatments at one year. Data Sources Meta-analyses of remission rates from randomised controlled trials, and cost and quality-of-life data from published sources. Results The most favourable pharmacological treatment in terms of remission was escitalopram with an 8- to 12-week probability of remission of 0.47. Despite a high acquisition cost, this clinical effectiveness translated into escitalopram being both more effective and having a lower total cost than all other comparators from a societal perspective. From a healthcare perspective, the cost per QALY of escitalopram was €3732 compared with venlafaxine. Conclusion Of the investigated antidepressants, escitalopram has the highest probability of remission and is the most effective and cost-effective pharmacological treatment in a primary care setting, when evaluated over a one year time-horizon. Small differences in remission rates may be important when assessing costs and cost-effectiveness of antidepressants. PMID:22876296

  14. Chemical identification using Bayesian model selection

    Energy Technology Data Exchange (ETDEWEB)

    Burr, Tom; Fry, H. A. (Herbert A.); McVey, B. D. (Brian D.); Sander, E. (Eric)

    2002-01-01

    Remote detection and identification of chemicals in a scene is a challenging problem. We introduce an approach that uses some of the image's pixels to establish the background characteristics while other pixels represent the target for which we seek to identify all chemical species present. This leads to a generalized least squares problem in which we focus on 'subset selection' to identify the chemicals thought to be present. Bayesian model selection allows us to approximate the posterior probability that each chemical in the library is present by adding the posterior probabilities of all the subsets which include the chemical. We present results using realistic simulated data for the case with 1 to 5 chemicals present in each target and compare performance to a hybrid of forward and backward stepwise selection procedure using the F statistic.

  15. Modelling crime linkage with Bayesian networks.

    Science.gov (United States)

    de Zoete, Jacob; Sjerps, Marjan; Lagnado, David; Fenton, Norman

    2015-05-01

    When two or more crimes show specific similarities, such as a very distinct modus operandi, the probability that they were committed by the same offender becomes of interest. This probability depends on the degree of similarity and distinctiveness. We show how Bayesian networks can be used to model different evidential structures that can occur when linking crimes, and how they assist in understanding the complex underlying dependencies. That is, how evidence that is obtained in one case can be used in another and vice versa. The flip side of this is that the intuitive decision to "unlink" a case in which exculpatory evidence is obtained leads to serious overestimation of the strength of the remaining cases. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  16. Cost-effectiveness simulation model of biologic strategies for treating to target rheumatoid arthritis in Germany.

    Science.gov (United States)

    Beresniak, Ariel; Baerwald, Christoph; Zeidler, Henning; Krüger, Klaus; Neubauer, Aljoscha S; Dupont, Danielle; Merkesdal, Sonja

    2013-01-01

    The treatment of active rheumatoid arthritis (RA) usually requires different therapeutic options used sequentially in case of an insufficient response (IR) to previous agents. Since there is a lack of clinical trials comparing biologic treatment sequences, simulation models might add to the understanding of optimal treatment sequences and their cost-effectiveness. The objective of this study was to assess the cost-effectiveness of different biologic treatment strategies in patients with an IR to anti-TNF agents, based on levels of disease activity from the German public payer's perspective. A cost-effectiveness sequential model was developed in accordance with local RA treatment strategies, using DAS28 scores as dichotomous effectiveness endpoints: achieving remission/no remission (RS/no RS) or a state of low disease activity (LDAS/no LDAS). Costs were estimated using resource utilisation data obtained from a large observational German cohort. Advanced simulations were conducted to assess the cost-effectiveness over 2 years of four sequential biologic strategies composed of up to 3 biologic agents, namely anti-TNF agents, abatacept or rituximab, in patients with moderate-to-severe active RA and an IR to at least one anti-TNF agent. Over two years, the biological sequence including abatacept after an IR to one anti-TNF agent appeared the most effective and cost-effective versus (vs.) use after two anti-TNF agents (€633 vs. €1,067/day in LDAS and €1,222 vs. €3,592/day in remission), and vs a similar sequence using rituximab (€633 vs. €728/day in LDAS and €1,222 vs. €1,812/day in remission). The sequence using a 3rd anti-TNF agent was less effective and cost-effective than the same sequence using abatacept (€2,000 vs. €1,067/day in LDAS and €6,623 vs. €3,592/day in remission). All differences were statistically significant (pcost-effective than similar sequences including rituximab or only cycled anti-TNF agents.

  17. Cost-effectiveness of new pneumococcal conjugate vaccines in Turkey: a decision analytical model

    Directory of Open Access Journals (Sweden)

    Bakır Mustafa

    2012-11-01

    Full Text Available Abstract Background Streptococcus pneumoniae infections, which place a considerable burden on healthcare resources, can be reduced in a cost-effective manner using a 7-valent pneumococcal conjugate vaccine (PCV-7. We compare the cost effectiveness of a 13-valent PCV (PCV-13 and a 10-valent pneumococcal non-typeable Haemophilus influenzae protein D conjugate vaccine (PHiD-CV with that of PCV-7 in Turkey. Methods A cost-utility analysis was conducted and a decision analytical model was used to estimate the proportion of the Turkish population Results PCV-13 and PHiD-CV are projected to have a substantial impact on pneumococcal disease in Turkey versus PCV-7, with 2,223 and 3,156 quality-adjusted life years (QALYs and 2,146 and 2,081 life years, respectively, being saved under a 3+1 schedule. Projections of direct medical costs showed that a PHiD-CV vaccination programme would provide the greatest cost savings, offering additional savings of US$11,718,813 versus PCV-7 and US$8,235,010 versus PCV-13. Probabilistic sensitivity analysis showed that PHiD-CV dominated PCV-13 in terms of QALYs gained and cost savings in 58.3% of simulations. Conclusion Under the modeled conditions, PHiD-CV would provide the most cost-effective intervention for reducing pneumococcal disease in Turkish children.

  18. The cost-effectiveness of the Olweus Bullying Prevention Program: Results from a modelling study.

    Science.gov (United States)

    Beckman, Linda; Svensson, Mikael

    2015-12-01

    Exposure to bullying affects around 3-5 percent of adolescents in secondary school and is related to various mental health problems. Many different anti-bullying programmes are currently available, but economic evaluations are lacking. The aim of this study is to identify the cost effectiveness of the Olweus Bullying Prevention Program (OBPP). We constructed a decision-tree model for a Swedish secondary school, using a public payer perspective, and retrieved data on costs and effects from the published literature. Probabilistic sensitivity analysis to reflect the uncertainty in the model was conducted. The base-case analysis showed that using the OBPP to reduce the number of victims of bullying costs 131,250 Swedish kronor (€14,470) per victim spared. Compared to a relevant threshold of the societal value of bullying reduction, this indicates that the programme is cost-effective. Using a relevant willingness-to-pay threshold shows that the OBPP is a cost-effective intervention. Copyright © 2015 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  19. A Cost-Effective Tracking Algorithm for Hypersonic Glide Vehicle Maneuver Based on Modified Aerodynamic Model

    Directory of Open Access Journals (Sweden)

    Yu Fan

    2016-10-01

    Full Text Available In order to defend the hypersonic glide vehicle (HGV, a cost-effective single-model tracking algorithm using Cubature Kalman filter (CKF is proposed in this paper based on modified aerodynamic model (MAM as process equation and radar measurement model as measurement equation. In the existing aerodynamic model, the two control variables attack angle and bank angle cannot be measured by the existing radar equipment and their control laws cannot be known by defenders. To establish the process equation, the MAM for HGV tracking is proposed by using additive white noise to model the rates of change of the two control variables. For the ease of comparison several multiple model algorithms based on CKF are presented, including interacting multiple model (IMM algorithm, adaptive grid interacting multiple model (AGIMM algorithm and hybrid grid multiple model (HGMM algorithm. The performances of these algorithms are compared and analyzed according to the simulation results. The simulation results indicate that the proposed tracking algorithm based on modified aerodynamic model has the best tracking performance with the best accuracy and least computational cost among all tracking algorithms in this paper. The proposed algorithm is cost-effective for HGV tracking.

  20. Hierarchical Bayesian models of subtask learning.

    Science.gov (United States)

    Anglim, Jeromy; Wynton, Sarah K A

    2015-07-01

    The current study used Bayesian hierarchical methods to challenge and extend previous work on subtask learning consistency. A general model of individual-level subtask learning was proposed focusing on power and exponential functions with constraints to test for inconsistency. To study subtask learning, we developed a novel computer-based booking task, which logged participant actions, enabling measurement of strategy use and subtask performance. Model comparison was performed using deviance information criterion (DIC), posterior predictive checks, plots of model fits, and model recovery simulations. Results showed that although learning tended to be monotonically decreasing and decelerating, and approaching an asymptote for all subtasks, there was substantial inconsistency in learning curves both at the group- and individual-levels. This inconsistency was most apparent when constraining both the rate and the ratio of learning to asymptote to be equal across subtasks, thereby giving learning curves only 1 parameter for scaling. The inclusion of 6 strategy covariates provided improved prediction of subtask performance capturing different subtask learning processes and subtask trade-offs. In addition, strategy use partially explained the inconsistency in subtask learning. Overall, the model provided a more nuanced representation of how complex tasks can be decomposed in terms of simpler learning mechanisms. (c) 2015 APA, all rights reserved.

  1. Vaginal microbicides save money: a model of cost-effectiveness in South Africa and the USA.

    Science.gov (United States)

    Verguet, S; Walsh, J A

    2010-06-01

    To determine the hypothetical cost-effectiveness of vaginal microbicides preventing male to female HIV transmission. A mathematical epidemiological and cost-effectiveness model using data from South Africa and the USA was used. The prospective 1-year-long intervention targeted a general population of women in a city of 1,000,000 inhabitants in two very different epidemiological settings, South Africa with a male HIV prevalence of 18.80% and the USA with a male HIV prevalence of 0.72%. The base case scenario assumes a microbicide effective at 55%, used in 30% of sexual episodes at a retail price for the public sector in South Africa of US$0.51 per use and in the USA of US$2.23 per use. In South Africa, over 1 year, the intervention would prevent 1908 infections, save US$6712 per infection averted as compared with antiretroviral treatment. In the USA, it would be more costly: over 1 year, the intervention would prevent 21 infections, amounting to a net cost per infection averted of US$405,077. However, in the setting of Washington DC, with a higher HIV prevalence, the same intervention would prevent 93 infections and save US$91,176 per infection averted. Sensitivity analyses were conducted and even a microbicide with a low effectiveness of 30% would still save healthcare costs in South Africa. A microbicide intervention is likely to be very cost-effective in a country undergoing a high-level generalised epidemic such as South Africa, but is unlikely to be cost-effective in a developed country presenting epidemiological features similar to the USA unless the male HIV prevalence exceeds 2.4%.

  2. Prior sensitivity analysis in default Bayesian structural equation modeling

    NARCIS (Netherlands)

    van Erp, S.J.; Mulder, J.; Oberski, Daniel L.

    2018-01-01

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models while solving some of the issues often encountered in classical maximum likelihood (ML) estimation, such as nonconvergence and inadmissible solutions. An important

  3. Cost-effectiveness models for chronic obstructive pulmonary disease : cross-model comparison of hypothetical treatment scenarios

    NARCIS (Netherlands)

    Hoogendoorn, Martine; Feenstra, Talitha L; Asukai, Yumi; Borg, Sixten; Hansen, Ryan N; Jansson, Sven-Arne; Samyshkin, Yevgeniy; Wacker, Margarethe; Briggs, Andrew H; Lloyd, Adam; Sullivan, Sean D; Rutten-van Mölken, Maureen P M H

    OBJECTIVES: To compare different chronic obstructive pulmonary disease (COPD) cost-effectiveness models with respect to structure and input parameters and to cross-validate the models by running the same hypothetical treatment scenarios. METHODS: COPD modeling groups simulated four hypothetical

  4. Tuberculosis active case finding in Cambodia: a pragmatic, cost-effectiveness comparison of three implementation models.

    Science.gov (United States)

    James, Richard; Khim, Keovathanak; Boudarene, Lydia; Yoong, Joanne; Phalla, Chea; Saint, Saly; Koeut, Pichenda; Mao, Tan Eang; Coker, Richard; Khan, Mishal Sameer

    2017-08-22

    Globally, almost 40% of tuberculosis (TB) patients remain undiagnosed, and those that are diagnosed often experience prolonged delays before initiating correct treatment, leading to ongoing transmission. While there is a push for active case finding (ACF) to improve early detection and treatment of TB, there is extremely limited evidence about the relative cost-effectiveness of different ACF implementation models. Cambodia presents a unique opportunity for addressing this gap in evidence as ACF has been implemented using different models, but no comparisons have been conducted. The objective of our study is to contribute to knowledge and methodology on comparing cost-effectiveness of alternative ACF implementation models from the health service perspective, using programmatic data, in order to inform national policy and practice. We retrospectively compared three distinct ACF implementation models - door to door symptom screening in urban slums, checking contacts of TB patients, and door to door symptom screening focusing on rural populations aged above 55 - in terms of the number of new bacteriologically-positive pulmonary TB cases diagnosed and the cost of implementation assuming activities are conducted by the national TB program of Cambodia. We calculated the cost per additional case detected using the alternative ACF models. Our analysis, which is the first of its kind for TB, revealed that the ACF model based on door to door screening in poor urban areas of Phnom Penh was the most cost-effective (249 USD per case detected, 737 cases diagnosed), followed by the model based on testing contacts of TB patients (308 USD per case detected, 807 cases diagnosed), and symptomatic screening of older rural populations (316 USD per case detected, 397 cases diagnosed). Our study provides new evidence on the relative effectiveness and economics of three implementation models for enhanced TB case finding, in line with calls for data from 'routine conditions' to be included

  5. Cost-effectiveness of a new rotavirus vaccination program in Pakistan: a decision tree model.

    Science.gov (United States)

    Patel, Hiten D; Roberts, Eric T; Constenla, Dagna O

    2013-12-09

    Rotavirus gastroenteritis places a significant health and economic burden on Pakistan. To determine the public health impact of a national rotavirus vaccination program, we performed a cost-effectiveness study from the perspective of the health care system. A decision tree model was developed to assess the cost-effectiveness of a national vaccination program in Pakistan. Disease and cost burden with the program were compared to the current state. Disease parameters, vaccine-related costs, and medical treatment costs were based on published epidemiological and economic data, which were specific to Pakistan when possible. An annual birth cohort of children was followed for 5 years to model the public health impact of vaccination on health-related events and costs. The cost-effectiveness was assessed and quantified in cost (2012 US$) per disability-adjusted life-year (DALY) averted and cost per death averted. Sensitivity analyses were performed to assess the robustness of the incremental cost-effectiveness ratios (ICERs). The base case results showed vaccination prevented 1.2 million cases of rotavirus gastroenteritis, 93,000 outpatient visits, 43,000 hospitalizations, and 6700 deaths by 5 years of age for an annual birth cohort scaled from 6% current coverage to DPT3 levels (85%). The medical cost savings would be US$1.4 million from hospitalizations and US$200,000 from outpatient visit costs. The vaccination program would cost US$35 million at a vaccine price of US$5.00. The ICER was US$149.50 per DALY averted or US$4972 per death averted. Sensitivity analyses showed changes in case-fatality ratio, vaccine efficacy, and vaccine cost exerted the greatest influence on the ICER. Across a range of sensitivity analyses, a national rotavirus vaccination program was predicted to decrease health and economic burden due to rotavirus gastroenteritis in Pakistan by ~40%. Vaccination was highly cost-effective in this context. As discussions of implementing the intervention

  6. A model-based economic analysis of pre-pandemic influenza vaccination cost-effectiveness.

    Science.gov (United States)

    Halder, Nilimesh; Kelso, Joel K; Milne, George J

    2014-05-16

    A vaccine matched to a newly emerged pandemic influenza virus would require a production time of at least 6 months with current proven techniques, and so could only be used reactively after the peak of the pandemic. A pre-pandemic vaccine, although probably having lower efficacy, could be produced and used pre-emptively. While several previous studies have investigated the cost effectiveness of pre-emptive vaccination strategies, they have not been directly compared to realistic reactive vaccination strategies. An individual-based simulation model of ~30,000 people was used to examine a pre-emptive vaccination strategy, assuming vaccination conducted prior to a pandemic using a low-efficacy vaccine. A reactive vaccination strategy, assuming a 6-month delay between pandemic emergence and availability of a high-efficacy vaccine, was also modelled. Social distancing and antiviral interventions were examined in combination with these alternative vaccination strategies. Moderate and severe pandemics were examined, based on estimates of transmissibility and clinical severity of the 1957 and 1918 pandemics respectively, and the cost effectiveness of each strategy was evaluated. Provided that a pre-pandemic vaccine achieved at least 30% efficacy, pre-emptive vaccination strategies were found to be more cost effective when compared to reactive vaccination strategies. Reactive vaccination coupled with sustained social distancing and antiviral interventions was found to be as effective at saving lives as pre-emptive vaccination coupled with limited duration social distancing and antiviral use, with both strategies saving approximately 420 life-years per 10,000 population for a moderate pandemic with a basic reproduction number of 1.9 and case fatality rate of 0.25%. Reactive vaccination was however more costly due to larger productivity losses incurred by sustained social distancing, costing $8 million per 10,000 population ($19,074/LYS) versus $6.8 million per 10

  7. A model-based economic analysis of pre-pandemic influenza vaccination cost-effectiveness

    Science.gov (United States)

    2014-01-01

    Background A vaccine matched to a newly emerged pandemic influenza virus would require a production time of at least 6 months with current proven techniques, and so could only be used reactively after the peak of the pandemic. A pre-pandemic vaccine, although probably having lower efficacy, could be produced and used pre-emptively. While several previous studies have investigated the cost effectiveness of pre-emptive vaccination strategies, they have not been directly compared to realistic reactive vaccination strategies. Methods An individual-based simulation model of ~30,000 people was used to examine a pre-emptive vaccination strategy, assuming vaccination conducted prior to a pandemic using a low-efficacy vaccine. A reactive vaccination strategy, assuming a 6-month delay between pandemic emergence and availability of a high-efficacy vaccine, was also modelled. Social distancing and antiviral interventions were examined in combination with these alternative vaccination strategies. Moderate and severe pandemics were examined, based on estimates of transmissibility and clinical severity of the 1957 and 1918 pandemics respectively, and the cost effectiveness of each strategy was evaluated. Results Provided that a pre-pandemic vaccine achieved at least 30% efficacy, pre-emptive vaccination strategies were found to be more cost effective when compared to reactive vaccination strategies. Reactive vaccination coupled with sustained social distancing and antiviral interventions was found to be as effective at saving lives as pre-emptive vaccination coupled with limited duration social distancing and antiviral use, with both strategies saving approximately 420 life-years per 10,000 population for a moderate pandemic with a basic reproduction number of 1.9 and case fatality rate of 0.25%. Reactive vaccination was however more costly due to larger productivity losses incurred by sustained social distancing, costing $8 million per 10,000 population ($19,074/LYS) versus $6

  8. Bootstrap prediction and Bayesian prediction under misspecified models

    OpenAIRE

    Fushiki, Tadayoshi

    2005-01-01

    We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...

  9. A guide to Bayesian model selection for ecologists

    Science.gov (United States)

    Hooten, Mevin B.; Hobbs, N.T.

    2015-01-01

    The steady upward trend in the use of model selection and Bayesian methods in ecological research has made it clear that both approaches to inference are important for modern analysis of models and data. However, in teaching Bayesian methods and in working with our research colleagues, we have noticed a general dissatisfaction with the available literature on Bayesian model selection and multimodel inference. Students and researchers new to Bayesian methods quickly find that the published advice on model selection is often preferential in its treatment of options for analysis, frequently advocating one particular method above others. The recent appearance of many articles and textbooks on Bayesian modeling has provided welcome background on relevant approaches to model selection in the Bayesian framework, but most of these are either very narrowly focused in scope or inaccessible to ecologists. Moreover, the methodological details of Bayesian model selection approaches are spread thinly throughout the literature, appearing in journals from many different fields. Our aim with this guide is to condense the large body of literature on Bayesian approaches to model selection and multimodel inference and present it specifically for quantitative ecologists as neutrally as possible. We also bring to light a few important and fundamental concepts relating directly to model selection that seem to have gone unnoticed in the ecological literature. Throughout, we provide only a minimal discussion of philosophy, preferring instead to examine the breadth of approaches as well as their practical advantages and disadvantages. This guide serves as a reference for ecologists using Bayesian methods, so that they can better understand their options and can make an informed choice that is best aligned with their goals for inference.

  10. Cost-effectiveness of interventions to promote physical activity: a modelling study.

    Directory of Open Access Journals (Sweden)

    Linda J Cobiac

    2009-07-01

    Full Text Available BACKGROUND: Physical inactivity is a key risk factor for chronic disease, but a growing number of people are not achieving the recommended levels of physical activity necessary for good health. Australians are no exception; despite Australia's image as a sporting nation, with success at the elite level, the majority of Australians do not get enough physical activity. There are many options for intervention, from individually tailored advice, such as counselling from a general practitioner, to population-wide approaches, such as mass media campaigns, but the most cost-effective mix of interventions is unknown. In this study we evaluate the cost-effectiveness of interventions to promote physical activity. METHODS AND FINDINGS: From evidence of intervention efficacy in the physical activity literature and evaluation of the health sector costs of intervention and disease treatment, we model the cost impacts and health outcomes of six physical activity interventions, over the lifetime of the Australian population. We then determine cost-effectiveness of each intervention against current practice for physical activity intervention in Australia and derive the optimal pathway for implementation. Based on current evidence of intervention effectiveness, the intervention programs that encourage use of pedometers (Dominant and mass media-based community campaigns (Dominant are the most cost-effective strategies to implement and are very likely to be cost-saving. The internet-based intervention program (AUS$3,000/DALY, the GP physical activity prescription program (AUS$12,000/DALY, and the program to encourage more active transport (AUS$20,000/DALY, although less likely to be cost-saving, have a high probability of being under a AUS$50,000 per DALY threshold. GP referral to an exercise physiologist (AUS$79,000/DALY is the least cost-effective option if high time and travel costs for patients in screening and consulting an exercise physiologist are considered

  11. Hierarchical Bayesian Modeling of Fluid-Induced Seismicity

    Science.gov (United States)

    Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.

    2017-11-01

    In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.

  12. Bayesian Model Selection in Geophysics: The evidence

    Science.gov (United States)

    Vrugt, J. A.

    2016-12-01

    Bayesian inference has found widespread application and use in science and engineering to reconcile Earth system models with data, including prediction in space (interpolation), prediction in time (forecasting), assimilation of observations and deterministic/stochastic model output, and inference of the model parameters. Per Bayes theorem, the posterior probability, , P(H|D), of a hypothesis, H, given the data D, is equivalent to the product of its prior probability, P(H), and likelihood, L(H|D), divided by a normalization constant, P(D). In geophysics, the hypothesis, H, often constitutes a description (parameterization) of the subsurface for some entity of interest (e.g. porosity, moisture content). The normalization constant, P(D), is not required for inference of the subsurface structure, yet of great value for model selection. Unfortunately, it is not particularly easy to estimate P(D) in practice. Here, I will introduce the various building blocks of a general purpose method which provides robust and unbiased estimates of the evidence, P(D). This method uses multi-dimensional numerical integration of the posterior (parameter) distribution. I will then illustrate this new estimator by application to three competing subsurface models (hypothesis) using GPR travel time data from the South Oyster Bacterial Transport Site, in Virginia, USA. The three subsurface models differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. The results of the new estimator are compared against the brute force Monte Carlo method, and the Laplace-Metropolis method.

  13. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    Science.gov (United States)

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  14. A conceptual model to estimate cost effectiveness of the indoor environment improvements

    Energy Technology Data Exchange (ETDEWEB)

    Seppanen, Olli; Fisk, William J.

    2003-06-01

    Macroeconomic analyses indicate a high cost to society of a deteriorated indoor climate. The few example calculations performed to date indicate that measures taken to improve IEQ are highly cost-effective when health and productivity benefits are considered. We believe that cost-benefit analyses of building designs and operations should routinely incorporate health and productivity impacts. As an initial step, we developed a conceptual model that shows the links between improvements in IEQ and the financial gains from reductions in medical care and sick leave, improved work performance, lower employee turn over, and reduced maintenance due to fewer complaints.

  15. Involving stakeholders in building integrated fisheries models using Bayesian methods

    DEFF Research Database (Denmark)

    Haapasaari, Päivi Elisabet; Mäntyniemi, Samu; Kuikka, Sakari

    2013-01-01

    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders...... on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame...... the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology...

  16. Bayesian disease mapping: hierarchical modeling in spatial epidemiology

    National Research Council Canada - National Science Library

    Lawson, Andrew

    2013-01-01

    Since the publication of the first edition, many new Bayesian tools and methods have been developed for space-time data analysis, the predictive modeling of health outcomes, and other spatial biostatistical areas...

  17. Bayesian network modeling of operator's state recognition process

    International Nuclear Information System (INIS)

    Hatakeyama, Naoki; Furuta, Kazuo

    2000-01-01

    Nowadays we are facing a difficult problem of establishing a good relation between humans and machines. To solve this problem, we suppose that machine system need to have a model of human behavior. In this study we model the state cognition process of a PWR plant operator as an example. We use a Bayesian network as an inference engine. We incorporate the knowledge hierarchy in the Bayesian network and confirm its validity using the example of PWR plant operator. (author)

  18. Is bronchial thermoplasty cost-effective as treatment for problematic asthma patients? Singapore's perspective on a global model.

    Science.gov (United States)

    Nguyen, Hai V; Bose, Saideep; Mital, Shweta; Yii, Anthony Chau Ang; Ang, Shin Yuh; Lam, Sean Shao Wei; Anantham, Devanand; Finkelstein, Eric; Koh, Mariko Siyue

    2017-08-01

    Bronchial thermoplasty (BT) has been shown to be effective at reducing asthma exacerbations and improving asthma control for patients with severe persistent asthma but it is also expensive. Evidence on its cost-effectiveness is limited and inconclusive. In this study, we aim to evaluate the incremental cost-effectiveness of BT combined with optimized asthma therapy (BT-OAT) relative to OAT for difficult-to-treat and severe asthma patients in Singapore, and to provide a general framework for determining BT's cost-effectiveness in other healthcare settings. We developed a Markov model to estimate the costs and quality-adjusted life years (QALYs) gained with BT-OAT versus OAT from the societal and health system perspectives. The model was populated using Singapore-specific costs and transition probabilities and utilities from the literature. Sensitivity analyses were conducted to identify the main factors determining cost-effectiveness of BT-OAT. BT-OAT is not cost-effective relative to OAT over a 5-year time horizon with an incremental cost-effectiveness ratio (ICER) of $US138 889 per QALY from the societal perspective and $US139 041 per QALY from the health system perspective. The cost-effectiveness of BT-OAT largely depends on a combination of the cost of the BT procedure and the cost of asthma-related hospitalizations and emergency department (ED) visits. Based on established thresholds for cost-effectiveness, BT-OAT is not cost-effective compared with OAT in Singapore. Given its current clinical efficacy, BT-OAT is most likely to be cost-effective in a setting where the cost of BT procedure is low and costs of hospitalization and ED visits are high. © 2017 Asian Pacific Society of Respirology.

  19. Developing a cost effective rock bed thermal energy storage system: Design and modelling

    Science.gov (United States)

    Laubscher, Hendrik Frederik; von Backström, Theodor Willem; Dinter, Frank

    2017-06-01

    Thermal energy storage is an integral part of the drive for low cost of concentrated solar power (CSP). Storage of thermal energy enables CSP plants to provide base load power. Alternative, cheaper concepts for storing thermal energy have been conceptually proposed in previous studies. Using rocks as a storage medium and air as a heat transfer fluid, the proposed concept offers the potential of lower cost storage because of the abundance and affordability of rocks. A packed rock bed thermal energy storage (TES) concept is investigated and a design for an experimental rig is done. This paper describes the design and modelling of an experimental test facility for a cost effective packed rock bed thermal energy storage system. Cost effective, simplified designs for the different subsystems of an experimental setup are developed based on the availability of materials and equipment. Modelling of this design to predict the thermal performance of the TES system is covered in this study. If the concept under consideration proves to be successful, a design that is scalable and commercially viable can be proposed for further development of an industrial thermal energy storage system.

  20. A decision model for cost effective design of biomass based green energy supply chains.

    Science.gov (United States)

    Yılmaz Balaman, Şebnem; Selim, Hasan

    2015-09-01

    The core driver of this study is to deal with the design of anaerobic digestion based biomass to energy supply chains in a cost effective manner. In this concern, a decision model is developed. The model is based on fuzzy multi objective decision making in order to simultaneously optimize multiple economic objectives and tackle the inherent uncertainties in the parameters and decision makers' aspiration levels for the goals. The viability of the decision model is explored with computational experiments on a real-world biomass to energy supply chain and further analyses are performed to observe the effects of different conditions. To this aim, scenario analyses are conducted to investigate the effects of energy crop utilization and operational costs on supply chain structure and performance measures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Cost-effectiveness of female human papillomavirus vaccination in 179 countries: a PRIME modelling study.

    Science.gov (United States)

    Jit, Mark; Brisson, Marc; Portnoy, Allison; Hutubessy, Raymond

    2014-07-01

    Introduction of human papillomavirus (HPV) vaccination in settings with the highest burden of HPV is not universal, partly because of the absence of quantitative estimates of country-specific effects on health and economic costs. We aimed to develop and validate a simple generic model of such effects that could be used and understood in a range of settings with little external support. We developed the Papillomavirus Rapid Interface for Modelling and Economics (PRIME) model to assess cost-effectiveness and health effects of vaccination of girls against HPV before sexual debut in terms of burden of cervical cancer and mortality. PRIME models incidence according to proposed vaccine efficacy against HPV 16/18, vaccine coverage, cervical cancer incidence and mortality, and HPV type distribution. It assumes lifelong vaccine protection and no changes to other screening programmes or vaccine uptake. We validated PRIME against existing reports of HPV vaccination cost-effectiveness, projected outcomes for 179 countries (assuming full vaccination of 12-year-old girls), and outcomes for 71 phase 2 GAVI-eligible countries (using vaccine uptake data from the GAVI Alliance). We assessed differences between countries in terms of cost-effectiveness and health effects. In validation, PRIME reproduced cost-effectiveness conclusions for 24 of 26 countries from 17 published studies, and for all 72 countries in a published study of GAVI-eligible countries. Vaccination of a cohort of 58 million 12-year-old girls in 179 countries prevented 690,000 cases of cervical cancer and 420,000 deaths during their lifetime (mostly in low-income or middle-income countries), at a net cost of US$4 billion. HPV vaccination was very cost effective (with every disability-adjusted life-year averted costing less than the gross domestic product per head) in 156 (87%) of 179 countries. Introduction of the vaccine in countries without national HPV vaccination at present would prevent substantially more cases

  2. Combination of Bayesian Network and Overlay Model in User Modeling

    Directory of Open Access Journals (Sweden)

    Loc Nguyen

    2009-12-01

    Full Text Available The core of adaptive system is user model containing personal information such as knowledge, learning styles, goals… which is requisite for learning personalized process. There are many modeling approaches, for example: stereotype, overlay, plan recognition… but they don’t bring out the solid method for reasoning from user model. This paper introduces the statistical method that combines Bayesian network and overlay modeling so that it is able to infer user’s knowledge from evidences collected during user’s learning process.

  3. Quantifying Multiscale Habitat Structural Complexity: A Cost-Effective Framework for Underwater 3D Modelling

    Directory of Open Access Journals (Sweden)

    Renata Ferrari

    2016-02-01

    Full Text Available Coral reef habitat structural complexity influences key ecological processes, ecosystem biodiversity, and resilience. Measuring structural complexity underwater is not trivial and researchers have been searching for accurate and cost-effective methods that can be applied across spatial extents for over 50 years. This study integrated a set of existing multi-view, image-processing algorithms, to accurately compute metrics of structural complexity (e.g., ratio of surface to planar area underwater solely from images. This framework resulted in accurate, high-speed 3D habitat reconstructions at scales ranging from small corals to reef-scapes (10s km2. Structural complexity was accurately quantified from both contemporary and historical image datasets across three spatial scales: (i branching coral colony (Acropora spp.; (ii reef area (400 m2; and (iii reef transect (2 km. At small scales, our method delivered models with <1 mm error over 90% of the surface area, while the accuracy at transect scale was 85.3% ± 6% (CI. Advantages are: no need for an a priori requirement for image size or resolution, no invasive techniques, cost-effectiveness, and utilization of existing imagery taken from off-the-shelf cameras (both monocular or stereo. This remote sensing method can be integrated to reef monitoring and improve our knowledge of key aspects of coral reef dynamics, from reef accretion to habitat provisioning and productivity, by measuring and up-scaling estimates of structural complexity.

  4. A model to estimate the cost effectiveness of the indoorenvironment improvements in office work

    Energy Technology Data Exchange (ETDEWEB)

    Seppanen, Olli; Fisk, William J.

    2004-06-01

    Deteriorated indoor climate is commonly related to increases in sick building syndrome symptoms, respiratory illnesses, sick leave, reduced comfort and losses in productivity. The cost of deteriorated indoor climate for the society is high. Some calculations show that the cost is higher than the heating energy costs of the same buildings. Also building-level calculations have shown that many measures taken to improve indoor air quality and climate are cost-effective when the potential monetary savings resulting from an improved indoor climate are included as benefits gained. As an initial step towards systemizing these building level calculations we have developed a conceptual model to estimate the cost-effectiveness of various measures. The model shows the links between the improvements in the indoor environment and the following potential financial benefits: reduced medical care cost, reduced sick leave, better performance of work, lower turn over of employees, and lower cost of building maintenance due to fewer complaints about indoor air quality and climate. The pathways to these potential benefits from changes in building technology and practices go via several human responses to the indoor environment such as infectious diseases, allergies and asthma, sick building syndrome symptoms, perceived air quality, and thermal environment. The model also includes the annual cost of investments, operation costs, and cost savings of improved indoor climate. The conceptual model illustrates how various factors are linked to each other. SBS symptoms are probably the most commonly assessed health responses in IEQ studies and have been linked to several characteristics of buildings and IEQ. While the available evidence indicates that SBS symptoms can affect these outcomes and suspects that such a linkage exists, at present we can not quantify the relationships sufficiently for cost-benefit modeling. New research and analyses of existing data to quantify the financial

  5. Bayesian graphical models for genomewide association studies.

    Science.gov (United States)

    Verzilli, Claudio J; Stallard, Nigel; Whittaker, John C

    2006-07-01

    As the extent of human genetic variation becomes more fully characterized, the research community is faced with the challenging task of using this information to dissect the heritable components of complex traits. Genomewide association studies offer great promise in this respect, but their analysis poses formidable difficulties. In this article, we describe a computationally efficient approach to mining genotype-phenotype associations that scales to the size of the data sets currently being collected in such studies. We use discrete graphical models as a data-mining tool, searching for single- or multilocus patterns of association around a causative site. The approach is fully Bayesian, allowing us to incorporate prior knowledge on the spatial dependencies around each marker due to linkage disequilibrium, which reduces considerably the number of possible graphical structures. A Markov chain-Monte Carlo scheme is developed that yields samples from the posterior distribution of graphs conditional on the data from which probabilistic statements about the strength of any genotype-phenotype association can be made. Using data simulated under scenarios that vary in marker density, genotype relative risk of a causative allele, and mode of inheritance, we show that the proposed approach has better localization properties and leads to lower false-positive rates than do single-locus analyses. Finally, we present an application of our method to a quasi-synthetic data set in which data from the CYP2D6 region are embedded within simulated data on 100K single-nucleotide polymorphisms. Analysis is quick (<5 min), and we are able to localize the causative site to a very short interval.

  6. A tutorial introduction to Bayesian models of cognitive development.

    Science.gov (United States)

    Perfors, Amy; Tenenbaum, Joshua B; Griffiths, Thomas L; Xu, Fei

    2011-09-01

    We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in the cognitive science applications, mathematical foundations, or machine learning details in more depth. In addition, we discuss some important interpretation issues that often arise when evaluating Bayesian models in cognitive science. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. Inventory model using bayesian dynamic linear model for demand forecasting

    Directory of Open Access Journals (Sweden)

    Marisol Valencia-Cárdenas

    2014-12-01

    Full Text Available An important factor of manufacturing process is the inventory management of terminated product. Constantly, industry is looking for better alternatives to establish an adequate plan of production and stored quantities, with optimal cost, getting quantities in a time horizon, which permits to define resources and logistics with anticipation, needed to distribute products on time. Total absence of historical data, required by many statistical models to forecast, demands the search for other kind of accurate techniques. This work presents an alternative that not only permits to forecast, in an adjusted way, but also, to provide optimal quantities to produce and store with an optimal cost, using Bayesian statistics. The proposal is illustrated with real data. Palabras clave: estadística bayesiana, optimización, modelo de inventarios, modelo lineal dinámico bayesiano. Keywords: Bayesian statistics, opti

  8. A Bayesian alternative for multi-objective ecohydrological model specification

    Science.gov (United States)

    Tang, Yating; Marshall, Lucy; Sharma, Ashish; Ajami, Hoori

    2018-01-01

    Recent studies have identified the importance of vegetation processes in terrestrial hydrologic systems. Process-based ecohydrological models combine hydrological, physical, biochemical and ecological processes of the catchments, and as such are generally more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov chain Monte Carlo (MCMC) techniques. The Bayesian approach offers an appealing alternative to traditional multi-objective hydrologic model calibrations by defining proper prior distributions that can be considered analogous to the ad-hoc weighting often prescribed in multi-objective calibration. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological modeling framework based on a traditional Pareto-based model calibration technique. In our study, a Pareto-based multi-objective optimization and a formal Bayesian framework are implemented in a conceptual ecohydrological model that combines a hydrological model (HYMOD) and a modified Bucket Grassland Model (BGM). Simulations focused on one objective (streamflow/LAI) and multiple objectives (streamflow and LAI) with different emphasis defined via the prior distribution of the model error parameters. Results show more reliable outputs for both predicted streamflow and LAI using Bayesian multi-objective calibration with specified prior distributions for error parameters based on results from the Pareto front in the ecohydrological modeling. The methodology implemented here provides insight into the usefulness of multiobjective Bayesian calibration for ecohydrologic systems and the importance of appropriate prior

  9. Metrics for evaluating performance and uncertainty of Bayesian network models

    Science.gov (United States)

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  10. Bayesian Estimation of the Logistic Positive Exponent IRT Model

    Science.gov (United States)

    Bolfarine, Heleno; Bazan, Jorge Luis

    2010-01-01

    A Bayesian inference approach using Markov Chain Monte Carlo (MCMC) is developed for the logistic positive exponent (LPE) model proposed by Samejima and for a new skewed Logistic Item Response Theory (IRT) model, named Reflection LPE model. Both models lead to asymmetric item characteristic curves (ICC) and can be appropriate because a symmetric…

  11. Modelling the cost-effectiveness of impact-absorbing flooring in Swedish residential care facilities.

    Science.gov (United States)

    Ryen, Linda; Svensson, Mikael

    2016-06-01

    Fall-related injuries among the elderly, specifically hip fractures, cause significant morbidity and mortality as well as imposing a substantial financial cost on the health care system. Impact-absorbing flooring has been advocated as an effective method for preventing hip fractures resulting from falls. This study identifies the cost-effectiveness of impact-absorbing flooring compared to standard flooring in residential care facilities for the elderly in a Swedish setting. An incremental cost-effectiveness analysis was performed comparing impact-absorbing flooring to standard flooring using a Markov decision model. A societal perspective was adopted and incremental costs were compared to incremental gains in quality-adjusted life years (QALYs). Data on costs, probability transitions and health-related quality of life measures were retrieved from the published literature and from Swedish register data. Probabilistic sensitivity analysis was performed through a Monte Carlo simulation. The base-case analysis indicates that the impact-absorbing flooring reduces costs and increases QALYs. When allowing for uncertainty we find that 60% of the simulations indicate that impact-absorbing flooring is cost-saving compared to standard flooring and an additional 20% that it has a cost per QALY below a commonly used threshold value : Using a modelling approach, we find that impact-absorbing flooring is a dominant strategy at the societal level considering that it can save resources and improve health in a vulnerable population. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  12. Using consensus bayesian network to model the reactive oxygen species regulatory pathway.

    Directory of Open Access Journals (Sweden)

    Liangdong Hu

    Full Text Available Bayesian network is one of the most successful graph models for representing the reactive oxygen species regulatory pathway. With the increasing number of microarray measurements, it is possible to construct the bayesian network from microarray data directly. Although large numbers of bayesian network learning algorithms have been developed, when applying them to learn bayesian networks from microarray data, the accuracies are low due to that the databases they used to learn bayesian networks contain too few microarray data. In this paper, we propose a consensus bayesian network which is constructed by combining bayesian networks from relevant literatures and bayesian networks learned from microarray data. It would have a higher accuracy than the bayesian networks learned from one database. In the experiment, we validated the bayesian network combination algorithm on several classic machine learning databases and used the consensus bayesian network to model the Escherichia coli's ROS pathway.

  13. Incorporating herd immunity effects into cohort models of vaccine cost-effectiveness.

    Science.gov (United States)

    Bauch, Chris T; Anonychuk, Andrea M; Van Effelterre, Thierry; Pham, Bá Z; Merid, Maraki Fikre

    2009-01-01

    Cohort models are often used in cost-effectiveness analysis (CEA) of vaccination. However, because they cannot capture herd immunity effects, cohort models underestimate the reduction in incidence caused by vaccination. Dynamic models capture herd immunity effects but are often not adopted in vaccine CEA. The objective was to develop a pseudo-dynamic approximation that can be incorporated into an existing cohort model to capture herd immunity effects. The authors approximated changing force of infection due to universal vaccination for a pediatric infectious disease. The projected lifetime cases in a cohort were compared under 1) a cohort model, 2) a cohort model with pseudo-dynamic approximation, and 3) an age-structured susceptible-exposed-infectious-recovered compartmental (dynamic) model. The authors extended the methodology to sexually transmitted infections. For average to high values of vaccine coverage (P > 60%) and small to average values of the basic reproduction number (R(0) vaccination programs for many common infections, the pseudo-dynamic approximation significantly improved projected lifetime cases and was close to projections of the full dynamic model. For large values of R(0) (R(0) > 15), projected lifetime cases were similar under the dynamic model and the cohort model, both with and without pseudo-dynamic approximation. The approximation captures changes in the mean age at infection in the 1st vaccinated cohort. This methodology allows for preliminary assessment of herd immunity effects on CEA of universal vaccination for pediatric infectious diseases. The method requires simple adjustments to an existing cohort model and less data than a full dynamic model.

  14. The application of a hierarchical Bayesian spatiotemporal model for ...

    Indian Academy of Sciences (India)

    2005.09.070. Sahu S K and Bakar K S 2012 Hierarchical bayesian autore- gressive models for large space-time data with application to ozone concentration modeling; Appl. Stochastic Models. Bus. Ind. 28 395–415, doi: 10.1002/asmb.1951.

  15. A Bayesian Infinite Hidden Markov Vector Autoregressive Model

    NARCIS (Netherlands)

    D. Nibbering (Didier); R. Paap (Richard); M. van der Wel (Michel)

    2016-01-01

    textabstractWe propose a Bayesian infinite hidden Markov model to estimate time-varying parameters in a vector autoregressive model. The Markov structure allows for heterogeneity over time while accounting for state-persistence. By modelling the transition distribution as a Dirichlet process mixture

  16. Maritime piracy situation modelling with dynamic Bayesian networks

    CSIR Research Space (South Africa)

    Dabrowski, James M

    2015-05-01

    Full Text Available A generative model for modelling maritime vessel behaviour is proposed. The model is a novel variant of the dynamic Bayesian network (DBN). The proposed DBN is in the form of a switching linear dynamic system (SLDS) that has been extended into a...

  17. Bayesian Plackett-Luce Mixture Models for Partially Ranked Data.

    Science.gov (United States)

    Mollica, Cristina; Tardella, Luca

    2017-06-01

    The elicitation of an ordinal judgment on multiple alternatives is often required in many psychological and behavioral experiments to investigate preference/choice orientation of a specific population. The Plackett-Luce model is one of the most popular and frequently applied parametric distributions to analyze rankings of a finite set of items. The present work introduces a Bayesian finite mixture of Plackett-Luce models to account for unobserved sample heterogeneity of partially ranked data. We describe an efficient way to incorporate the latent group structure in the data augmentation approach and the derivation of existing maximum likelihood procedures as special instances of the proposed Bayesian method. Inference can be conducted with the combination of the Expectation-Maximization algorithm for maximum a posteriori estimation and the Gibbs sampling iterative procedure. We additionally investigate several Bayesian criteria for selecting the optimal mixture configuration and describe diagnostic tools for assessing the fitness of ranking distributions conditionally and unconditionally on the number of ranked items. The utility of the novel Bayesian parametric Plackett-Luce mixture for characterizing sample heterogeneity is illustrated with several applications to simulated and real preference ranked data. We compare our method with the frequentist approach and a Bayesian nonparametric mixture model both assuming the Plackett-Luce model as a mixture component. Our analysis on real datasets reveals the importance of an accurate diagnostic check for an appropriate in-depth understanding of the heterogenous nature of the partial ranking data.

  18. The cost and impact of scaling up pre-exposure prophylaxis for HIV prevention: a systematic review of cost-effectiveness modelling studies

    NARCIS (Netherlands)

    Gomez, Gabriela B.; Borquez, Annick; Case, Kelsey K.; Wheelock, Ana; Vassall, Anna; Hankins, Catherine

    2013-01-01

    Cost-effectiveness studies inform resource allocation, strategy, and policy development. However, due to their complexity, dependence on assumptions made, and inherent uncertainty, synthesising, and generalising the results can be difficult. We assess cost-effectiveness models evaluating expected

  19. Approximate Bayesian computation (ABC) coupled with Bayesian model averaging method for estimating mean and standard deviation

    OpenAIRE

    Kwon, Deukwoo; Reis, Isildinha M.

    2016-01-01

    Background: We proposed approximate Bayesian computation with single distribution selection (ABC-SD) for estimating mean and standard deviation from other reported summary statistics. The ABC-SD generates pseudo data from a single parametric distribution thought to be the true distribution of underlying study data. This single distribution is either an educated guess, or it is selected via model selection using posterior probability criterion for testing two or more candidate distributions. F...

  20. Bayesian inference of chemical kinetic models from proposed reactions

    KAUST Repository

    Galagali, Nikhil

    2015-02-01

    © 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.

  1. Bayesian Subset Modeling for High-Dimensional Generalized Linear Models

    KAUST Repository

    Liang, Faming

    2013-06-01

    This article presents a new prior setting for high-dimensional generalized linear models, which leads to a Bayesian subset regression (BSR) with the maximum a posteriori model approximately equivalent to the minimum extended Bayesian information criterion model. The consistency of the resulting posterior is established under mild conditions. Further, a variable screening procedure is proposed based on the marginal inclusion probability, which shares the same properties of sure screening and consistency with the existing sure independence screening (SIS) and iterative sure independence screening (ISIS) procedures. However, since the proposed procedure makes use of joint information from all predictors, it generally outperforms SIS and ISIS in real applications. This article also makes extensive comparisons of BSR with the popular penalized likelihood methods, including Lasso, elastic net, SIS, and ISIS. The numerical results indicate that BSR can generally outperform the penalized likelihood methods. The models selected by BSR tend to be sparser and, more importantly, of higher prediction ability. In addition, the performance of the penalized likelihood methods tends to deteriorate as the number of predictors increases, while this is not significant for BSR. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  2. A mixture copula Bayesian network model for multimodal genomic data

    Directory of Open Access Journals (Sweden)

    Qingyang Zhang

    2017-04-01

    Full Text Available Gaussian Bayesian networks have become a widely used framework to estimate directed associations between joint Gaussian variables, where the network structure encodes the decomposition of multivariate normal density into local terms. However, the resulting estimates can be inaccurate when the normality assumption is moderately or severely violated, making it unsuitable for dealing with recent genomic data such as the Cancer Genome Atlas data. In the present paper, we propose a mixture copula Bayesian network model which provides great flexibility in modeling non-Gaussian and multimodal data for causal inference. The parameters in mixture copula functions can be efficiently estimated by a routine expectation–maximization algorithm. A heuristic search algorithm based on Bayesian information criterion is developed to estimate the network structure, and prediction can be further improved by the best-scoring network out of multiple predictions from random initial values. Our method outperforms Gaussian Bayesian networks and regular copula Bayesian networks in terms of modeling flexibility and prediction accuracy, as demonstrated using a cell signaling data set. We apply the proposed methods to the Cancer Genome Atlas data to study the genetic and epigenetic pathways that underlie serous ovarian cancer.

  3. Reviewing the evidence to inform the population of cost-effectiveness models within health technology assessments.

    Science.gov (United States)

    Kaltenthaler, Eva; Tappenden, Paul; Paisley, Suzy

    2013-01-01

    Health technology assessments (HTAs) typically require the development of a cost-effectiveness model, which necessitates the identification, selection, and use of other types of information beyond clinical effectiveness evidence to populate the model parameters. The reviewing activity associated with model development should be transparent and reproducible but can result in a tension between being both timely and systematic. Little procedural guidance exists in this area. The purpose of this article was to provide guidance, informed by focus groups, on what might constitute a systematic and transparent approach to reviewing information to populate model parameters. A focus group series was held with HTA experts in the United Kingdom including systematic reviewers, information specialists, and health economic modelers to explore these issues. Framework analysis was used to analyze the qualitative data elicited during focus groups. Suggestions included the use of rapid reviewing methods and the need to consider the trade-off between relevance and quality. The need for transparency in the reporting of review methods was emphasized. It was suggested that additional attention should be given to the reporting of parameters deemed to be more important to the model or where the preferred decision regarding the choice of evidence is equivocal. These recommendations form part of a Technical Support Document produced for the National Institute for Health and Clinical Excellence Decision Support Unit in the United Kingdom. It is intended that these recommendations will help to ensure a more systematic, transparent, and reproducible process for the review of model parameters within HTA. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. A decision analytic model to investigate the cost-effectiveness of poisoning prevention practices in households with young children

    Directory of Open Access Journals (Sweden)

    Felix Achana

    2016-08-01

    Full Text Available Abstract Background Systematic reviews and a network meta-analysis show home safety education with or without the provision of safety equipment is effective in promoting poison prevention behaviours in households with children. This paper compares the cost-effectiveness of home safety interventions to promote poison prevention practices. Methods A probabilistic decision-analytic model simulates healthcare costs and benefits for a hypothetical cohort of under 5 year olds. The model compares the cost-effectiveness of home safety education, home safety inspections, provision of free or low cost safety equipment and fitting of equipment. Analyses are conducted from a UK National Health Service and Personal Social Services perspective and expressed in 2012 prices. Results Education without safety inspection, provision or fitting of equipment was the most cost-effective strategy for promoting safe storage of medicines with an incremental cost-effectiveness ratio of £2888 (95 % credible interval (CrI £1990–£5774 per poison case avoided or £41,330 (95%CrI £20,007–£91,534 per QALY gained compared with usual care. Compared to usual care, home safety interventions were not cost-effective in promoting safe storage of other household products. Conclusion Education offers better value for money than more intensive but expensive strategies for preventing medicinal poisonings, but is only likely to be cost-effective at £30,000 per QALY gained for families in disadvantaged areas and for those with more than one child. There was considerable uncertainty in cost-effectiveness estimates due to paucity of evidence on model parameters. Policy makers should consider both costs and effectiveness of competing interventions to ensure efficient use of resources.

  5. Estimating the cost-effectiveness of lifestyle intervention programmes to prevent diabetes based on an example from Germany: Markov modelling

    Directory of Open Access Journals (Sweden)

    Neumann Anne

    2011-11-01

    Full Text Available Abstract Background Type 2 diabetes mellitus (T2D poses a large worldwide burden for health care systems. One possible tool to decrease this burden is primary prevention. As it is unethical to wait until perfect data are available to conclude whether T2D primary prevention intervention programmes are cost-effective, we need a model that simulates the effect of prevention initiatives. Thus, the aim of this study is to investigate the long-term cost-effectiveness of lifestyle intervention programmes for the prevention of T2D using a Markov model. As decision makers often face difficulties in applying health economic results, we visualise our results with health economic tools. Methods We use four-state Markov modelling with a probabilistic cohort analysis to calculate the cost per quality-adjusted life year (QALY gained. A one-year cycle length and a lifetime time horizon are applied. Best available evidence supplies the model with data on transition probabilities between glycaemic states, mortality risks, utility weights, and disease costs. The costs are calculated from a societal perspective. A 3% discount rate is used for costs and QALYs. Cost-effectiveness acceptability curves are presented to assist decision makers. Results The model indicates that diabetes prevention interventions have the potential to be cost-effective, but the outcome reveals a high level of uncertainty. Incremental cost-effectiveness ratios (ICERs were negative for the intervention, ie, the intervention leads to a cost reduction for men and women aged 30 or 50 years at initiation of the intervention. For men and women aged 70 at initiation of the intervention, the ICER was EUR27,546/QALY gained and EUR19,433/QALY gained, respectively. In all cases, the QALYs gained were low. Cost-effectiveness acceptability curves show that the higher the willingness-to-pay threshold value, the higher the probability that the intervention is cost-effective. Nonetheless, all curves are

  6. Accurate phenotyping: Reconciling approaches through Bayesian model averaging.

    Directory of Open Access Journals (Sweden)

    Carla Chia-Ming Chen

    Full Text Available Genetic research into complex diseases is frequently hindered by a lack of clear biomarkers for phenotype ascertainment. Phenotypes for such diseases are often identified on the basis of clinically defined criteria; however such criteria may not be suitable for understanding the genetic composition of the diseases. Various statistical approaches have been proposed for phenotype definition; however our previous studies have shown that differences in phenotypes estimated using different approaches have substantial impact on subsequent analyses. Instead of obtaining results based upon a single model, we propose a new method, using Bayesian model averaging to overcome problems associated with phenotype definition. Although Bayesian model averaging has been used in other fields of research, this is the first study that uses Bayesian model averaging to reconcile phenotypes obtained using multiple models. We illustrate the new method by applying it to simulated genetic and phenotypic data for Kofendred personality disorder-an imaginary disease with several sub-types. Two separate statistical methods were used to identify clusters of individuals with distinct phenotypes: latent class analysis and grade of membership. Bayesian model averaging was then used to combine the two clusterings for the purpose of subsequent linkage analyses. We found that causative genetic loci for the disease produced higher LOD scores using model averaging than under either individual model separately. We attribute this improvement to consolidation of the cores of phenotype clusters identified using each individual method.

  7. Involving stakeholders in building integrated fisheries models using Bayesian methods

    DEFF Research Database (Denmark)

    Haapasaari, Päivi Elisabet; Mäntyniemi, Samu; Kuikka, Sakari

    2013-01-01

    the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology...

  8. Validation & verification of a Bayesian network model for aircraft vulnerability

    CSIR Research Space (South Africa)

    Schietekat, Sunelle

    2016-09-01

    Full Text Available This paper provides a methodology for Validation and Verification (V&V) of a Bayesian Network (BN) model for aircraft vulnerability against Infrared (IR) missile threats. The model considers that the aircraft vulnerability depends both on a missile...

  9. Bayesian Network Models in Cyber Security: A Systematic Review

    NARCIS (Netherlands)

    Chockalingam, S.; Pieters, W.; Herdeiro Teixeira, A.M.; van Gelder, P.H.A.J.M.; Lipmaa, Helger; Mitrokotsa, Aikaterini; Matulevicius, Raimundas

    2017-01-01

    Bayesian Networks (BNs) are an increasingly popular modelling technique in cyber security especially due to their capability to overcome data limitations. This is also instantiated by the growth of BN models development in cyber security. However, a comprehensive comparison and analysis of these

  10. Cost-effectiveness of rotavirus vaccination in the Netherlands; the results of a consensus model

    NARCIS (Netherlands)

    Rozenbaum, M.H.; Mangen, M.J.J.; Giaquinto, C.; Wilschut, J.C.; Hak, E.; Postma, M.J.

    2011-01-01

    Background: Each year rotavirus gastroenteritis results in thousands of paediatric hospitalisations and primary care visits in the Netherlands. While two vaccines against rotavirus are registered, routine immunisation of infants has not yet been implemented. Existing cost-effectiveness studies

  11. Research & development and growth: A Bayesian model averaging analysis

    Czech Academy of Sciences Publication Activity Database

    Horváth, Roman

    2011-01-01

    Roč. 28, č. 6 (2011), s. 2669-2673 ISSN 0264-9993. [Society for Non-linear Dynamics and Econometrics Annual Conferencen. Washington DC, 16.03.2011-18.03.2011] R&D Projects: GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Keywords : Research and development * Growth * Bayesian model averaging Subject RIV: AH - Economics Impact factor: 0.701, year: 2011 http://library.utia.cas.cz/separaty/2011/E/horvath-research & development and growth a bayesian model averaging analysis.pdf

  12. Bayesian log-periodic model for financial crashes

    Science.gov (United States)

    Rodríguez-Caballero, Carlos Vladimir; Knapik, Oskar

    2014-10-01

    This paper introduces a Bayesian approach in econophysics literature about financial bubbles in order to estimate the most probable time for a financial crash to occur. To this end, we propose using noninformative prior distributions to obtain posterior distributions. Since these distributions cannot be performed analytically, we develop a Markov Chain Monte Carlo algorithm to draw from posterior distributions. We consider three Bayesian models that involve normal and Student's t-distributions in the disturbances and an AR(1)-GARCH(1,1) structure only within the first case. In the empirical part of the study, we analyze a well-known example of financial bubble - the S&P 500 1987 crash - to show the usefulness of the three methods under consideration and crashes of Merval-94, Bovespa-97, IPCMX-94, Hang Seng-97 using the simplest method. The novelty of this research is that the Bayesian models provide 95% credible intervals for the estimated crash time.

  13. Cost-effectiveness of dengue vaccination in Yucatán, Mexico using a dynamic dengue transmission model.

    Directory of Open Access Journals (Sweden)

    Eunha Shim

    Full Text Available The incidence of dengue fever (DF is steadily increasing in Mexico, burdening health systems with consequent morbidities and mortalities. On December 9th, 2015, Mexico became the first country for which the dengue vaccine was approved for use. In anticipation of a vaccine rollout, analysis of the cost-effectiveness of the dengue vaccination program that quantifies the dynamics of disease transmission is essential.We developed a dynamic transmission model of dengue in Yucatán, Mexico and its proposed vaccination program to incorporate herd immunity into our analysis of cost-effectiveness analysis. Our model also incorporates important characteristics of dengue epidemiology, such as clinical cross-immunity and susceptibility enhancement upon secondary infection. Using our model, we evaluated the cost-effectiveness and economic impact of an imperfect dengue vaccine in Yucatán, Mexico.Our study indicates that a dengue vaccination program would prevent 90% of cases of symptomatic DF incidence as well as 90% of dengue hemorrhagic fever (DHF incidence and dengue-related deaths annually. We conclude that a dengue vaccine program in Yucatán, Mexico would be very cost-effective as long as the vaccination cost per individual is less than $140 and $214 from health care and societal perspectives, respectively. Furthermore, at an exemplary vaccination cost of $250 USD per individual on average, dengue vaccination is likely to be cost-effective 43% and 88% of the time from health care and societal perspectives, respectively.

  14. Cost-effectiveness of dengue vaccination in Yucatán, Mexico using a dynamic dengue transmission model

    Science.gov (United States)

    Shim, Eunha

    2017-01-01

    Background The incidence of dengue fever (DF) is steadily increasing in Mexico, burdening health systems with consequent morbidities and mortalities. On December 9th, 2015, Mexico became the first country for which the dengue vaccine was approved for use. In anticipation of a vaccine rollout, analysis of the cost-effectiveness of the dengue vaccination program that quantifies the dynamics of disease transmission is essential. Methods We developed a dynamic transmission model of dengue in Yucatán, Mexico and its proposed vaccination program to incorporate herd immunity into our analysis of cost-effectiveness analysis. Our model also incorporates important characteristics of dengue epidemiology, such as clinical cross-immunity and susceptibility enhancement upon secondary infection. Using our model, we evaluated the cost-effectiveness and economic impact of an imperfect dengue vaccine in Yucatán, Mexico. Conclusions Our study indicates that a dengue vaccination program would prevent 90% of cases of symptomatic DF incidence as well as 90% of dengue hemorrhagic fever (DHF) incidence and dengue-related deaths annually. We conclude that a dengue vaccine program in Yucatán, Mexico would be very cost-effective as long as the vaccination cost per individual is less than $140 and $214 from health care and societal perspectives, respectively. Furthermore, at an exemplary vaccination cost of $250 USD per individual on average, dengue vaccination is likely to be cost-effective 43% and 88% of the time from health care and societal perspectives, respectively. PMID:28380060

  15. The Cost Effectiveness of Psychological and Pharmacological Interventions for Social Anxiety Disorder: A Model-Based Economic Analysis.

    Directory of Open Access Journals (Sweden)

    Ifigeneia Mavranezouli

    Full Text Available Social anxiety disorder is one of the most persistent and common anxiety disorders. Individually delivered psychological therapies are the most effective treatment options for adults with social anxiety disorder, but they are associated with high intervention costs. Therefore, the objective of this study was to assess the relative cost effectiveness of a variety of psychological and pharmacological interventions for adults with social anxiety disorder.A decision-analytic model was constructed to compare costs and quality adjusted life years (QALYs of 28 interventions for social anxiety disorder from the perspective of the British National Health Service and personal social services. Efficacy data were derived from a systematic review and network meta-analysis. Other model input parameters were based on published literature and national sources, supplemented by expert opinion.Individual cognitive therapy was the most cost-effective intervention for adults with social anxiety disorder, followed by generic individual cognitive behavioural therapy (CBT, phenelzine and book-based self-help without support. Other drugs, group-based psychological interventions and other individually delivered psychological interventions were less cost-effective. Results were influenced by limited evidence suggesting superiority of psychological interventions over drugs in retaining long-term effects. The analysis did not take into account side effects of drugs.Various forms of individually delivered CBT appear to be the most cost-effective options for the treatment of adults with social anxiety disorder. Consideration of side effects of drugs would only strengthen this conclusion, as it would improve even further the cost effectiveness of individually delivered CBT relative to phenelzine, which was the next most cost-effective option, due to the serious side effects associated with phenelzine. Further research needs to determine more accurately the long

  16. The Cost Effectiveness of Psychological and Pharmacological Interventions for Social Anxiety Disorder: A Model-Based Economic Analysis

    Science.gov (United States)

    Mavranezouli, Ifigeneia; Mayo-Wilson, Evan; Dias, Sofia; Kew, Kayleigh; Clark, David M.; Ades, A. E.; Pilling, Stephen

    2015-01-01

    Background Social anxiety disorder is one of the most persistent and common anxiety disorders. Individually delivered psychological therapies are the most effective treatment options for adults with social anxiety disorder, but they are associated with high intervention costs. Therefore, the objective of this study was to assess the relative cost effectiveness of a variety of psychological and pharmacological interventions for adults with social anxiety disorder. Methods A decision-analytic model was constructed to compare costs and quality adjusted life years (QALYs) of 28 interventions for social anxiety disorder from the perspective of the British National Health Service and personal social services. Efficacy data were derived from a systematic review and network meta-analysis. Other model input parameters were based on published literature and national sources, supplemented by expert opinion. Results Individual cognitive therapy was the most cost-effective intervention for adults with social anxiety disorder, followed by generic individual cognitive behavioural therapy (CBT), phenelzine and book-based self-help without support. Other drugs, group-based psychological interventions and other individually delivered psychological interventions were less cost-effective. Results were influenced by limited evidence suggesting superiority of psychological interventions over drugs in retaining long-term effects. The analysis did not take into account side effects of drugs. Conclusion Various forms of individually delivered CBT appear to be the most cost-effective options for the treatment of adults with social anxiety disorder. Consideration of side effects of drugs would only strengthen this conclusion, as it would improve even further the cost effectiveness of individually delivered CBT relative to phenelzine, which was the next most cost-effective option, due to the serious side effects associated with phenelzine. Further research needs to determine more accurately

  17. Modeling error distributions of growth curve models through Bayesian methods.

    Science.gov (United States)

    Zhang, Zhiyong

    2016-06-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided.

  18. Orlistat for the treatment of obesity: rapid review and cost-effectiveness model.

    Science.gov (United States)

    Foxcroft, D R; Milne, R

    2000-10-01

    The aim of this study is to clarify the potential benefits, disbenefits and costs of Orlistat for the treatment of obesity. The method was a search for relevant systematic reviews and randomized controlled trials, in Medline, Pre-Medline, Embase and the Cochrane Library, using Orlistat and its synonyms. Identified trials were appraised using a standard appraisal checklist and trial data were extracted for use in cost-effectiveness modelling. Three large multicentre, randomized placebo controlled trials were included in the rapid review. On average, Orlistat results in obese people losing an additional 3-4% of their initial body weight over diet alone during a 2 year period. There was no strong evidence that this short-term weight loss would have a longer-term impact on morbidity and mortality. The cost utility of Orlistat treatment was estimated at around 46,000 Pounds per Quality Adjusted Life Year gained (extreme values sensitivity analysis 14,000 Pounds to 132,000 Pounds). This rapid review raises some important questions about the potential value of Orlistat in the treatment of obesity. Further research is needed, not only to clarify the longer-term impact of Orlistat treatment, but also to uncover the longer-term impact on mortality and morbidity from short-term weight loss.

  19. Modeling the Cost Effectiveness of Neuroimaging-Based Treatment of Acute Wake-Up Stroke.

    Directory of Open Access Journals (Sweden)

    Ankur Pandya

    Full Text Available Thrombolytic treatment (tissue-type plasminogen activator [tPA] is only recommended for acute ischemic stroke patients with stroke onset time 4.5 hours, 46.3% experienced a good stroke outcome. Lifetime discounted QALYs and costs were 5.312 and $88,247 for the no treatment strategy and 5.342 and $90,869 for the MRI-based strategy, resulting in an ICER of $88,000/QALY. Results were sensitive to variations in patient- and provider-specific factors such as sleep duration, hospital travel and door-to-needle times, as well as onset probability distribution, MRI specificity, and mRS utility values.Our model-based findings suggest that an MRI-based treatment strategy for this population could be cost-effective and quantifies the impact that patient- and provider-specific factors, such as sleep duration, hospital travel and door-to-needle times, could have on the optimal decision for wake-up stroke patients.

  20. The Brazilian Unified National Health System: Proposal of a Cost-effectiveness Evaluation Model

    Directory of Open Access Journals (Sweden)

    Lilian Ribeiro de Oliveira

    2016-04-01

    Full Text Available The Brazilian Unified National Health System (Sistema Único de Saúde [SUS] is in a prominent position compared to the existing social policies. One of the new tools used by SUS is known as Performance Index of the Unified Health System (Índice de Desempenho do Sistema Único de Saúde [IDSUS], which is intended to measure the performance of each municipality. Therefore, the aim of this study was to propose a model of cost-effectiveness to compare IDSUS performance against total revenues achieved in Homogeneous Group 2, consisting of 94 municipalities and analysed using data from IDSUS and the System Information of the Public Budget for Health Care (Sistema de Informação do Orçamento Público em Saúde [SIOPS] for the year 2011. After structuring this data, we carried out descriptive statistical and cluster analysis in order to group similar municipalities in accordance with established variables: IDSUS performance, population and total revenue in health per capita. Even with the division of municipalities into homogeneous groups and after using variables such as population and revenue to regroup them, the results showed there are municipalities with heterogeneous characteristics. Another finding is in the use and intersection of two distinct databases (IDSUS and SIOPS, which allowed for visualizing the impact of health care revenue on the municipalities performance.

  1. The modeled cost-effectiveness of family-based and adolescent-focused treatment for anorexia nervosa.

    Science.gov (United States)

    Le, Long Khanh-Dao; Barendregt, Jan J; Hay, Phillipa; Sawyer, Susan M; Hughes, Elizabeth K; Mihalopoulos, Cathrine

    2017-12-01

    Anorexia nervosa (AN) is a prevalent, serious mental disorder. We aimed to evaluate the cost-effectiveness of family-based treatment (FBT) compared to adolescent-focused individual therapy (AFT) or no intervention within the Australian healthcare system. A Markov model was developed to estimate the cost and disability-adjusted life-year (DALY) averted of FBT relative to comparators over 6 years from the health system perspective. The target population was 11-18 year olds with AN of relatively short duration. Uncertainty and sensitivity analyses were conducted to test model assumptions. Results are reported as incremental cost-effectiveness ratios (ICER) in 2013 Australian dollars per DALY averted. FBT was less costly than AFT. Relative to no intervention, the mean ICER of FBT and AFT was $5,089 (95% uncertainty interval (UI): dominant to $16,659) and $51,897 ($21,591 to $1,712,491) per DALY averted. FBT and AFT are 100% and 45% likely to be cost-effective, respectively, at a threshold of AUD$50,000 per DALY averted. Sensitivity analyses indicated that excluding hospital costs led to increases in the ICERs but the conclusion of the study did not change. FBT is the most cost-effective among treatment arms, whereas AFT was not cost-effective compared to no intervention. Further research is required to verify this result. © 2017 Wiley Periodicals, Inc.

  2. Development of dynamic Bayesian models for web application test management

    Science.gov (United States)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  3. Cost-effectiveness of total hip and knee replacements for the Australian population with osteoarthritis: discrete-event simulation model.

    Directory of Open Access Journals (Sweden)

    Hideki Higashi

    Full Text Available BACKGROUND: Osteoarthritis constitutes a major musculoskeletal burden for the aged Australians. Hip and knee replacement surgeries are effective interventions once all conservative therapies to manage the symptoms have been exhausted. This study aims to evaluate the cost-effectiveness of hip and knee replacements in Australia. To our best knowledge, the study is the first attempt to account for the dual nature of hip and knee osteoarthritis in modelling the severities of right and left joints separately. METHODOLOGY/PRINCIPAL FINDINGS: We developed a discrete-event simulation model that follows up the individuals with osteoarthritis over their lifetimes. The model defines separate attributes for right and left joints and accounts for several repeat replacements. The Australian population with osteoarthritis who were 40 years of age or older in 2003 were followed up until extinct. Intervention effects were modelled by means of disability-adjusted life-years (DALYs averted. Both hip and knee replacements are highly cost effective (AUD 5,000 per DALY and AUD 12,000 per DALY respectively under an AUD 50,000/DALY threshold level. The exclusion of cost offsets, and inclusion of future unrelated health care costs in extended years of life, did not change the findings that the interventions are cost-effective (AUD 17,000 per DALY and AUD 26,000 per DALY respectively. However, there was a substantial difference between hip and knee replacements where surgeries administered for hips were more cost-effective than for knees. CONCLUSIONS/SIGNIFICANCE: Both hip and knee replacements are cost-effective interventions to improve the quality of life of people with osteoarthritis. It was also shown that the dual nature of hip and knee OA should be taken into account to provide more accurate estimation on the cost-effectiveness of hip and knee replacements.

  4. Bayesian log-periodic model for financial crashes

    DEFF Research Database (Denmark)

    Rodríguez-Caballero, Carlos Vladimir; Knapik, Oskar

    2014-01-01

    This paper introduces a Bayesian approach in econophysics literature about financial bubbles in order to estimate the most probable time for a financial crash to occur. To this end, we propose using noninformative prior distributions to obtain posterior distributions. Since these distributions...... cannot be performed analytically, we develop a Markov Chain Monte Carlo algorithm to draw from posterior distributions. We consider three Bayesian models that involve normal and Student’s t-distributions in the disturbances and an AR(1)-GARCH(1,1) structure only within the first case. In the empirical...... part of the study, we analyze a well-known example of financial bubble – the S&P 500 1987 crash – to show the usefulness of the three methods under consideration and crashes of Merval-94, Bovespa-97, IPCMX-94, Hang Seng-97 using the simplest method. The novelty of this research is that the Bayesian...

  5. Spatial and spatio-temporal bayesian models with R - INLA

    CERN Document Server

    Blangiardo, Marta

    2015-01-01

    Dedication iiiPreface ix1 Introduction 11.1 Why spatial and spatio-temporal statistics? 11.2 Why do we use Bayesian methods for modelling spatial and spatio-temporal structures? 21.3 Why INLA? 31.4 Datasets 32 Introduction to 212.1 The language 212.2 objects 222.3 Data and session management 342.4 Packages 352.5 Programming in 362.6 Basic statistical analysis with 393 Introduction to Bayesian Methods 533.1 Bayesian Philosophy 533.2 Basic Probability Elements 573.3 Bayes Theorem 623.4 Prior and Posterior Distributions 643.5 Working with the Posterior Distribution 663.6 Choosing the Prior Distr

  6. A hierarchical bayesian model to quantify uncertainty of stream water temperature forecasts.

    Directory of Open Access Journals (Sweden)

    Guillaume Bal

    Full Text Available Providing generic and cost effective modelling approaches to reconstruct and forecast freshwater temperature using predictors as air temperature and water discharge is a prerequisite to understanding ecological processes underlying the impact of water temperature and of global warming on continental aquatic ecosystems. Using air temperature as a simple linear predictor of water temperature can lead to significant bias in forecasts as it does not disentangle seasonality and long term trends in the signal. Here, we develop an alternative approach based on hierarchical Bayesian statistical time series modelling of water temperature, air temperature and water discharge using seasonal sinusoidal periodic signals and time varying means and amplitudes. Fitting and forecasting performances of this approach are compared with that of simple linear regression between water and air temperatures using i an emotive simulated example, ii application to three French coastal streams with contrasting bio-geographical conditions and sizes. The time series modelling approach better fit data and does not exhibit forecasting bias in long term trends contrary to the linear regression. This new model also allows for more accurate forecasts of water temperature than linear regression together with a fair assessment of the uncertainty around forecasting. Warming of water temperature forecast by our hierarchical Bayesian model was slower and more uncertain than that expected with the classical regression approach. These new forecasts are in a form that is readily usable in further ecological analyses and will allow weighting of outcomes from different scenarios to manage climate change impacts on freshwater wildlife.

  7. Bayesian inference model for fatigue life of laminated composites

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der; Berggreen, Christian

    2016-01-01

    A probabilistic model for estimating the fatigue life of laminated composite plates is developed. The model is based on lamina-level input data, making it possible to predict fatigue properties for a wide range of laminate configurations. Model parameters are estimated by Bayesian inference....... The reference data used consists of constant-amplitude cycle test results for four laminates with different layup configurations. The paper describes the modeling techniques and the parameter estimation procedure, supported by an illustrative application....

  8. Hierarchical Bayesian spatial models for multispecies conservation planning and monitoring.

    Science.gov (United States)

    Carroll, Carlos; Johnson, Devin S; Dunk, Jeffrey R; Zielinski, William J

    2010-12-01

    Biologists who develop and apply habitat models are often familiar with the statistical challenges posed by their data's spatial structure but are unsure of whether the use of complex spatial models will increase the utility of model results in planning. We compared the relative performance of nonspatial and hierarchical Bayesian spatial models for three vertebrate and invertebrate taxa of conservation concern (Church's sideband snails [Monadenia churchi], red tree voles [Arborimus longicaudus], and Pacific fishers [Martes pennanti pacifica]) that provide examples of a range of distributional extents and dispersal abilities. We used presence-absence data derived from regional monitoring programs to develop models with both landscape and site-level environmental covariates. We used Markov chain Monte Carlo algorithms and a conditional autoregressive or intrinsic conditional autoregressive model framework to fit spatial models. The fit of Bayesian spatial models was between 35 and 55% better than the fit of nonspatial analogue models. Bayesian spatial models outperformed analogous models developed with maximum entropy (Maxent) methods. Although the best spatial and nonspatial models included similar environmental variables, spatial models provided estimates of residual spatial effects that suggested how ecological processes might structure distribution patterns. Spatial models built from presence-absence data improved fit most for localized endemic species with ranges constrained by poorly known biogeographic factors and for widely distributed species suspected to be strongly affected by unmeasured environmental variables or population processes. By treating spatial effects as a variable of interest rather than a nuisance, hierarchical Bayesian spatial models, especially when they are based on a common broad-scale spatial lattice (here the national Forest Inventory and Analysis grid of 24 km(2) hexagons), can increase the relevance of habitat models to multispecies

  9. Bayesian Model Comparison With the g-Prior

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan

    2014-01-01

    Model comparison and selection is an important problem in many model-based signal processing applications. Often, very simple information criteria such as the Akaike information criterion or the Bayesian information criterion are used despite their shortcomings. Compared to these methods, Djuric...... demonstrate that our proposed model comparison and selection rules outperform the traditional information criteria both in terms of detecting the true model and in terms of predicting unobserved data. The simulation code is available online....

  10. A Bayesian network approach to coastal storm impact modeling

    NARCIS (Netherlands)

    Jäger, W.S.; Den Heijer, C.; Bolle, A.; Hanea, A.M.

    2015-01-01

    In this paper we develop a Bayesian network (BN) that relates offshore storm conditions to their accompagnying flood characteristics and damages to residential buildings, following on the trend of integrated flood impact modeling. It is based on data from hydrodynamic storm simulations, information

  11. Bayesian model discrimination for glucose-insulin homeostasis

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Brooks, Stephen P.; Højbjerre, Malene

    In this paper we analyse a set of experimental data on a number of healthy and diabetic patients and discuss a variety of models for describing the physiological processes involved in glucose absorption and insulin secretion within the human body. We adopt a Bayesian approach which facilitates th...

  12. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    2000-01-01

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...

  13. Shortlist B: A Bayesian model of continuous speech recognition

    NARCIS (Netherlands)

    Norris, D.; McQueen, J.M.

    2008-01-01

    A Bayesian model of continuous speech recognition is presented. It is based on Shortlist (D. Norris, 1994; D. Norris, J. M. McQueen, A. Cutler, & S. Butterfield, 1997) and shares many of its key assumptions: parallel competitive evaluation of multiple lexical hypotheses, phonologically abstract

  14. Shortlist B: A Bayesian Model of Continuous Speech Recognition

    Science.gov (United States)

    Norris, Dennis; McQueen, James M.

    2008-01-01

    A Bayesian model of continuous speech recognition is presented. It is based on Shortlist (D. Norris, 1994; D. Norris, J. M. McQueen, A. Cutler, & S. Butterfield, 1997) and shares many of its key assumptions: parallel competitive evaluation of multiple lexical hypotheses, phonologically abstract prelexical and lexical representations, a feedforward…

  15. Bayesian nonparametric estimation of hazard rate in monotone Aalen model

    Czech Academy of Sciences Publication Activity Database

    Timková, Jana

    2014-01-01

    Roč. 50, č. 6 (2014), s. 849-868 ISSN 0023-5954 Institutional support: RVO:67985556 Keywords : Aalen model * Bayesian estimation * MCMC Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.541, year: 2014 http://library.utia.cas.cz/separaty/2014/SI/timkova-0438210.pdf

  16. Efficient Bayesian Estimation and Combination of GARCH-Type Models

    NARCIS (Netherlands)

    D. David (David); L.F. Hoogerheide (Lennart)

    2010-01-01

    textabstractThis paper proposes an up-to-date review of estimation strategies available for the Bayesian inference of GARCH-type models. The emphasis is put on a novel efficient procedure named AdMitIS. The methodology automatically constructs a mixture of Student-t distributions as an approximation

  17. Revision Arthroscopic Repair Versus Latarjet Procedure in Patients With Recurrent Instability After Initial Repair Attempt: A Cost-Effectiveness Model.

    Science.gov (United States)

    Makhni, Eric C; Lamba, Nayan; Swart, Eric; Steinhaus, Michael E; Ahmad, Christopher S; Romeo, Anthony A; Verma, Nikhil N

    2016-09-01

    To compare the cost-effectiveness of arthroscopic revision instability repair and Latarjet procedure in treating patients with recurrent instability after initial arthroscopic instability repair. An expected-value decision analysis of revision arthroscopic instability repair compared with Latarjet procedure for recurrent instability followed by failed repair attempt was modeled. Inputs regarding procedure cost, clinical outcomes, and health utilities were derived from the literature. Compared with revision arthroscopic repair, Latarjet was less expensive ($13,672 v $15,287) with improved clinical outcomes (43.78 v 36.76 quality-adjusted life-years). Both arthroscopic repair and Latarjet were cost-effective compared with nonoperative treatment (incremental cost-effectiveness ratios of 3,082 and 1,141, respectively). Results from sensitivity analyses indicate that under scenarios of high rates of stability postoperatively, along with improved clinical outcome scores, revision arthroscopic repair becomes increasingly cost-effective. Latarjet procedure for failed instability repair is a cost-effective treatment option, with lower costs and improved clinical outcomes compared with revision arthroscopic instability repair. However, surgeons must still incorporate clinical judgment into treatment algorithm formation. Level IV, expected value decision analysis. Copyright © 2016. Published by Elsevier Inc.

  18. Bayesian estimation of parameters in a regional hydrological model

    Directory of Open Access Journals (Sweden)

    K. Engeland

    2002-01-01

    Full Text Available This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC analysis. The Bayesian method requires formulation of a likelihood function for the parameters and three alternative formulations are used. The first is a subjectively chosen objective function that describes the goodness of fit between the simulated and observed streamflow, as defined in the GLUE framework. The second and third formulations are more statistically correct likelihood models that describe the simulation errors. The full statistical likelihood model describes the simulation errors as an AR(1 process, whereas the simple model excludes the auto-regressive part. The statistical parameters depend on the catchments and the hydrological processes and the statistical and the hydrological parameters are estimated simultaneously. The results show that the simple likelihood model gives the most robust parameter estimates. The simulation error may be explained to a large extent by the catchment characteristics and climatic conditions, so it is possible to transfer knowledge about them to ungauged catchments. The statistical models for the simulation errors indicate that structural errors in the model are more important than parameter uncertainties. Keywords: regional hydrological model, model uncertainty, Bayesian analysis, Markov Chain Monte Carlo analysis

  19. A Bayesian Markov geostatistical model for estimation of hydrogeological properties

    International Nuclear Information System (INIS)

    Rosen, L.; Gustafson, G.

    1996-01-01

    A geostatistical methodology based on Markov-chain analysis and Bayesian statistics was developed for probability estimations of hydrogeological and geological properties in the siting process of a nuclear waste repository. The probability estimates have practical use in decision-making on issues such as siting, investigation programs, and construction design. The methodology is nonparametric which makes it possible to handle information that does not exhibit standard statistical distributions, as is often the case for classified information. Data do not need to meet the requirements on additivity and normality as with the geostatistical methods based on regionalized variable theory, e.g., kriging. The methodology also has a formal way for incorporating professional judgments through the use of Bayesian statistics, which allows for updating of prior estimates to posterior probabilities each time new information becomes available. A Bayesian Markov Geostatistical Model (BayMar) software was developed for implementation of the methodology in two and three dimensions. This paper gives (1) a theoretical description of the Bayesian Markov Geostatistical Model; (2) a short description of the BayMar software; and (3) an example of application of the model for estimating the suitability for repository establishment with respect to the three parameters of lithology, hydraulic conductivity, and rock quality designation index (RQD) at 400--500 meters below ground surface in an area around the Aespoe Hard Rock Laboratory in southeastern Sweden

  20. The Albufera Initiative for Biodiversity: a cost effective model for integrating science and volunteer participation in coastal protected area management

    NARCIS (Netherlands)

    Riddiford, N.J.; Veraart, J.A.; Férriz, I.; Owens, N.W.; Royo, L.; Honey, M.R.

    2014-01-01

    This paper puts forward a multi-disciplinary field project, set up in 1989 at the Parc Natural de s’Albufera in Mallorca, Balearic Islands, Spain, as an example of a cost effective model for integrating science and volunteer participation in a coastal protected area. Outcomes include the provision

  1. Bayesian Dimensionality Assessment for the Multidimensional Nominal Response Model

    Directory of Open Access Journals (Sweden)

    Javier Revuelta

    2017-06-01

    Full Text Available This article introduces Bayesian estimation and evaluation procedures for the multidimensional nominal response model. The utility of this model is to perform a nominal factor analysis of items that consist of a finite number of unordered response categories. The key aspect of the model, in comparison with traditional factorial model, is that there is a slope for each response category on the latent dimensions, instead of having slopes associated to the items. The extended parameterization of the multidimensional nominal response model requires large samples for estimation. When sample size is of a moderate or small size, some of these parameters may be weakly empirically identifiable and the estimation algorithm may run into difficulties. We propose a Bayesian MCMC inferential algorithm to estimate the parameters and the number of dimensions underlying the multidimensional nominal response model. Two Bayesian approaches to model evaluation were compared: discrepancy statistics (DIC, WAICC, and LOO that provide an indication of the relative merit of different models, and the standardized generalized discrepancy measure that requires resampling data and is computationally more involved. A simulation study was conducted to compare these two approaches, and the results show that the standardized generalized discrepancy measure can be used to reliably estimate the dimensionality of the model whereas the discrepancy statistics are questionable. The paper also includes an example with real data in the context of learning styles, in which the model is used to conduct an exploratory factor analysis of nominal data.

  2. Cost-Effectiveness of Interventions for Chronic Obstructive Pulmonary Disease (COPD) Using an Ontario Policy Model

    Science.gov (United States)

    Chandra, K; Blackhouse, G; McCurdy, BR; Bornstein, M; Campbell, K; Costa, V; Franek, J; Kaulback, K; Levin, L; Sehatzadeh, S; Sikich, N; Thabane, M; Goeree, R

    2012-01-01

    Pulmonary Disease (COPD): An Evidence-Based Analysis Pulmonary Rehabilitation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Long-Term Oxygen Therapy for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Noninvasive Positive Pressure Ventilation for Acute Respiratory Failure Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Noninvasive Positive Pressure Ventilation for Chronic Respiratory Failure Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Hospital-at-Home Programs for Patients With Acute Exacerbations of Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Home Telehealth for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Cost-Effectiveness of Interventions for Chronic Obstructive Pulmonary Disease Using an Ontario Policy Model Experiences of Living and Dying With COPD: A Systematic Review and Synthesis of the Qualitative Empirical Literature For more information on the qualitative review, please contact Mita Giacomini at: http://fhs.mcmaster.ca/ceb/faculty_member_giacomini.htm. For more information on the economic analysis, please visit the PATH website: http://www.path-hta.ca/About-Us/Contact-Us.aspx. The Toronto Health Economics and Technology Assessment (THETA) collaborative has produced an associated report on patient preference for mechanical ventilation. For more information, please visit the THETA website: http://theta.utoronto.ca/static/contact. Background Chronic obstructive pulmonary disease (COPD) is characterized by chronic inflammation throughout the airways, parenchyma, and pulmonary vasculature. The inflammation causes repeated cycles of injury and repair in the airway wall— inflammatory cells release a variety of chemicals and lead to cellular damage. The inflammation process also contributes to the loss of elastic recoil pressure

  3. Copula Based Factorization in Bayesian Multivariate Infinite Mixture Models

    OpenAIRE

    Martin Burda; Artem Prokhorov

    2012-01-01

    Bayesian nonparametric models based on infinite mixtures of density kernels have been recently gaining in popularity due to their flexibility and feasibility of implementation even in complicated modeling scenarios. In economics, they have been particularly useful in estimating nonparametric distributions of latent variables. However, these models have been rarely applied in more than one dimension. Indeed, the multivariate case suffers from the curse of dimensionality, with a rapidly increas...

  4. Cost-effectiveness model comparing olanzapine and other oral atypical antipsychotics in the treatment of schizophrenia in the United States

    Directory of Open Access Journals (Sweden)

    Smolen Lee J

    2009-04-01

    Full Text Available Abstract Background Schizophrenia is often a persistent and costly illness that requires continued treatment with antipsychotics. Differences among antipsychotics on efficacy, safety, tolerability, adherence, and cost have cost-effectiveness implications for treating schizophrenia. This study compares the cost-effectiveness of oral olanzapine, oral risperidone (at generic cost, primary comparator, quetiapine, ziprasidone, and aripiprazole in the treatment of patients with schizophrenia from the perspective of third-party payers in the U.S. health care system. Methods A 1-year microsimulation economic decision model, with quarterly cycles, was developed to simulate the dynamic nature of usual care of schizophrenia patients who switch, continue, discontinue, and restart their medications. The model captures clinical and cost parameters including adherence levels, relapse with and without hospitalization, quality-adjusted life years (QALYs, treatment discontinuation by reason, treatment-emergent adverse events, suicide, health care resource utilization, and direct medical care costs. Published medical literature and a clinical expert panel were used to develop baseline model assumptions. Key model outcomes included mean annual total direct cost per treatment, cost per stable patient, and incremental cost-effectiveness values per QALY gained. Results The results of the microsimulation model indicated that olanzapine had the lowest mean annual direct health care cost ($8,544 followed by generic risperidone ($9,080. In addition, olanzapine resulted in more QALYs than risperidone (0.733 vs. 0.719. The base case and multiple sensitivity analyses found olanzapine to be the dominant choice in terms of incremental cost-effectiveness per QALY gained. Conclusion The utilization of olanzapine is predicted in this model to result in better clinical outcomes and lower total direct health care costs compared to generic risperidone, quetiapine, ziprasidone, and

  5. Integrated modelling of risk and uncertainty underlying the selection of cost-effective water quality measures

    NARCIS (Netherlands)

    Brouwer, R.; de Blois, C.

    2008-01-01

    In this paper we present an overview of the most important sources of uncertainty when analysing the least cost way to improve water quality. The estimation of the cost-effectiveness of water quality measures is surrounded by environmental, economic and political uncertainty. These different types

  6. Systematic screening for Chlamydia trachomatis : Estimating cost-effectiveness using dynamic modeling and Dutch data

    NARCIS (Netherlands)

    de Vries, R.; Van Bergen, J.E.A.M.; de Jong-van den Berg, Lolkje; Postma, Maarten

    2006-01-01

    To estimate the cost-effectiveness of a systematic one-off Chlamydia trachomatis (CT) screening program including partner treatment for Dutch young adults. Data on infection prevalence, participation rates, and sexual behavior were obtained from a large pilot study conducted in The Netherlands.

  7. Systematic screening for Chlamydia trachomatis: estimating cost-effectiveness using dynamic modeling and Dutch data

    NARCIS (Netherlands)

    de Vries, Robin; van Bergen, Jan E. A. M.; de Jong-van den Berg, Lolkje T. W.; Postma, Maarten J.

    2006-01-01

    To estimate the cost-effectiveness of a systematic one-off Chlamydia trachomatis (CT) screening program including partner treatment for Dutch young adults. Data on infection prevalence, participation rates, and sexual behavior were obtained from a large pilot study conducted in The Netherlands.

  8. Cost-effectiveness of primary prevention of paediatric asthma: a decision-analytic model

    NARCIS (Netherlands)

    Ramos, G. Feljandro P.; van Asselt, Antoinette D. I.; Kuiper, Sandra; Severens, Johan L.; Maas, Tanja; Dompeling, Edward; Knottnerus, J. André; van Schayck, Onno C. P.

    2013-01-01

    Background: Many children stand to benefit from being asthma-free for life with primary (i.e., prenatally started) prevention addressing one environmental exposure in a unifaceted (UF) approach or at least two in a multifaceted (MF) approach. We assessed the cost-effectiveness of primary prevention

  9. From intermediate to final behavioral endpoints; Modeling cognitions in (cost-)effectiveness analyses in health promotion

    NARCIS (Netherlands)

    Prenger, Hendrikje Cornelia

    2012-01-01

    Cost-effectiveness analyses (CEAs) are considered an increasingly important tool in health promotion and psychology. In health promotion adequate effectiveness data of innovative interventions are often lacking. In case of many promising interventions the available data are inadequate for CEAs due

  10. Cost-effectiveness of a multidisciplinary intervention model for community-dwelling frail older people.

    NARCIS (Netherlands)

    Melis, R.J.F.; Adang, E.M.M.; Teerenstra, S.; Eijken, M.I.J. van; Wimo, A.; Achterberg, T. van; Lisdonk, E.H. van de; Olde Rikkert, M.G.M.

    2008-01-01

    BACKGROUND: There is growing interest in geriatric care for community-dwelling older people. There are, however, relatively few reports on the economics of this type of care. This article reports about the cost-effectiveness of the Dutch Geriatric Intervention Program (DGIP) compared to usual care

  11. Comparison of two dose and three dose human papillomavirus vaccine schedules: cost effectiveness analysis based on transmission model.

    Science.gov (United States)

    Jit, Mark; Brisson, Marc; Laprise, Jean-François; Choi, Yoon Hong

    2015-01-06

    To investigate the incremental cost effectiveness of two dose human papillomavirus vaccination and of additionally giving a third dose. Cost effectiveness study based on a transmission dynamic model of human papillomavirus vaccination. Two dose schedules for bivalent or quadrivalent human papillomavirus vaccines were assumed to provide 10, 20, or 30 years' vaccine type protection and cross protection or lifelong vaccine type protection without cross protection. Three dose schedules were assumed to give lifelong vaccine type and cross protection. United Kingdom. Males and females aged 12-74 years. No, two, or three doses of human papillomavirus vaccine given routinely to 12 year old girls, with an initial catch-up campaign to 18 years. Costs (from the healthcare provider's perspective), health related utilities, and incremental cost effectiveness ratios. Giving at least two doses of vaccine seems to be highly cost effective across the entire range of scenarios considered at the quadrivalent vaccine list price of £86.50 (€109.23; $136.00) per dose. If two doses give only 10 years' protection but adding a third dose extends this to lifetime protection, then the third dose also seems to be cost effective at £86.50 per dose (median incremental cost effectiveness ratio £17,000, interquartile range £11,700-£25,800). If two doses protect for more than 20 years, then the third dose will have to be priced substantially lower (median threshold price £31, interquartile range £28-£35) to be cost effective. Results are similar for a bivalent vaccine priced at £80.50 per dose and when the same scenarios are explored by parameterising a Canadian model (HPV-ADVISE) with economic data from the United Kingdom. Two dose human papillomavirus vaccine schedules are likely to be the most cost effective option provided protection lasts for at least 20 years. As the precise duration of two dose schedules may not be known for decades, cohorts given two doses should be closely

  12. Modeling the cost-effectiveness of ilaprazole versus omeprazole for the treatment of newly diagnosed duodenal ulcer patients in China.

    Science.gov (United States)

    Xuan, J W; Song, R L; Xu, G X; Lu, W Q; Lu, Y J; Liu, Z

    2016-11-01

    To evaluate the cost-effectiveness of 10 mg ilaprazole once-daily vs 20 mg omeprazole once-daily to treat newly-diagnosed duodenal ulcer patients in China. A decision tree model was constructed and the treatment impact was projected up to 1 year. The CYP2C19 polymorphism distribution in the Chinese population, the respective cure rates in the CYP2C19 genotype sub-groups, the impact of Duodenal Ulcer (DU) on utility value and drug-related side-effect data were obtained from the literature. The total costs of medications were calculated to estimate the treatment costs based on current drug retail prices in China. Expert surveys were conducted when published data were not available. Probabilistic sensitivity analysis was performed to gauge the robustness of the results. Ilaprazole, when compared with omeprazole, achieved a better overall clinical efficacy. For the overall population, ilaprazole achieved an incremental cost effectiveness ratio (ICER) of ¥132 056 per QALY gained. This is less than the WHO recommended threshold of 3-times the average GDP per capita in China (2014). Furthermore, sub-group analysis showed that ilaprazole is cost-effective in every province in CYP2C19 hetEM patients and in the most developed provinces in CYP2C19 homEM patients. Probabilistic sensitivity analysis suggests that the results are robust with 97% probability that ilaprozole is considered cost-effective when a threshold of 3-times China's average GDP per capita is considered. This study didn't have the data of ilaprazole combined with Hp eradication therapy. Caution should be taken when extrapolating these findings to DU patients with an Hp eradication therapy. The cost-effectiveness analysis results demonstrated that ilaprazole would be considered a cost-effective therapy, compared with omeprazole, in Chinese DU patients based on the efficacy projections in various CYP2C19 polymorphism types.

  13. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  14. Bayesian non parametric modelling of Higgs pair production

    Directory of Open Access Journals (Sweden)

    Scarpa Bruno

    2017-01-01

    Full Text Available Statistical classification models are commonly used to separate a signal from a background. In this talk we face the problem of isolating the signal of Higgs pair production using the decay channel in which each boson decays into a pair of b-quarks. Typically in this context non parametric methods are used, such as Random Forests or different types of boosting tools. We remain in the same non-parametric framework, but we propose to face the problem following a Bayesian approach. A Dirichlet process is used as prior for the random effects in a logit model which is fitted by leveraging the Polya-Gamma data augmentation. Refinements of the model include the insertion in the simple model of P-splines to relate explanatory variables with the response and the use of Bayesian trees (BART to describe the atoms in the Dirichlet process.

  15. Population cost-effectiveness of the Triple P parenting programme for the treatment of conduct disorder: an economic modelling study.

    Science.gov (United States)

    Sampaio, Filipa; Barendregt, Jan J; Feldman, Inna; Lee, Yong Yi; Sawyer, Michael G; Dadds, Mark R; Scott, James G; Mihalopoulos, Cathrine

    2017-12-29

    Parenting programmes are the recommended treatments of conduct disorders (CD) in children, but little is known about their longer term cost-effectiveness. This study aimed to evaluate the population cost-effectiveness of one of the most researched evidence-based parenting programmes, the Triple P-Positive Parenting Programme, delivered in a group and individual format, for the treatment of CD in children. A population-based multiple cohort decision analytic model was developed to estimate the cost per disability-adjusted life year (DALY) averted of Triple P compared with a 'no intervention' scenario, using a health sector perspective. The model targeted a cohort of 5-9-year-old children with CD in Australia currently seeking treatment, and followed them until they reached adulthood (i.e., 18 years). Multivariate probabilistic and univariate sensitivity analyses were conducted to incorporate uncertainty in the model parameters. Triple P was cost-effective compared to no intervention at a threshold of AU$50,000 per DALY averted when delivered in a group format [incremental cost-effectiveness ratio (ICER) = $1013 per DALY averted; 95% uncertainty interval (UI) 471-1956] and in an individual format (ICER = $20,498 per DALY averted; 95% UI 11,146-39,470). Evidence-based parenting programmes, such as the Triple P, for the treatment of CD among children appear to represent good value for money, when delivered in a group or an individual face-to-face format, with the group format being the most cost-effective option. The current model can be used for economic evaluations of other interventions targeting CD and in other settings.

  16. Technical note: Bayesian calibration of dynamic ruminant nutrition models.

    Science.gov (United States)

    Reed, K F; Arhonditsis, G B; France, J; Kebreab, E

    2016-08-01

    Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  17. Cost-effectiveness of HIV and syphilis antenatal screening: a modelling study.

    Science.gov (United States)

    Bristow, Claire C; Larson, Elysia; Anderson, Laura J; Klausner, Jeffrey D

    2016-08-01

    The WHO called for the elimination of maternal-to-child transmission (MTCT) of HIV and syphilis, a harmonised approach for the improvement of health outcomes for mothers and children. Testing early in pregnancy, treating seropositive pregnant women and preventing syphilis reinfection can prevent MTCT of HIV and syphilis. We assessed the health and economic outcomes of a dual testing strategy in a simulated cohort of 100 000 antenatal care patients in Malawi. We compared four screening algorithms: (1) HIV rapid test only, (2) dual HIV and syphilis rapid tests, (3) single rapid tests for HIV and syphilis and (4) HIV rapid and syphilis laboratory tests. We calculated the expected number of adverse pregnancy outcomes, the expected costs and the expected newborn disability-adjusted life years (DALYs) for each screening algorithm. The estimated costs and DALYs for each screening algorithm were assessed from a societal perspective using Markov progression models. Additionally, we conducted a Monte Carlo multiway sensitivity analysis, allowing for ranges of inputs. Our cohort decision model predicted the lowest number of adverse pregnancy outcomes in the dual HIV and syphilis rapid test strategy. Additionally, from the societal perspective, the costs of prevention and care using a dual HIV and syphilis rapid testing strategy was both the least costly ($226.92 per pregnancy) and resulted in the fewest DALYs (116 639) per 100 000 pregnancies. In the Monte Carlo simulation the dual HIV and syphilis algorithm was always cost saving and almost always reduced DALYs compared with HIV testing alone. The results of the cost-effectiveness analysis showed that a dual HIV and syphilis test was cost saving compared with all other screening strategies. Updating existing prevention of mother-to-child HIV transmission programmes in Malawi and similar countries to include dual rapid testing for HIV and syphilis is likely to be advantageous. Published by the BMJ Publishing Group

  18. New Computationally Cost-Effective Implementation of Online Nesting for a Regional Model

    Science.gov (United States)

    Yoshida, R.; Yamaura, T.; Adachi, S. A.; Nishizawa, S.; Yashiro, H.; Sato, Y.; Tomita, H.

    2015-12-01

    A new cost-effective implementation of online nesting is developed to improve the computational performance, which is important as well as physical performance in a numerical weather prediction and regional climate experiment. For a down-scaling experiment, a nesting system is indispensable component. Online nesting has merits against offline nesting in updating interval of boundary data and un-necessity of intermediate files. However, the computational efficiency of online nesting has not been much evaluated. In the conventional implementation (CVI) of online nesting, the MPI processes are arranged as a single group, and the group manages all of the nested-domains. In the new implementation (NWI), the MPI processes are divided into several groups, and each process group is assigned to each domain. Therefore, there can be almost no idling processes ideally. In addition, the outer domain calculation can be overlapped behind the inner domain calculation. Elapsed time of data transfer from the outer domain to the inner domain also can be hidden behind the inner domain calculation by appropriate assignment of the processes.We applied the NWI to the SCALE model (Nishizawa et al. 2015), which is a regional weather prediction model developed by RIKEN AICS. We evaluated the computational performance of the NWI in the double-nested experiment by using the K computer. The grid numbers (x,y,z) were set as (120, 108, 40) for the outer domain with 7.5 km horizontal grid space, and (180, 162, 60) for the inner domain with 2.5 km horizontal grid space. For the calculation, 90 processes were used in both the CVI and the NWI. In the NWI, the MPI processes were divided into two groups, and assigned to the outer and the inner domains; 9 and 81 processes for the outer and inner domains, respectively. The computational performance was improved 1.2 times in the NWI compared to the CVI. The benefit of the NWI could become larger when domains are multiple nested.

  19. Bayesian interference in heterogeneous dynamic panel data models: three essays.

    OpenAIRE

    Ciccarelli, Matteo

    2001-01-01

    The task of this work is to discuss issues conceming the specification, estimation, inference and forecasting in multivariate dynamic heterogeneous panel data models from a Bayesian perspective. Three essays linked by a few conraion ideas compose the work. Multivariate dynamic models (mainly VARs) based on micro or macro panel data sets have become increasingly popular in macroeconomics, especially to study the transmission of real and monetary shocks across economies. This great use...

  20. Cost-effectiveness modeling of colorectal cancer: Computed tomography colonography vs colonoscopy or fecal occult blood tests

    International Nuclear Information System (INIS)

    Lucidarme, Olivier; Cadi, Mehdi; Berger, Genevieve; Taieb, Julien; Poynard, Thierry; Grenier, Philippe; Beresniak, Ariel

    2012-01-01

    Objectives: To assess the cost-effectiveness of three colorectal-cancer (CRC) screening strategies in France: fecal-occult-blood tests (FOBT), computed-tomography-colonography (CTC) and optical-colonoscopy (OC). Methods: Ten-year simulation modeling was used to assess a virtual asymptomatic, average-risk population 50–74 years old. Negative OC was repeated 10 years later, and OC positive for advanced or non-advanced adenoma 3 or 5 years later, respectively. FOBT was repeated biennially. Negative CTC was repeated 5 years later. Positive CTC and FOBT led to triennial OC. Total cost and CRC rate after 10 years for each screening strategy and 0–100% adherence rates with 10% increments were computed. Transition probabilities were programmed using distribution ranges to account for uncertainty parameters. Direct medical costs were estimated using the French national health insurance prices. Probabilistic sensitivity analyses used 5000 Monte Carlo simulations generating model outcomes and standard deviations. Results: For a given adherence rate, CTC screening was always the most effective but not the most cost-effective. FOBT was the least effective but most cost-effective strategy. OC was of intermediate efficacy and the least cost-effective strategy. Without screening, treatment of 123 CRC per 10,000 individuals would cost €3,444,000. For 60% adherence, the respective costs of preventing and treating, respectively 49 and 74 FOBT-detected, 73 and 50 CTC-detected and 63 and 60 OC-detected CRC would be €2,810,000, €6,450,000 and €9,340,000. Conclusion: Simulation modeling helped to identify what would be the most effective (CTC) and cost-effective screening (FOBT) strategy in the setting of mass CRC screening in France.

  1. Cost and cost effectiveness of long-lasting insecticide-treated bed nets - a model-based analysis

    Directory of Open Access Journals (Sweden)

    Pulkki-Brännström Anni-Maria

    2012-04-01

    Full Text Available Abstract Background The World Health Organization recommends that national malaria programmes universally distribute long-lasting insecticide-treated bed nets (LLINs. LLINs provide effective insecticide protection for at least three years while conventional nets must be retreated every 6-12 months. LLINs may also promise longer physical durability (lifespan, but at a higher unit price. No prospective data currently available is sufficient to calculate the comparative cost effectiveness of different net types. We thus constructed a model to explore the cost effectiveness of LLINs, asking how a longer lifespan affects the relative cost effectiveness of nets, and if, when and why LLINs might be preferred to conventional insecticide-treated nets. An innovation of our model is that we also considered the replenishment need i.e. loss of nets over time. Methods We modelled the choice of net over a 10-year period to facilitate the comparison of nets with different lifespan (and/or price and replenishment need over time. Our base case represents a large-scale programme which achieves high coverage and usage throughout the population by distributing either LLINs or conventional nets through existing health services, and retreats a large proportion of conventional nets regularly at low cost. We identified the determinants of bed net programme cost effectiveness and parameter values for usage rate, delivery and retreatment cost from the literature. One-way sensitivity analysis was conducted to explicitly compare the differential effect of changing parameters such as price, lifespan, usage and replenishment need. Results If conventional and long-lasting bed nets have the same physical lifespan (3 years, LLINs are more cost effective unless they are priced at more than USD 1.5 above the price of conventional nets. Because a longer lifespan brings delivery cost savings, each one year increase in lifespan can be accompanied by a USD 1 or more increase in price

  2. Cost-effective conservation of an endangered frog under uncertainty.

    Science.gov (United States)

    Rose, Lucy E; Heard, Geoffrey W; Chee, Yung En; Wintle, Brendan A

    2016-04-01

    How should managers choose among conservation options when resources are scarce and there is uncertainty regarding the effectiveness of actions? Well-developed tools exist for prioritizing areas for one-time and binary actions (e.g., protect vs. not protect), but methods for prioritizing incremental or ongoing actions (such as habitat creation and maintenance) remain uncommon. We devised an approach that combines metapopulation viability and cost-effectiveness analyses to select among alternative conservation actions while accounting for uncertainty. In our study, cost-effectiveness is the ratio between the benefit of an action and its economic cost, where benefit is the change in metapopulation viability. We applied the approach to the case of the endangered growling grass frog (Litoria raniformis), which is threatened by urban development. We extended a Bayesian model to predict metapopulation viability under 9 urbanization and management scenarios and incorporated the full probability distribution of possible outcomes for each scenario into the cost-effectiveness analysis. This allowed us to discern between cost-effective alternatives that were robust to uncertainty and those with a relatively high risk of failure. We found a relatively high risk of extinction following urbanization if the only action was reservation of core habitat; habitat creation actions performed better than enhancement actions; and cost-effectiveness ranking changed depending on the consideration of uncertainty. Our results suggest that creation and maintenance of wetlands dedicated to L. raniformis is the only cost-effective action likely to result in a sufficiently low risk of extinction. To our knowledge we are the first study to use Bayesian metapopulation viability analysis to explicitly incorporate parametric and demographic uncertainty into a cost-effective evaluation of conservation actions. The approach offers guidance to decision makers aiming to achieve cost-effective

  3. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Directory of Open Access Journals (Sweden)

    Volker J. Schmid

    2007-10-01

    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  4. A tutorial introduction to Bayesian inference for stochastic epidemic models using Approximate Bayesian Computation.

    Science.gov (United States)

    Kypraios, Theodore; Neal, Peter; Prangle, Dennis

    2017-05-01

    Likelihood-based inference for disease outbreak data can be very challenging due to the inherent dependence of the data and the fact that they are usually incomplete. In this paper we review recent Approximate Bayesian Computation (ABC) methods for the analysis of such data by fitting to them stochastic epidemic models without having to calculate the likelihood of the observed data. We consider both non-temporal and temporal-data and illustrate the methods with a number of examples featuring different models and datasets. In addition, we present extensions to existing algorithms which are easy to implement and provide an improvement to the existing methodology. Finally, R code to implement the algorithms presented in the paper is available on https://github.com/kypraios/epiABC. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Detecting Multiple Random Changepoints in Bayesian Piecewise Growth Mixture Models.

    Science.gov (United States)

    Lock, Eric F; Kohli, Nidhi; Bose, Maitreyee

    2017-11-17

    Piecewise growth mixture models are a flexible and useful class of methods for analyzing segmented trends in individual growth trajectory over time, where the individuals come from a mixture of two or more latent classes. These models allow each segment of the overall developmental process within each class to have a different functional form; examples include two linear phases of growth, or a quadratic phase followed by a linear phase. The changepoint (knot) is the time of transition from one developmental phase (segment) to another. Inferring the location of the changepoint(s) is often of practical interest, along with inference for other model parameters. A random changepoint allows for individual differences in the transition time within each class. The primary objectives of our study are as follows: (1) to develop a PGMM using a Bayesian inference approach that allows the estimation of multiple random changepoints within each class; (2) to develop a procedure to empirically detect the number of random changepoints within each class; and (3) to empirically investigate the bias and precision of the estimation of the model parameters, including the random changepoints, via a simulation study. We have developed the user-friendly package BayesianPGMM for R to facilitate the adoption of this methodology in practice, which is available at https://github.com/lockEF/BayesianPGMM . We describe an application to mouse-tracking data for a visual recognition task.

  6. Operational modal analysis modeling, Bayesian inference, uncertainty laws

    CERN Document Server

    Au, Siu-Kui

    2017-01-01

    This book presents operational modal analysis (OMA), employing a coherent and comprehensive Bayesian framework for modal identification and covering stochastic modeling, theoretical formulations, computational algorithms, and practical applications. Mathematical similarities and philosophical differences between Bayesian and classical statistical approaches to system identification are discussed, allowing their mathematical tools to be shared and their results correctly interpreted. Many chapters can be used as lecture notes for the general topic they cover beyond the OMA context. After an introductory chapter (1), Chapters 2–7 present the general theory of stochastic modeling and analysis of ambient vibrations. Readers are first introduced to the spectral analysis of deterministic time series (2) and structural dynamics (3), which do not require the use of probability concepts. The concepts and techniques in these chapters are subsequently extended to a probabilistic context in Chapter 4 (on stochastic pro...

  7. PDS-Modelling and Regional Bayesian Estimation of Extreme Rainfalls

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan; Harremoës, Poul

    1994-01-01

    Since 1979 a country-wide system of raingauges has been operated in Denmark in order to obtain a better basis for design and analysis of urban drainage systems. As an alternative to the traditional non-parametric approach the Partial Duration Series method is employed in the modelling of extreme ....... The application of the Bayesian approach is derived in case of both exponential and generalized Pareto distributed exceedances. Finally, the aspect of including economic perspectives in the estimation of the design events is briefly discussed....... in Denmark cannot be justified. In order to obtain an estimation procedure at non-monitored sites and to improve at-site estimates a regional Bayesian approach is adopted. The empirical regional distributions of the parameters in the Partial Duration Series model are used as prior information...

  8. Approximate Bayesian computation for spatial SEIR(S) epidemic models.

    Science.gov (United States)

    Brown, Grant D; Porter, Aaron T; Oleson, Jacob J; Hinman, Jessica A

    2018-02-01

    Approximate Bayesia n Computation (ABC) provides an attractive approach to estimation in complex Bayesian inferential problems for which evaluation of the kernel of the posterior distribution is impossible or computationally expensive. These highly parallelizable techniques have been successfully applied to many fields, particularly in cases where more traditional approaches such as Markov chain Monte Carlo (MCMC) are impractical. In this work, we demonstrate the application of approximate Bayesian inference to spatially heterogeneous Susceptible-Exposed-Infectious-Removed (SEIR) stochastic epidemic models. These models have a tractable posterior distribution, however MCMC techniques nevertheless become computationally infeasible for moderately sized problems. We discuss the practical implementation of these techniques via the open source ABSEIR package for R. The performance of ABC relative to traditional MCMC methods in a small problem is explored under simulation, as well as in the spatially heterogeneous context of the 2014 epidemic of Chikungunya in the Americas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Bayesian Predictive Modeling Based on Multidimensional Connectivity Profiling

    Science.gov (United States)

    Herskovits, Edward

    2015-01-01

    Dysfunction of brain structural and functional connectivity is increasingly being recognized as playing an important role in many brain disorders. Diffusion tensor imaging (DTI) and functional magnetic resonance (fMR) imaging are widely used to infer structural and functional connectivity, respectively. How to combine structural and functional connectivity patterns for predictive modeling is an important, yet open, problem. We propose a new method, called Bayesian prediction based on multidimensional connectivity profiling (BMCP), to distinguish subjects at the individual level based on structural and functional connectivity patterns. BMCP combines finite mixture modeling and Bayesian network classification. We demonstrate its use in distinguishing young and elderly adults based on DTI and resting-state fMR data. PMID:25924166

  10. Comparison of evidence theory and Bayesian theory for uncertainty modeling

    International Nuclear Information System (INIS)

    Soundappan, Prabhu; Nikolaidis, Efstratios; Haftka, Raphael T.; Grandhi, Ramana; Canfield, Robert

    2004-01-01

    This paper compares Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on challenge problems involving an algebraic function whose input variables are uncertain. The evidence about the input variables consists of intervals provided by experts. It is recommended that a decision-maker compute both the Bayesian probabilities of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty is imprecise, because this helps assess the importance of imprecision and the value of additional information. Finally, the paper presents and demonstrates a method for testing approaches for decision under uncertainty in terms of their effectiveness in making decisions

  11. A Unified Bayesian Inference Framework for Generalized Linear Models

    Science.gov (United States)

    Meng, Xiangming; Wu, Sheng; Zhu, Jiang

    2018-03-01

    In this letter, we present a unified Bayesian inference framework for generalized linear models (GLM) which iteratively reduces the GLM problem to a sequence of standard linear model (SLM) problems. This framework provides new perspectives on some established GLM algorithms derived from SLM ones and also suggests novel extensions for some other SLM algorithms. Specific instances elucidated under such framework are the GLM versions of approximate message passing (AMP), vector AMP (VAMP), and sparse Bayesian learning (SBL). It is proved that the resultant GLM version of AMP is equivalent to the well-known generalized approximate message passing (GAMP). Numerical results for 1-bit quantized compressed sensing (CS) demonstrate the effectiveness of this unified framework.

  12. A Bayesian hierarchical model for climate change detection and attribution

    Science.gov (United States)

    Katzfuss, Matthias; Hammerling, Dorit; Smith, Richard L.

    2017-06-01

    Regression-based detection and attribution methods continue to take a central role in the study of climate change and its causes. Here we propose a novel Bayesian hierarchical approach to this problem, which allows us to address several open methodological questions. Specifically, we take into account the uncertainties in the true temperature change due to imperfect measurements, the uncertainty in the true climate signal under different forcing scenarios due to the availability of only a small number of climate model simulations, and the uncertainty associated with estimating the climate variability covariance matrix, including the truncation of the number of empirical orthogonal functions (EOFs) in this covariance matrix. We apply Bayesian model averaging to assign optimal probabilistic weights to different possible truncations and incorporate all uncertainties into the inference on the regression coefficients. We provide an efficient implementation of our method in a software package and illustrate its use with a realistic application.

  13. Nonparametric Bayesian models through probit stick-breaking processes.

    Science.gov (United States)

    Rodríguez, Abel; Dunson, David B

    2011-03-01

    We describe a novel class of Bayesian nonparametric priors based on stick-breaking constructions where the weights of the process are constructed as probit transformations of normal random variables. We show that these priors are extremely flexible, allowing us to generate a great variety of models while preserving computational simplicity. Particular emphasis is placed on the construction of rich temporal and spatial processes, which are applied to two problems in finance and ecology.

  14. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    2000-01-01

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...... Carlo (MCMC) sampling techniques. The advantage of this method is that detection of short time learning effects between repeated trials is possible since inference is based only on single trial experiments....

  15. Bayesian tsunami fragility modeling considering input data uncertainty

    OpenAIRE

    De Risi, Raffaele; Goda, Katsu; Mori, Nobuhito; Yasuda, Tomohiro

    2017-01-01

    Empirical tsunami fragility curves are developed based on a Bayesian framework by accounting for uncertainty of input tsunami hazard data in a systematic and comprehensive manner. Three fragility modeling approaches, i.e. lognormal method, binomial logistic method, and multinomial logistic method, are considered, and are applied to extensive tsunami damage data for the 2011 Tohoku earthquake. A unique aspect of this study is that uncertainty of tsunami inundation data (i.e. input hazard data ...

  16. Shortlist B: A Bayesian model of continuous speech recognition

    OpenAIRE

    Norris, D.; McQueen, J.

    2008-01-01

    A Bayesian model of continuous speech recognition is presented. It is based on Shortlist ( D. Norris, 1994; D. Norris, J. M. McQueen, A. Cutler, & S. Butterfield, 1997) and shares many of its key assumptions: parallel competitive evaluation of multiple lexical hypotheses, phonologically abstract prelexical and lexical representations, a feedforward architecture with no online feedback, and a lexical segmentation algorithm based on the viability of chunks of the input as possible words. Shortl...

  17. Cost-Effectiveness of Enhanced Syphilis Screening among HIV-Positive Men Who Have Sex with Men: A Microsimulation Model

    Science.gov (United States)

    Tuite, Ashleigh R.; Burchell, Ann N.; Fisman, David N.

    2014-01-01

    Background Syphilis co-infection risk has increased substantially among HIV-infected men who have sex with men (MSM). Frequent screening for syphilis and treatment of men who test positive might be a practical means of controlling the risk of infection and disease sequelae in this population. Purpose We evaluated the cost-effectiveness of strategies that increased the frequency and population coverage of syphilis screening in HIV-infected MSM receiving HIV care, relative to current standard of care. Methods We developed a state-transition microsimulation model of syphilis natural history and medical care in HIV-infected MSM receiving care for HIV. We performed Monte Carlo simulations using input data derived from a large observational cohort in Ontario, Canada, and from published biomedical literature. Simulations compared usual care (57% of the population screened annually) to different combinations of more frequent (3- or 6-monthly) screening and higher coverage (100% screened). We estimated expected disease-specific outcomes, quality-adjusted survival, costs, and cost-effectiveness associated with each strategy from the perspective of a public health care payer. Results Usual care was more costly and less effective than strategies with more frequent or higher coverage screening. Higher coverage strategies (with screening frequency of 3 or 6 months) were expected to be cost-effective based on usually cited willingness-to-pay thresholds. These findings were robust in the face of probabilistic sensitivity analyses, alternate cost-effectiveness thresholds, and alternate assumptions about duration of risk, program characteristics, and management of underlying HIV. Conclusions We project that higher coverage and more frequent syphilis screening of HIV-infected MSM would be a highly cost-effective health intervention, with many potentially viable screening strategies projected to both save costs and improve health when compared to usual care. The baseline requirement

  18. Cost-effectiveness of enhanced syphilis screening among HIV-positive men who have sex with men: a microsimulation model.

    Directory of Open Access Journals (Sweden)

    Ashleigh R Tuite

    Full Text Available Syphilis co-infection risk has increased substantially among HIV-infected men who have sex with men (MSM. Frequent screening for syphilis and treatment of men who test positive might be a practical means of controlling the risk of infection and disease sequelae in this population.We evaluated the cost-effectiveness of strategies that increased the frequency and population coverage of syphilis screening in HIV-infected MSM receiving HIV care, relative to current standard of care.We developed a state-transition microsimulation model of syphilis natural history and medical care in HIV-infected MSM receiving care for HIV. We performed Monte Carlo simulations using input data derived from a large observational cohort in Ontario, Canada, and from published biomedical literature. Simulations compared usual care (57% of the population screened annually to different combinations of more frequent (3- or 6-monthly screening and higher coverage (100% screened. We estimated expected disease-specific outcomes, quality-adjusted survival, costs, and cost-effectiveness associated with each strategy from the perspective of a public health care payer.Usual care was more costly and less effective than strategies with more frequent or higher coverage screening. Higher coverage strategies (with screening frequency of 3 or 6 months were expected to be cost-effective based on usually cited willingness-to-pay thresholds. These findings were robust in the face of probabilistic sensitivity analyses, alternate cost-effectiveness thresholds, and alternate assumptions about duration of risk, program characteristics, and management of underlying HIV.We project that higher coverage and more frequent syphilis screening of HIV-infected MSM would be a highly cost-effective health intervention, with many potentially viable screening strategies projected to both save costs and improve health when compared to usual care. The baseline requirement for regular blood testing in this

  19. Bayesian Modeling of Cerebral Information Processing

    OpenAIRE

    Labatut, Vincent; Pastor, Josette

    2001-01-01

    International audience; Modeling explicitly the links between cognitive functions and networks of cerebral areas is necessitated both by the understanding of the clinical outcomes of brain lesions and by the interpretation of activation data provided by functional neuroimaging techniques. At this global level of representation, the human brain can be best modeled by a probabilistic functional causal network. Our modeling approach is based on the anatomical connection pattern, the information ...

  20. DISSECTING MAGNETAR VARIABILITY WITH BAYESIAN HIERARCHICAL MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Huppenkothen, Daniela; Elenbaas, Chris; Watts, Anna L.; Horst, Alexander J. van der [Anton Pannekoek Institute for Astronomy, University of Amsterdam, Postbus 94249, 1090 GE Amsterdam (Netherlands); Brewer, Brendon J. [Department of Statistics, The University of Auckland, Private Bag 92019, Auckland 1142 (New Zealand); Hogg, David W. [Center for Data Science, New York University, 726 Broadway, 7th Floor, New York, NY 10003 (United States); Murray, Iain [School of Informatics, University of Edinburgh, Edinburgh EH8 9AB (United Kingdom); Frean, Marcus [School of Engineering and Computer Science, Victoria University of Wellington (New Zealand); Levin, Yuri [Monash Center for Astrophysics and School of Physics, Monash University, Clayton, Victoria 3800 (Australia); Kouveliotou, Chryssa, E-mail: daniela.huppenkothen@nyu.edu [Astrophysics Office, ZP 12, NASA/Marshall Space Flight Center, Huntsville, AL 35812 (United States)

    2015-09-01

    Neutron stars are a prime laboratory for testing physical processes under conditions of strong gravity, high density, and extreme magnetic fields. Among the zoo of neutron star phenomena, magnetars stand out for their bursting behavior, ranging from extremely bright, rare giant flares to numerous, less energetic recurrent bursts. The exact trigger and emission mechanisms for these bursts are not known; favored models involve either a crust fracture and subsequent energy release into the magnetosphere, or explosive reconnection of magnetic field lines. In the absence of a predictive model, understanding the physical processes responsible for magnetar burst variability is difficult. Here, we develop an empirical model that decomposes magnetar bursts into a superposition of small spike-like features with a simple functional form, where the number of model components is itself part of the inference problem. The cascades of spikes that we model might be formed by avalanches of reconnection, or crust rupture aftershocks. Using Markov Chain Monte Carlo sampling augmented with reversible jumps between models with different numbers of parameters, we characterize the posterior distributions of the model parameters and the number of components per burst. We relate these model parameters to physical quantities in the system, and show for the first time that the variability within a burst does not conform to predictions from ideas of self-organized criticality. We also examine how well the properties of the spikes fit the predictions of simplified cascade models for the different trigger mechanisms.

  1. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    Science.gov (United States)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream

  2. A Bayesian Nonparametric Meta-Analysis Model

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  3. AIC, BIC, Bayesian evidence against the interacting dark energy model

    International Nuclear Information System (INIS)

    Szydlowski, Marek; Krawiec, Adam; Kurek, Aleksandra; Kamionka, Michal

    2015-01-01

    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative - the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock- Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam's razor we are inclined to reject this model. (orig.)

  4. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    Science.gov (United States)

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Cost-effectiveness of dengue vaccination in Yucat?n, Mexico using a dynamic dengue transmission model

    OpenAIRE

    Shim, Eunha

    2017-01-01

    Background The incidence of dengue fever (DF) is steadily increasing in Mexico, burdening health systems with consequent morbidities and mortalities. On December 9th, 2015, Mexico became the first country for which the dengue vaccine was approved for use. In anticipation of a vaccine rollout, analysis of the cost-effectiveness of the dengue vaccination program that quantifies the dynamics of disease transmission is essential. Methods We developed a dynamic transmission model of dengue in Yuca...

  6. Cost-effectiveness of seasonal quadrivalent versus trivalent influenza vaccination in the United States: A dynamic transmission modeling approach.

    Science.gov (United States)

    Brogan, Anita J; Talbird, Sandra E; Davis, Ashley E; Thommes, Edward W; Meier, Genevieve

    2017-03-04

    Trivalent inactivated influenza vaccines (IIV3s) protect against 2 A strains and one B lineage; quadrivalent versions (IIV4s) protect against an additional B lineage. The objective was to assess projected health and economic outcomes associated with IIV4 versus IIV3 for preventing seasonal influenza in the US. A cost-effectiveness model was developed to interact with a dynamic transmission model. The transmission model tracked vaccination, influenza cases, infection-spreading interactions, and recovery over 10 y (2012-2022). The cost-effectiveness model estimated influenza-related complications, direct and indirect costs (2013-2014 US$), health outcomes, and cost-effectiveness. Inputs were taken from published/public sources or estimated using regression or calibration. Outcomes were discounted at 3% per year. Scenario analyses tested the reliability of the results. Seasonal vaccination with IIV4 versus IIV3 is predicted to reduce annual influenza cases by 1,973,849 (discounted; 2,325,644 undiscounted), resulting in 12-13% fewer cases and influenza-related complications and deaths. These reductions are predicted to translate into 18,485 more quality-adjusted life years (QALYs) accrued annually for IIV4 versus IIV3. Increased vaccine-related costs ($599 million; 5.7%) are predicted to be more than offset by reduced influenza treatment costs ($699 million; 12.2%), resulting in direct medical cost saving annually ($100 million; 0.6%). Including indirect costs, savings with IIV4 are predicted to be $7.1 billion (5.6%). Scenario analyses predict IIV4 to be cost-saving in all scenarios tested apart from low infectivity, where IIV4 is predicted to be cost-effective. In summary, seasonal influenza vaccination in the US with IIV4 versus IIV3 is predicted to improve health outcomes and reduce costs.

  7. Towards port sustainability through probabilistic models: Bayesian networks

    Directory of Open Access Journals (Sweden)

    B. Molina

    2018-04-01

    Full Text Available It is necessary that a manager of an infrastructure knows relations between variables. Using Bayesian networks, variables can be classified, predicted and diagnosed, being able to estimate posterior probability of the unknown ones based on known ones. The proposed methodology has generated a database with port variables, which have been classified as economic, social, environmental and institutional, as addressed in of smart ports studies made in all Spanish Port System. Network has been developed using an acyclic directed graph, which have let us know relationships in terms of parents and sons. In probabilistic terms, it can be concluded from the constructed network that the most decisive variables for port sustainability are those that are part of the institutional dimension. It has been concluded that Bayesian networks allow modeling uncertainty probabilistically even when the number of variables is high as it occurs in port planning and exploitation.

  8. Bayesian geostatistical modeling of leishmaniasis incidence in Brazil.

    Directory of Open Access Journals (Sweden)

    Dimitrios-Alexios Karagiannis-Voules

    Full Text Available BACKGROUND: Leishmaniasis is endemic in 98 countries with an estimated 350 million people at risk and approximately 2 million cases annually. Brazil is one of the most severely affected countries. METHODOLOGY: We applied Bayesian geostatistical negative binomial models to analyze reported incidence data of cutaneous and visceral leishmaniasis in Brazil covering a 10-year period (2001-2010. Particular emphasis was placed on spatial and temporal patterns. The models were fitted using integrated nested Laplace approximations to perform fast approximate Bayesian inference. Bayesian variable selection was employed to determine the most important climatic, environmental, and socioeconomic predictors of cutaneous and visceral leishmaniasis. PRINCIPAL FINDINGS: For both types of leishmaniasis, precipitation and socioeconomic proxies were identified as important risk factors. The predicted number of cases in 2010 were 30,189 (standard deviation [SD]: 7,676 for cutaneous leishmaniasis and 4,889 (SD: 288 for visceral leishmaniasis. Our risk maps predicted the highest numbers of infected people in the states of Minas Gerais and Pará for visceral and cutaneous leishmaniasis, respectively. CONCLUSIONS/SIGNIFICANCE: Our spatially explicit, high-resolution incidence maps identified priority areas where leishmaniasis control efforts should be targeted with the ultimate goal to reduce disease incidence.

  9. Cost-effectiveness of investing in sidewalks as a means of increasing physical activity: a RESIDE modelling study.

    Science.gov (United States)

    Veerman, J Lennert; Zapata-Diomedi, Belen; Gunn, Lucy; McCormack, Gavin R; Cobiac, Linda J; Mantilla Herrera, Ana Maria; Giles-Corti, Billie; Shiell, Alan

    2016-09-20

    Studies consistently find that supportive neighbourhood built environments increase physical activity by encouraging walking and cycling. However, evidence on the cost-effectiveness of investing in built environment interventions as a means of promoting physical activity is lacking. In this study, we assess the cost-effectiveness of increasing sidewalk availability as one means of encouraging walking. Using data from the RESIDE study in Perth, Australia, we modelled the cost impact and change in health-adjusted life years (HALYs) of installing additional sidewalks in established neighbourhoods. Estimates of the relationship between sidewalk availability and walking were taken from a previous study. Multistate life table models were used to estimate HALYs associated with changes in walking frequency and duration. Sensitivity analyses were used to explore the impact of variations in population density, discount rates, sidewalk costs and the inclusion of unrelated healthcare costs in added life years. Installing and maintaining an additional 10 km of sidewalk in an average neighbourhood with 19 000 adult residents was estimated to cost A$4.2 million over 30 years and gain 24 HALYs over the lifetime of an average neighbourhood adult resident population. The incremental cost-effectiveness ratio was A$176 000/HALY. However, sensitivity results indicated that increasing population densities improves cost-effectiveness. In low-density cities such as in Australia, installing sidewalks in established neighbourhoods as a single intervention is unlikely to cost-effectively improve health. Sidewalks must be considered alongside other complementary elements of walkability, such as density, land use mix and street connectivity. Population density is particularly important because at higher densities, more residents are exposed and this improves the cost-effectiveness. Health gain is one of many benefits of enhancing neighbourhood walkability and future studies might

  10. A Bayesian Network View on Nested Effects Models

    Directory of Open Access Journals (Sweden)

    Fröhlich Holger

    2009-01-01

    Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the /Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.

  11. Bayesian inference and model comparison for metallic fatigue data

    KAUST Repository

    Babuška, Ivo

    2016-02-23

    In this work, we present a statistical treatment of stress-life (S-N) data drawn from a collection of records of fatigue experiments that were performed on 75S-T6 aluminum alloys. Our main objective is to predict the fatigue life of materials by providing a systematic approach to model calibration, model selection and model ranking with reference to S-N data. To this purpose, we consider fatigue-limit models and random fatigue-limit models that are specially designed to allow the treatment of the run-outs (right-censored data). We first fit the models to the data by maximum likelihood methods and estimate the quantiles of the life distribution of the alloy specimen. To assess the robustness of the estimation of the quantile functions, we obtain bootstrap confidence bands by stratified resampling with respect to the cycle ratio. We then compare and rank the models by classical measures of fit based on information criteria. We also consider a Bayesian approach that provides, under the prior distribution of the model parameters selected by the user, their simulation-based posterior distributions. We implement and apply Bayesian model comparison methods, such as Bayes factor ranking and predictive information criteria based on cross-validation techniques under various a priori scenarios.

  12. Cervical cancer treatment costs and cost-effectiveness analysis of human papillomavirus vaccination in Vietnam: a PRIME modeling study.

    Science.gov (United States)

    Van Minh, Hoang; My, Nguyen Thi Tuyet; Jit, Mark

    2017-05-15

    Cervical cancer is currently the leading cause of cancer mortality among women in South Vietnam and the second leading cause of cancer mortality in North Vietnam. Human papillomavirus (HPV) vaccination has the potential to substantially decrease this burden. The World Health Organization (WHO) recommends that a cost-effectiveness analysis of HPV vaccination is conducted before nationwide introduction. The Papillomavirus Rapid Interface for Modeling and Economics (PRIME) model was used to evaluate the cost-effectiveness of HPV vaccine introduction. A costing study based on expert panel discussions, interviews and hospital case note reviews was conducted to explore the cost of cervical cancer care. The cost of cervical cancer treatment ranged from US$368 - 11400 depending on the type of hospital and treatment involved. Under Gavi-negotiated prices of US$4.55, HPV vaccination is likely to be very cost-effective with an incremental cost per disability-adjusted life year (DALY) averted in the range US$780 - 1120. However, under list prices for Cervarix and Gardasil in Vietnam, the incremental cost per DALY averted for HPV vaccination can exceed US$8000. HPV vaccine introduction appears to be economically attractive only if Vietnam is able to procure the vaccine at Gavi prices. This highlights the importance of initiating a nationwide vaccination programme while such prices are still available.

  13. Predicting coastal cliff erosion using a Bayesian probabilistic model

    Science.gov (United States)

    Hapke, Cheryl J.; Plant, Nathaniel G.

    2010-01-01

    Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70–90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale.

  14. The cost-effectiveness of tracking newborns with bilateral hearing impairment in Bavaria: a decision-analytic model

    Directory of Open Access Journals (Sweden)

    Langer Astrid

    2012-11-01

    Full Text Available Abstract Background Although several countries, including Germany, have established newborn hearing screening programmes for early detection and treatment of newborns with hearing impairments, nationwide tracking systems for follow-up of newborns with positive test results until diagnosis of hearing impairment have often not been implemented. However, a recent study on universal newborn hearing screening in Bavaria showed that, in a high proportion of newborns, early diagnosis was only possible with the use of a tracking system. The aim of this study was, therefore, to assess the cost-effectiveness of tracking newborns with bilateral hearing impairment in Bavaria. Methods Data from a Bavarian pilot project on newborn hearing screening and Bavarian newborn hearing screening facilities were used to assess the cost-effectiveness of the inclusion of a tracking system within a newborn hearing screening programme. A model-based cost-effectiveness analysis was conducted. The time horizon of the model was limited to the newborn hearing screening programme. Costs of the initial hearing screening test and subsequent tests were included, as well as costs of diagnosis and costs of tracking. The outcome measure of the economic analysis was the cost per case of bilateral hearing impairment detected. In order to reflect uncertainty, deterministic and probabilistic sensitivity analyses were performed. Results The incremental cost-effectiveness ratio of tracking vs. no tracking was €1,697 per additional case of bilateral hearing impairment detected. Conclusions Compared with no tracking, tracking resulted in more cases of bilateral hearing impairment detected as well as higher costs. If society is willing to pay at least €1,697 per additional case of bilateral hearing impairment detected, tracking can be recommended.

  15. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  16. A Cost-Effectiveness Model for Frail Older Persons: Development and Application to a Physiotherapy-Based Intervention.

    Science.gov (United States)

    Karnon, Jonathan; Afzali, Hossein Haji Ali; Putro, Gregorius Virgianto Arpuji Anggoro; Thant, Phyu Win; Dompok, Ameline; Cox, Ingrid; Chikhwaza, Owen Henry; Wang, Xian; Mwangangi, Mercy Mukui; Farransahat, Matahari; Cameron, Ian

    2017-10-01

    The clinical importance of frailty is increasing. Existing economic evaluations of interventions to manage frailty have limited time horizons, but even in older populations there may be important longer-term differences in costs and outcomes. This paper reports on the development of a cost-effectiveness model to predict publicly funded health and aged care costs and quality-adjusted life years (QALYs) over the remaining lifetime of frail Australians and a model-based cost-utility analysis of a physiotherapy-based intervention for frail individuals. A cohort-based state transition (Markov) model was developed to predict costs and QALYs over the remaining lifetime of a frail population. Frailty is defined using the phenotypic definition of frailty, and the model comprises health states that describe frailty status, residential status, the experience of bone fractures and depression, and death. Model input parameters were estimated and calibrated using the Dynamic Analyses to Optimise Ageing dataset, supplemented with data from the published literature. The cost-effectiveness model was subject to a range of validation approaches, which did not negate the validity of the model. The evaluated physiotherapy-based frailty intervention has an expected incremental cost per QALY gained of Australian $8129 compared to usual care, but there is a probability of 0.3 that usual care is more effective and less costly than the intervention. Frailty reduces quality of life, is costly to manage and it's prevalence is increasing, but new approaches to managing frailty need to demonstrate value for money. The value of the reported cost-effectiveness model is illustrated through the estimation of all important costs and effects of a physiotherapy-based frailty intervention, which facilitates comparisons with funding decisions for other new technologies in Australia.

  17. Methodologies used in cost-effectiveness models for evaluating treatments in major depressive disorder: a systematic review

    Directory of Open Access Journals (Sweden)

    Zimovetz Evelina A

    2012-02-01

    Full Text Available Abstract Background Decision makers in many jurisdictions use cost-effectiveness estimates as an aid for selecting interventions with an appropriate balance between health benefits and costs. This systematic literature review aims to provide an overview of published cost-effectiveness models in major depressive disorder (MDD with a focus on the methods employed. Key components of the identified models are discussed and any challenges in developing models are highlighted. Methods A systematic literature search was performed to identify all primary model-based economic evaluations of MDD interventions indexed in MEDLINE, the Cochrane Library, EMBASE, EconLit, and PsycINFO between January 2000 and May 2010. Results A total of 37 studies were included in the review. These studies predominantly evaluated antidepressant medications. The analyses were performed across a broad set of countries. The majority of models were decision-trees; eight were Markov models. Most models had a time horizon of less than 1 year. The majority of analyses took a payer perspective. Clinical input data were obtained from pooled placebo-controlled comparative trials, single head-to-head trials, or meta-analyses. The majority of studies (24 of 37 used treatment success or symptom-free days as main outcomes, 14 studies incorporated health state utilities, and 2 used disability-adjusted life-years. A few models (14 of 37 incorporated probabilities and costs associated with suicide and/or suicide attempts. Two models examined the cost-effectiveness of second-line treatment in patients who had failed to respond to initial therapy. Resource use data used in the models were obtained mostly from expert opinion. All studies, with the exception of one, explored parameter uncertainty. Conclusions The review identified several model input data gaps, including utility values in partial responders, efficacy of second-line treatments, and resource utilisation estimates obtained from

  18. Cost effectiveness of a radiation therapy simulator: a model for the determination of need

    International Nuclear Information System (INIS)

    Dritschilo, A.; Sherman, D.; Emami, B.; Piro, A.J.; Hellman, S.

    1979-01-01

    The requirement for a certificate-of-need for capital expenditures of $100,000 or more has placed a major constraint on purchases of new medical equipment. Consideration of a first principles argument has not proven compelling to the planning agencies in justifying the purchase of a radiation therapy simulator. Thus a strategy based on cost-effectiveness and the consequences of survival in successfully treated patients is proposed for equipment justification. We have reviewed the records of 18-month survivors among patients with lung cancer that were treated by irradiation; we observed 3 spinal cord injuries in non-simulated patients, whereas none were observed in patients who had the benefit of simulation. Considering the societal costs of spinal cord injury, a cost-benefit analysis of a simulator justifies the expense of this equipment

  19. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    Science.gov (United States)

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach.

  20. Cost-effectiveness of quadrivalent vaccine against human papilloma virus in Argentina based on a dynamic transmission model

    Directory of Open Access Journals (Sweden)

    Andrés Pichon-Riviere

    2015-11-01

    Full Text Available Objective. To assess the cost-effectiveness of the qua- drivalent vaccine against human papillomavirus (HPV in Argentina from the health system perspective. Materials and methods. A dynamic transmission model was used to estimate the impact of the vaccine on the incidence of cervical cancer, warts, and other HPV related diseases; in quality adjusted life years (QALYs; and in healthcare costs. Results. Vaccination could reduce the risk of cervical cancer by 60% and by 67% the risk of genital warts. Compared to a non-vaccine scenario, the immunization strategy showed an incremental benefit of 0.00234 QALY per person at an incremental cost of US$2.36, resulting in an incremental cost-effectiveness ratio of US$1007.55 per QALY gained. Sensitivity analysis proved the robustness of these results. Conclusions. Immunization with the quadrivalent vaccine was a cost-effective intervention in Argentina, and it was far below the threshold of one gross domestic product per capita (US$15 009 per QALY gained.

  1. A predictive ligand-based Bayesian model for human drug-induced liver injury.

    Science.gov (United States)

    Ekins, Sean; Williams, Antony J; Xu, Jinghai J

    2010-12-01

    Drug-induced liver injury (DILI) is one of the most important reasons for drug development failure at both preapproval and postapproval stages. There has been increased interest in developing predictive in vivo, in vitro, and in silico models to identify compounds that cause idiosyncratic hepatotoxicity. In the current study, we applied machine learning, a Bayesian modeling method with extended connectivity fingerprints and other interpretable descriptors. The model that was developed and internally validated (using a training set of 295 compounds) was then applied to a large test set relative to the training set (237 compounds) for external validation. The resulting concordance of 60%, sensitivity of 56%, and specificity of 67% were comparable to results for internal validation. The Bayesian model with extended connectivity functional class fingerprints of maximum diameter 6 (ECFC_6) and interpretable descriptors suggested several substructures that are chemically reactive and may also be important for DILI-causing compounds, e.g., ketones, diols, and α-methyl styrene type structures. Using Smiles Arbitrary Target Specification (SMARTS) filters published by several pharmaceutical companies, we evaluated whether such reactive substructures could be readily detected by any of the published filters. It was apparent that the most stringent filters used in this study, such as the Abbott alerts, which captures thiol traps and other compounds, may be of use in identifying DILI-causing compounds (sensitivity 67%). A significant outcome of the present study is that we provide predictions for many compounds that cause DILI by using the knowledge we have available from previous studies. These computational models may represent cost-effective selection criteria before in vitro or in vivo experimental studies.

  2. The cost-effectiveness of neonatal screening for Cystic Fibrosis: an analysis of alternative scenarios using a decision model

    Directory of Open Access Journals (Sweden)

    Tu Karen

    2005-08-01

    Full Text Available Abstract Background The use of neonatal screening for cystic fibrosis is widely debated in the United Kingdom and elsewhere, but the evidence available to inform policy is limited. This paper explores the cost-effectiveness of adding screening for cystic fibrosis to an existing routine neonatal screening programme for congenital hypothyroidism and phenylketonuria, under alternative scenarios and assumptions. Methods The study is based on a decision model comparing screening to no screening in terms of a number of outcome measures, including diagnosis of cystic fibrosis, life-time treatment costs, life years and QALYs gained. The setting is a hypothetical UK health region without an existing neonatal screening programme for cystic fibrosis. Results Under initial assumptions, neonatal screening (using an immunoreactive trypsin/DNA two stage screening protocol costs £5,387 per infant diagnosed, or £1.83 per infant screened (1998 costs. Neonatal screening for cystic fibrosis produces an incremental cost-effectiveness of £6,864 per QALY gained, in our base case scenario (an assumed benefit of a 6 month delay in the emergence of symptoms. A difference of 11 months or more in the emergence of symptoms (and mean survival means neonatal screening is both less costly and produces better outcomes than no screening. Conclusion Neonatal screening is expensive as a method of diagnosis. Neonatal screening may be a cost-effective intervention if the hypothesised delays in the onset of symptoms are confirmed. Implementing both antenatal and neonatal screening would undermine potential economic benefits, since a reduction in the birth incidence of cystic fibrosis would reduce the cost-effectiveness of neonatal screening.

  3. Forecasting natural gas consumption in China by Bayesian Model Averaging

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2015-11-01

    Full Text Available With rapid growth of natural gas consumption in China, it is in urgent need of more accurate and reliable models to make a reasonable forecast. Considering the limitations of the single model and the model uncertainty, this paper presents a combinative method to forecast natural gas consumption by Bayesian Model Averaging (BMA. It can effectively handle the uncertainty associated with model structure and parameters, and thus improves the forecasting accuracy. This paper chooses six variables for forecasting the natural gas consumption, including GDP, urban population, energy consumption structure, industrial structure, energy efficiency and exports of goods and services. The results show that comparing to Gray prediction model, Linear regression model and Artificial neural networks, the BMA method provides a flexible tool to forecast natural gas consumption that will have a rapid growth in the future. This study can provide insightful information on natural gas consumption in the future.

  4. Bayesian analysis for uncertainty estimation of a canopy transpiration model

    Science.gov (United States)

    Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.

    2007-04-01

    A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.

  5. Modeling operational risks of the nuclear industry with Bayesian networks

    International Nuclear Information System (INIS)

    Wieland, Patricia; Lustosa, Leonardo J.

    2009-01-01

    Basically, planning a new industrial plant requires information on the industrial management, regulations, site selection, definition of initial and planned capacity, and on the estimation of the potential demand. However, this is far from enough to assure the success of an industrial enterprise. Unexpected and extremely damaging events may occur that deviates from the original plan. The so-called operational risks are not only in the system, equipment, process or human (technical or managerial) failures. They are also in intentional events such as frauds and sabotage, or extreme events like terrorist attacks or radiological accidents and even on public reaction to perceived environmental or future generation impacts. For the nuclear industry, it is a challenge to identify and to assess the operational risks and their various sources. Early identification of operational risks can help in preparing contingency plans, to delay the decision to invest or to approve a project that can, at an extreme, affect the public perception of the nuclear energy. A major problem in modeling operational risk losses is the lack of internal data that are essential, for example, to apply the loss distribution approach. As an alternative, methods that consider qualitative and subjective information can be applied, for example, fuzzy logic, neural networks, system dynamic or Bayesian networks. An advantage of applying Bayesian networks to model operational risk is the possibility to include expert opinions and variables of interest, to structure the model via causal dependencies among these variables, and to specify subjective prior and conditional probabilities distributions at each step or network node. This paper suggests a classification of operational risks in industry and discusses the benefits and obstacles of the Bayesian networks approach to model those risks. (author)

  6. Optimizing Prediction Using Bayesian Model Averaging: Examples Using Large-Scale Educational Assessments.

    Science.gov (United States)

    Kaplan, David; Lee, Chansoon

    2018-01-01

    This article provides a review of Bayesian model averaging as a means of optimizing the predictive performance of common statistical models applied to large-scale educational assessments. The Bayesian framework recognizes that in addition to parameter uncertainty, there is uncertainty in the choice of models themselves. A Bayesian approach to addressing the problem of model uncertainty is the method of Bayesian model averaging. Bayesian model averaging searches the space of possible models for a set of submodels that satisfy certain scientific principles and then averages the coefficients across these submodels weighted by each model's posterior model probability (PMP). Using the weighted coefficients for prediction has been shown to yield optimal predictive performance according to certain scoring rules. We demonstrate the utility of Bayesian model averaging for prediction in education research with three examples: Bayesian regression analysis, Bayesian logistic regression, and a recently developed approach for Bayesian structural equation modeling. In each case, the model-averaged estimates are shown to yield better prediction of the outcome of interest than any submodel based on predictive coverage and the log-score rule. Implications for the design of large-scale assessments when the goal is optimal prediction in a policy context are discussed.

  7. Comparative review of three cost-effectiveness models for rotavirus vaccines in national immunization programs; a generic approach applied to various regions in the world

    NARCIS (Netherlands)

    Postma, Maarten J.; Jit, Mark; Rozenbaum, Mark H.; Standaert, Baudouin; Tu, Hong-Anh; Hutubessy, Raymond C. W.

    2011-01-01

    Background: This study aims to critically review available cost-effectiveness models for rotavirus vaccination, compare their designs using a standardized approach and compare similarities and differences in cost-effectiveness outcomes using a uniform set of input parameters. Methods: We identified

  8. Cost Effectiveness of the Instrumentalism in Occupational Therapy (IOT) Conceptual Model as a Guide for Intervention with Adolescents with Emotional and Behavioral Disorders (EBD)

    Science.gov (United States)

    Ikiugu, Moses N.; Anderson, Lynne

    2007-01-01

    The purpose of this paper was to demonstrate the cost-effectiveness of using the Instrumentalism in Occupational Therapy (IOT) conceptual practice model as a guide for intervention to assist teenagers with emotional and behavioral disorders (EBD) transition successfully into adulthood. The cost effectiveness analysis was based on a project…

  9. Costs and cost effectiveness of different strategies for chlamydia screening and partner notification: an economic and mathematical modelling study.

    Science.gov (United States)

    Turner, Katy; Adams, Elisabeth; Grant, Arabella; Macleod, John; Bell, Gill; Clarke, Jan; Horner, Paddy

    2011-01-04

    To compare the cost, cost effectiveness, and sex equity of different intervention strategies within the English National Chlamydia Screening Programme. To develop a tool for calculating cost effectiveness of chlamydia control programmes at a local, national, or international level. An economic and mathematical modelling study with cost effectiveness analysis. Costs were restricted to those of screening and partner notification from the perspective of the NHS and excluded patient costs, the costs of reinfection, and costs of complications arising from initial infection. England. Population Individuals eligible for the National Chlamydia Screening Programme. Cost effectiveness of National Chlamydia Screening Programme in 2008-9 (as cost per individual tested, cost per positive diagnosis, total cost of screening, number screened, number infected, sex ratio of those tested and treated). Comparison of baseline programme with two different interventions-(i) increased coverage of primary screening in men and (ii) increased efficacy of partner notification. In 2008-9 screening was estimated to cost about £46.3m in total and £506 per infection treated. Provision for partner notification within the screening programme cost between £9 and £27 per index case, excluding treatment and testing. The model results suggest that increasing male screening coverage from 8% (baseline value) to 24% (to match female coverage) would cost an extra £22.9m and increase the cost per infection treated to £528. In contrast, increasing partner notification efficacy from 0.4 (baseline value) to 0.8 partners per index case would cost an extra £3.3m and would reduce the cost per infection diagnosed to £449. Increasing screening coverage to 24% in men would cost over six times as much as increasing partner notification to 0.8 but only treat twice as many additional infections. In the English National Chlamydia Screening Programme increasing the effectiveness of partner notification is likely

  10. Airline Sustainability Modeling: A New Framework with Application of Bayesian Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Hashem Salarzadeh Jenatabadi

    2016-11-01

    Full Text Available There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM with maximum likelihood and Bayesian predictors. The introduced framework includes economic performance, operational performance, cost performance, and financial performance. Based on both Bayesian SEM (Bayesian-SEM and Classical SEM (Classical-SEM, it was found that economic performance with both operational performance and cost performance are significantly related to the financial performance index. The four mathematical indices employed are root mean square error, coefficient of determination, mean absolute error, and mean absolute percentage error to compare the efficiency of Bayesian-SEM and Classical-SEM in predicting the airline financial performance. The outputs confirmed that the framework with Bayesian prediction delivered a good fit with the data, although the framework predicted with a Classical-SEM approach did not prepare a well-fitting model. The reasons for this discrepancy between Classical and Bayesian predictions, as well as the potential advantages and caveats with the application of Bayesian approach in airline sustainability studies, are debated.

  11. From qualitative reasoning models to Bayesian-based learner modeling

    NARCIS (Netherlands)

    Milošević, U.; Bredeweg, B.; de Kleer, J.; Forbus, K.D.

    2010-01-01

    Assessing the knowledge of a student is a fundamental part of intelligent learning environments. We present a Bayesian network based approach to dealing with uncertainty when estimating a learner’s state of knowledge in the context of Qualitative Reasoning (QR). A proposal for a global architecture

  12. Development of a cyber security risk model using Bayesian networks

    International Nuclear Information System (INIS)

    Shin, Jinsoo; Son, Hanseong; Khalil ur, Rahman; Heo, Gyunyoung

    2015-01-01

    Cyber security is an emerging safety issue in the nuclear industry, especially in the instrumentation and control (I and C) field. To address the cyber security issue systematically, a model that can be used for cyber security evaluation is required. In this work, a cyber security risk model based on a Bayesian network is suggested for evaluating cyber security for nuclear facilities in an integrated manner. The suggested model enables the evaluation of both the procedural and technical aspects of cyber security, which are related to compliance with regulatory guides and system architectures, respectively. The activity-quality analysis model was developed to evaluate how well people and/or organizations comply with the regulatory guidance associated with cyber security. The architecture analysis model was created to evaluate vulnerabilities and mitigation measures with respect to their effect on cyber security. The two models are integrated into a single model, which is called the cyber security risk model, so that cyber security can be evaluated from procedural and technical viewpoints at the same time. The model was applied to evaluate the cyber security risk of the reactor protection system (RPS) of a research reactor and to demonstrate its usefulness and feasibility. - Highlights: • We developed the cyber security risk model can be find the weak point of cyber security integrated two cyber analysis models by using Bayesian Network. • One is the activity-quality model signifies how people and/or organization comply with the cyber security regulatory guide. • Other is the architecture model represents the probability of cyber-attack on RPS architecture. • The cyber security risk model can provide evidence that is able to determine the key element for cyber security for RPS of a research reactor

  13. Two-stage Bayesian models-application to ZEDB project

    International Nuclear Information System (INIS)

    Bunea, C.; Charitos, T.; Cooke, R.M.; Becker, G.

    2005-01-01

    A well-known mathematical tool to analyze plant specific reliability data for nuclear power facilities is the two-stage Bayesian model. Such two-stage Bayesian models are standard practice nowadays, for example in the German ZEDB project or in the Swedish T-Book, although they may differ in their mathematical models and software implementation. In this paper, we review the mathematical model, its underlying assumptions and supporting arguments. Reasonable conditional assumptions are made to yield tractable and mathematically valid form for the failure rate at plant of interest, given failures and operational times at other plants in the population. The posterior probability of failure rate at plant of interest is sensitive to the choice of hyperprior parameters since the effect of hyperprior distribution will never be dominated by the effect of observation. The methods of Poern and Jeffrey for choosing distributions over hyperparameters are discussed. Furthermore, we will perform verification tasks associated with the theoretical model presented in this paper. The present software implementation produces good agreement with ZEDB results for various prior distributions. The difference between our results and those of ZEDB reflect differences that may arise from numerical implementation, as that would use different step size and truncation bounds

  14. Quantum-Like Bayesian Networks for Modeling Decision Making

    Directory of Open Access Journals (Sweden)

    Catarina eMoreira

    2016-01-01

    Full Text Available In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios.

  15. Prior Sensitivity Analysis in Default Bayesian Structural Equation Modeling.

    Science.gov (United States)

    van Erp, Sara; Mulder, Joris; Oberski, Daniel L

    2017-11-27

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models and solve some of the issues often encountered in classical maximum likelihood estimation, such as nonconvergence and inadmissible solutions. An important component of any Bayesian analysis is the prior distribution of the unknown model parameters. Often, researchers rely on default priors, which are constructed in an automatic fashion without requiring substantive prior information. However, the prior can have a serious influence on the estimation of the model parameters, which affects the mean squared error, bias, coverage rates, and quantiles of the estimates. In this article, we investigate the performance of three different default priors: noninformative improper priors, vague proper priors, and empirical Bayes priors-with the latter being novel in the BSEM literature. Based on a simulation study, we find that these three default BSEM methods may perform very differently, especially with small samples. A careful prior sensitivity analysis is therefore needed when performing a default BSEM analysis. For this purpose, we provide a practical step-by-step guide for practitioners to conducting a prior sensitivity analysis in default BSEM. Our recommendations are illustrated using a well-known case study from the structural equation modeling literature, and all code for conducting the prior sensitivity analysis is available in the online supplemental materials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Bayesian Variable Selection on Model Spaces Constrained by Heredity Conditions.

    Science.gov (United States)

    Taylor-Rodriguez, Daniel; Womack, Andrew; Bliznyuk, Nikolay

    2016-01-01

    This paper investigates Bayesian variable selection when there is a hierarchical dependence structure on the inclusion of predictors in the model. In particular, we study the type of dependence found in polynomial response surfaces of orders two and higher, whose model spaces are required to satisfy weak or strong heredity conditions. These conditions restrict the inclusion of higher-order terms depending upon the inclusion of lower-order parent terms. We develop classes of priors on the model space, investigate their theoretical and finite sample properties, and provide a Metropolis-Hastings algorithm for searching the space of models. The tools proposed allow fast and thorough exploration of model spaces that account for hierarchical polynomial structure in the predictors and provide control of the inclusion of false positives in high posterior probability models.

  17. Experimental validation of a Bayesian model of visual acuity.

    LENUS (Irish Health Repository)

    Dalimier, Eugénie

    2009-01-01

    Based on standard procedures used in optometry clinics, we compare measurements of visual acuity for 10 subjects (11 eyes tested) in the presence of natural ocular aberrations and different degrees of induced defocus, with the predictions given by a Bayesian model customized with aberrometric data of the eye. The absolute predictions of the model, without any adjustment, show good agreement with the experimental data, in terms of correlation and absolute error. The efficiency of the model is discussed in comparison with image quality metrics and other customized visual process models. An analysis of the importance and customization of each stage of the model is also given; it stresses the potential high predictive power from precise modeling of ocular and neural transfer functions.

  18. A flexible Bayesian model for studying gene-environment interaction.

    Directory of Open Access Journals (Sweden)

    Kai Yu

    2012-01-01

    Full Text Available An important follow-up step after genetic markers are found to be associated with a disease outcome is a more detailed analysis investigating how the implicated gene or chromosomal region and an established environment risk factor interact to influence the disease risk. The standard approach to this study of gene-environment interaction considers one genetic marker at a time and therefore could misrepresent and underestimate the genetic contribution to the joint effect when one or more functional loci, some of which might not be genotyped, exist in the region and interact with the environment risk factor in a complex way. We develop a more global approach based on a Bayesian model that uses a latent genetic profile variable to capture all of the genetic variation in the entire targeted region and allows the environment effect to vary across different genetic profile categories. We also propose a resampling-based test derived from the developed Bayesian model for the detection of gene-environment interaction. Using data collected in the Environment and Genetics in Lung Cancer Etiology (EAGLE study, we apply the Bayesian model to evaluate the joint effect of smoking intensity and genetic variants in the 15q25.1 region, which contains a cluster of nicotinic acetylcholine receptor genes and has been shown to be associated with both lung cancer and smoking behavior. We find evidence for gene-environment interaction (P-value = 0.016, with the smoking effect appearing to be stronger in subjects with a genetic profile associated with a higher lung cancer risk; the conventional test of gene-environment interaction based on the single-marker approach is far from significant.

  19. The Cost-Effectiveness of Low-Cost Essential Antihypertensive Medicines for Hypertension Control in China: A Modelling Study.

    Science.gov (United States)

    Gu, Dongfeng; He, Jiang; Coxson, Pamela G; Rasmussen, Petra W; Huang, Chen; Thanataveerat, Anusorn; Tzong, Keane Y; Xiong, Juyang; Wang, Miao; Zhao, Dong; Goldman, Lee; Moran, Andrew E

    2015-08-01

    Hypertension is China's leading cardiovascular disease risk factor. Improved hypertension control in China would result in result in enormous health gains in the world's largest population. A computer simulation model projected the cost-effectiveness of hypertension treatment in Chinese adults, assuming a range of essential medicines list drug costs. The Cardiovascular Disease Policy Model-China, a Markov-style computer simulation model, simulated hypertension screening, essential medicines program implementation, hypertension control program administration, drug treatment and monitoring costs, disease-related costs, and quality-adjusted life years (QALYs) gained by preventing cardiovascular disease or lost because of drug side effects in untreated hypertensive adults aged 35-84 y over 2015-2025. Cost-effectiveness was assessed in cardiovascular disease patients (secondary prevention) and for two blood pressure ranges in primary prevention (stage one, 140-159/90-99 mm Hg; stage two, ≥160/≥100 mm Hg). Treatment of isolated systolic hypertension and combined systolic and diastolic hypertension were modeled as a reduction in systolic blood pressure; treatment of isolated diastolic hypertension was modeled as a reduction in diastolic blood pressure. One-way and probabilistic sensitivity analyses explored ranges of antihypertensive drug effectiveness and costs, monitoring frequency, medication adherence, side effect severity, background hypertension prevalence, antihypertensive medication treatment, case fatality, incidence and prevalence, and cardiovascular disease treatment costs. Median antihypertensive costs from Shanghai and Yunnan province were entered into the model in order to estimate the effects of very low and high drug prices. Incremental cost-effectiveness ratios less than the per capita gross domestic product of China (11,900 international dollars [Int$] in 2015) were considered cost-effective. Treating hypertensive adults with prior cardiovascular

  20. The Cost-Effectiveness of Low-Cost Essential Antihypertensive Medicines for Hypertension Control in China: A Modelling Study.

    Directory of Open Access Journals (Sweden)

    Dongfeng Gu

    2015-08-01

    Full Text Available Hypertension is China's leading cardiovascular disease risk factor. Improved hypertension control in China would result in result in enormous health gains in the world's largest population. A computer simulation model projected the cost-effectiveness of hypertension treatment in Chinese adults, assuming a range of essential medicines list drug costs.The Cardiovascular Disease Policy Model-China, a Markov-style computer simulation model, simulated hypertension screening, essential medicines program implementation, hypertension control program administration, drug treatment and monitoring costs, disease-related costs, and quality-adjusted life years (QALYs gained by preventing cardiovascular disease or lost because of drug side effects in untreated hypertensive adults aged 35-84 y over 2015-2025. Cost-effectiveness was assessed in cardiovascular disease patients (secondary prevention and for two blood pressure ranges in primary prevention (stage one, 140-159/90-99 mm Hg; stage two, ≥160/≥100 mm Hg. Treatment of isolated systolic hypertension and combined systolic and diastolic hypertension were modeled as a reduction in systolic blood pressure; treatment of isolated diastolic hypertension was modeled as a reduction in diastolic blood pressure. One-way and probabilistic sensitivity analyses explored ranges of antihypertensive drug effectiveness and costs, monitoring frequency, medication adherence, side effect severity, background hypertension prevalence, antihypertensive medication treatment, case fatality, incidence and prevalence, and cardiovascular disease treatment costs. Median antihypertensive costs from Shanghai and Yunnan province were entered into the model in order to estimate the effects of very low and high drug prices. Incremental cost-effectiveness ratios less than the per capita gross domestic product of China (11,900 international dollars [Int$] in 2015 were considered cost-effective. Treating hypertensive adults with prior

  1. Cost-effectiveness of Security Measures: A model-based Framework

    DEFF Research Database (Denmark)

    Pieters, Wolter; Probst, Christian W.; Lukszo, Zofia

    2014-01-01

    Recently, cyber security has become an important topic on the agenda of many organisations. It is already widely acknowledged that attacks do happen, and decision makers face the problem of how to respond. As it is almost impossible to secure a complex system completely, it is important to have a...... the question of how to guarantee cost-effectiveness of security measures. They investigate the possibility of using existing frameworks and tools, the challenges in a security context as opposed to a safety context, and directions for future research.......Recently, cyber security has become an important topic on the agenda of many organisations. It is already widely acknowledged that attacks do happen, and decision makers face the problem of how to respond. As it is almost impossible to secure a complex system completely, it is important to have...... an adequate estimate of the effectiveness of security measures when making investment decisions. Risk concepts are known in principle, but estimating the effectiveness of countermeasure proves to be difficult and cannot be achieved by qualitative approaches only. In this chapter, the authors consider...

  2. Bayesian Sensitivity Analysis of Statistical Models with Missing Data.

    Science.gov (United States)

    Zhu, Hongtu; Ibrahim, Joseph G; Tang, Niansheng

    2014-04-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures.

  3. On-line Bayesian model updating for structural health monitoring

    Science.gov (United States)

    Rocchetta, Roberto; Broggi, Matteo; Huchet, Quentin; Patelli, Edoardo

    2018-03-01

    Fatigue induced cracks is a dangerous failure mechanism which affects mechanical components subject to alternating load cycles. System health monitoring should be adopted to identify cracks which can jeopardise the structure. Real-time damage detection may fail in the identification of the cracks due to different sources of uncertainty which have been poorly assessed or even fully neglected. In this paper, a novel efficient and robust procedure is used for the detection of cracks locations and lengths in mechanical components. A Bayesian model updating framework is employed, which allows accounting for relevant sources of uncertainty. The idea underpinning the approach is to identify the most probable crack consistent with the experimental measurements. To tackle the computational cost of the Bayesian approach an emulator is adopted for replacing the computationally costly Finite Element model. To improve the overall robustness of the procedure, different numerical likelihoods, measurement noises and imprecision in the value of model parameters are analysed and their effects quantified. The accuracy of the stochastic updating and the efficiency of the numerical procedure are discussed. An experimental aluminium frame and on a numerical model of a typical car suspension arm are used to demonstrate the applicability of the approach.

  4. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  5. Bayesian approach to errors-in-variables in regression models

    Science.gov (United States)

    Rozliman, Nur Aainaa; Ibrahim, Adriana Irawati Nur; Yunus, Rossita Mohammad

    2017-05-01

    In many applications and experiments, data sets are often contaminated with error or mismeasured covariates. When at least one of the covariates in a model is measured with error, Errors-in-Variables (EIV) model can be used. Measurement error, when not corrected, would cause misleading statistical inferences and analysis. Therefore, our goal is to examine the relationship of the outcome variable and the unobserved exposure variable given the observed mismeasured surrogate by applying the Bayesian formulation to the EIV model. We shall extend the flexible parametric method proposed by Hossain and Gustafson (2009) to another nonlinear regression model which is the Poisson regression model. We shall then illustrate the application of this approach via a simulation study using Markov chain Monte Carlo sampling methods.

  6. Cost-Effectiveness of HIV Screening in STD Clinics, Emergency Departments, and Inpatient Units: A Model-Based Analysis

    Science.gov (United States)

    Prabhu, Vimalanand S.; Farnham, Paul G.; Hutchinson, Angela B.; Soorapanth, Sada; Heffelfinger, James D.; Golden, Matthew R.; Brooks, John T.; Rimland, David; Sansom, Stephanie L.

    2011-01-01

    Background Identifying and treating persons with human immunodeficiency virus (HIV) infection early in their disease stage is considered an effective means of reducing the impact of the disease. We compared the cost-effectiveness of HIV screening in three settings, sexually transmitted disease (STD) clinics serving men who have sex with men, hospital emergency departments (EDs), settings where patients are likely to be diagnosed early, and inpatient diagnosis based on clinical manifestations. Methods and Findings We developed the Progression and Transmission of HIV/AIDS model, a health state transition model that tracks index patients and their infected partners from HIV infection to death. We used program characteristics for each setting to compare the incremental cost per quality-adjusted life year gained from early versus late diagnosis and treatment. We ran the model for 10,000 index patients for each setting, examining alternative scenarios, excluding and including transmission to partners, and assuming HAART was initiated at a CD4 count of either 350 or 500 cells/µL. Screening in STD clinics and EDs was cost-effective compared with diagnosing inpatients, even when including only the benefits to the index patients. Screening patients in STD clinics, who have less-advanced disease, was cost-effective compared with ED screening when treatment with HAART was initiated at a CD4 count of 500 cells/µL. When the benefits of reduced transmission to partners from early diagnosis were included, screening in settings with less-advanced disease stages was cost-saving compared with screening later in the course of infection. The study was limited by a small number of observations on CD4 count at diagnosis and by including transmission only to first generation partners of the index patients. Conclusions HIV prevention efforts can be advanced by screening in settings where patients present with less-advanced stages of HIV infection and by initiating treatment with HAART

  7. Estimating the Cost-Effectiveness of HIV Prevention Programmes in Vietnam, 2006-2010: A Modelling Study.

    Directory of Open Access Journals (Sweden)

    Quang Duy Pham

    Full Text Available Vietnam has been largely reliant on international support in its HIV response. Over 2006-2010, a total of US$480 million was invested in its HIV programmes, more than 70% of which came from international sources. This study investigates the potential epidemiological impacts of these programmes and their cost-effectiveness.We conducted a data synthesis of HIV programming, spending, epidemiological, and clinical outcomes. Counterfactual scenarios were defined based on assumed programme coverage and behaviours had the programmes not been implemented. An epidemiological model, calibrated to reflect the actual epidemiological trends, was used to estimate plausible ranges of programme impacts. The model was then used to estimate the costs per averted infection, death, and disability adjusted life-year (DALY.Based on observed prevalence reductions amongst most population groups, and plausible counterfactuals, modelling suggested that antiretroviral therapy (ART and prevention programmes over 2006-2010 have averted an estimated 50,600 [95% uncertainty bound: 36,300-68,900] new infections and 42,600 [36,100-54,100] deaths, resulting in 401,600 [312,200-496,300] fewer DALYs across all population groups. HIV programmes in Vietnam have cost an estimated US$1,972 [1,447-2,747], US$2,344 [1,843-2,765], and US$248 [201-319] for each averted infection, death, and DALY, respectively.Our evaluation suggests that HIV programmes in Vietnam have most likely had benefits that are cost-effective. ART and direct HIV prevention were the most cost-effective interventions in reducing HIV disease burden.

  8. Estimating the Cost-Effectiveness of HIV Prevention Programmes in Vietnam, 2006-2010: A Modelling Study.

    Science.gov (United States)

    Pham, Quang Duy; Wilson, David P; Kerr, Cliff C; Shattock, Andrew J; Do, Hoa Mai; Duong, Anh Thuy; Nguyen, Long Thanh; Zhang, Lei

    2015-01-01

    Vietnam has been largely reliant on international support in its HIV response. Over 2006-2010, a total of US$480 million was invested in its HIV programmes, more than 70% of which came from international sources. This study investigates the potential epidemiological impacts of these programmes and their cost-effectiveness. We conducted a data synthesis of HIV programming, spending, epidemiological, and clinical outcomes. Counterfactual scenarios were defined based on assumed programme coverage and behaviours had the programmes not been implemented. An epidemiological model, calibrated to reflect the actual epidemiological trends, was used to estimate plausible ranges of programme impacts. The model was then used to estimate the costs per averted infection, death, and disability adjusted life-year (DALY). Based on observed prevalence reductions amongst most population groups, and plausible counterfactuals, modelling suggested that antiretroviral therapy (ART) and prevention programmes over 2006-2010 have averted an estimated 50,600 [95% uncertainty bound: 36,300-68,900] new infections and 42,600 [36,100-54,100] deaths, resulting in 401,600 [312,200-496,300] fewer DALYs across all population groups. HIV programmes in Vietnam have cost an estimated US$1,972 [1,447-2,747], US$2,344 [1,843-2,765], and US$248 [201-319] for each averted infection, death, and DALY, respectively. Our evaluation suggests that HIV programmes in Vietnam have most likely had benefits that are cost-effective. ART and direct HIV prevention were the most cost-effective interventions in reducing HIV disease burden.

  9. Multimethod, multistate Bayesian hierarchical modeling approach for use in regional monitoring of wolves.

    Science.gov (United States)

    Jiménez, José; García, Emilio J; Llaneza, Luis; Palacios, Vicente; González, Luis Mariano; García-Domínguez, Francisco; Múñoz-Igualada, Jaime; López-Bao, José Vicente

    2016-08-01

    In many cases, the first step in large-carnivore management is to obtain objective, reliable, and cost-effective estimates of population parameters through procedures that are reproducible over time. However, monitoring predators over large areas is difficult, and the data have a high level of uncertainty. We devised a practical multimethod and multistate modeling approach based on Bayesian hierarchical-site-occupancy models that combined multiple survey methods to estimate different population states for use in monitoring large predators at a regional scale. We used wolves (Canis lupus) as our model species and generated reliable estimates of the number of sites with wolf reproduction (presence of pups). We used 2 wolf data sets from Spain (Western Galicia in 2013 and Asturias in 2004) to test the approach. Based on howling surveys, the naïve estimation (i.e., estimate based only on observations) of the number of sites with reproduction was 9 and 25 sites in Western Galicia and Asturias, respectively. Our model showed 33.4 (SD 9.6) and 34.4 (3.9) sites with wolf reproduction, respectively. The number of occupied sites with wolf reproduction was 0.67 (SD 0.19) and 0.76 (0.11), respectively. This approach can be used to design more cost-effective monitoring programs (i.e., to define the sampling effort needed per site). Our approach should inspire well-coordinated surveys across multiple administrative borders and populations and lead to improved decision making for management of large carnivores on a landscape level. The use of this Bayesian framework provides a simple way to visualize the degree of uncertainty around population-parameter estimates and thus provides managers and stakeholders an intuitive approach to interpreting monitoring results. Our approach can be widely applied to large spatial scales in wildlife monitoring where detection probabilities differ between population states and where several methods are being used to estimate different population

  10. Randomized evaluation and cost-effectiveness of HIV and sexual and reproductive health service referral and linkage models in Zambia

    Directory of Open Access Journals (Sweden)

    Paul C. Hewett

    2016-08-01

    Full Text Available Abstract Background Provision of HIV prevention and sexual and reproductive health services in Zambia is largely characterized by discrete service provision with weak client referral and linkage. The literature reveals gaps in the continuity of care for HIV and sexual and reproductive health. This study assessed whether improved service delivery models increased the uptake and cost-effectiveness of HIV and sexual and reproductive health services. Methods Adult clients 18+ years of age accessing family planning (females, HIV testing and counseling (females and males, and male circumcision services (males were recruited, enrolled and individually randomized to one of three study arms: 1 the standard model of service provision at the entry point (N = 1319; 2 an enhanced counseling and referral to add-on service with follow-up (N = 1323; and 3 the components of study arm two, with the additional offer of an escort (N = 1321. Interviews were conducted with the same clients at baseline, six weeks and six months. Uptake of services for HIV, family planning, male circumcision, and cervical cancer screening at six weeks and six months were the primary endpoints. Pairwise chi-square and multivariable logistic regression statistical tests assessed differences across study arms, which were also assessed for incremental cost-efficiency and cost-effectiveness. Results A total of 3963 clients, 1920 males and 2043 females, were enrolled; 82 % of participants at six weeks were tracked and 81 % at six months; follow-up rates did not vary significantly by study arm. The odds of clients accessing HIV testing and counseling, cervical cancer screening services among females, and circumcision services among males varied significantly by study arm at six weeks and six months; less consistent findings were observed for HIV care and treatment. Client uptake of family planning services did not vary significantly by study arm. Integrated services were found

  11. Does cost-effectiveness of influenza vaccine choice vary across the U.S.? An agent-based modeling study.

    Science.gov (United States)

    DePasse, Jay V; Nowalk, Mary Patricia; Smith, Kenneth J; Raviotta, Jonathan M; Shim, Eunha; Zimmerman, Richard K; Brown, Shawn T

    2017-07-13

    In a prior agent-based modeling study, offering a choice of influenza vaccine type was shown to be cost-effective when the simulated population represented the large, Washington DC metropolitan area. This study calculated the public health impact and cost-effectiveness of the same four strategies: No Choice, Pediatric Choice, Adult Choice, or Choice for Both Age Groups in five United States (U.S.) counties selected to represent extremes in population age distribution. The choice offered was either inactivated influenza vaccine delivered intramuscularly with a needle (IIV-IM) or an age-appropriate needle-sparing vaccine, specifically, the nasal spray (LAIV) or intradermal (IIV-ID) delivery system. Using agent-based modeling, individuals were simulated as they interacted with others, and influenza was tracked as it spread through each population. Influenza vaccination coverage derived from Centers for Disease Control and Prevention (CDC) data, was increased by 6.5% (range 3.25%-11.25%) to reflect the effects of vaccine choice. Assuming moderate influenza infectivity, the number of averted cases was highest for the Choice for Both Age Groups in all five counties despite differing demographic profiles. In a cost-effectiveness analysis, Choice for Both Age Groups was the dominant strategy. Sensitivity analyses varying influenza infectivity, costs, and degrees of vaccine coverage increase due to choice, supported the base case findings. Offering a choice to receive a needle-sparing influenza vaccine has the potential to significantly reduce influenza disease burden and to be cost saving. Consistent findings across diverse populations confirmed these findings. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. A cost-effectiveness analysis of a proactive management strategy for the Sprint Fidelis recall: a probabilistic decision analysis model.

    Science.gov (United States)

    Bashir, Jamil; Cowan, Simone; Raymakers, Adam; Yamashita, Michael; Danter, Matthew; Krahn, Andrew; Lynd, Larry D

    2013-12-01

    The management of the recall is complicated by the competing risks of lead failure and complications that can occur with lead revision. Many of these patients are currently undergoing an elective generator change--an ideal time to consider lead revision. To determine the cost-effectiveness of a proactive management strategy for the Sprint Fidelis recall. We obtained detailed clinical outcomes and costing data from a retrospective analysis of 341 patients who received the Sprint Fidelis lead in British Columbia, where patients younger than 60 years were offered lead extraction when undergoing generator replacement. These population-based data were used to construct and populate a probabilistic Markov model in which a proactive management strategy was compared to a conservative strategy to determine the incremental cost per lead failure avoided. In our population, elective lead revisions were half the cost of emergent revisions and had a lower complication rate. In the model, the incremental cost-effectiveness ratio of proactive lead revision versus a recommended monitoring strategy was $12,779 per lead failure avoided. The proactive strategy resulted in 21 fewer failures per 100 patients treated and reduced the chance of an additional complication from an unexpected surgery. Cost-effectiveness analysis suggests that prospective lead revision should be considered when patients with a Sprint Fidelis lead present for pulse generator change. Elective revision of the lead is justified even when 25% of the population is operated on per year, and in some scenarios, it is both less costly and provides a better outcome. © 2013 Heart Rhythm Society Published by Heart Rhythm Society All rights reserved.

  13. Non-stationary magnetoencephalography by Bayesian filtering of dipole models

    Science.gov (United States)

    Somersalo, E.; Voutilainen, A.; Kaipio, J. P.

    2003-10-01

    In this paper, we consider the biomagnetic inverse problem of estimating a time-varying source current from magnetic field measurements. It is assumed that the data are severely corrupted by measurement noise. This setting is a model for magnetoencephalography (MEG) when the dynamic nature of the source prevents us from effecting noise reduction by averaging over consecutive measurements. Thus, the potential applications of this approach include the single trial estimation of the brain activity, in particular from the spontaneous MEG data. Our approach is based on non-stationary Bayesian estimation, and we propose the use of particle filters. The source model in this work is either a single dipole or multiple dipole model. Part of the problem consists of the model determination. Numerical simulations are presented.

  14. A kinematic model for Bayesian tracking of cyclic human motion

    Science.gov (United States)

    Greif, Thomas; Lienhart, Rainer

    2010-01-01

    We introduce a two-dimensional kinematic model for cyclic motions of humans, which is suitable for the use as temporal prior in any Bayesian tracking framework. This human motion model is solely based on simple kinematic properties: the joint accelerations. Distributions of joint accelerations subject to the cycle progress are learned from training data. We present results obtained by applying the introduced model to the cyclic motion of backstroke swimming in a Kalman filter framework that represents the posterior distribution by a Gaussian. We experimentally evaluate the sensitivity of the motion model with respect to the frequency and noise level of assumed appearance-based pose measurements by simulating various fidelities of the pose measurements using ground truth data.

  15. Markov chain Monte Carlo simulation for Bayesian Hidden Markov Models

    Science.gov (United States)

    Chan, Lay Guat; Ibrahim, Adriana Irawati Nur Binti

    2016-10-01

    A hidden Markov model (HMM) is a mixture model which has a Markov chain with finite states as its mixing distribution. HMMs have been applied to a variety of fields, such as speech and face recognitions. The main purpose of this study is to investigate the Bayesian approach to HMMs. Using this approach, we can simulate from the parameters' posterior distribution using some Markov chain Monte Carlo (MCMC) sampling methods. HMMs seem to be useful, but there are some limitations. Therefore, by using the Mixture of Dirichlet processes Hidden Markov Model (MDPHMM) based on Yau et. al (2011), we hope to overcome these limitations. We shall conduct a simulation study using MCMC methods to investigate the performance of this model.

  16. Dynamic Bayesian networks as prognostic models for clinical patient management.

    Science.gov (United States)

    van Gerven, Marcel A J; Taal, Babs G; Lucas, Peter J F

    2008-08-01

    Prognostic models in medicine are usually been built using simple decision rules, proportional hazards models, or Markov models. Dynamic Bayesian networks (DBNs) offer an approach that allows for the incorporation of the causal and temporal nature of medical domain knowledge as elicited from domain experts, thereby allowing for detailed prognostic predictions. The aim of this paper is to describe the considerations that must be taken into account when constructing a DBN for complex medical domains and to demonstrate their usefulness in practice. To this end, we focus on the construction of a DBN for prognosis of carcinoid patients, compare performance with that of a proportional hazards model, and describe predictions for three individual patients. We show that the DBN can make detailed predictions, about not only patient survival, but also other variables of interest, such as disease progression, the effect of treatment, and the development of complications. Strengths and limitations of our approach are discussed and compared with those offered by traditional methods.

  17. One-Stage and Bayesian Two-Stage Optimal Designs for Mixture Models

    OpenAIRE

    Lin, Hefang

    1999-01-01

    In this research, Bayesian two-stage D-D optimal designs for mixture experiments with or without process variables under model uncertainty are developed. A Bayesian optimality criterion is used in the first stage to minimize the determinant of the posterior variances of the parameters. The second stage design is then generated according to an optimality procedure that collaborates with the improved model from first stage data. Our results show that the Bayesian two-stage D-D optimal design...

  18. Farmed deer: A veterinary model for chronic mycobacterial diseases that is accessible, appropriate and cost-effective

    Directory of Open Access Journals (Sweden)

    Frank Griffin

    2014-01-01

    Full Text Available Although most studies in immunology have used inbred mice as the experimental model to study fundamental immune mechanisms they have been proven to be limited in their ability to chart complex functional immune pathways, such as are seen in outbred populations of humans or animals. Translation of the findings from inbred mouse studies into practical solutions in therapeutics or the clinic has been remarkably unproductive compared with many other areas of clinical practice in human and veterinary medicine. Access to an unlimited array of mouse strains and an increasing number of genetically modified strains continues to sustain their paramount position in immunology research. Since the mouse studies have provided little more than the dictionary and glossary of immunology, another approach will be required to write the classic exposition of functional immunity. Domestic animals such as ruminants and swine present worthwhile alternatives as models for immunological research into infectious diseases, which may be more informative and cost effective. The original constraint on large animal research through a lack of reagents has been superseded by new molecular technologies and robotics that allow research to progress from gene discovery to systems biology, seamlessly. The current review attempts to highlight how exotic animals such as deer can leverage off the knowledge of ruminant genomics to provide cost-effective models for research into complex, chronic infections. The unique opportunity they provide relates to their diversity and polymorphic genotypes and the integrity of their phenotype for a range of infectious diseases.

  19. Bayesian uncertainty analysis with applications to turbulence modeling

    International Nuclear Information System (INIS)

    Cheung, Sai Hung; Oliver, Todd A.; Prudencio, Ernesto E.; Prudhomme, Serge; Moser, Robert D.

    2011-01-01

    In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.

  20. The Potential Cost Effectiveness of Different Dengue Vaccination Programmes in Malaysia: A Value-Based Pricing Assessment Using Dynamic Transmission Mathematical Modelling.

    Science.gov (United States)

    Shafie, Asrul Akmal; Yeo, Hui Yee; Coudeville, Laurent; Steinberg, Lucas; Gill, Balvinder Singh; Jahis, Rohani; Amar-Singh Hss

    2017-05-01

    Dengue disease poses a great economic burden in Malaysia. This study evaluated the cost effectiveness and impact of dengue vaccination in Malaysia from both provider and societal perspectives using a dynamic transmission mathematical model. The model incorporated sensitivity analyses, Malaysia-specific data, evidence from recent phase III studies and pooled efficacy and long-term safety data to refine the estimates from previous published studies. Unit costs were valued in $US, year 2013 values. Six vaccination programmes employing a three-dose schedule were identified as the most likely programmes to be implemented. In all programmes, vaccination produced positive benefits expressed as reductions in dengue cases, dengue-related deaths, life-years lost, disability-adjusted life-years and dengue treatment costs. Instead of incremental cost-effectiveness ratios (ICERs), we evaluated the cost effectiveness of the programmes by calculating the threshold prices for a highly cost-effective strategy [ICER <1 × gross domestic product (GDP) per capita] and a cost-effective strategy (ICER between 1 and 3 × GDP per capita). We found that vaccination may be cost effective up to a price of $US32.39 for programme 6 (highly cost effective up to $US14.15) and up to a price of $US100.59 for programme 1 (highly cost effective up to $US47.96) from the provider perspective. The cost-effectiveness analysis is sensitive to under-reporting, vaccine protection duration and model time horizon. Routine vaccination for a population aged 13 years with a catch-up cohort aged 14-30 years in targeted hotspot areas appears to be the best-value strategy among those investigated. Dengue vaccination is a potentially good investment if the purchaser can negotiate a price at or below the cost-effective threshold price.

  1. Ridge, Lasso and Bayesian additive-dominance genomic models.

    Science.gov (United States)

    Azevedo, Camila Ferreira; de Resende, Marcos Deon Vilela; E Silva, Fabyano Fonseca; Viana, José Marcelo Soriano; Valente, Magno Sávio Ferreira; Resende, Márcio Fernando Ribeiro; Muñoz, Patricio

    2015-08-25

    A complete approach for genome-wide selection (GWS) involves reliable statistical genetics models and methods. Reports on this topic are common for additive genetic models but not for additive-dominance models. The objective of this paper was (i) to compare the performance of 10 additive-dominance predictive models (including current models and proposed modifications), fitted using Bayesian, Lasso and Ridge regression approaches; and (ii) to decompose genomic heritability and accuracy in terms of three quantitative genetic information sources, namely, linkage disequilibrium (LD), co-segregation (CS) and pedigree relationships or family structure (PR). The simulation study considered two broad sense heritability levels (0.30 and 0.50, associated with narrow sense heritabilities of 0.20 and 0.35, respectively) and two genetic architectures for traits (the first consisting of small gene effects and the second consisting of a mixed inheritance model with five major genes). G-REML/G-BLUP and a modified Bayesian/Lasso (called BayesA*B* or t-BLASSO) method performed best in the prediction of genomic breeding as well as the total genotypic values of individuals in all four scenarios (two heritabilities x two genetic architectures). The BayesA*B*-type method showed a better ability to recover the dominance variance/additive variance ratio. Decomposition of genomic heritability and accuracy revealed the following descending importance order of information: LD, CS and PR not captured by markers, the last two being very close. Amongst the 10 models/methods evaluated, the G-BLUP, BAYESA*B* (-2,8) and BAYESA*B* (4,6) methods presented the best results and were found to be adequate for accurately predicting genomic breeding and total genotypic values as well as for estimating additive and dominance in additive-dominance genomic models.

  2. Bayesian Dose-Response Modeling in Sparse Data

    Science.gov (United States)

    Kim, Steven B.

    This book discusses Bayesian dose-response modeling in small samples applied to two different settings. The first setting is early phase clinical trials, and the second setting is toxicology studies in cancer risk assessment. In early phase clinical trials, experimental units are humans who are actual patients. Prior to a clinical trial, opinions from multiple subject area experts are generally more informative than the opinion of a single expert, but we may face a dilemma when they have disagreeing prior opinions. In this regard, we consider compromising the disagreement and compare two different approaches for making a decision. In addition to combining multiple opinions, we also address balancing two levels of ethics in early phase clinical trials. The first level is individual-level ethics which reflects the perspective of trial participants. The second level is population-level ethics which reflects the perspective of future patients. We extensively compare two existing statistical methods which focus on each perspective and propose a new method which balances the two conflicting perspectives. In toxicology studies, experimental units are living animals. Here we focus on a potential non-monotonic dose-response relationship which is known as hormesis. Briefly, hormesis is a phenomenon which can be characterized by a beneficial effect at low doses and a harmful effect at high doses. In cancer risk assessments, the estimation of a parameter, which is known as a benchmark dose, can be highly sensitive to a class of assumptions, monotonicity or hormesis. In this regard, we propose a robust approach which considers both monotonicity and hormesis as a possibility. In addition, We discuss statistical hypothesis testing for hormesis and consider various experimental designs for detecting hormesis based on Bayesian decision theory. Past experiments have not been optimally designed for testing for hormesis, and some Bayesian optimal designs may not be optimal under a

  3. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  4. Bayesian Age-Period-Cohort Model of Lung Cancer Mortality

    Directory of Open Access Journals (Sweden)

    Bhikhari P. Tharu

    2015-09-01

    Full Text Available Background The objective of this study was to analyze the time trend for lung cancer mortality in the population of the USA by 5 years based on most recent available data namely to 2010. The knowledge of the mortality rates in the temporal trends is necessary to understand cancer burden.Methods Bayesian Age-Period-Cohort model was fitted using Poisson regression with histogram smoothing prior to decompose mortality rates based on age at death, period at death, and birth-cohort.Results Mortality rates from lung cancer increased more rapidly from age 52 years. It ended up to 325 deaths annually for 82 years on average. The mortality of younger cohorts was lower than older cohorts. The risk of lung cancer was lowered from period 1993 to recent periods.Conclusions The fitted Bayesian Age-Period-Cohort model with histogram smoothing prior is capable of explaining mortality rate of lung cancer. The reduction in carcinogens in cigarettes and increase in smoking cessation from around 1960 might led to decreasing trend of lung cancer mortality after calendar period 1993.

  5. Quantitative comparison of canopy conductance models using a Bayesian approach

    Science.gov (United States)

    Samanta, S.; Clayton, M. K.; Mackay, D. S.; Kruger, E. L.; Ewers, B. E.

    2008-09-01

    A quantitative model comparison methodology based on deviance information criterion, a Bayesian measure of the trade-off between model complexity and goodness of fit, is developed and demonstrated by comparing semiempirical transpiration models. This methodology accounts for parameter and prediction uncertainties associated with such models and facilitates objective selection of the simplest model, out of available alternatives, which does not significantly compromise the ability to accurately model observations. We use this methodology to compare various Jarvis canopy conductance model configurations, embedded within a larger transpiration model, against canopy transpiration measured by sap flux. The results indicate that descriptions of the dependence of stomatal conductance on vapor pressure deficit, photosynthetic radiation, and temperature, as well as the gradual variation in canopy conductance through the season are essential in the transpiration model. Use of soil moisture was moderately significant, but only when used with a hyperbolic vapor pressure deficit relationship. Subtle differences in model quality could be clearly associated with small structural changes through the use of this methodology. The results also indicate that increments in model complexity are not always accompanied by improvements in model quality and that such improvements are conditional on model structure. Possible application of this methodology to compare complex semiempirical models of natural systems in general is also discussed.

  6. Effectiveness and cost-effectiveness of serum B-type natriuretic peptide testing and monitoring in patients with heart failure in primary and secondary care: an evidence synthesis, cohort study and cost-effectiveness model.

    Science.gov (United States)

    Pufulete, Maria; Maishman, Rachel; Dabner, Lucy; Mohiuddin, Syed; Hollingworth, William; Rogers, Chris A; Higgins, Julian; Dayer, Mark; Macleod, John; Purdy, Sarah; McDonagh, Theresa; Nightingale, Angus; Williams, Rachael; Reeves, Barnaby C

    2017-08-01

    Heart failure (HF) affects around 500,000 people in the UK. HF medications are frequently underprescribed and B-type natriuretic peptide (BNP)-guided therapy may help to optimise treatment. To evaluate the clinical effectiveness and cost-effectiveness of BNP-guided therapy compared with symptom-guided therapy in HF patients. Systematic review, cohort study and cost-effectiveness model. A literature review and usual care in the NHS. (a) HF patients in randomised controlled trials (RCTs) of BNP-guided therapy; and (b) patients having usual care for HF in the NHS. Systematic review : BNP-guided therapy or symptom-guided therapy in primary or secondary care. Cohort study : BNP monitored (≥ 6 months' follow-up and three or more BNP tests and two or more tests per year), BNP tested (≥ 1 tests but not BNP monitored) or never tested. Cost-effectiveness model : BNP-guided therapy in specialist clinics. Mortality, hospital admission (all cause and HF related) and adverse events; and quality-adjusted life-years (QALYs) for the cost-effectiveness model. Systematic review : Individual participant or aggregate data from eligible RCTs. Cohort study : The Clinical Practice Research Datalink, Hospital Episode Statistics and National Heart Failure Audit (NHFA). A systematic literature search (five databases, trial registries, grey literature and reference lists of publications) for published and unpublished RCTs. Five RCTs contributed individual participant data (IPD) and eight RCTs contributed aggregate data (1536 participants were randomised to BNP-guided therapy and 1538 participants were randomised to symptom-guided therapy). For all-cause mortality, the hazard ratio (HR) for BNP-guided therapy was 0.87 [95% confidence interval (CI) 0.73 to 1.04]. Patients who were aged Trust. Rachel Maishman contributed to the study when she was in receipt of a NIHR Methodology Research Fellowship.

  7. Application of Bayesian Model Selection for Metal Yield Models using ALEGRA and Dakota.

    Energy Technology Data Exchange (ETDEWEB)

    Portone, Teresa; Niederhaus, John Henry; Sanchez, Jason James; Swiler, Laura Painton

    2018-02-01

    This report introduces the concepts of Bayesian model selection, which provides a systematic means of calibrating and selecting an optimal model to represent a phenomenon. This has many potential applications, including for comparing constitutive models. The ideas described herein are applied to a model selection problem between different yield models for hardened steel under extreme loading conditions.

  8. 'Traffic-light' nutrition labelling and 'junk-food' tax: a modelled comparison of cost-effectiveness for obesity prevention.

    Science.gov (United States)

    Sacks, G; Veerman, J L; Moodie, M; Swinburn, B

    2011-07-01

    Cost-effectiveness analyses are important tools in efforts to prioritise interventions for obesity prevention. Modelling facilitates evaluation of multiple scenarios with varying assumptions. This study compares the cost-effectiveness of conservative scenarios for two commonly proposed policy-based interventions: front-of-pack 'traffic-light' nutrition labelling (traffic-light labelling) and a tax on unhealthy foods ('junk-food' tax). For traffic-light labelling, estimates of changes in energy intake were based on an assumed 10% shift in consumption towards healthier options in four food categories (breakfast cereals, pastries, sausages and preprepared meals) in 10% of adults. For the 'junk-food' tax, price elasticities were used to estimate a change in energy intake in response to a 10% price increase in seven food categories (including soft drinks, confectionery and snack foods). Changes in population weight and body mass index by sex were then estimated based on these changes in population energy intake, along with subsequent impacts on disability-adjusted life years (DALYs). Associated resource use was measured and costed using pathway analysis, based on a health sector perspective (with some industry costs included). Costs and health outcomes were discounted at 3%. The cost-effectiveness of each intervention was modelled for the 2003 Australian adult population. Both interventions resulted in reduced mean weight (traffic-light labelling: 1.3 kg (95% uncertainty interval (UI): 1.2; 1.4); 'junk-food' tax: 1.6 kg (95% UI: 1.5; 1.7)); and DALYs averted (traffic-light labelling: 45,100 (95% UI: 37,700; 60,100); 'junk-food' tax: 559,000 (95% UI: 459,500; 676,000)). Cost outlays were AUD81 million (95% UI: 44.7; 108.0) for traffic-light labelling and AUD18 million (95% UI: 14.4; 21.6) for 'junk-food' tax. Cost-effectiveness analysis showed both interventions were 'dominant' (effective and cost-saving). Policy-based population-wide interventions such as traffic

  9. A Bayesian Analysis of Unobserved Component Models Using Ox

    Directory of Open Access Journals (Sweden)

    Charles S. Bos

    2011-05-01

    Full Text Available This article details a Bayesian analysis of the Nile river flow data, using a similar state space model as other articles in this volume. For this data set, Metropolis-Hastings and Gibbs sampling algorithms are implemented in the programming language Ox. These Markov chain Monte Carlo methods only provide output conditioned upon the full data set. For filtered output, conditioning only on past observations, the particle filter is introduced. The sampling methods are flexible, and this advantage is used to extend the model to incorporate a stochastic volatility process. The volatility changes both in the Nile data and also in daily S&P 500 return data are investigated. The posterior density of parameters and states is found to provide information on which elements of the model are easily identifiable, and which elements are estimated with less precision.

  10. Fast Bayesian Inference in Dirichlet Process Mixture Models.

    Science.gov (United States)

    Wang, Lianming; Dunson, David B

    2011-01-01

    There has been increasing interest in applying Bayesian nonparametric methods in large samples and high dimensions. As Markov chain Monte Carlo (MCMC) algorithms are often infeasible, there is a pressing need for much faster algorithms. This article proposes a fast approach for inference in Dirichlet process mixture (DPM) models. Viewing the partitioning of subjects into clusters as a model selection problem, we propose a sequential greedy search algorithm for selecting the partition. Then, when conjugate priors are chosen, the resulting posterior conditionally on the selected partition is available in closed form. This approach allows testing of parametric models versus nonparametric alternatives based on Bayes factors. We evaluate the approach using simulation studies and compare it with four other fast nonparametric methods in the literature. We apply the proposed approach to three datasets including one from a large epidemiologic study. Matlab codes for the simulation and data analyses using the proposed approach are available online in the supplemental materials.

  11. Aggregated Residential Load Modeling Using Dynamic Bayesian Networks

    Energy Technology Data Exchange (ETDEWEB)

    Vlachopoulou, Maria; Chin, George; Fuller, Jason C.; Lu, Shuai

    2014-09-28

    Abstract—It is already obvious that the future power grid will have to address higher demand for power and energy, and to incorporate renewable resources of different energy generation patterns. Demand response (DR) schemes could successfully be used to manage and balance power supply and demand under operating conditions of the future power grid. To achieve that, more advanced tools for DR management of operations and planning are necessary that can estimate the available capacity from DR resources. In this research, a Dynamic Bayesian Network (DBN) is derived, trained, and tested that can model aggregated load of Heating, Ventilation, and Air Conditioning (HVAC) systems. DBNs can provide flexible and powerful tools for both operations and planing, due to their unique analytical capabilities. The DBN model accuracy and flexibility of use is demonstrated by testing the model under different operational scenarios.

  12. Development of a Bayesian Belief Network Runway Incursion Model

    Science.gov (United States)

    Green, Lawrence L.

    2014-01-01

    In a previous paper, a statistical analysis of runway incursion (RI) events was conducted to ascertain their relevance to the top ten Technical Challenges (TC) of the National Aeronautics and Space Administration (NASA) Aviation Safety Program (AvSP). The study revealed connections to perhaps several of the AvSP top ten TC. That data also identified several primary causes and contributing factors for RI events that served as the basis for developing a system-level Bayesian Belief Network (BBN) model for RI events. The system-level BBN model will allow NASA to generically model the causes of RI events and to assess the effectiveness of technology products being developed under NASA funding. These products are intended to reduce the frequency of RI events in particular, and to improve runway safety in general. The development, structure and assessment of that BBN for RI events by a Subject Matter Expert panel are documented in this paper.

  13. Advances in Bayesian Model Based Clustering Using Particle Learning

    Energy Technology Data Exchange (ETDEWEB)

    Merl, D M

    2009-11-19

    Recent work by Carvalho, Johannes, Lopes and Polson and Carvalho, Lopes, Polson and Taddy introduced a sequential Monte Carlo (SMC) alternative to traditional iterative Monte Carlo strategies (e.g. MCMC and EM) for Bayesian inference for a large class of dynamic models. The basis of SMC techniques involves representing the underlying inference problem as one of state space estimation, thus giving way to inference via particle filtering. The key insight of Carvalho et al was to construct the sequence of filtering distributions so as to make use of the posterior predictive distribution of the observable, a distribution usually only accessible in certain Bayesian settings. Access to this distribution allows a reversal of the usual propagate and resample steps characteristic of many SMC methods, thereby alleviating to a large extent many problems associated with particle degeneration. Furthermore, Carvalho et al point out that for many conjugate models the posterior distribution of the static variables can be parametrized in terms of [recursively defined] sufficient statistics of the previously observed data. For models where such sufficient statistics exist, particle learning as it is being called, is especially well suited for the analysis of streaming data do to the relative invariance of its algorithmic complexity with the number of data observations. Through a particle learning approach, a statistical model can be fit to data as the data is arriving, allowing at any instant during the observation process direct quantification of uncertainty surrounding underlying model parameters. Here we describe the use of a particle learning approach for fitting a standard Bayesian semiparametric mixture model as described in Carvalho, Lopes, Polson and Taddy. In Section 2 we briefly review the previously presented particle learning algorithm for the case of a Dirichlet process mixture of multivariate normals. In Section 3 we describe several novel extensions to the original

  14. Road network safety evaluation using Bayesian hierarchical joint model.

    Science.gov (United States)

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Integrated Bayesian network framework for modeling complex ecological issues.

    Science.gov (United States)

    Johnson, Sandra; Mengersen, Kerrie

    2012-07-01

    The management of environmental problems is multifaceted, requiring varied and sometimes conflicting objectives and perspectives to be considered. Bayesian network (BN) modeling facilitates the integration of information from diverse sources and is well suited to tackling the management challenges of complex environmental problems. However, combining several perspectives in one model can lead to large, unwieldy BNs that are difficult to maintain and understand. Conversely, an oversimplified model may lead to an unrealistic representation of the environmental problem. Environmental managers require the current research and available knowledge about an environmental problem of interest to be consolidated in a meaningful way, thereby enabling the assessment of potential impacts and different courses of action. Previous investigations of the environmental problem of interest may have already resulted in the construction of several disparate ecological models. On the other hand, the opportunity may exist to initiate this modeling. In the first instance, the challenge is to integrate existing models and to merge the information and perspectives from these models. In the second instance, the challenge is to include different aspects of the environmental problem incorporating both the scientific and management requirements. Although the paths leading to the combined model may differ for these 2 situations, the common objective is to design an integrated model that captures the available information and research, yet is simple to maintain, expand, and refine. BN modeling is typically an iterative process, and we describe a heuristic method, the iterative Bayesian network development cycle (IBNDC), for the development of integrated BN models that are suitable for both situations outlined above. The IBNDC approach facilitates object-oriented BN (OOBN) modeling, arguably viewed as the next logical step in adaptive management modeling, and that embraces iterative development

  16. Is computer aided detection (CAD cost effective in screening mammography? A model based on the CADET II study

    Directory of Open Access Journals (Sweden)

    Wallis Matthew G

    2011-01-01

    Full Text Available Abstract Background Single reading with computer aided detection (CAD is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£, year 2007/08 of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner. Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate CAD is unlikely to be a cost effective alternative to double reading for mammography screening

  17. An Assessment of the Expected Cost-Effectiveness of Quadrivalent Influenza Vaccines in Ontario, Canada Using a Static Model.

    Directory of Open Access Journals (Sweden)

    Ayman Chit

    Full Text Available Ontario, Canada, immunizes against influenza using a trivalent inactivated influenza vaccine (IIV3 under a Universal Influenza Immunization Program (UIIP. The UIIP offers IIV3 free-of-charge to all Ontarians over 6 months of age. A newly approved quadrivalent inactivated influenza vaccine (IIV4 offers wider protection against influenza B disease. We explored the expected cost-utility and budget impact of replacing IIV3 with IIV4, within the context of Ontario's UIIP, using a probabilistic and static cost-utility model. Wherever possible, epidemiological and cost data were obtained from Ontario sources. Canadian or U.S. sources were used when Ontario data were not available. Vaccine efficacy for IIV3 was obtained from the literature. IIV4 efficacy was derived from meta-analysis of strain-specific vaccine efficacy. Conservatively, herd protection was not considered. In the base case, we used IIV3 and IIV4 prices of $5.5/dose and $7/dose, respectively. We conducted a sensitivity analysis on the price of IIV4, as well as standard univariate and multivariate statistical uncertainty analyses. Over a typical influenza season, relative to IIV3, IIV4 is expected to avert an additional 2,516 influenza cases, 1,683 influenza-associated medical visits, 27 influenza-associated hospitalizations, and 5 influenza-associated deaths. From a societal perspective, IIV4 would generate 76 more Quality Adjusted Life Years (QALYs and a net societal budget impact of $4,784,112. The incremental cost effectiveness ratio for this comparison was $63,773/QALY. IIV4 remains cost-effective up to a 53% price premium over IIV3. A probabilistic sensitivity analysis showed that IIV4 was cost-effective with a probability of 65% for a threshold of $100,000/QALY gained. IIV4 is expected to achieve reductions in influenza-related morbidity and mortality compared to IIV3. Despite not accounting for herd protection, IIV4 is still expected to be a cost-effective alternative to IIV3 up to

  18. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    Science.gov (United States)

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  19. Modeling Land-Use Decision Behavior with Bayesian Belief Networks

    Directory of Open Access Journals (Sweden)

    Inge Aalders

    2008-06-01

    Full Text Available The ability to incorporate and manage the different drivers of land-use change in a modeling process is one of the key challenges because they are complex and are both quantitative and qualitative in nature. This paper uses Bayesian belief networks (BBN to incorporate characteristics of land managers in the modeling process and to enhance our understanding of land-use change based on the limited and disparate sources of information. One of the two models based on spatial data represented land managers in the form of a quantitative variable, the area of individual holdings, whereas the other model included qualitative data from a survey of land managers. Random samples from the spatial data provided evidence of the relationship between the different variables, which I used to develop the BBN structure. The model was tested for four different posterior probability distributions, and results showed that the trained and learned models are better at predicting land use than the uniform and random models. The inference from the model demonstrated the constraints that biophysical characteristics impose on land managers; for older land managers without heirs, there is a higher probability of the land use being arable agriculture. The results show the benefits of incorporating a more complex notion of land managers in land-use models, and of using different empirical data sources in the modeling process. Future research should focus on incorporating more complex social processes into the modeling structure, as well as incorporating spatio-temporal dynamics in a BBN.

  20. Evaluating Flight Crew Performance by a Bayesian Network Model

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2018-03-01

    Full Text Available Flight crew performance is of great significance in keeping flights safe and sound. When evaluating the crew performance, quantitative detailed behavior information may not be available. The present paper introduces the Bayesian Network to perform flight crew performance evaluation, which permits the utilization of multidisciplinary sources of objective and subjective information, despite sparse behavioral data. In this paper, the causal factors are selected based on the analysis of 484 aviation accidents caused by human factors. Then, a network termed Flight Crew Performance Model is constructed. The Delphi technique helps to gather subjective data as a supplement to objective data from accident reports. The conditional probabilities are elicited by the leaky noisy MAX model. Two ways of inference for the BN—probability prediction and probabilistic diagnosis are used and some interesting conclusions are drawn, which could provide data support to make interventions for human error management in aviation safety.

  1. Joint Modeling of Multiple Crimes: A Bayesian Spatial Approach

    Directory of Open Access Journals (Sweden)

    Hongqiang Liu

    2017-01-01

    Full Text Available A multivariate Bayesian spatial modeling approach was used to jointly model the counts of two types of crime, i.e., burglary and non-motor vehicle theft, and explore the geographic pattern of crime risks and relevant risk factors. In contrast to the univariate model, which assumes independence across outcomes, the multivariate approach takes into account potential correlations between crimes. Six independent variables are included in the model as potential risk factors. In order to fully present this method, both the multivariate model and its univariate counterpart are examined. We fitted the two models to the data and assessed them using the deviance information criterion. A comparison of the results from the two models indicates that the multivariate model was superior to the univariate model. Our results show that population density and bar density are clearly associated with both burglary and non-motor vehicle theft risks and indicate a close relationship between these two types of crime. The posterior means and 2.5% percentile of type-specific crime risks estimated by the multivariate model were mapped to uncover the geographic patterns. The implications, limitations and future work of the study are discussed in the concluding section.

  2. Cost-effective choices of marine fuels in a carbon-constrained world: results from a global energy model.

    Science.gov (United States)

    Taljegard, Maria; Brynolf, Selma; Grahn, Maria; Andersson, Karin; Johnson, Hannes

    2014-11-04

    The regionalized Global Energy Transition model has been modified to include a more detailed shipping sector in order to assess what marine fuels and propulsion technologies might be cost-effective by 2050 when achieving an atmospheric CO2 concentration of 400 or 500 ppm by the year 2100. The robustness of the results was examined in a Monte Carlo analysis, varying uncertain parameters and technology options, including the amount of primary energy resources, the availability of carbon capture and storage (CCS) technologies, and costs of different technologies and fuels. The four main findings are (i) it is cost-effective to start the phase out of fuel oil from the shipping sector in the next decade; (ii) natural gas-based fuels (liquefied natural gas and methanol) are the most probable substitutes during the study period; (iii) availability of CCS, the CO2 target, the liquefied natural gas tank cost and potential oil resources affect marine fuel choices significantly; and (iv) biofuels rarely play a major role in the shipping sector, due to limited supply and competition for bioenergy from other energy sectors.

  3. A 'cost-effective' probabilistic model to select the dominant factors affecting the variation of the component failure rate

    International Nuclear Information System (INIS)

    Kirchsteiger, C.

    1992-11-01

    Within the framework of a Probabilistic Safety Assessment (PSA), the component failure rate λ is a key parameter in the sense that the study of its behavior gives the essential information for estimating the current values as well as the trends in the failure probabilities of interest. Since there is an infinite variety of possible underlying factors which might cause changes in λ (e.g. operating time, maintenance practices, component environment, etc.), an 'importance ranking' process of these factors is considered most desirable to prioritize research efforts. To be 'cost-effective', the modeling effort must be small, i.e. essentially involving no estimation of additional parameters other than λ. In this paper, using a multivariate data analysis technique and various statistical measures, such a 'cost-effective' screening process has been developed. Dominant factors affecting the failure rate of any components of interest can easily be identified and the appropriateness of current research plans (e.g. on the necessity of performing aging studies) can be validated. (author)

  4. Replacing ambulatory surgical follow-up visits with mobile app home monitoring: modeling cost-effective scenarios.

    Science.gov (United States)

    Armstrong, Kathleen A; Semple, John L; Coyte, Peter C

    2014-09-22

    Women's College Hospital (WCH) offers specialized surgical procedures, including ambulatory breast reconstruction in post-mastectomy breast cancer patients. Most patients receiving ambulatory surgery have low rates of postoperative events necessitating clinic visits. Increasingly, mobile monitoring and follow-up care is used to overcome the distance patients must travel to receive specialized care at a reduced cost to society. WCH has completed a feasibility study using a mobile app (QoC Health Inc, Toronto) that suggests high patient satisfaction and adequate detection of postoperative complications. The proposed cost-effectiveness study models the replacement of conventional, in-person postoperative follow-up care with mobile app follow-up care following ambulatory breast reconstruction in post-mastectomy breast cancer patients. This is a societal perspective cost-effectiveness analysis, wherein all costs are assessed irrespective of the payer. The patient/caregiver, health care system, and externally borne costs are calculated within the first postoperative month based on cost information provided by WCH and QoC Health Inc. The effectiveness of telemedicine and conventional follow-up care is measured as successful surgical outcomes at 30-days postoperative, and is modeled based on previous clinical trials containing similar patient populations and surgical risks. This costing assumes that 1000 patients are enrolled in bring-your-own-device (BYOD) mobile app follow-up per year and that 1.64 in-person follow-ups are attended in the conventional arm within the first month postoperatively. The total cost difference between mobile app and in-person follow-up care is $245 CAD ($223 USD based on the current exchange rate), with in-person follow-up being more expensive ($381 CAD) than mobile app follow-up care ($136 CAD). This takes into account the total of health care system, patient, and external borne costs. If we examine health care system costs alone, in

  5. Modelling of population dynamics of red king crab using Bayesian approach

    Directory of Open Access Journals (Sweden)

    Bakanev Sergey ...

    2012-10-01

    Modeling population dynamics based on the Bayesian approach enables to successfully resolve the above issues. The integration of the data from various studies into a unified model based on Bayesian parameter estimation method provides a much more detailed description of the processes occurring in the population.

  6. Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models

    DEFF Research Database (Denmark)

    Vehtari, Aki; Mononen, Tommi; Tolvanen, Ville

    2016-01-01

    The future predictive performance of a Bayesian model can be estimated using Bayesian cross-validation. In this article, we consider Gaussian latent variable models where the integration over the latent values is approximated using the Laplace method or expectation propagation (EP). We study the ...

  7. BDgraph: An R Package for Bayesian Structure Learning in Graphical Models

    NARCIS (Netherlands)

    Mohammadi, A.; Wit, E.C.

    2017-01-01

    Graphical models provide powerful tools to uncover complicated patterns in multivariate data and are commonly used in Bayesian statistics and machine learning. In this paper, we introduce an R package BDgraph which performs Bayesian structure learning for general undirected graphical models with

  8. Bayesian network as a modelling tool for risk management in agriculture

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Madsen, Anders L.; Lund, Mogens

    . In this paper we use Bayesian networks as an integrated modelling approach for representing uncertainty and analysing risk management in agriculture. It is shown how historical farm account data may be efficiently used to estimate conditional probabilities, which are the core elements in Bayesian network models...

  9. Some explorations into Bayesian modelling of risks due to pesticide intake from food

    OpenAIRE

    Voet, van der, H.; Paulo, M.J.

    2004-01-01

    This paper presents some common types of data and models in pesticide exposure assessment. The problems of traditional methods are discussed in connection with possibilities to address them in a Bayesian framework. We present simple Bayesian models for consumption of food and for residue monitoring data

  10. Sensitivity of fluvial sediment source apportionment to mixing model assumptions: A Bayesian model comparison.

    Science.gov (United States)

    Cooper, Richard J; Krueger, Tobias; Hiscock, Kevin M; Rawlins, Barry G

    2014-11-01

    Mixing models have become increasingly common tools for apportioning fluvial sediment load to various sediment sources across catchments using a wide variety of Bayesian and frequentist modeling approaches. In this study, we demonstrate how different model setups can impact upon resulting source apportionment estimates in a Bayesian framework via a one-factor-at-a-time (OFAT) sensitivity analysis. We formulate 13 versions of a mixing model, each with different error assumptions and model structural choices, and apply them to sediment geochemistry data from the River Blackwater, Norfolk, UK, to apportion suspended particulate matter (SPM) contributions from three sources (arable topsoils, road verges, and subsurface material) under base flow conditions between August 2012 and August 2013. Whilst all 13 models estimate subsurface sources to be the largest contributor of SPM (median ∼76%), comparison of apportionment estimates reveal varying degrees of sensitivity to changing priors, inclusion of covariance terms, incorporation of time-variant distributions, and methods of proportion characterization. We also demonstrate differences in apportionment results between a full and an empirical Bayesian setup, and between a Bayesian and a frequentist optimization approach. This OFAT sensitivity analysis reveals that mixing model structural choices and error assumptions can significantly impact upon sediment source apportionment results, with estimated median contributions in this study varying by up to 21% between model versions. Users of mixing models are therefore strongly advised to carefully consider and justify their choice of model structure prior to conducting sediment source apportionment investigations. An OFAT sensitivity analysis of sediment fingerprinting mixing models is conductedBayesian models display high sensitivity to error assumptions and structural choicesSource apportionment results differ between Bayesian and frequentist approaches.

  11. Bayesian Correction for Misclassification in Multilevel Count Data Models

    Directory of Open Access Journals (Sweden)

    Tyler Nelson

    2018-01-01

    Full Text Available Covariate misclassification is well known to yield biased estimates in single level regression models. The impact on hierarchical count models has been less studied. A fully Bayesian approach to modeling both the misclassified covariate and the hierarchical response is proposed. Models with a single diagnostic test and with multiple diagnostic tests are considered. Simulation studies show the ability of the proposed model to appropriately account for the misclassification by reducing bias and improving performance of interval estimators. A real data example further demonstrated the consequences of ignoring the misclassification. Ignoring misclassification yielded a model that indicated there was a significant, positive impact on the number of children of females who observed spousal abuse between their parents. When the misclassification was accounted for, the relationship switched to negative, but not significant. Ignoring misclassification in standard linear and generalized linear models is well known to lead to biased results. We provide an approach to extend misclassification modeling to the important area of hierarchical generalized linear models.

  12. Cost-effective design of reinforced concrete with use of self-terminated carbonation model

    Directory of Open Access Journals (Sweden)

    Woyciechowski Piotr

    2016-01-01

    Full Text Available According to Eurocodes EC0 and EC2 designing of concrete structure durable in terms of carbonation is assured by selection of suitable thickness of reinforcement concrete cover. The selection is done on the basis of structure category and concrete strength class, regardless of the concrete material composition or technological type, thus selected value is an estimation, often exaggerated. The paper presents elaborated “self-terminated carbonation model” that includes abovementioned factors and enables to indicate the maximal possible depth of carbonation. The model, in contrast to parabolic models published in the literature, is a hyperbolic function of carbonation depth in time. The paper explains why such model describes the phenomenon of carbonation better than others. The paper contains an example of calculation of the cover thickness using that model.

  13. Bayesian network models for error detection in radiotherapy plans

    Science.gov (United States)

    Kalet, Alan M.; Gennari, John H.; Ford, Eric C.; Phillips, Mark H.

    2015-04-01

    The purpose of this study is to design and develop a probabilistic network for detecting errors in radiotherapy plans for use at the time of initial plan verification. Our group has initiated a multi-pronged approach to reduce these errors. We report on our development of Bayesian models of radiotherapy plans. Bayesian networks consist of joint probability distributions that define the probability of one event, given some set of other known information. Using the networks, we find the probability of obtaining certain radiotherapy parameters, given a set of initial clinical information. A low probability in a propagated network then corresponds to potential errors to be flagged for investigation. To build our networks we first interviewed medical physicists and other domain experts to identify the relevant radiotherapy concepts and their associated interdependencies and to construct a network topology. Next, to populate the network’s conditional probability tables, we used the Hugin Expert software to learn parameter distributions from a subset of de-identified data derived from a radiation oncology based clinical information database system. These data represent 4990 unique prescription cases over a 5 year period. Under test case scenarios with approximately 1.5% introduced error rates, network performance produced areas under the ROC curve of 0.88, 0.98, and 0.89 for the lung, brain and female breast cancer error detection networks, respectively. Comparison of the brain network to human experts performance (AUC of 0.90 ± 0.01) shows the Bayes network model performs better than domain experts under the same test conditions. Our results demonstrate the feasibility and effectiveness of comprehensive probabilistic models as part of decision support systems for improved detection of errors in initial radiotherapy plan verification procedures.

  14. Evaluation of a Stratified National Breast Screening Program in the United Kingdom: An Early Model-Based Cost-Effectiveness Analysis.

    Science.gov (United States)

    Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D Gareth; Astley, Sue; Payne, Katherine

    2017-09-01

    To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1, risk 2, masking [supplemental screening for women with higher breast density], and masking and risk 1) compared with the current UK NBSP and no screening. The model assumed a lifetime horizon, the health service perspective to identify costs (£, 2015), and measured consequences in quality-adjusted life-years (QALYs). Multiple data sources were used: systematic reviews of effectiveness and utility, published studies reporting costs, and cohort studies embedded in existing NBSPs. Model parameter uncertainty was assessed using probabilistic sensitivity analysis and one-way sensitivity analysis. The base-case analysis, supported by probabilistic sensitivity analysis, suggested that the risk stratified NBSPs (risk 1 and risk-2) were relatively cost-effective when compared with the current UK NBSP, with incremental cost-effectiveness ratios of £16,689 per QALY and £23,924 per QALY, respectively. Stratified NBSP including masking approaches (supplemental screening for women with higher breast density) was not a cost-effective alternative, with incremental cost-effectiveness ratios of £212,947 per QALY (masking) and £75,254 per QALY (risk 1 and masking). When compared with no screening, all stratified NBSPs could be considered cost-effective. Key drivers of cost-effectiveness were discount rate, natural history model parameters, mammographic sensitivity, and biopsy rates for recalled cases. A key assumption was that the risk model used in the stratification process was perfectly calibrated to the population. This early model-based cost-effectiveness analysis provides indicative evidence for decision makers to understand the key drivers of costs and QALYs for exemplar stratified NBSP. Copyright

  15. Large scale Bayesian nuclear data evaluation with consistent model defects

    International Nuclear Information System (INIS)

    Schnabel, G

    2015-01-01

    The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the

  16. Bayesian analysis of a reduced-form air quality model.

    Science.gov (United States)

    Foley, Kristen M; Reich, Brian J; Napelenok, Sergey L

    2012-07-17

    Numerical air quality models are being used for assessing emission control strategies for improving ambient pollution levels across the globe. This paper applies probabilistic modeling to evaluate the effectiveness of emission reduction scenarios aimed at lowering ground-level ozone concentrations. A Bayesian hierarchical model is used to combine air quality model output and monitoring data in order to characterize the impact of emissions reductions while accounting for different degrees of uncertainty in the modeled emissions inputs. The probabilistic model predictions are weighted based on population density in order to better quantify the societal benefits/disbenefits of four hypothetical emission reduction scenarios in which domain-wide NO(x) emissions from various sectors are reduced individually and then simultaneously. Cross validation analysis shows the statistical model performs well compared to observed ozone levels. Accounting for the variability and uncertainty in the emissions and atmospheric systems being modeled is shown to impact how emission reduction scenarios would be ranked, compared to standard methodology.

  17. A Bayesian Attractor Model for Perceptual Decision Making

    Science.gov (United States)

    Bitzer, Sebastian; Bruineberg, Jelle; Kiebel, Stefan J.

    2015-01-01

    Even for simple perceptual decisions, the mechanisms that the brain employs are still under debate. Although current consensus states that the brain accumulates evidence extracted from noisy sensory information, open questions remain about how this simple model relates to other perceptual phenomena such as flexibility in decisions, decision-dependent modulation of sensory gain, or confidence about a decision. We propose a novel approach of how perceptual decisions are made by combining two influential formalisms into a new model. Specifically, we embed an attractor model of decision making into a probabilistic framework that models decision making as Bayesian inference. We show that the new model can explain decision making behaviour by fitting it to experimental data. In addition, the new model combines for the first time three important features: First, the model can update decisions in response to switches in the underlying stimulus. Second, the probabilistic formulation accounts for top-down effects that may explain recent experimental findings of decision-related gain modulation of sensory neurons. Finally, the model computes an explicit measure of confidence which we relate to recent experimental evidence for confidence computations in perceptual decision tasks. PMID:26267143

  18. Bayesian inference and model comparison for metallic fatigue data

    KAUST Repository

    Babuska, Ivo

    2016-01-06

    In this work, we present a statistical treatment of stress-life (S-N) data drawn from a collection of records of fatigue experiments that were performed on 75S-T6 aluminum alloys. Our main objective is to predict the fatigue life of materials by providing a systematic approach to model calibration, model selection and model ranking with reference to S-N data. To this purpose, we consider fatigue-limit models and random fatigue-limit models that are specially designed to allow the treatment of the run-outs (right-censored data). We first fit the models to the data by maximum likelihood methods and estimate the quantiles of the life distribution of the alloy specimen. We then compare and rank the models by classical measures of fit based on information criteria. We also consider a Bayesian approach that provides, under the prior distribution of the model parameters selected by the user, their simulation-based posterior distributions.

  19. Comparative review of three cost-effectiveness models for rotavirus vaccines in national immunization programs; a generic approach applied to various regions in the world

    Directory of Open Access Journals (Sweden)

    Tu Hong-Anh

    2011-07-01

    Full Text Available Abstract Background This study aims to critically review available cost-effectiveness models for rotavirus vaccination, compare their designs using a standardized approach and compare similarities and differences in cost-effectiveness outcomes using a uniform set of input parameters. Methods We identified various models used to estimate the cost-effectiveness of rotavirus vaccination. From these, results using a standardized dataset for four regions in the world could be obtained for three specific applications. Results Despite differences in the approaches and individual constituting elements including costs, QALYs Quality Adjusted Life Years and deaths, cost-effectiveness results of the models were quite similar. Differences between the models on the individual components of cost-effectiveness could be related to some specific features of the respective models. Sensitivity analysis revealed that cost-effectiveness of rotavirus vaccination is highly sensitive to vaccine prices, rotavirus-associated mortality and discount rates, in particular that for QALYs. Conclusions The comparative approach followed here is helpful in understanding the various models selected and will thus benefit (low-income countries in designing their own cost-effectiveness analyses using new or adapted existing models. Potential users of the models in low and middle income countries need to consider results from existing studies and reviews. There will be a need for contextualization including the use of country specific data inputs. However, given that the underlying biological and epidemiological mechanisms do not change between countries, users are likely to be able to adapt existing model designs rather than developing completely new approaches. Also, the communication established between the individual researchers involved in the three models is helpful in the further development of these individual models. Therefore, we recommend that this kind of comparative study

  20. Assimilating multi-source uncertainties of a parsimonious conceptual hydrological model using hierarchical Bayesian modeling

    Science.gov (United States)

    Wei Wu; James Clark; James Vose

    2010-01-01

    Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model – GR4J – by coherently assimilating the uncertainties from the...

  1. Modelling the impact and cost-effectiveness of the HIV intervention programme amongst commercial sex workers in Ahmedabad, Gujarat, India

    Directory of Open Access Journals (Sweden)

    Foss Anna M

    2007-08-01

    Full Text Available Abstract Background Ahmedabad is an industrial city in Gujarat, India. In 2003, the HIV prevalence among commercial sex workers (CSWs in Ahmedabad reached 13.0%. In response, the Jyoti Sangh HIV prevention programme for CSWs was initiated, which involves outreach, peer education, condom distribution, and free STD clinics. Two surveys were performed among CSWs in 1999 and 2003. This study estimates the cost-effectiveness of the Jyoti Sangh HIV prevention programme. Methods A dynamic mathematical model was used with survey and intervention-specific data from Ahmedabad to estimate the HIV impact of the Jyoti Sangh project for the 51 months between the two CSW surveys. Uncertainty analysis was used to obtain different model fits to the HIV/STI epidemiological data, producing a range for the HIV impact of the project. Financial and economic costs of the intervention were estimated from the provider's perspective for the same time period. The cost per HIV-infection averted was estimated. Results Over 51 months, projections suggest that the intervention averted 624 and 5,131 HIV cases among the CSWs and their clients, respectively. This equates to a 54% and 51% decrease in the HIV infections that would have occurred among the CSWs and clients without the intervention. In the absence of intervention, the model predicts that the HIV prevalence amongst the CSWs in 2003 would have been 26%, almost twice that with the intervention. Cost per HIV infection averted, excluding and including peer educator economic costs, was USD 59 and USD 98 respectively. Conclusion This study demonstrated that targeted CSW interventions in India can be cost-effective, and highlights the importance of replicating this effort in other similar settings.

  2. iSEDfit: Bayesian spectral energy distribution modeling of galaxies

    Science.gov (United States)

    Moustakas, John

    2017-08-01

    iSEDfit uses Bayesian inference to extract the physical properties of galaxies from their observed broadband photometric spectral energy distribution (SED). In its default mode, the inputs to iSEDfit are the measured photometry (fluxes and corresponding inverse variances) and a measurement of the galaxy redshift. Alternatively, iSEDfit can be used to estimate photometric redshifts from the input photometry alone. After the priors have been specified, iSEDfit calculates the marginalized posterior probability distributions for the physical parameters of interest, including the stellar mass, star-formation rate, dust content, star formation history, and stellar metallicity. iSEDfit also optionally computes K-corrections and produces multiple "quality assurance" (QA) plots at each stage of the modeling procedure to aid in the interpretation of the prior parameter choices and subsequent fitting results. The software is distributed as part of the impro IDL suite.

  3. Designing and testing inflationary models with Bayesian networks

    International Nuclear Information System (INIS)

    Price, Layne C.; Auckland Univ.; Peiris, Hiranya V.; Frazer, Jonathan; Univ. of the Basque Country, Bilbao; Basque Foundation for Science, Bilbao; Easther, Richard

    2015-11-01

    Even simple inflationary scenarios have many free parameters. Beyond the variables appearing in the inflationary action, these include dynamical initial conditions, the number of fields, and couplings to other sectors. These quantities are often ignored but cosmological observables can depend on the unknown parameters. We use Bayesian networks to account for a large set of inflationary parameters, deriving generative models for the primordial spectra that are conditioned on a hierarchical set of prior probabilities describing the initial conditions, reheating physics, and other free parameters. We use N f -quadratic inflation as an illustrative example, finding that the number of e-folds N * between horizon exit for the pivot scale and the end of inflation is typically the most important parameter, even when the number of fields, their masses and initial conditions are unknown, along with possible conditional dependencies between these parameters.

  4. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  5. Modeling the cost effectiveness of malaria control interventions in the highlands of western Kenya

    NARCIS (Netherlands)

    Stuckey, E.M.; Stevenson, J.; Galactionova, K.; Baidjoe, A.Y.; Bousema, T.; Odongo, W.; Kariuki, S.; Drakeley, C.; Smith, T.A.; Cox, J.; Chitnis, N.

    2014-01-01

    INTRODUCTION: Tools that allow for in silico optimization of available malaria control strategies can assist the decision-making process for prioritizing interventions. The OpenMalaria stochastic simulation modeling platform can be applied to simulate the impact of interventions singly and in

  6. Bayesian modeling of ChIP-chip data using latent variables

    Directory of Open Access Journals (Sweden)

    Tian Yanan

    2009-10-01

    Full Text Available Abstract Background The ChIP-chip technology has been used in a wide range of biomedical studies, such as identification of human transcription factor binding sites, investigation of DNA methylation, and investigation of histone modifications in animals and plants. Various methods have been proposed in the literature for analyzing the ChIP-chip data, such as the sliding window methods, the hidden Markov model-based methods, and Bayesian methods. Although, due to the integrated consideration of uncertainty of the models and model parameters, Bayesian methods can potentially work better than the other two classes of methods, the existing Bayesian methods do not perform satisfactorily. They usually require multiple replicates or some extra experimental information to parametrize the model, and long CPU time due to involving of MCMC simulations. Results In this paper, we propose a Bayesian latent model for the ChIP-chip data. The new model mainly differs from the existing Bayesian models, such as the joint deconvolution model, the hierarchical gamma mixture model, and the Bayesian hierarchical model, in two respects. Firstly, it works on the difference between the averaged treatment and control samples. This enables the use of a simple model for the data, which avoids the probe-specific effect and the sample (control/treatment effect. As a consequence, this enables an efficient MCMC simulation of the posterior distribution of the model, and also makes the model more robust to the outliers. Secondly, it models the neighboring dependence of probes by introducing a latent indicator vector. A truncated Poisson prior distribution is assumed for the latent indicator variable, with the rationale being justified at length. Conclusion The Bayesian latent method is successfully applied to real and ten simulated datasets, with comparisons with some of the existing Bayesian methods, hidden Markov model methods, and sliding window methods. The numerical results

  7. Bayesian modeling of ChIP-chip data using latent variables.

    KAUST Repository

    Wu, Mingqi

    2009-10-26

    BACKGROUND: The ChIP-chip technology has been used in a wide range of biomedical studies, such as identification of human transcription factor binding sites, investigation of DNA methylation, and investigation of histone modifications in animals and plants. Various methods have been proposed in the literature for analyzing the ChIP-chip data, such as the sliding window methods, the hidden Markov model-based methods, and Bayesian methods. Although, due to the integrated consideration of uncertainty of the models and model parameters, Bayesian methods can potentially work better than the other two classes of methods, the existing Bayesian methods do not perform satisfactorily. They usually require multiple replicates or some extra experimental information to parametrize the model, and long CPU time due to involving of MCMC simulations. RESULTS: In this paper, we propose a Bayesian latent model for the ChIP-chip data. The new model mainly differs from the existing Bayesian models, such as the joint deconvolution model, the hierarchical gamma mixture model, and the Bayesian hierarchical model, in two respects. Firstly, it works on the difference between the averaged treatment and control samples. This enables the use of a simple model for the data, which avoids the probe-specific effect and the sample (control/treatment) effect. As a consequence, this enables an efficient MCMC simulation of the posterior distribution of the model, and also makes the model more robust to the outliers. Secondly, it models the neighboring dependence of probes by introducing a latent indicator vector. A truncated Poisson prior distribution is assumed for the latent indicator variable, with the rationale being justified at length. CONCLUSION: The Bayesian latent method is successfully applied to real and ten simulated datasets, with comparisons with some of the existing Bayesian methods, hidden Markov model methods, and sliding window methods. The numerical results indicate that the

  8. A novel Bayesian hierarchical model for road safety hotspot prediction.

    Science.gov (United States)

    Fawcett, Lee; Thorpe, Neil; Matthews, Joseph; Kremer, Karsten

    2017-02-01

    In this paper, we propose a Bayesian hierarchical model for predicting accident counts in future years at sites within a pool of potential road safety hotspots. The aim is to inform road safety practitioners of the location of likely future hotspots to enable a proactive, rather than reactive, approach to road safety scheme implementation. A feature of our model is the ability to rank sites according to their potential to exceed, in some future time period, a threshold accident count which may be used as a criterion for scheme implementation. Our model specification enables the classical empirical Bayes formulation - commonly used in before-and-after studies, wherein accident counts from a single before period are used to estimate counterfactual counts in the after period - to be extended to incorporate counts from multiple time periods. This allows site-specific variations in historical accident counts (e.g. locally-observed trends) to offset estimates of safety generated by a global accident prediction model (APM), which itself is used to help account for the effects of global trend and regression-to-mean (RTM). The Bayesian posterior predictive distribution is exploited to formulate predictions and to properly quantify our uncertainty in these predictions. The main contributions of our model include (i) the ability to allow accident counts from multiple time-points to inform predictions, with counts in more recent years lending more weight to predictions than counts from time-points further in the past; (ii) where appropriate, the ability to offset global estimates of trend by variations in accident counts observed locally, at a site-specific level; and (iii) the ability to account for unknown/unobserved site-specific factors which may affect accident counts. We illustrate our model with an application to accident counts at 734 potential hotspots in the German city of Halle; we also propose some simple diagnostics to validate the predictive capability of our

  9. Taming Many-Parameter BSM Models with Bayesian Neural Networks

    Science.gov (United States)

    Kuchera, M. P.; Karbo, A.; Prosper, H. B.; Sanchez, A.; Taylor, J. Z.

    2017-09-01

    The search for physics Beyond the Standard Model (BSM) is a major focus of large-scale high energy physics experiments. One method is to look for specific deviations from the Standard Model that are predicted by BSM models. In cases where the model has a large number of free parameters, standard search methods become intractable due to computation time. This talk presents results using Bayesian Neural Networks, a supervised machine learning method, to enable the study of higher-dimensional models. The popular phenomenological Minimal Supersymmetric Standard Model was studied as an example of the feasibility and usefulness of this method. Graphics Processing Units (GPUs) are used to expedite the calculations. Cross-section predictions for 13 TeV proton collisions will be presented. My participation in the Conference Experience for Undergraduates (CEU) in 2004-2006 exposed me to the national and global significance of cutting-edge research. At the 2005 CEU, I presented work from the previous summer's SULI internship at Lawrence Berkeley Laboratory, where I learned to program while working on the Majorana Project. That work inspired me to follow a similar research path, which led me to my current work on computational methods applied to BSM physics.

  10. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    Science.gov (United States)

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  11. A Bayesian joint model of menstrual cycle length and fecundity.

    Science.gov (United States)

    Lum, Kirsten J; Sundaram, Rajeshwari; Buck Louis, Germaine M; Louis, Thomas A

    2016-03-01

    Menstrual cycle length (MCL) has been shown to play an important role in couple fecundity, which is the biologic capacity for reproduction irrespective of pregnancy intentions. However, a comprehensive assessment of its role requires a fecundity model that accounts for male and female attributes and the couple's intercourse pattern relative to the ovulation day. To this end, we employ a Bayesian joint model for MCL and pregnancy. MCLs follow a scale multiplied (accelerated) mixture model with Gaussian and Gumbel components; the pregnancy model includes MCL as a covariate and computes the cycle-specific probability of pregnancy in a menstrual cycle conditional on the pattern of intercourse and no previous fertilization. Day-specific fertilization probability is modeled using natural, cubic splines. We analyze data from the Longitudinal Investigation of Fertility and the Environment Study (the LIFE Study), a couple based prospective pregnancy study, and find a statistically significant quadratic relation between fecundity and menstrual cycle length, after adjustment for intercourse pattern and other attributes, including male semen quality, both partner's age, and active smoking status (determined by baseline cotinine level 100 ng/mL). We compare results to those produced by a more basic model and show the advantages of a more comprehensive approach. © 2015, The International Biometric Society.

  12. Bayesian analysis of inflation: Parameter estimation for single field models

    International Nuclear Information System (INIS)

    Mortonson, Michael J.; Peiris, Hiranya V.; Easther, Richard

    2011-01-01

    Future astrophysical data sets promise to strengthen constraints on models of inflation, and extracting these constraints requires methods and tools commensurate with the quality of the data. In this paper we describe ModeCode, a new, publicly available code that computes the primordial scalar and tensor power spectra for single-field inflationary models. ModeCode solves the inflationary mode equations numerically, avoiding the slow roll approximation. It is interfaced with CAMB and CosmoMC to compute cosmic microwave background angular power spectra and perform likelihood analysis and parameter estimation. ModeCode is easily extendable to additional models of inflation, and future updates will include Bayesian model comparison. Errors from ModeCode contribute negligibly to the error budget for analyses of data from Planck or other next generation experiments. We constrain representative single-field models (φ n with n=2/3, 1, 2, and 4, natural inflation, and 'hilltop' inflation) using current data, and provide forecasts for Planck. From current data, we obtain weak but nontrivial limits on the post-inflationary physics, which is a significant source of uncertainty in the predictions of inflationary models, while we find that Planck will dramatically improve these constraints. In particular, Planck will link the inflationary dynamics with the post-inflationary growth of the horizon, and thus begin to probe the ''primordial dark ages'' between TeV and grand unified theory scale energies.

  13. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    Science.gov (United States)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  14. Bayesian model ensembling using meta-trained recurrent neural networks

    NARCIS (Netherlands)

    Ambrogioni, L.; Berezutskaya, Y.; Gü ç lü , U.; Borne, E.W.P. van den; Gü ç lü tü rk, Y.; Gerven, M.A.J. van; Maris, E.G.G.

    2017-01-01

    In this paper we demonstrate that a recurrent neural network meta-trained on an ensemble of arbitrary classification tasks can be used as an approximation of the Bayes optimal classifier. This result is obtained by relying on the framework of e-free approximate Bayesian inference, where the Bayesian

  15. A Bayesian Reformulation of the Extended Drift-Diffusion Model in Perceptual Decision Making

    Science.gov (United States)

    Fard, Pouyan R.; Park, Hame; Warkentin, Andrej; Kiebel, Stefan J.; Bitzer, Sebastian

    2017-01-01

    Perceptual decision making can be described as a process of accumulating evidence to a bound which has been formalized within drift-diffusion models (DDMs). Recently, an equivalent Bayesian model has been proposed. In contrast to standard DDMs, this Bayesian model directly links information in the stimulus to the decision process. Here, we extend this Bayesian model further and allow inter-trial variability of two parameters following the extended version of the DDM. We derive parameter distributions for the Bayesian model and show that they lead to predictions that are qualitatively equivalent to those made by the extended drift-diffusion model (eDDM). Further, we demonstrate the usefulness of the extended Bayesian model (eBM) for the analysis of concrete behavioral data. Specifically, using Bayesian model selection, we find evidence that including additional inter-trial parameter variability provides for a better model, when the model is constrained by trial-wise stimulus features. This result is remarkable because it was derived using just 200 trials per condition, which is typically thought to be insufficient for identifying variability parameters in DDMs. In sum, we present a Bayesian analysis, which provides for a novel and promising analysis of perceptual decision making experiments. PMID:28553219

  16. A Bayesian Reformulation of the Extended Drift-Diffusion Model in Perceptual Decision Making

    Directory of Open Access Journals (Sweden)

    Pouyan R. Fard

    2017-05-01

    Full Text Available Perceptual decision making can be described as a process of accumulating evidence to a bound which has been formalized within drift-diffusion models (DDMs. Recently, an equivalent Bayesian model has been proposed. In contrast to standard DDMs, this Bayesian model directly links information in the stimulus to the decision process. Here, we extend this Bayesian model further and allow inter-trial variability of two parameters following the extended version of the DDM. We derive parameter distributions for the Bayesian model and show that they lead to predictions that are qualitatively equivalent to those made by the extended drift-diffusion model (eDDM. Further, we demonstrate the usefulness of the extended Bayesian model (eBM for the analysis of concrete behavioral data. Specifically, using Bayesian model selection, we find evidence that including additional inter-trial parameter variability provides for a better model, when the model is constrained by trial-wise stimulus features. This result is remarkable because it was derived using just 200 trials per condition, which is typically thought to be insufficient for identifying variability parameters in DDMs. In sum, we present a Bayesian analysis, which provides for a novel and promising analysis of perceptual decision making experiments.

  17. Early warning systems for the management of chronic heart failure: a systematic literature review of cost-effectiveness models.

    Science.gov (United States)

    Albuquerque De Almeida, Fernando; Al, Maiwenn; Koymans, Ron; Caliskan, Kadir; Kerstens, Ankie; Severens, Johan L

    2018-04-01

    Describing the general and methodological characteristics of decision-analytical models used in the economic evaluation of early warning systems for the management of chronic heart failure patients and performing a quality assessment of their methodological characteristics is expected to provide concise and useful insight to inform the future development of decision-analytical models in the field of heart failure management. Areas covered: The literature on decision-analytical models for the economic evaluation of early warning systems for the management of chronic heart failure patients was systematically reviewed. Nine electronic databases were searched through the combination of synonyms for heart failure and sensitive filters for cost-effectiveness and early warning systems. Expert commentary: The retrieved models show some variability with regards to their general study characteristics. Overall, they display satisfactory methodological quality, even though some points could be improved, namely on the consideration and discussion of any competing theories regarding model structure and disease progression, identification of key parameters and the use of expert opinion, and uncertainty analyses. A comprehensive definition of early warning systems and further research under this label should be pursued. To improve the transparency of economic evaluation publications, authors should make available detailed technical information regarding the published models.

  18. The Cost and Cost-Effectiveness of Scaling up Screening and Treatment of Syphilis in Pregnancy: A Model

    Science.gov (United States)

    Kahn, James G.; Jiwani, Aliya; Gomez, Gabriela B.; Hawkes, Sarah J.; Chesson, Harrell W.; Broutet, Nathalie; Kamb, Mary L.; Newman, Lori M.

    2014-01-01

    Background Syphilis in pregnancy imposes a significant global health and economic burden. More than half of cases result in serious adverse events, including infant mortality and infection. The annual global burden from mother-to-child transmission (MTCT) of syphilis is estimated at 3.6 million disability-adjusted life years (DALYs) and $309 million in medical costs. Syphilis screening and treatment is simple, effective, and affordable, yet, worldwide, most pregnant women do not receive these services. We assessed cost-effectiveness of scaling-up syphilis screening and treatment in existing antenatal care (ANC) programs in various programmatic, epidemiologic, and economic contexts. Methods and Findings We modeled the cost, health impact, and cost-effectiveness of expanded syphilis screening and treatment in ANC, compared to current services, for 1,000,000 pregnancies per year over four years. We defined eight generic country scenarios by systematically varying three factors: current maternal syphilis testing and treatment coverage, syphilis prevalence in pregnant women, and the cost of healthcare. We calculated program and net costs, DALYs averted, and net costs per DALY averted over four years in each scenario. Program costs are estimated at $4,142,287 – $8,235,796 per million pregnant women (2010 USD). Net costs, adjusted for averted medical care and current services, range from net savings of $12,261,250 to net costs of $1,736,807. The program averts an estimated 5,754 – 93,484 DALYs, yielding net savings in four scenarios, and a cost per DALY averted of $24 – $111 in the four scenarios with net costs. Results were robust in sensitivity analyses. Conclusions Eliminating MTCT of syphilis through expanded screening and treatment in ANC is likely to be highly cost-effective by WHO-defined thresholds in a wide range of settings. Countries with high prevalence, low current service coverage, and high healthcare cost would benefit most. Future analyses can be

  19. A budget-impact and cost-effectiveness model for second-line treatment of major depression.

    Science.gov (United States)

    Malone, Daniel C

    2007-07-01

    Depressed patients who initially fail to achieve remission when placed on a selective serotonin reuptake inhibitor (SSRI) may require a second treatment. The purpose of this study was to evaluate the effectiveness, cost, cost-effectiveness, and budget impact of second-line pharmacologic treatment for major depressive disorder (MDD). A cost-effectiveness analysis was conducted to evaluate second-line therapies (citalopram, escitalopram, fluoxetine, paroxetine, paroxetine controlled release [CR], sertraline, and venlafaxine extended release [XR]) for the treatment of depression. Effectiveness data were obtained from published clinical studies. The primary outcome was remission defined as a score of 7 or less on the Hamilton Rating Scale for Depression (HAM-D) or a score of 10 or less on the montgomery-Asberg Depression Rating Scale (MADRS) depression rating scales. The wholesale acquisition cost (WAC) for medications and medical treatment costs for depression were included. The perspective was derived from a managed care organization (MCO) with 500,000 members, a 1.9% annual incidence of depression, and treatment duration of 6 months. Assumptions included: second-line treatment is not as effective as first-line treatment, WAC price reflects MCO costs, and side effects were identical. Sensitivity analyses were conducted to determine variables that influenced the results. Second-line remission rates were 20.4% for venlafaxine XR, 16.9% for sertraline, 16.4% for escitalopram, 15.1% for generic SSRIs (weighted average), and 13.6% for paroxetine CR. Pharmacy costs ranged from $163 for generic SSRIs to $319 for venlafaxine SR. Total cost per patient achieving remission was $14,275 for venlafaxine SR, followed by $16,100 for escitalopram. The incremental cost-effectiveness ratio (ICER) for venlafaxine SR compared with generic SSRIs was $2,073 per patient achieving remission, followed by escitalopram with an ICER of $3,566. The model was most sensitive to other therapies

  20. Incorporating Parameter Uncertainty in Bayesian Segmentation Models: Application to Hippocampal Subfield Volumetry

    DEFF Research Database (Denmark)

    Iglesias, J. E.; Sabuncu, M. R.; Van Leemput, Koen

    2012-01-01

    Many successful segmentation algorithms are based on Bayesian models in which prior anatomical knowledge is combined with the available image information. However, these methods typically have many free parameters that are estimated to obtain point estimates only, whereas a faithful Bayesian anal...

  1. A Bayesian Approach to Person Fit Analysis in Item Response Theory Models. Research Report.

    Science.gov (United States)

    Glas, Cees A. W.; Meijer, Rob R.

    A Bayesian approach to the evaluation of person fit in item response theory (IRT) models is presented. In a posterior predictive check, the observed value on a discrepancy variable is positioned in its posterior distribution. In a Bayesian framework, a Markov Chain Monte Carlo procedure can be used to generate samples of the posterior distribution…

  2. Bayesian Methods for Analyzing Structural Equation Models with Covariates, Interaction, and Quadratic Latent Variables

    Science.gov (United States)

    Lee, Sik-Yum; Song, Xin-Yuan; Tang, Nian-Sheng

    2007-01-01

    The analysis of interaction among latent variables has received much attention. This article introduces a Bayesian approach to analyze a general structural equation model that accommodates the general nonlinear terms of latent variables and covariates. This approach produces a Bayesian estimate that has the same statistical optimal properties as a…

  3. More Bayesian Transdimensional Inversion for Thermal History Modelling (Invited)

    Science.gov (United States)

    Gallagher, K.

    2013-12-01

    Since the publication of Dodson (1973) quantifying the relationship between geochronogical ages and closure temperatures, an ongoing concern in thermochronology is reconstruction of thermal histories consistent with the measured data. Extracting this thermal history information is best treated as an inverse problem, given the complex relationship between the observations and the thermal history. When solving the inverse problem (i.e. finding thermal acceptable thermal histories), stochastic sampling methods have often been used, as these are relatively global when searching the model space. However, the issue remains how best to estimate those parts of the thermal history unconstrained by independent information, i.e. what is required to fit the data ? To solve this general problem, we use a Bayesian transdimensional Markov Chain Monte Carlo method and this has been integrated into user-friendly software, QTQt (Quantitative Thermochronology with Qt), which runs on both Macintosh and PC. The Bayesian approach allows us to consider a wide range of possible thermal history as general prior information on time, temperature (and temperature offset for multiple samples in a vertical profile). We can also incorporate more focussed geological constraints in terms of more specific priors. In this framework, it is the data themselves (and their errors) that determine the complexity of the thermal history solutions. For example, more precise data will justify a more complex solution, while more noisy data will be happy with simpler solutions. We can express complexity in terms of the number of time-temperature points defining the total thermal history. Another useful feature of this method is that was can easily deal with imprecise parameter values (e.g. kinetics, data errors), by drawing samples from a user specified probability distribution, rather than using a single value. Finally, the method can be applied to either single samples, or multiple samples (from a borehole or

  4. Rational Irrationality: Modeling Climate Change Belief Polarization Using Bayesian Networks.

    Science.gov (United States)

    Cook, John; Lewandowsky, Stephan

    2016-01-01

    Belief polarization is said to occur when two people respond to the same evidence by updating their beliefs in opposite directions. This response is considered to be "irrational" because it involves contrary updating, a form of belief updating that appears to violate normatively optimal responding, as for example dictated by Bayes' theorem. In light of much evidence that people are capable of normatively optimal behavior, belief polarization presents a puzzling exception. We show that Bayesian networks, or Bayes nets, can simulate rational belief updating. When fit to experimental data, Bayes nets can help identify the factors that contribute to polarization. We present a study into belief updating concerning the reality of climate change in response to information about the scientific consensus on anthropogenic global warming (AGW). The study used representative samples of Australian and U.S. Among Australians, consensus information partially neutralized the influence of worldview, with free-market supporters showing a greater increase in acceptance of human-caused global warming relative to free-market opponents. In contrast, while consensus information overall had a positive effect on perceived consensus among U.S. participants, there was a reduction in perceived consensus and acceptance of human-caused global warming for strong supporters of unregulated free markets. Fitting a Bayes net model to the data indicated that under a Bayesian framework, free-market support is a significant driver of beliefs about climate change and trust in climate scientists. Further, active distrust of climate scientists among a small number of U.S. conservatives drives contrary updating in response to consensus information among this particular group. Copyright © 2016 Cognitive Science Society, Inc.

  5. Model-based Bayesian signal extraction algorithm for peripheral nerves

    Science.gov (United States)

    Eggers, Thomas E.; Dweiri, Yazan M.; McCallum, Grant A.; Durand, Dominique M.

    2017-10-01

    Objective. Multi-channel cuff electrodes have recently been investigated for extracting fascicular-level motor commands from mixed neural recordings. Such signals could provide volitional, intuitive control over a robotic prosthesis for amputee patients. Recent work has demonstrated success in extracting these signals in acute and chronic preparations using spatial filtering techniques. These extracted signals, however, had low signal-to-noise ratios and thus limited their utility to binary classification. In this work a new algorithm is proposed which combines previous source localization approaches to create a model based method which operates in real time. Approach. To validate this algorithm, a saline benchtop setup was created to allow the precise placement of artificial sources within a cuff and interference sources outside the cuff. The artificial source was taken from five seconds of chronic neural activity to replicate realistic recordings. The proposed algorithm, hybrid Bayesian signal extraction (HBSE), is then compared to previous algorithms, beamforming and a Bayesian spatial filtering method, on this test data. An example chronic neural recording is also analyzed with all three algorithms. Main results. The proposed algorithm improved the signal to noise and signal to interference ratio of extracted test signals two to three fold, as well as increased the correlation coefficient between the original and recovered signals by 10–20%. These improvements translated to the chronic recording example and increased the calculated bit rate between the recovered signals and the recorded motor activity. Significance. HBSE significantly outperforms previous algorithms in extracting realistic neural signals, even in the presence of external noise sources. These results demonstrate the feasibility of extracting dynamic motor signals from a multi-fascicled intact nerve trunk, which in turn could extract motor command signals from an amputee for the end goal of

  6. Ensemble bayesian model averaging using markov chain Monte Carlo sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Diks, Cees G H [NON LANL; Clark, Martyn P [NON LANL

    2008-01-01

    Bayesian model averaging (BMA) has recently been proposed as a statistical method to calibrate forecast ensembles from numerical weather models. Successful implementation of BMA however, requires accurate estimates of the weights and variances of the individual competing models in the ensemble. In their seminal paper (Raftery etal. Mon Weather Rev 133: 1155-1174, 2(05)) has recommended the Expectation-Maximization (EM) algorithm for BMA model training, even though global convergence of this algorithm cannot be guaranteed. In this paper, we compare the performance of the EM algorithm and the recently developed Differential Evolution Adaptive Metropolis (DREAM) Markov Chain Monte Carlo (MCMC) algorithm for estimating the BMA weights and variances. Simulation experiments using 48-hour ensemble data of surface temperature and multi-model stream-flow forecasts show that both methods produce similar results, and that their performance is unaffected by the length of the training data set. However, MCMC simulation with DREAM is capable of efficiently handling a wide variety of BMA predictive distributions, and provides useful information about the uncertainty associated with the estimated BMA weights and variances.

  7. Bayesian calibration of the Community Land Model using surrogates

    Energy Technology Data Exchange (ETDEWEB)

    Ray, Jaideep; Hou, Zhangshuan; Huang, Maoyi; Swiler, Laura Painton

    2014-02-01

    We present results from the Bayesian calibration of hydrological parameters of the Community Land Model (CLM), which is often used in climate simulations and Earth system models. A statistical inverse problem is formulated for three hydrological parameters, conditional on observations of latent heat surface fluxes over 48 months. Our calibration method uses polynomial and Gaussian process surrogates of the CLM, and solves the parameter estimation problem using a Markov chain Monte Carlo sampler. Posterior probability densities for the parameters are developed for two sites with different soil and vegetation covers. Our method also allows us to examine the structural error in CLM under two error models. We find that surrogate models can be created for CLM in most cases. The posterior distributions are more predictive than the default parameter values in CLM. Climatologically averaging the observations does not modify the parameters' distributions significantly. The structural error model reveals a correlation time-scale which can be used to identify the physical process that could be contributing to it. While the calibrated CLM has a higher predictive skill, the calibration is under-dispersive.

  8. A Bayesian Model of Category-Specific Emotional Brain Responses

    Science.gov (United States)

    Wager, Tor D.; Kang, Jian; Johnson, Timothy D.; Nichols, Thomas E.; Satpute, Ajay B.; Barrett, Lisa Feldman

    2015-01-01

    Understanding emotion is critical for a science of healthy and disordered brain function, but the neurophysiological basis of emotional experience is still poorly understood. We analyzed human brain activity patterns from 148 studies of emotion categories (2159 total participants) using a novel hierarchical Bayesian model. The model allowed us to classify which of five categories—fear, anger, disgust, sadness, or happiness—is engaged by a study with 66% accuracy (43-86% across categories). Analyses of the activity patterns encoded in the model revealed that each emotion category is associated with unique, prototypical patterns of activity across multiple brain systems including the cortex, thalamus, amygdala, and other structures. The results indicate that emotion categories are not contained within any one region or system, but are represented as configurations across multiple brain networks. The model provides a precise summary of the prototypical patterns for each emotion category, and demonstrates that a sufficient characterization of emotion categories relies on (a) differential patterns of involvement in neocortical systems that differ between humans and other species, and (b) distinctive patterns of cortical-subcortical interactions. Thus, these findings are incompatible with several contemporary theories of emotion, including those that emphasize emotion-dedicated brain systems and those that propose emotion is localized primarily in subcortical activity. They are consistent with componential and constructionist views, which propose that emotions are differentiated by a combination of perceptual, mnemonic, prospective, and motivational elements. Such brain-based models of emotion provide a foundation for new translational and clinical approaches. PMID:25853490

  9. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    Science.gov (United States)

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A.; Burgueño, Juan; Pérez-Rodríguez, Paulino; de los Campos, Gustavo

    2016-01-01

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects (u) that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model (u) plus an extra component, f, that captures random effects between environments that were not captured by the random effects u. We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u and f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u. PMID:27793970

  10. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    Directory of Open Access Journals (Sweden)

    Jaime Cuevas

    2017-01-01

    Full Text Available The phenomenon of genotype × environment (G × E interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects ( u that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP and Gaussian (Gaussian kernel, GK. The other model has the same genetic component as the first model ( u plus an extra component, f, that captures random effects between environments that were not captured by the random effects u . We used five CIMMYT data sets (one maize and four wheat that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u   and   f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u .

  11. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models.

    Science.gov (United States)

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A; Burgueño, Juan; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo

    2017-01-05

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects [Formula: see text] that can be assessed by the Kronecker product of variance-covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model [Formula: see text] plus an extra component, F: , that captures random effects between environments that were not captured by the random effects [Formula: see text] We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with [Formula: see text] over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect [Formula: see text]. Copyright © 2017 Cuevas et al.

  12. A Review of Generic Preference-Based Measures for Use in Cost-Effectiveness Models.

    Science.gov (United States)

    Brazier, John; Ara, Roberta; Rowen, Donna; Chevrou-Severac, Helene

    2017-12-01

    Generic preference-based measures (GPBMs) of health are used to obtain the quality adjustment weight required to calculate the quality-adjusted life year in health economic models. GPBMs have been developed to use across different interventions and medical conditions and typically consist of a self-complete patient questionnaire, a health state classification system, and preference weights for all states defined by the classification system. Of the six main GPBMs, the three most frequently used are the Health Utilities Index version 3, the EuroQol 5 dimensions (3 and 5 levels), and the Short Form 6 dimensions. There are considerable differences in GPBMs in terms of the content and size of descriptive systems (i.e. the numbers of dimensions of health and levels of severity within these), the methods of valuation [e.g. time trade-off (TTO), standard gamble (SG)], and the populations (e.g. general population, patients) used to value the health states within the descriptive systems. Although GPBMs are anchored at 1 (full health) and 0 (dead), they produce different health state utility values when completed by the same patient. Considerations when selecting a measure for use in a clinical trial include practicality, reliability, validity and responsiveness. Requirements of reimbursement agencies may impose additional restrictions on suitable measures for use in economic evaluations, such as the valuation technique (TTO, SG) or the source of values (general public vs. patients).

  13. Nonlinear regression modeling of nutrient loads in streams: A Bayesian approach

    Science.gov (United States)

    Qian, S.S.; Reckhow, K.H.; Zhai, J.; McMahon, G.

    2005-01-01

    A Bayesian nonlinear regression modeling method is introduced and compared with the least squares method for modeling nutrient loads in stream networks. The objective of the study is to better model spatial correlation in river basin hydrology and land use for improving the model as a forecasting tool. The Bayesian modeling approach is introduced in three steps, each with a more complicated model and data error structure. The approach is illustrated using a data set from three large river basins in eastern North Carolina. Results indicate that the Bayesian model better accounts for model and data uncertainties than does the conventional least squares approach. Applications of the Bayesian models for ambient water quality standards compliance and TMDL assessment are discussed. Copyright 2005 by the American Geophysical Union.

  14. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses.

    Science.gov (United States)

    Welton, Nicky J; Soares, Marta O; Palmer, Stephen; Ades, Anthony E; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M

    2015-07-01

    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. © The Author(s) 2015.

  15. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  16. Confronting different models of community structure to species-abundance data : a Bayesian model comparison

    NARCIS (Netherlands)

    Etienne, RS; Olff, H

    Species abundances are undoubtedly the most widely available macroecological data, but can we use them to distinguish among several models of community structure? Here we present a Bayesian analysis of species-abundance data that yields a full joint probability distribution of each model's

  17. Confronting different models of community structure to species-abundance data: a Bayesian model comparison

    NARCIS (Netherlands)

    Etienne, R.S.; Olff, H.

    2005-01-01

    Species abundances are undoubtedly the most widely available macroecological data, but can we use them to distinguish among several models of community structure? Here we present a Bayesian analysis of species-abundance data that yields a full joint probability distribution of each model's

  18. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.

    2014-09-16

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models\\' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  19. Image Segmentation Using Disjunctive Normal Bayesian Shape and Appearance Models.

    Science.gov (United States)

    Mesadi, Fitsum; Erdil, Ertunc; Cetin, Mujdat; Tasdizen, Tolga

    2018-01-01

    The use of appearance and shape priors in image segmentation is known to improve accuracy; however, existing techniques have several drawbacks. For instance, most active shape and appearance models require landmark points and assume unimodal shape and appearance distributions, and the level set representation does not support construction of local priors. In this paper, we present novel appearance and shape models for image segmentation based on a differentiable implicit parametric shape representation called a disjunctive normal shape model (DNSM). The DNSM is formed by the disjunction of polytopes, which themselves are formed by the conjunctions of half-spaces. The DNSM's parametric nature allows the use of powerful local prior statistics, and its implicit nature removes the need to use landmarks and easily handles topological changes. In a Bayesian inference framework, we model arbitrary shape and appearance distributions using nonparametric density estimations, at any local scale. The proposed local shape prior results in accurate segmentation even when very few training shapes are available, because the method generates a rich set of shape variations by locally combining training samples. We demonstrate the performance of the framework by applying it to both 2-D and 3-D data sets with emphasis on biomedical image segmentation applications.

  20. Extraction of Airways with Probabilistic State-Space Models and Bayesian Smoothing

    DEFF Research Database (Denmark)

    Raghavendra, Selvan; Petersen, Jens; Pedersen, Jesper Johannes Holst

    of elongated branches using probabilistic state-space models and Bayesian smoothing. Unlike most existing methods that proceed with sequential tracking of branches, we present an exploratory method, that is less sensitive to local anomalies in the data due to acquisition noise and/or interfering structures....... The evolution of individual branches is modelled using a process model and the observed data is incorporated into the update step of the Bayesian smoother using a measurement model that is based on a multi-scale blob detector. Bayesian smoothing is performed using the RTS (Rauch-Tung-Striebel) smoother, which...

  1. Bayesian specification analysis and estimation of simultaneous equation models using Monte Carlo methods

    NARCIS (Netherlands)

    A. Zellner (Arnold); L. Bauwens (Luc); H.K. van Dijk (Herman)

    1988-01-01

    textabstractBayesian procedures for specification analysis or diagnostic checking of modeling assumptions for structural equations of econometric models are developed and applied using Monte Carlo numerical methods. Checks on the validity of identifying restrictions, exogeneity assumptions and other

  2. A full-capture Hierarchical Bayesian model of Pollock's Closed Robust Design and application to dolphins

    Directory of Open Access Journals (Sweden)

    Robert William Rankin

    2016-03-01

    Full Text Available We present a Hierarchical Bayesian version of Pollock's Closed Robust Design for studying the survival, temporary-migration, and abundance of marked animals. Through simulations and analyses of a bottlenose dolphin photo-identification dataset, we compare several estimation frameworks, including Maximum Likelihood estimation (ML, model-averaging by AICc, as well as Bayesian and Hierarchical Bayesian (HB procedures. Our results demonstrate a number of advantages of the Bayesian framework over other popular methods. First, for simple fixed-effect models, we show the near-equivalence of Bayesian and ML point-estimates and confidence/credibility intervals. Second, we demonstrate how there is an inherent correlation among temporary-migration and survival parameter estimates in the PCRD, and while this can lead to serious convergence issues and singularities among MLEs, we show that the Bayesian estimates were more reliable. Third, we demonstrate that a Hierarchical Bayesian model with carefully thought-out hyperpriors, can lead to similar parameter estimates and conclusions as multi-model inference by AICc model-averaging. This latter point is especially interesting for mark-recapture practitioners, for whom model-uncertainty and multi-model inference have become a major preoccupation. Lastly, we extend the Hierarchical Bayesian PCRD to include full-capture histories (i.e., by modelling a recruitment process and individual-level heterogeneity in detection probabilities, which can have important consequences for the range of phenomena studied by the PCRD, as well as lead to large differences in abundance estimates. For example, we estimate 8%-24% more bottlenose dolphins in the western gulf of Shark Bay than previously estimated by ML and AICc-based model-averaging. Other important extensions are discussed. Our Bayesian PCRD models are written in the BUGS-like JAGS language for easy dissemination and customization by the community of capture

  3. Cost-effectiveness of a nurse practitioner-family physician model of care in a nursing home: controlled before and after study.

    Science.gov (United States)

    Lacny, Sarah; Zarrabi, Mahmood; Martin-Misener, Ruth; Donald, Faith; Sketris, Ingrid; Murphy, Andrea L; DiCenso, Alba; Marshall, Deborah A

    2016-09-01

    To examine the cost-effectiveness of a nurse practitioner-family physician model of care compared with family physician-only care in a Canadian nursing home. As demand for long-term care increases, alternative care models including nurse practitioners are being explored. Cost-effectiveness analysis using a controlled before-after design. The study included an 18-month 'before' period (2005-2006) and a 21-month 'after' time period (2007-2009). Data were abstracted from charts from 2008-2010. We calculated incremental cost-effectiveness ratios comparing the intervention (nurse practitioner-family physician model; n = 45) to internal (n = 65), external (n = 70) and combined internal/external family physician-only control groups, measured as the change in healthcare costs divided by the change in emergency department transfers/person-month. We assessed joint uncertainty around costs and effects using non-parametric bootstrapping and cost-effectiveness acceptability curves. Point estimates of the incremental cost-effectiveness ratio demonstrated the nurse practitioner-family physician model dominated the internal and combined control groups (i.e. was associated with smaller increases in costs and emergency department transfers/person-month). Compared with the external control, the intervention resulted in a smaller increase in costs and larger increase in emergency department transfers. Using a willingness-to-pay threshold of $1000 CAD/emergency department transfer, the probability the intervention was cost-effective compared with the internal, external and combined control groups was 26%, 21% and 25%. Due to uncertainty around the distribution of costs and effects, we were unable to make a definitive conclusion regarding the cost-effectiveness of the nurse practitioner-family physician model; however, these results suggest benefits that could be confirmed in a larger study. © 2016 John Wiley & Sons Ltd.

  4. Bayesian nonparametric clustering in phylogenetics: modeling antigenic evolution in influenza.

    Science.gov (United States)

    Cybis, Gabriela B; Sinsheimer, Janet S; Bedford, Trevor; Rambaut, Andrew; Lemey, Philippe; Suchard, Marc A

    2018-01-30

    Influenza is responsible for up to 500,000 deaths every year, and antigenic variability represents much of its epidemiological burden. To visualize antigenic differences across many viral strains, antigenic cartography methods use multidimensional scaling on binding assay data to map influenza antigenicity onto a low-dimensional space. Analysis of such assay data ideally leads to natural clustering of influenza strains of similar antigenicity that correlate with sequence evolution. To understand the dynamics of these antigenic groups, we present a framework that jointly models genetic and antigenic evolution by combining multidimensional scaling of binding assay data, Bayesian phylogenetic machinery and nonparametric clustering methods. We propose a phylogenetic Chinese restaurant process that extends the current process to incorporate the phylogenetic dependency structure between strains in the modeling of antigenic clusters. With this method, we are able to use the genetic information to better understand the evolution of antigenicity throughout epidemics, as shown in applications of this model to H1N1 influenza. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. A Bayesian semiparametric Markov regression model for juvenile dermatomyositis.

    Science.gov (United States)

    De Iorio, Maria; Gallot, Natacha; Valcarcel, Beatriz; Wedderburn, Lucy

    2018-02-20

    Juvenile dermatomyositis (JDM) is a rare autoimmune disease that may lead to serious complications, even to death. We develop a 2-state Markov regression model in a Bayesian framework to characterise disease progression in JDM over time and gain a better understanding of the factors influencing disease risk. The transition probabilities between disease and remission state (and vice versa) are a function of time-homogeneous and time-varying covariates. These latter types of covariates are introduced in the model through a latent health state function, which describes patient-specific health over time and accounts for variability among patients. We assume a nonparametric prior based on the Dirichlet process to model the health state function and the baseline transition intensities between disease and remission state and vice versa. The Dirichlet process induces a clustering of the patients in homogeneous risk groups. To highlight clinical variables that most affect the transition probabilities, we perform variable selection using spike and slab prior distributions. Posterior inference is performed through Markov chain Monte Carlo methods. Data were made available from the UK JDM Cohort and Biomarker Study and Repository, hosted at the UCL Institute of Child Health. Copyright © 2018 John Wiley & Sons, Ltd.

  6. Bayesian network model of crowd emotion and negative behavior

    Science.gov (United States)

    Ramli, Nurulhuda; Ghani, Noraida Abdul; Hatta, Zulkarnain Ahmad; Hashim, Intan Hashimah Mohd; Sulong, Jasni; Mahudin, Nor Diana Mohd; Rahman, Shukran Abd; Saad, Zarina Mat

    2014-12-01

    The effects of overcrowding have become a major concern for event organizers. One aspect of this concern has been the idea that overcrowding can enhance the occurrence of serious incidents during events. As one of the largest Muslim religious gathering attended by pilgrims from all over the world, Hajj has become extremely overcrowded with many incidents being reported. The purpose of this study is to analyze the nature of human emotion and negative behavior resulting from overcrowding during Hajj events from data gathered in Malaysian Hajj Experience Survey in 2013. The sample comprised of 147 Malaysian pilgrims (70 males and 77 females). Utilizing a probabilistic model called Bayesian network, this paper models the dependence structure between different emotions and negative behaviors of pilgrims in the crowd. The model included the following variables of emotion: negative, negative comfortable, positive, positive comfortable and positive spiritual and variables of negative behaviors; aggressive and hazardous acts. The study demonstrated that emotions of negative, negative comfortable, positive spiritual and positive emotion have a direct influence on aggressive behavior whereas emotion of negative comfortable, positive spiritual and positive have a direct influence on hazardous acts behavior. The sensitivity analysis showed that a low level of negative and negative comfortable emotions leads to a lower level of aggressive and hazardous behavior. Findings of the study can be further improved to identify the exact cause and risk factors of crowd-related incidents in preventing crowd disasters during the mass gathering events.

  7. Flexible Bayesian Dynamic Modeling of Covariance and Correlation Matrices

    KAUST Repository

    Lan, Shiwei

    2017-11-08

    Modeling covariance (and correlation) matrices is a challenging problem due to the large dimensionality and positive-definiteness constraint. In this paper, we propose a novel Bayesian framework based on decomposing the covariance matrix into variance and correlation matrices. The highlight is that the correlations are represented as products of vectors on unit spheres. We propose a variety of distributions on spheres (e.g. the squared-Dirichlet distribution) to induce flexible prior distributions for covariance matrices that go beyond the commonly used inverse-Wishart prior. To handle the intractability of the resulting posterior, we introduce the adaptive $\\\\Delta$-Spherical Hamiltonian Monte Carlo. We also extend our structured framework to dynamic cases and introduce unit-vector Gaussian process priors for modeling the evolution of correlation among multiple time series. Using an example of Normal-Inverse-Wishart problem, a simulated periodic process, and an analysis of local field potential data (collected from the hippocampus of rats performing a complex sequence memory task), we demonstrated the validity and effectiveness of our proposed framework for (dynamic) modeling covariance and correlation matrices.

  8. Parameter Estimation of Structural Equation Modeling Using Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Dewi Kurnia Sari

    2016-05-01

    Full Text Available Leadership is a process of influencing, directing or giving an example of employees in order to achieve the objectives of the organization and is a key element in the effectiveness of the organization. In addition to the style of leadership, the success of an organization or company in achieving its objectives can also be influenced by the commitment of the organization. Where organizational commitment is a commitment created by each individual for the betterment of the organization. The purpose of this research is to obtain a model of leadership style and organizational commitment to job satisfaction and employee performance, and determine the factors that influence job satisfaction and employee performance using SEM with Bayesian approach. This research was conducted at Statistics FNI employees in Malang, with 15 people. The result of this study showed that the measurement model, all significant indicators measure each latent variable. Meanwhile in the structural model, it was concluded there are a significant difference between the variables of Leadership Style and Organizational Commitment toward Job Satisfaction directly as well as a significant difference between Job Satisfaction on Employee Performance. As for the influence of Leadership Style and variable Organizational Commitment on Employee Performance directly declared insignificant.

  9. Bayesian models for astrophysical data using R, JAGS, Python, and Stan

    CERN Document Server

    Hilbe, Joseph M; Ishida, Emille E O

    2017-01-01

    This comprehensive guide to Bayesian methods in astronomy enables hands-on work by supplying complete R, JAGS, Python, and Stan code, to use directly or to adapt. It begins by examining the normal model from both frequentist and Bayesian perspectives and then progresses to a full range of Bayesian generalized linear and mixed or hierarchical models, as well as additional types of models such as ABC and INLA. The book provides code that is largely unavailable elsewhere and includes details on interpreting and evaluating Bayesian models. Initial discussions offer models in synthetic form so that readers can easily adapt them to their own data; later the models are applied to real astronomical data. The consistent focus is on hands-on modeling, analysis of data, and interpretations that address scientific questions. A must-have for astronomers, its concrete approach will also be attractive to researchers in the sciences more generally.

  10. Bayesian Regression of Thermodynamic Models of Redox Active Materials

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, Katherine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    Finding a suitable functional redox material is a critical challenge to achieving scalable, economically viable technologies for storing concentrated solar energy in the form of a defected oxide. Demonstrating e ectiveness for thermal storage or solar fuel is largely accomplished by using a thermodynamic model derived from experimental data. The purpose of this project is to test the accuracy of our regression model on representative data sets. Determining the accuracy of the model includes parameter tting the model to the data, comparing the model using di erent numbers of param- eters, and analyzing the entropy and enthalpy calculated from the model. Three data sets were considered in this project: two demonstrating materials for solar fuels by wa- ter splitting and the other of a material for thermal storage. Using Bayesian Inference and Markov Chain Monte Carlo (MCMC), parameter estimation was preformed on the three data sets. Good results were achieved, except some there was some deviations on the edges of the data input ranges. The evidence values were then calculated in a variety of ways and used to compare models with di erent number of parameters. It was believed that at least one of the parameters was unnecessary and comparing evidence values demonstrated that the parameter was need on one data set and not signi cantly helpful on another. The entropy was calculated by taking the derivative in one variable and integrating over another. and its uncertainty was also calculated by evaluating the entropy over multiple MCMC samples. Afterwards, all the parts were written up as a tutorial for the Uncertainty Quanti cation Toolkit (UQTk).

  11. Results from evaluations of models and cost-effectiveness tools to support introduction decisions for new vaccines need critical appraisal

    Directory of Open Access Journals (Sweden)

    Moorthy Vasee

    2011-05-01

    Full Text Available Abstract The World Health Organization (WHO recommends that the cost-effectiveness (CE of introducing new vaccines be considered before such a programme is implemented. However, in low- and middle-income countries (LMICs, it is often challenging to perform and interpret the results of model-based economic appraisals of vaccines that benefit from locally relevant data. As a result, WHO embarked on a series of consultations to assess economic analytical tools to support vaccine introduction decisions for pneumococcal, rotavirus and human papillomavirus vaccines. The objectives of these assessments are to provide decision makers with a menu of existing CE tools for vaccines and their characteristics rather than to endorse the use of a single tool. The outcome will provide policy makers in LMICs with information about the feasibility of applying these models to inform their own decision making. We argue that if models and CE analyses are used to inform decisions, they ought to be critically appraised beforehand, including a transparent evaluation of their structure, assumptions and data sources (in isolation or in comparison to similar tools, so that decision makers can use them while being fully aware of their robustness and limitations.

  12. Macroscopic Models of Clique Tree Growth for Bayesian Networks

    Data.gov (United States)

    National Aeronautics and Space Administration — In clique tree clustering, inference consists of propagation in a clique tree compiled from a Bayesian network. In this paper, we develop an analytical approach to...

  13. Multilevel temporal Bayesian networks can model longitudinal change in multimorbidity

    NARCIS (Netherlands)

    Lappenschaar, M.; Hommersom, A.; Lucas, P.J.; Lagro, J.; Visscher, S.; Korevaar, J.C.; Schellevis, F.G.

    2013-01-01

    Objectives Although the course of single diseases can be studied using traditional epidemiologic techniques, these methods cannot capture the complex joint evolutionary course of multiple disorders. In this study, multilevel temporal Bayesian networks were adopted to study the course of

  14. Bayesian modeling and chronological precision for Polynesian settlement of Tonga.

    Directory of Open Access Journals (Sweden)

    David Burley

    Full Text Available First settlement of Polynesia, and population expansion throughout the ancestral Polynesian homeland are foundation events for global history. A precise chronology is paramount to informed archaeological interpretation of these events and their consequences. Recently applied chronometric hygiene protocols excluding radiocarbon dates on wood charcoal without species identification all but eliminates this chronology as it has been built for the Kingdom of Tonga, the initial islands to be settled in Polynesia. In this paper we re-examine and redevelop this chronology through application of Bayesian models to the questioned suite of radiocarbon dates, but also incorporating short-lived wood charcoal dates from archived samples and high precision U/Th dates on coral artifacts. These models provide generation level precision allowing us to track population migration from first Lapita occupation on the island of Tongatapu through Tonga's central and northern island groups. They further illustrate an exceptionally short duration for the initial colonizing Lapita phase and a somewhat abrupt transition to ancestral Polynesian society as it is currently defined.

  15. Bayesian mixture models for source separation in MEG

    International Nuclear Information System (INIS)

    Calvetti, Daniela; Homa, Laura; Somersalo, Erkki

    2011-01-01

    This paper discusses the problem of imaging electromagnetic brain activity from measurements of the induced magnetic field outside the head. This imaging modality, magnetoencephalography (MEG), is known to be severely ill posed, and in order to obtain useful estimates for the activity map, complementary information needs to be used to regularize the problem. In this paper, a particular emphasis is on finding non-superficial focal sources that induce a magnetic field that may be confused with noise due to external sources and with distributed brain noise. The data are assumed to come from a mixture of a focal source and a spatially distributed possibly virtual source; hence, to differentiate between those two components, the problem is solved within a Bayesian framework, with a mixture model prior encoding the information that different sources may be concurrently active. The mixture model prior combines one density that favors strongly focal sources and another that favors spatially distributed sources, interpreted as clutter in the source estimation. Furthermore, to address the challenge of localizing deep focal sources, a novel depth sounding algorithm is suggested, and it is shown with simulated data that the method is able to distinguish between a signal arising from a deep focal source and a clutter signal. (paper)

  16. Nitrate source apportionment in a subtropical watershed using Bayesian model

    International Nuclear Information System (INIS)

    Yang, Liping; Han, Jiangpei; Xue, Jianlong; Zeng, Lingzao; Shi, Jiachun; Wu, Laosheng; Jiang, Yonghai

    2013-01-01

    Nitrate (NO 3 − ) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO 3 − concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L −1 ) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L −1 ). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L −1 NO 3 − . Four sources of NO 3 − (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl − , NO 3 − , HCO 3 − , SO 4 2− , Ca 2+ , K + , Mg 2+ , Na + , dissolved oxygen (DO)] and dual isotope approach (δ 15 N–NO 3 − and δ 18 O–NO 3 − ). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO 3 − to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO 3 − , better agricultural management practices and sewage disposal programs can be implemented to sustain water quality in subtropical watersheds

  17. Taxes and Subsidies for Improving Diet and Population Health in Australia: A Cost-Effectiveness Modelling Study.

    Directory of Open Access Journals (Sweden)

    Linda J Cobiac

    2017-02-01

    Full Text Available An increasing number of countries are implementing taxes on unhealthy foods and drinks to address the growing burden of dietary-related disease, but the cost-effectiveness of combining taxes on unhealthy foods and subsidies on healthy foods is not well understood.Using a population model of dietary-related diseases and health care costs and food price elasticities, we simulated the effect of taxes on saturated fat, salt, sugar, and sugar-sweetened beverages and a subsidy on fruits and vegetables, over the lifetime of the Australian population. The sizes of the taxes and subsidy were set such that, when combined as a package, there would be a negligible effect on average weekly expenditure on food (<1% change. We evaluated the cost-effectiveness of the interventions individually, then determined the optimal combination based on maximising net monetary benefit at a threshold of AU$50,000 per disability-adjusted life year (DALY. The simulations suggested that the combination of taxes and subsidy might avert as many as 470,000 DALYs (95% uncertainty interval [UI]: 420,000 to 510,000 in the Australian population of 22 million, with a net cost-saving of AU$3.4 billion (95% UI: AU$2.4 billion to AU$4.6 billion; US$2.3 billion to the health sector. Of the taxes evaluated, the sugar tax produced the biggest estimates of health gain (270,000 [95% UI: 250,000 to 290,000] DALYs averted, followed by the salt tax (130,000 [95% UI: 120,000 to 140,000] DALYs, the saturated fat tax (97,000 [95% UI: 77,000 to 120,000] DALYs, and the sugar-sweetened beverage tax (12,000 [95% UI: 2,100 to 21,000] DALYs. The fruit and vegetable subsidy (-13,000 [95% UI: -44,000 to 18,000] DALYs was a cost-effective addition to the package of taxes. However, it did not necessarily lead to a net health benefit for the population when modelled as an intervention on its own, because of the possible adverse cross-price elasticity effects on consumption of other foods (e.g., foods high in

  18. Bayesian inference for partially identified models exploring the limits of limited data

    CERN Document Server

    Gustafson, Paul

    2015-01-01

    Introduction Identification What Is against Us? What Is for Us? Some Simple Examples of Partially Identified ModelsThe Road Ahead The Structure of Inference in Partially Identified Models Bayesian Inference The Structure of Posterior Distributions in PIMs Computational Strategies Strength of Bayesian Updating, Revisited Posterior MomentsCredible Intervals Evaluating the Worth of Inference Partial Identification versus Model Misspecification The Siren Call of Identification Comp

  19. Health impact and cost-effectiveness of a domestically-produced rotavirus vaccine in India: A model based analysis.

    Directory of Open Access Journals (Sweden)

    Johnie Rose

    Full Text Available Currently, Indian officials are incorporating a domestically manufactured rotavirus vaccine (based on the 116E rotavirus strain into the country's universal immunization program; this vaccine will cost significantly less than western rotavirus vaccines. Here, we examine the public health impact, cost, and cost-effectiveness of universal vaccination in India using the 116E vaccine. This work will allow comparison of universal 116E vaccination with other approaches to child mortality reduction, shed light on the future burden of rotavirus disease in India, and help stakeholders understand future resource needs.Using information from published literature, we developed a dynamic simulation model of rotavirus transmission, natural history, and related utilization among Indian infants followed until age five. Infection risk depended on the degree of viral shedding in the population. Infection risk and severity were influenced by age, number of previous infections, and vaccination history. Probabilities of inpatient and outpatient health services utilization depended on symptom severity. With the model, we compared a strategy of nationwide 116E vaccination to one of no vaccination. Costs were considered from the perspective of all payers (including families and from the societal perspective.We estimated that an established 116E vaccination program would reduce symptomatic rotavirus infection by 13.0%, while reducing population-wide rotavirus mortality by 34.6% (over 34,000 lives annually. Rotavirus outpatient visits would decline by 21.3%, and hospitalization would decline by 28.1%. The cost per disability-adjusted life year (DALY averted was estimated at 3,429 Rupees (approximately $56. Predicted mortality reduction in children born during the first five years of vaccination implementation was nearly identical to that in children born in later years (34.4% versus 34.6%.116E vaccination of Indian infants would likely substantially reduce rotavirus

  20. Cost-effectiveness of omega-3 fatty acid supplements in parenteral nutrition therapy in hospitals: a discrete event simulation model.

    Science.gov (United States)

    Pradelli, Lorenzo; Eandi, Mario; Povero, Massimiliano; Mayer, Konstantin; Muscaritoli, Maurizio; Heller, Axel R; Fries-Schaffner, Eva

    2014-10-01

    A recent meta-analysis showed that supplementation of omega-3 fatty acids in parenteral nutrition (PN) regimens is associated with a statistically and clinically significant reduction in infection rate, and length of hospital stay (LOS) in medical and surgical patients admitted to the ICU and in surgical patients not admitted to the ICU. The objective of this present study was to evaluate the cost-effectiveness of the addition of omega-3 fatty acids to standard PN regimens in four European countries (Italy, France, Germany and the UK) from the healthcare provider perspective. Using a discrete event simulation scheme, a patient-level simulation model was developed, based on outcomes from the Italian ICU patient population and published literature. Comparative efficacy data for PN regimens containing omega-3 fatty acids versus standard PN regimens was taken from the meta-analysis of published randomised clinical trials (n = 23 studies with a total of 1502 patients), and hospital LOS reduction was further processed in order to split the reduction in ICU stay from that in-ward stays for patients admitted to the ICU. Country-specific cost data was obtained for Italian, French, German and UK healthcare systems. Clinical outcomes included in the model were death rates, nosocomial infection rates, and ICU/hospital LOS. Probabilistic and deterministic sensitivity analyses were undertaken to test the reliability of results. PN regimens containing omega-3 fatty acids were more effective on average than standard PN both in ICU and in non-ICU patients in the four countries considered, reducing infection rates and overall LOS, and resulting in a lower total cost per patient. Overall costs for patients receiving PN regimens containing omega-3 fatty acids were between €14 144 to €19 825 per ICU patient and €5484 to €14 232 per non-ICU patient, translating into savings of between €3972 and €4897 per ICU patient and savings of between €561 and €1762 per non

  1. Clinical effectiveness and cost-effectiveness of second- and third-generation left ventricular assist devices as either bridge to transplant or alternative to transplant for adults eligible for heart transplantation: systematic review and cost-effectiveness model.

    Science.gov (United States)

    Sutcliffe, P; Connock, M; Pulikottil-Jacob, R; Kandala, N-B; Suri, G; Gurung, T; Grove, A; Shyangdan, D; Briscoe, S; Maheswaran, H; Clarke, A

    2013-11-01

    of second- and third-generation US Food and Drug Administration (FDA) and/or Conformité Européenne (CE) approved VADs. Publications from the last 5 years with control groups, or case series with 50 or more patients were included. Outcomes included survival, functional capacity (e.g. change in New York Heart Association functional classification), quality of life (QoL) and adverse events. Data from the BTDB were obtained. A discrete-time, semi-Markov, multistate model was built. Deterministic and probabilistic methods with multiple sensitivity analyses varying survival, utilities and cost inputs to the model were used. Model outputs were incremental cost-effectiveness ratios (ICERs), cost/quality-adjusted life-years (QALYs) gained and cost/life-year gained (LYG). The discount rate was 3.5% and the time horizon varied over 3 years, 10 years and lifetime. Forty publications reported clinical effectiveness of VADs and one study reported cost-effectiveness. We found no high-quality comparative empirical studies of VADs as BTT compared with MM or as ATT compared with BTT. Approximately 15-25% of the patients receiving a device had died by 12 months. Studies reported the following wide ranges for adverse events: 4-27% bleeding requiring transfusion; 1.5-40% stroke; 3.3-48% infection; 1-14% device failure; 3-30% HF; 11-32% reoperation; and 3-53% renal failure. QoL and functional status were reported as improved in studies of two devices [HeartMate II (HMII; Thoratec Inc., Pleasanton, CA, USA) and HeartWare (HW; HeartWare Inc., Framingham, MA, USA)]. At 3 years, 10 years and lifetime, the ICERs for VADs as BTT compared with MM were £122,730, £68,088 and £55,173 respectively. These values were stable to changes in survival of the MM group. Both QoL and costs were reduced by VADs as ATT compared with VADs as BTT giving ICERs in south-west quadrant of the cost effectiveness plain (cost saving/QALY sacrificed) of £353,467, £31,685 and £20,637 over the 3 years, 10 years

  2. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    International Nuclear Information System (INIS)

    Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

    2014-01-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems

  3. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  4. Bayesian selection of misspecified models is overconfident and may cause spurious posterior probabilities for phylogenetic trees.

    Science.gov (United States)

    Yang, Ziheng; Zhu, Tianqi

    2018-02-20

    The Bayesian method is noted to produce spuriously high posterior probabilities for phylogenetic trees in analysis of large datasets, but the precise reasons for this overconfidence are unknown. In general, the performance of Bayesian selection of misspecified models is poorly understood, even though this is of great scientific interest since models are never true in real data analysis. Here we characterize the asymptotic behavior of Bayesian model selection and show that when the competing models are equally wrong, Bayesian model selection exhibits surprising and polarized behaviors in large datasets, supporting one model with full force while rejecting the others. If one model is slightly less wrong than the other, the less wrong model will eventually win when the amount of data increases, but the method may become overconfident before it becomes reliable. We suggest that this extreme behavior may be a major factor for the spuriously high posterior probabilities for evolutionary trees. The philosophical implications of our results to the application of Bayesian model selection to evaluate opposing scientific hypotheses are yet to be explored, as are the behaviors of non-Bayesian methods in similar situations.

  5. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  6. Public health impact and cost effectiveness of routine childhood vaccination for hepatitis a in Jordan: a dynamic model approach.

    Science.gov (United States)

    Hayajneh, Wail A; Daniels, Vincent J; James, Cerise K; Kanıbir, Muhammet Nabi; Pilsbury, Matthew; Marks, Morgan; Goveia, Michelle G; Elbasha, Elamin H; Dasbach, Erik; Acosta, Camilo J

    2018-03-07

    As the socioeconomic conditions in Jordan have improved over recent decades the disease and economic burden of Hepatitis A has increased. The purpose of this study is to assess the potential health and economic impact of a two-dose hepatitis A vaccine program covering one-year old children in Jordan. We adapted an age-structured population model of hepatitis A transmission dynamics to project the epidemiologic and economic impact of vaccinating one-year old children for 50 years in Jordan. The epidemiologic model was calibrated using local data on hepatitis A in Jordan. These data included seroprevalence and incidence data from the Jordan Ministry of Health as well as hospitalization data from King Abdullah University Hospital in Irbid, Jordan. We assumed 90% of all children would be vaccinated with the two-dose regimen by two years of age. The economic evaluation adopted a societal perspective and measured benefits using the quality-adjusted life-year (QALY). The modeled vaccination program reduced the incidence of hepatitis A in Jordan by 99%, 50 years after its introduction. The model projected 4.26 million avoided hepatitis A infections, 1.42 million outpatient visits, 22,475 hospitalizations, 508 fulminant cases, 95 liver transplants, and 76 deaths over a 50 year time horizon. In addition, we found, over a 50 year time horizon, the vaccination program would gain 37,502 QALYs and save over $42.6 million in total costs. The vaccination program became cost-saving within 6 years of its introduction and was highly cost-effective during the first 5 years. A vaccination program covering one-year old children is projected to be a cost-saving intervention that will significantly reduce the public health and economic burden of hepatitis A in Jordan.

  7. Digitized Onondaga Lake Dissolved Oxygen Concentrations and Model Simulated Values using Bayesian Monte Carlo Methods

    Data.gov (United States)

    U.S. Environmental Protection Agency — The dataset is lake dissolved oxygen concentrations obtained form plots published by Gelda et al. (1996) and lake reaeration model simulated values using Bayesian...

  8. Propagation of Uncertainty in Bayesian Kernel Models - Application to Multiple-Step Ahead Forecasting

    DEFF Research Database (Denmark)

    Quinonero, Joaquin; Girard, Agathe; Larsen, Jan

    2003-01-01

    The object of Bayesian modelling is predictive distribution, which, in a forecasting scenario, enables evaluation of forecasted values and their uncertainties. We focus on reliably estimating the predictive mean and variance of forecasted values using Bayesian kernel based models such as the Gaus......The object of Bayesian modelling is predictive distribution, which, in a forecasting scenario, enables evaluation of forecasted values and their uncertainties. We focus on reliably estimating the predictive mean and variance of forecasted values using Bayesian kernel based models...... such as the Gaussian process and the relevance vector machine. We derive novel analytic expressions for the predictive mean and variance for Gaussian kernel shapes under the assumption of a Gaussian input distribution in the static case, and of a recursive Gaussian predictive density in iterative forecasting...

  9. A comparison between Markovian models and Bayesian networks for treating some dependent events in reliability evaluations

    International Nuclear Information System (INIS)

    Duarte, Juliana P.; Leite, Victor C.; Melo, P.F. Frutuoso e

    2013-01-01

    Bayesian networks have become a very handy tool for solving problems in various application areas. This paper discusses the use of Bayesian networks to treat dependent events in reliability engineering typically modeled by Markovian models. Dependent events play an important role as, for example, when treating load-sharing systems, bridge systems, common-cause failures, and switching systems (those for which a standby component is activated after the main one fails by means of a switching mechanism). Repair plays an important role in all these cases (as, for example, the number of repairmen). All Bayesian network calculations are performed by means of the Netica™ software, of Norsys Software Corporation, and Fortran 90 to evaluate them over time. The discussion considers the development of time-dependent reliability figures of merit, which are easily obtained, through Markovian models, but not through Bayesian networks, because these latter need probability figures as input and not failure and repair rates. Bayesian networks produced results in very good agreement with those of Markov models and pivotal decomposition. Static and discrete time (DTBN) Bayesian networks were used in order to check their capabilities of modeling specific situations, like switching failures in cold-standby systems. The DTBN was more flexible to modeling systems where the time of occurrence of an event is important, for example, standby failure and repair. However, the static network model showed as good results as DTBN by a much more simplified approach. (author)

  10. Leak localization in water distribution networks using model-based bayesian reasoning

    OpenAIRE

    Soldevila Coma, Adrià; Fernández Canti, Rosa M.; Blesa Izquierdo, Joaquim; Tornil Sin, Sebastián; Puig Cayuela, Vicenç

    2016-01-01

    This paper presents a new method for leak localization in Water Distribution Networks that uses a model-based approach combined with Bayesian reasoning. Probability density functions in model-based pressure residuals are calibrated off-line for all the possible leak scenarios by using a hydraulic simulator, being leak size uncertainty, demand uncertainty and sensor noise considered. A Bayesian reasoning is applied online to the available residuals to determine the location of leaks present in...

  11. Cost-effectiveness of interventions for increasing the possession of functioning smoke alarms in households with pre-school children: a modelling study.

    Science.gov (United States)

    Saramago, Pedro; Cooper, Nicola J; Sutton, Alex J; Hayes, Mike; Dunn, Ken; Manca, Andrea; Kendrick, Denise

    2014-05-16

    The UK has one of the highest rates for deaths from fire and flames in children aged 0-14 years compared to other high income countries. Evidence shows that smoke alarms can reduce the risk of fire-related injury but little exists on their cost-effectiveness. We aimed to compare the cost effectiveness of different interventions for the uptake of 'functioning' smoke alarms and consequently for the prevention of fire-related injuries in children in the UK. We carried out a decision model-based probabilistic cost-effectiveness analysis. We used a hypothetical population of newborns and evaluated the impact of living in a household with or without a functioning smoke alarm during the first 5 years of their life on overall lifetime costs and quality of life from a public health perspective. We compared seven interventions, ranging from usual care to more complex interventions comprising of education, free/low cost equipment giveaway, equipment fitting and/or home safety inspection. Education and free/low cost equipment was the most cost-effective intervention with an estimated incremental cost-effectiveness ratio of £34,200 per QALY gained compared to usual care. This was reduced to approximately £4,500 per QALY gained when 1.8 children under the age of 5 were assumed per household. Assessing cost-effectiveness, as well as effectiveness, is important in a public sector system operating under a fixed budget restraint. As highlighted in this study, the more effective interventions (in this case the more complex interventions) may not necessarily be the ones considered the most cost-effective.

  12. Predicting water main failures using Bayesian model averaging and survival modelling approach

    International Nuclear Information System (INIS)

    Kabir, Golam; Tesfamariam, Solomon; Sadiq, Rehan

    2015-01-01

    To develop an effective preventive or proactive repair and replacement action plan, water utilities often rely on water main failure prediction models. However, in predicting the failure of water mains, uncertainty is inherent regardless of the quality and quantity of data used in the model. To improve the understanding of water main failure, a Bayesian framework is developed for predicting the failure of water mains considering uncertainties. In this study, Bayesian model averaging method (BMA) is presented to identify the influential pipe-dependent and time-dependent covariates considering model uncertainties whereas Bayesian Weibull Proportional Hazard Model (BWPHM) is applied to develop the survival curves and to predict the failure rates of water mains. To accredit the proposed framework, it is implemented to predict the failure of cast iron (CI) and ductile iron (DI) pipes of the water distribution network of the City of Calgary, Alberta, Canada. Results indicate that the predicted 95% uncertainty bounds of the proposed BWPHMs capture effectively the observed breaks for both CI and DI water mains. Moreover, the performance of the proposed BWPHMs are better compare to the Cox-Proportional Hazard Model (Cox-PHM) for considering Weibull distribution for the baseline hazard function and model uncertainties. - Highlights: • Prioritize rehabilitation and replacements (R/R) strategies of water mains. • Consider the uncertainties for the failure prediction. • Improve the prediction capability of the water mains failure models. • Identify the influential and appropriate covariates for different models. • Determine the effects of the covariates on failure

  13. A risk adjusted cost-effectiveness analysis of alternative models of nurse involvement in obesity management in primary care.

    Science.gov (United States)

    Karnon, J; Ali Afzali, H Haji; Gray, J; Holton, C; Banham, D; Beilby, J

    2013-03-01

    Controlled evaluations are subject to uncertainty regarding their replication in the real world, particularly around systems of service provision. Using routinely collected data, we undertook a risk adjusted cost-effectiveness (RAC-E) analysis of alternative applied models of primary health care for the management of obese adult patients. Models were based on the reported level of involvement of practice nurses (registered or enrolled nurses working in general practice) in the provision of clinical-based activities. Linked, routinely collected clinical data describing clinical outcomes (weight, BMI, and obesity-related complications) and resource use (primary care, pharmaceutical, and hospital resource use) were collected. Potential confounders were controlled for using propensity weighted regression analyses. Relative to low level involvement of practice nurses in the provision of clinical-based activities to obese patients, high level involvement was associated with lower costs and better outcomes (more patients losing weight, and larger mean reductions in BMI). Excluding hospital costs, high level practice nurse involvement was associated with slightly higher costs. Incrementally, the high level model gets one additional obese patient to lose weight at an additional cost of $6,741, and reduces mean BMI by an additional one point at an additional cost of $563 (upper 95% confidence interval $1,547). Converted to quality adjusted life year (QALY) gains, the results provide a strong indication that increased involvement of practice nurses in clinical activities is associated with additional health benefits that are achieved at reasonable additional cost. Dissemination activities and incentives are required to encourage general practices to better integrate practice nurses in the active provision of clinical services. Copyright © 2013 The Obesity Society.

  14. Bayesian Model Averaging of Artificial Intelligence Models for Hydraulic Conductivity Estimation

    Science.gov (United States)

    Nadiri, A.; Chitsazan, N.; Tsai, F. T.; Asghari Moghaddam, A.

    2012-12-01

    This research presents a Bayesian artificial intelligence model averaging (BAIMA) method that incorporates multiple artificial intelligence (AI) models to estimate hydraulic conductivity and evaluate estimation uncertainties. Uncertainty in the AI model outputs stems from error in model input as well as non-uniqueness in selecting different AI methods. Using one single AI model tends to bias the estimation and underestimate uncertainty. BAIMA employs Bayesian model averaging (BMA) technique to address the issue of using one single AI model for estimation. BAIMA estimates hydraulic conductivity by averaging the outputs of AI models according to their model weights. In this study, the model weights were determined using the Bayesian information criterion (BIC) that follows the parsimony principle. BAIMA calculates the within-model variances to account for uncertainty propagation from input data to AI model output. Between-model variances are evaluated to account for uncertainty due to model non-uniqueness. We employed Takagi-Sugeno fuzzy logic (TS-FL), artificial neural network (ANN) and neurofuzzy (NF) to estimate hydraulic conductivity for the Tasuj plain aquifer, Iran. BAIMA combined three AI models and produced better fitting than individual models. While NF was expected to be the best AI model owing to its utilization of both TS-FL and ANN models, the NF model is nearly discarded by the parsimony principle. The TS-FL model and the ANN model showed equal importance although their hydraulic conductivity estimates were quite different. This resulted in significant between-model variances that are normally ignored by using one AI model.

  15. Bayesian network as a modelling tool for risk management in agriculture

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Madsen, Anders Læsø; Lund, Mogens

    . In this paper we use Bayesian networks as an integrated modelling approach for representing uncertainty and analysing risk management in agriculture. It is shown how historical farm account data may be efficiently used to estimate conditional probabilities, which are the core elements in Bayesian network models....... We further show how the Bayesian network model RiBay is used for stochastic simulation of farm income, and we demonstrate how RiBay can be used to simulate risk management at the farm level. It is concluded that the key strength of a Bayesian network is the transparency of assumptions......, and that it has the ability to link uncertainty from different external sources to budget figures and to quantify risk at the farm level....

  16. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  17. Cost-effectiveness and resource implications of aggressive action on TB in China, India and South Africa: a combined analysis of nine models

    Science.gov (United States)

    Menzies, Nicolas A; Gomez, Gabriela B; Bozzani, Fiammetta; Chatterjee, Susmita; Foster, Nicola; Baena, Ines Garcia; Laurence, Yoko V; Qiang, Sun; Siroka, Andrew; Sweeney, Sedona; Verguet, Stéphane; Arinaminpathy, Nimalan; Azman, Andrew S; Bendavid, Eran; Chang, Stewart T; Cohen, Ted; Denholm, Justin T; Dowdy, David W; Eckhoff, Philip A; Goldhaber-Fiebert, Jeremy D; Handel, Andreas; Huynh, Grace H; Lalli, Marek; Lin, Hsien-Ho; Mandal, Sandip; McBryde, Emma S; Pandey, Surabhi; Salomon, Joshua A; Suen, Sze-chuan; Sumner, Tom; Trauer, James M; Wagner, Bradley G; Whalen, Christopher C; Wu, Chieh-Yin; Boccia, Delia; Chadha, Vineet K; Charalambous, Salome; Chin, Daniel P; Churchyard, Gavin; Daniels, Colleen; Dewan, Puneet; Ditiu, Lucica; Eaton, Jeffrey W; Grant, Alison D; Hippner, Piotr; Hosseini, Mehran; Mametja, David; Pretorius, Carel; Pillay, Yogan; Rade, Kiran; Sahu, Suvanand; Wang, Lixia; Houben, Rein MGJ; Kimerling, Michael E; White, Richard G; Vassall, Anna

    2017-01-01

    BACKGROUND The End TB Strategy sets global goals of reducing TB incidence and mortality by 50% and 75% respectively by 2025. We assessed resource requirements and cost-effectiveness of strategies to achieve these targets in China, India, and South Africa. METHODS We examined intervention scenarios developed in consultation with country stakeholders, which scaled-up existing interventions to high but feasible coverage by 2025. Nine independent TB modelling groups collaborated to estimate policy outcomes, and we costed each scenario by synthesizing service utilization estimates, empirical cost data, and expert opinion on implementation strategies. We estimated health impact and resource implications for 2016–2035, including patient-incurred costs. To assess resource requirements and cost-effectiveness, we compared scenarios to a base case representing continued current practice. FINDINGS Incremental TB service costs differed by scenario and country, and in some cases more than doubled current funding needs. In general, expanding TB services substantially reduced patient-incurred costs; and in India and China this produced net cost-savings for most interventions under a societal perspective. In all countries, expanding TB care access produced substantial health gains. Compared to current practice, most intervention approaches appeared highly cost-effective when compared to conventional cost-effectiveness thresholds. INTERPRETATION Expanding TB services appears cost-effective for high-burden countries and could generate substantial health and economic benefits for patients, though funding needs challenge affordability. Further work is required to determine the optimal intervention mix for each country. PMID:27720689

  18. Experiences in applying Bayesian integrative models in interdisciplinary modeling: the computational and human challenges

    DEFF Research Database (Denmark)

    Kuikka, Sakari; Haapasaari, Päivi Elisabet; Helle, Inari

    2011-01-01

    networks are flexible tools that can take into account the different research traditions and the various types of information sources. We present two types of cases. With the Baltic salmon stocks modeled with Bayesian techniques, the existing data sets are rich and the estimation of the parameters...... components, which favors the use of quantitative risk analysis. However, the traditions and quality criteria of these scientific fields are in many respects different. This creates both technical and human challenges to the modeling tasks....

  19. Bayesian network modeling applied to coastal geomorphology: lessons learned from a decade of experimentation and application

    Science.gov (United States)

    Plant, N. G.; Thieler, E. R.; Gutierrez, B.; Lentz, E. E.; Zeigler, S. L.; Van Dongeren, A.; Fienen, M. N.

    2016-12-01

    We evaluate the strengths and weaknesses of Bayesian networks that have been used to address scientific and decision-support questions related to coastal geomorphology. We will provide an overview of coastal geomorphology research that has used Bayesian networks and describe what this approach can do and when it works (or fails to work). Over the past decade, Bayesian networks have been formulated to analyze the multi-variate structure and evolution of coastal morphology and associated human and ecological impacts. The approach relates observable system variables to each other by estimating discrete correlations. The resulting Bayesian-networks make predictions that propagate errors, conduct inference via Bayes rule, or both. In scientific applications, the model results are useful for hypothesis testing, using confidence estimates to gage the strength of tests while applications to coastal resource management are aimed at decision-support, where the probabilities of desired ecosystems outcomes are evaluated. The range of Bayesian-network applications to coastal morphology includes emulation of high-resolution wave transformation models to make oceanographic predictions, morphologic response to storms and/or sea-level rise, groundwater response to sea-level rise and morphologic variability, habitat suitability for endangered species, and assessment of monetary or human-life risk associated with storms. All of these examples are based on vast observational data sets, numerical model output, or both. We will discuss the progression of our experiments, which has included testing whether the Bayesian-network approach can be implemented and is appropriate for addressing basic and applied scientific problems and evaluating the hindcast and forecast skill of these implementations. We will present and discuss calibration/validation tests that are used to assess the robustness of Bayesian-network models and we will compare these results to tests of other models. This will

  20. Bayesian estimation of regularization parameters for deformable surface models

    Energy Technology Data Exchange (ETDEWEB)

    Cunningham, G.S.; Lehovich, A.; Hanson, K.M.

    1999-02-20

    In this article the authors build on their past attempts to reconstruct a 3D, time-varying bolus of radiotracer from first-pass data obtained by the dynamic SPECT imager, FASTSPECT, built by the University of Arizona. The object imaged is a CardioWest total artificial heart. The bolus is entirely contained in one ventricle and its associated inlet and outlet tubes. The model for the radiotracer distribution at a given time is a closed surface parameterized by 482 vertices that are connected to make 960 triangles, with nonuniform intensity variations of radiotracer allowed inside the surface on a voxel-to-voxel basis. The total curvature of the surface is minimized through the use of a weighted prior in the Bayesian framework, as is the weighted norm of the gradient of the voxellated grid. MAP estimates for the vertices, interior intensity voxels and background count level are produced. The strength of the priors, or hyperparameters, are determined by maximizing the probability of the data given the hyperparameters, called the evidence. The evidence is calculated by first assuming that the posterior is approximately normal in the values of the vertices and voxels, and then by evaluating the integral of the multi-dimensional normal distribution. This integral (which requires evaluating the determinant of a covariance matrix) is computed by applying a recent algorithm from Bai et. al. that calculates the needed determinant efficiently. They demonstrate that the radiotracer is highly inhomogeneous in early time frames, as suspected in earlier reconstruction attempts that assumed a uniform intensity of radiotracer within the closed surface, and that the optimal choice of hyperparameters is substantially different for different time frames.

  1. Bayesian model calibration of computational models in velocimetry diagnosed dynamic compression experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hund, Lauren [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    Dynamic compression experiments are being performed on complicated materials using increasingly complex drivers. The data produced in these experiments are beginning to reach a regime where traditional analysis techniques break down; requiring the solution of an inverse problem. A common measurement in dynamic experiments is an interface velocity as a function of time, and often this functional output can be simulated using a hydrodynamics code. Bayesian model calibration is a statistical framework to estimate inputs into a computational model in the presence of multiple uncertainties, making it well suited to measurements of this type. In this article, we apply Bayesian model calibration to high pressure (250 GPa) ramp compression measurements in tantalum. We address several issues speci c to this calibration including the functional nature of the output as well as parameter and model discrepancy identi ability. Speci cally, we propose scaling the likelihood function by an e ective sample size rather than modeling the autocorrelation function to accommodate the functional output and propose sensitivity analyses using the notion of `modularization' to assess the impact of experiment-speci c nuisance input parameters on estimates of material properties. We conclude that the proposed Bayesian model calibration procedure results in simple, fast, and valid inferences on the equation of state parameters for tantalum.

  2. Evaluation of a Stratified National Breast Screening Program in the United Kingdom: An Early Model-Based Cost-Effectiveness Analysis

    NARCIS (Netherlands)

    Gray, E.; Donten, A.; Karssemeijer, N.; Gils, C. van; Evans, D.G.; Astley, S.; Payne, K.

    2017-01-01

    OBJECTIVES: To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. METHODS: A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1,

  3. Evaluation of a Stratified National Breast Screening Program in the United Kingdom : An Early Model-Based Cost-Effectiveness Analysis

    NARCIS (Netherlands)

    Gray, Ewan; Donten, Anna; Karssemeijer, Nico; van Gils, Carla; Evans, D. Gareth R.; Astley, Sue; Payne, Katherine

    Objectives: To identify the incremental costs and consequences of stratified national breast screening programs (stratified NBSPs) and drivers of relative cost-effectiveness. Methods: A decision-analytic model (discrete event simulation) was conceptualized to represent four stratified NBSPs (risk 1,

  4. Constructing Cost-Effective Crystal Structures with Table Tennis Balls and Tape That Allows Students to Assemble and Model Multiple Unit Cells

    Science.gov (United States)

    Elsworth, Catherine; Li, Barbara T. Y.; Ten, Abilio

    2017-01-01

    In this letter we present an innovative and cost-effective method of constructing crystal structures using Dual Lock fastening adhesive tape with table tennis (ping pong) balls. The use of these fasteners allows the balls to be easily assembled into layers to model various crystal structures and unit cells and then completely disassembled again.…

  5. Increasing the use of second-line therapy is a cost-effective approach to prevent the spread of drug-resistant HIV: a mathematical modelling study

    NARCIS (Netherlands)

    B.E. Nichols (Brooke); K.C. Sigaloff (Kim); C. Kityo (Cissy); R.L. Hamers (Raph); R.M.P.M. Baltussen (Rob); S. Bertagnolio (Silvia); M.R. Jordan (Michael); T.B. Hallett (Timothy); C.A.B. Boucher (Charles); M. De Wit (Meike); D.A.M.C. van de Vijver (David)

    2014-01-01

    textabstractMETHODS: We develop a deterministic mathematical model representing Kampala, Uganda, to predict the prevalence of TDR over a 10-year period. We then compare the impact on TDR and cost-effectiveness of: (1) introduction of pre-therapy genotyping; (2) doubling use of second-line treatment

  6. Integrated HIV testing, malaria, and diarrhea prevention campaign in Kenya: modeled health impact and cost-effectiveness.

    Directory of Open Access Journals (Sweden)

    James G Kahn

    Full Text Available Efficiently delivered interventions to reduce HIV, malaria, and diarrhea are essential to accelerating global health efforts. A 2008 community integrated prevention campaign in Western Province, Kenya, reached 47,000 individuals over 7 days, providing HIV testing and counseling, water filters, insecticide-treated bed nets, condoms, and for HIV-infected individuals cotrimoxazole prophylaxis and referral for ongoing care. We modeled the potential cost-effectiveness of a scaled-up integrated prevention campaign.We estimated averted deaths and disability-adjusted life years (DALYs based on published data on baseline mortality and morbidity and on the protective effect of interventions, including antiretroviral therapy. We incorporate a previously estimated scaled-up campaign cost. We used published costs of medical care to estimate savings from averted illness (for all three diseases and the added costs of initiating treatment earlier in the course of HIV disease.Per 1000 participants, projected reductions in cases of diarrhea, malaria, and HIV infection avert an estimated 16.3 deaths, 359 DALYs and $85,113 in medical care costs. Earlier care for HIV-infected persons adds an estimated 82 DALYs averted (to a total of 442, at a cost of $37,097 (reducing total averted costs to $48,015. Accounting for the estimated campaign cost of $32,000, the campaign saves an estimated $16,015 per 1000 participants. In multivariate sensitivity analyses, 83% of simulations result in net savings, and 93% in a cost per DALY averted of less than $20.A mass, rapidly implemented campaign for HIV testing, safe water, and malaria control appears economically attractive.

  7. The impact and cost-effectiveness of nonavalent HPV vaccination in the United States: Estimates from a simplified transmission model

    Science.gov (United States)

    Chesson, Harrell W.; Markowitz, Lauri E.; Hariri, Susan; Ekwueme, Donatus U.; Saraiya, Mona

    2016-01-01

    ABSTRACT Introduction: The objective of this study was to assess the incremental costs and benefits of the 9-valent HPV vaccine (9vHPV) compared with the quadrivalent HPV vaccine (4vHPV). Like 4vHPV, 9vHPV protects against HPV types 6, 11, 16, and 18. 9vHPV also protects against 5 additional HPV types 31, 33, 45, 52, and 58. Methods: We adapted a previously published model of the impact and cost-effectiveness of 4vHPV to include the 5 additional HPV types in 9vHPV. The vaccine strategies we examined were (1) 4vHPV for males and females; (2) 9vHPV for females and 4vHPV for males; and (3) 9vHPV for males and females. In the base case, 9vHPV cost $13 more per dose than 4vHPV, based on available vaccine price information. Results: Providing 9vHPV to females compared with 4vHPV for females (assuming 4vHPV for males in both scenarios) was cost-saving regardless of whether or not cross-protection for 4vHPV was assumed. The cost per quality-adjusted life year (QALY) gained by 9vHPV for both sexes (compared with 4vHPV for both sexes) was vaccination program of 4vHPV for both sexes, a vaccination program of 9vHPV for both sexes can improve health outcomes and can be cost-saving. PMID:26890978

  8. Model Criticism of Bayesian Networks with Latent Variables.

    Science.gov (United States)

    Williamson, David M.; Mislevy, Robert J.; Almond, Russell G.

    This study investigated statistical methods for identifying errors in Bayesian networks (BN) with latent variables, as found in intelligent cognitive assessments. BN, commonly used in artificial intelligence systems, are promising mechanisms for scoring constructed-response examinations. The success of an intelligent assessment or tutoring system…

  9. BAYESIAN FORECASTS COMBINATION TO IMPROVE THE ROMANIAN INFLATION PREDICTIONS BASED ON ECONOMETRIC MODELS

    Directory of Open Access Journals (Sweden)

    Mihaela Simionescu

    2014-12-01

    Full Text Available There are many types of econometric models used in predicting the inflation rate, but in this study we used a Bayesian shrinkage combination approach. This methodology is used in order to improve the predictions accuracy by including information that is not captured by the econometric models. Therefore, experts’ forecasts are utilized as prior information, for Romania these predictions being provided by Institute for Economic Forecasting (Dobrescu macromodel, National Commission for Prognosis and European Commission. The empirical results for Romanian inflation show the superiority of a fixed effects model compared to other types of econometric models like VAR, Bayesian VAR, simultaneous equations model, dynamic model, log-linear model. The Bayesian combinations that used experts’ predictions as priors, when the shrinkage parameter tends to infinite, improved the accuracy of all forecasts based on individual models, outperforming also zero and equal weights predictions and naïve forecasts.

  10. BiomeNet: a Bayesian model for inference of metabolic divergence among microbial communities.

    OpenAIRE

    Mahdi Shafiei; Katherine A Dunn; Hugh Chipman; Hong Gu; Joseph P Bielawski

    2014-01-01

    Metagenomics yields enormous numbers of microbial sequences that can be assigned a metabolic function. Using such data to infer community-level metabolic divergence is hindered by the lack of a suitable statistical framework. Here, we describe a novel hierarchical Bayesian model, called BiomeNet (Bayesian inference of metabolic networks), for inferring differential prevalence of metabolic subnetworks among microbial communities. To infer the structure of community-level metabolic interactions...

  11. On the Practice of Bayesian Inference in Basic Economic Time Series Models using Gibbs Sampling

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); R. Segers (René); H.K. van Dijk (Herman)

    2006-01-01

    textabstractSeveral lessons learned from a Bayesian analysis of basic economic time series models by means of the Gibbs sampling algorithm are presented. Models include the Cochrane-Orcutt model for serial correlation, the Koyck distributed lag model, the Unit Root model, the Instrumental Variables

  12. Bayesian spatial modeling of HIV mortality via zero-inflated Poisson models.

    Science.gov (United States)

    Musal, Muzaffer; Aktekin, Tevfik

    2013-01-30

    In this paper, we investigate the effects of poverty and inequality on the number of HIV-related deaths in 62 New York counties via Bayesian zero-inflated Poisson models that exhibit spatial dependence. We quantify inequality via the Theil index and poverty via the ratios of two Census 2000 variables, the number of people under the poverty line and the number of people for whom poverty status is determined, in each Zip Code Tabulation Area. The purpose of this study was to investigate the effects of inequality and poverty in addition to spatial dependence between neighboring regions on HIV mortality rate, which can lead to improved health resource allocation decisions. In modeling county-specific HIV counts, we propose Bayesian zero-inflated Poisson models whose rates are functions of both covariate and spatial/random effects. To show how the proposed models work, we used three different publicly available data sets: TIGER Shapefiles, Census 2000, and mortality index files. In addition, we introduce parameter estimation issues of Bayesian zero-inflated Poisson models and discuss MCMC method implications. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Bayesian Modeling of ChIP-chip Data Through a High-Order Ising Model

    KAUST Repository

    Mo, Qianxing

    2010-01-29

    ChIP-chip experiments are procedures that combine chromatin immunoprecipitation (ChIP) and DNA microarray (chip) technology to study a variety of biological problems, including protein-DNA interaction, histone modification, and DNA methylation. The most important feature of ChIP-chip data is that the intensity measurements of probes are spatially correlated because the DNA fragments are hybridized to neighboring probes in the experiments. We propose a simple, but powerful Bayesian hierarchical approach to ChIP-chip data through an Ising model with high-order interactions. The proposed method naturally takes into account the intrinsic spatial structure of the data and can be used to analyze data from multiple platforms with different genomic resolutions. The model parameters are estimated using the Gibbs sampler. The proposed method is illustrated using two publicly available data sets from Affymetrix and Agilent platforms, and compared with three alternative Bayesian methods, namely, Bayesian hierarchical model, hierarchical gamma mixture model, and Tilemap hidden Markov model. The numerical results indicate that the proposed method performs as well as the other three methods for the data from Affymetrix tiling arrays, but significantly outperforms the other three methods for the data from Agilent promoter arrays. In addition, we find that the proposed method has better operating characteristics in terms of sensitivities and false discovery rates under various scenarios. © 2010, The International Biometric Society.

  14. Modeling the cost-effectiveness of the integrated disease surveillance and response (IDSR system: meningitis in Burkina Faso.

    Directory of Open Access Journals (Sweden)

    Zana C Somda

    Full Text Available BACKGROUND: Effective surveillance for infectious diseases is an essential component of public health. There are few studies estimating the cost-effectiveness of starting or improving disease surveillance. We present a cost-effectiveness analysis the Integrated Disease Surveillance and Response (IDSR strategy in Africa. METHODOLOGY/PRINCIPAL FINDINGS: To assess the impact of the IDSR in Africa, we used pre- and post- IDSR meningococcal meningitis surveillance data from Burkina Faso (1996-2002 and 2003-2007. IDSR implementation was correlated with a median reduction of 2 weeks to peak of outbreaks (25(th percentile 1 week; 75(th percentile 4 weeks. IDSR was also correlated with a reduction of 43 meningitis cases per 100,000 (25(th-40: 75(th-129. Assuming the correlations between reductions in time to peak of outbreaks and cases are related, the cost-effectiveness of IDSR was $23 per case averted (25(th-$30; 75(th--cost saving, and $98 per meningitis-related death averted (25(th-$140: 75(th--cost saving. CONCLUSIONS/SIGNIFICANCE: We cannot absolutely claim that the measured differences were due to IDSR. We believe, however, that it is reasonable to claim that IDSR can improve the cost-effectiveness of public health surveillance.

  15. Use of SAMC for Bayesian analysis of statistical models with intractable normalizing constants

    KAUST Repository

    Jin, Ick Hoon

    2014-03-01

    Statistical inference for the models with intractable normalizing constants has attracted much attention. During the past two decades, various approximation- or simulation-based methods have been proposed for the problem, such as the Monte Carlo maximum likelihood method and the auxiliary variable Markov chain Monte Carlo methods. The Bayesian stochastic approximation Monte Carlo algorithm specifically addresses this problem: It works by sampling from a sequence of approximate distributions with their average converging to the target posterior distribution, where the approximate distributions can be achieved using the stochastic approximation Monte Carlo algorithm. A strong law of large numbers is established for the Bayesian stochastic approximation Monte Carlo estimator under mild conditions. Compared to the Monte Carlo maximum likelihood method, the Bayesian stochastic approximation Monte Carlo algorithm is more robust to the initial guess of model parameters. Compared to the auxiliary variable MCMC methods, the Bayesian stochastic approximation Monte Carlo algorithm avoids the requirement for perfect samples, and thus can be applied to many models for which perfect sampling is not available or very expensive. The Bayesian stochastic approximation Monte Carlo algorithm also provides a general framework for approximate Bayesian analysis. © 2012 Elsevier B.V. All rights reserved.

  16. Cost-effectiveness of medical primary prevention strategies to reduce absolute risk of cardiovascular disease in Tanzania: a Markov modelling study.

    Science.gov (United States)

    Ngalesoni, Frida N; Ruhago, George M; Mori, Amani T; Robberstad, Bjarne; Norheim, Ole F

    2016-05-17

    Cardiovascular disease (CVD) is a growing cause of mortality and morbidity in Tanzania, but contextualized evidence on cost-effective medical strategies to prevent it is scarce. We aim to perform a cost-effectiveness analysis of medical interventions for primary prevention of CVD using the World Health Organization's (WHO) absolute risk approach for four risk levels. The cost-effectiveness analysis was performed from a societal perspective using two Markov decision models: CVD risk without diabetes and CVD risk with diabetes. Primary provider and patient costs were estimated using the ingredients approach and step-down methodologies. Epidemiological data and efficacy inputs were derived from systematic reviews and meta-analyses. We used disability- adjusted life years (DALYs) averted as the outcome measure. Sensitivity analyses were conducted to evaluate the robustness of the model results. For CVD low-risk patients without diabetes, medical management is not cost-effective unless willingness to pay (WTP) is higher than US$1327 per DALY averted. For moderate-risk patients, WTP must exceed US$164 per DALY before a combination of angiotensin converting enzyme inhibitor (ACEI) and diuretic (Diu) becomes cost-effective, while for high-risk and very high-risk patients the thresholds are US$349 (ACEI, calcium channel blocker (CCB) and Diu) and US$498 per DALY (ACEI, CCB, Diu and Aspirin (ASA)) respectively. For patients with CVD risk with diabetes, a combination of sulfonylureas (Sulf), ACEI and CCB for low and moderate risk (incremental cost-effectiveness ratio (ICER) US$608 and US$115 per DALY respectively), is the most cost-effective, while adding biguanide (Big) to this combination yielded the most favourable ICERs of US$309 and US$350 per DALY for high and very high risk respectively. For the latter, ASA is also part of the combination. Medical preventive cardiology is very cost-effective for all risk levels except low CVD risk. Budget impact analyses and

  17. A Bayesian model for pooling gene expression studies that incorporates co-regulation information.

    Directory of Open Access Journals (Sweden)

    Erin M Conlon

    Full Text Available Current Bayesian microarray models that pool multiple studies assume gene expression is independent of other genes. However, in prokaryotic organisms, genes are arranged in units that are co-regulated (called operons. Here, we introduce a new Bayesian model for pooling gene expression studies that incorporates operon information into the model. Our Bayesian model borrows information from other genes within the same operon to improve estimation of gene expression. The model produces the gene-specific posterior probability of differential expression, which is the basis for inference. We found in simulations and in biological studies that incorporating co-regulation information improves upon the independence model. We assume that each study contains two experimental conditions: a treatment and control. We note that there exist environmental conditions for which genes that are supposed to be transcribed together lose their operon structure, and that our model is best carried out for known operon structures.

  18. The cost-effectiveness of different feeding patterns combined with prompt treatments for preventing mother-to-child HIV transmission in South Africa: estimates from simulation modeling.

    Directory of Open Access Journals (Sweden)

    Wenhua Yu

    Full Text Available OBJECTIVES: Based on the important changes in South Africa since 2009 and the Antiretroviral Treatment Guideline 2013 recommendations, we explored the cost-effectiveness of different strategy combinations according to the South African HIV-infected mothers' prompt treatments and different feeding patterns. STUDY DESIGN: A decision analytic model was applied to simulate cohorts of 10,000 HIV-infected pregnant women to compare the cost-effectiveness of two different HIV strategy combinations: (1 Women were tested and treated promptly at any time during pregnancy (Promptly treated cohort. (2 Women did not get testing or treatment until after delivery and appropriate standard treatments were offered as a remedy (Remedy cohort. Replacement feeding or exclusive breastfeeding was assigned in both strategies. Outcome measures included the number of infant HIV cases averted, the cost per infant HIV case averted, and the cost per life year (LY saved from the interventions. One-way and multivariate sensitivity analyses were performed to estimate the uncertainty ranges of all outcomes. RESULTS: The remedy strategy does not particularly cost-effective. Compared with the untreated baseline cohort which leads to 1127 infected infants, 698 (61.93% and 110 (9.76% of pediatric HIV cases are averted in the promptly treated cohort and remedy cohort respectively, with incremental cost-effectiveness of $68.51 and $118.33 per LY, respectively. With or without the antenatal testing and treatments, breastfeeding is less cost-effective ($193.26 per LY than replacement feeding ($134.88 per LY, without considering the impact of willingness to pay. CONCLUSION: Compared with the prompt treatments, remedy in labor or during the postnatal period is less cost-effective. Antenatal HIV testing and prompt treatments and avoiding breastfeeding are the best strategies. Although encouraging mothers to practice replacement feeding in South Africa is far from easy and the advantages of

  19. Modelling the Impact and Cost-Effectiveness of Biomarker Tests as Compared with Pathogen-Specific Diagnostics in the Management of Undifferentiated Fever in Remote Tropical Settings.

    Directory of Open Access Journals (Sweden)

    Yoel Lubell

    Full Text Available Malaria accounts for a small fraction of febrile cases in increasingly large areas of the malaria endemic world. Point-of-care tests to improve the management of non-malarial fevers appropriate for primary care are few, consisting of either diagnostic tests for specific pathogens or testing for biomarkers of host response that indicate whether antibiotics might be required. The impact and cost-effectiveness of these approaches are relatively unexplored and methods to do so are not well-developed.We model the ability of dengue and scrub typhus rapid tests to inform antibiotic treatment, as compared with testing for elevated C-Reactive Protein (CRP, a biomarker of host-inflammation. Using data on causes of fever in rural Laos, we estimate the proportion of outpatients that would be correctly classified as requiring an antibiotic and the likely cost-effectiveness of the approaches.Use of either pathogen-specific test slightly increased the proportion of patients correctly classified as requiring antibiotics. CRP testing was consistently superior to the pathogen-specific tests, despite heterogeneity in causes of fever. All testing strategies are likely to result in higher average costs, but only the scrub typhus and CRP tests are likely to be cost-effective when considering direct health benefits, with median cost per disability adjusted life year averted of approximately $48 USD and $94 USD, respectively.Testing for viral infections is unlikely to be cost-effective when considering only direct health benefits to patients. Testing for prevalent bacterial pathogens can be cost-effective, having the benefit of informing not only whether treatment is required, but also as to the most appropriate antibiotic; this advantage, however, varies widely in response to heterogeneity in causes of fever. Testing for biomarkers of host inflammation is likely to be consistently cost-effective despite high heterogeneity, and can also offer substantial reductions in

  20. Modelling the Impact and Cost-Effectiveness of Biomarker Tests as Compared with Pathogen-Specific Diagnostics in the Management of Undifferentiated Fever in Remote Tropical Settings.

    Science.gov (United States)

    Lubell, Yoel; Althaus, Thomas; Blacksell, Stuart D; Paris, Daniel H; Mayxay, Mayfong; Pan-Ngum, Wirichada; White, Lisa J; Day, Nicholas P J; Newton, Paul N

    2016-01-01

    Malaria accounts for a small fraction of febrile cases in increasingly large areas of the malaria endemic world. Point-of-care tests to improve the management of non-malarial fevers appropriate for primary care are few, consisting of either diagnostic tests for specific pathogens or testing for biomarkers of host response that indicate whether antibiotics might be required. The impact and cost-effectiveness of these approaches are relatively unexplored and methods to do so are not well-developed. We model the ability of dengue and scrub typhus rapid tests to inform antibiotic treatment, as compared with testing for elevated C-Reactive Protein (CRP), a biomarker of host-inflammation. Using data on causes of fever in rural Laos, we estimate the proportion of outpatients that would be correctly classified as requiring an antibiotic and the likely cost-effectiveness of the approaches. Use of either pathogen-specific test slightly increased the proportion of patients correctly classified as requiring antibiotics. CRP testing was consistently superior to the pathogen-specific tests, despite heterogeneity in causes of fever. All testing strategies are likely to result in higher average costs, but only the scrub typhus and CRP tests are likely to be cost-effective when considering direct health benefits, with median cost per disability adjusted life year averted of approximately $48 USD and $94 USD, respectively. Testing for viral infections is unlikely to be cost-effective when considering only direct health benefits to patients. Testing for prevalent bacterial pathogens can be cost-effective, having the benefit of informing not only whether treatment is required, but also as to the most appropriate antibiotic; this advantage, however, varies widely in response to heterogeneity in causes of fever. Testing for biomarkers of host inflammation is likely to be consistently cost-effective despite high heterogeneity, and can also offer substantial reductions in over-use of

  1. Bayesian Inference for Step-Stress Partially Accelerated Competing Failure Model under Type II Progressive Censoring

    Directory of Open Access Journals (Sweden)

    Xiaolin Shi

    2016-01-01

    Full Text Available This paper deals with the Bayesian inference on step-stress partially accelerated life tests using Type II progressive censored data in the presence of competing failure causes. Suppose that the occurrence time of the failure cause follows Pareto distribution under use stress levels. Based on the tampered failure rate model, the objective Bayesian estimates, Bayesian estimates, and E-Bayesian estimates of the unknown parameters and acceleration factor are obtained under the squared loss function. To evaluate the performance of the obtained estimates, the average relative errors (AREs and mean squared errors (MSEs are calculated. In addition, the comparisons of the three estimates of unknown parameters and acceleration factor for different sample sizes and different progressive censoring schemes are conducted through Monte Carlo simulations.

  2. Social and material deprivation and the cost-effectiveness of an intervention to promote physical activity: cohort study and Markov model.

    Science.gov (United States)

    Gulliford, Martin; Charlton, Judith; Bhattarai, Nawaraj; Rudisill, Caroline

    2014-12-01

    We developed a method to model the cost-effectiveness at different levels of deprivation of an intervention to promote physical activity. The cost-effectiveness of a brief intervention in primary care was estimated by means of a Markov model stratified by deprivation quintile. Estimates for disease incidence, mortality, depression prevalence and health service utilization were obtained from 282 887 participants in the UK Clinical Practice Research Datalink with linked deprivation scores. Discounted results were compared for least deprived and most deprived quintiles. An effective intervention to promote physical activity continuing for 5 years gave an increase in life years free from disease: least deprived 54.9 (95% interval 17.5-93.5) per 1000 participants entering model; most deprived 74.5 (22.8-128.0) per 1000. The overall incremental quality adjusted life years were: least deprived, 3.7 per 1000 and most deprived, 6.1 per 1000 with probability cost-effective at £30 000 per QALY being 52.5 and 63.3%, respectively. When the intervention was modelled to be 30% less effective in the most deprived than the least deprived quintile, the probability cost-effective was least deprived 52.9% and most deprived 55.9%. Physical activity interventions may generate greater health benefits in deprived populations. When intervention effectiveness is attenuated in deprived groups, cost-effectiveness may sometimes still be similar to that in the most affluent groups. Even with favourable assumptions, evidence was insufficient to support wider use of presently available brief primary care interventions in a universal strategy for primary prevention. © The Author 2014, Published by Oxford University Press on behalf of Faculty of Public Health.

  3. Multivariate Bayesian modeling of known and unknown causes of events--an application to biosurveillance.

    Science.gov (United States)

    Shen, Yanna; Cooper, Gregory F

    2012-09-01

    This paper investigates Bayesian modeling of known and unknown causes of events in the context of disease-outbreak detection. We introduce a multivariate Bayesian approach that models multiple evidential features of every person in the population. This approach models and detects (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A contribution of this paper is that it introduces a multivariate Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has general applicability in domains where the space of known causes is incomplete. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  4. Fast and accurate Bayesian model criticism and conflict diagnostics using R-INLA

    KAUST Repository

    Ferkingstad, Egil

    2017-10-16

    Bayesian hierarchical models are increasingly popular for realistic modelling and analysis of complex data. This trend is accompanied by the need for flexible, general and computationally efficient methods for model criticism and conflict detection. Usually, a Bayesian hierarchical model incorporates a grouping of the individual data points, as, for example, with individuals in repeated measurement data. In such cases, the following question arises: Are any of the groups “outliers,” or in conflict with the remaining groups? Existing general approaches aiming to answer such questions tend to be extremely computationally demanding when model fitting is based on Markov chain Monte Carlo. We show how group-level model criticism and conflict detection can be carried out quickly and accurately through integrated nested Laplace approximations (INLA). The new method is implemented as a part of the open-source R-INLA package for Bayesian computing (http://r-inla.org).

  5. Bayesian interpolation in a dynamic sinusoidal model with application to packet-loss concealment

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan

    2010-01-01

    In this paper, we consider Bayesian interpolation and parameter estimation in a dynamic sinusoidal model. This model is more flexible than the static sinusoidal model since it enables the amplitudes and phases of the sinusoids to be time-varying. For the dynamic sinusoidal model, we derive...

  6. Cost effectiveness of primary care referral to a commercial provider for weight loss treatment, relative to standard care: a modelled lifetime analysis.

    Science.gov (United States)

    Fuller, N R; Carter, H; Schofield, D; Hauner, H; Jebb, S A; Colagiuri, S; Caterson, I D

    2014-08-01

    Because of the high prevalence of overweight and obesity, there is a need to identify cost-effective approaches for weight loss in primary care and community settings. To evaluate the long-term cost effectiveness of a commercial weight loss programme (Weight Watchers) (CP) compared with standard care (SC), as defined by national guidelines. A Markov model was developed to calculate the incremental cost-effectiveness ratio (ICER), expressed as the cost per quality-adjusted life year (QALY) over the lifetime. The probabilities and quality-of-life utilities of outcomes were extrapolated from trial data using estimates from the published literature. A health sector perspective was adopted. Over a patient's lifetime, the CP resulted in an incremental cost saving of AUD 70 per patient, and an incremental 0.03 QALYs gained per patient. As such, the CP was found to be the dominant treatment, being more effective and less costly than SC (95% confidence interval: dominant to 6225 per QALY). Despite the CP delaying the onset of diabetes by ∼10 months, there was no significant difference in the incidence of type 2 diabetes, with the CP achieving <0.1% fewer cases than SC over the lifetime. The modelled results suggest that referral to community-based interventions may provide a highly cost-effective approach for those at high risk of weight-related comorbidities.

  7. The cost and impact of scaling up pre-exposure prophylaxis for HIV prevention: a systematic review of cost-effectiveness modelling studies.

    Directory of Open Access Journals (Sweden)

    Gabriela B Gomez

    Full Text Available Cost-effectiveness studies inform resource allocation, strategy, and policy development. However, due to their complexity, dependence on assumptions made, and inherent uncertainty, synthesising, and generalising the results can be difficult. We assess cost-effectiveness models evaluating expected health gains and costs of HIV pre-exposure prophylaxis (PrEP interventions.We conducted a systematic review comparing epidemiological and economic assumptions of cost-effectiveness studies using various modelling approaches. The following databases were searched (until January 2013: PubMed/Medline, ISI Web of Knowledge, Centre for Reviews and Dissemination databases, EconLIT, and region-specific databases. We included modelling studies reporting both cost and expected impact of a PrEP roll-out. We explored five issues: prioritisation strategies, adherence, behaviour change, toxicity, and resistance. Of 961 studies retrieved, 13 were included. Studies modelled populations (heterosexual couples, men who have sex with men, people who inject drugs in generalised and concentrated epidemics from Southern Africa (including South Africa, Ukraine, USA, and Peru. PrEP was found to have the potential to be a cost-effective addition to HIV prevention programmes in specific settings. The extent of the impact of PrEP depended upon assumptions made concerning cost, epidemic context, programme coverage, prioritisation strategies, and individual-level adherence. Delivery of PrEP to key populations at highest risk of HIV exposure appears the most cost-effective strategy. Limitations of this review include the partial geographical coverage, our inability to perform a meta-analysis, and the paucity of information available exploring trade-offs between early treatment and PrEP.Our review identifies the main considerations to address in assessing cost-effectiveness analyses of a PrEP intervention--cost, epidemic context, individual adherence level, PrEP programme coverage

  8. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  9. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence.

    Science.gov (United States)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  10. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  11. Predictive models for pressure ulcers from intensive care unit electronic health records using Bayesian networks.

    Science.gov (United States)

    Kaewprag, Pacharmon; Newton, Cheryl; Vermillion, Brenda; Hyun, Sookyung; Huang, Kun; Machiraju, Raghu

    2017-07-05

    We develop predictive models enabling clinicians to better understand and explore patient clinical data along with risk factors for pressure ulcers in intensive care unit patients from electronic health record data. Identifying accurate risk factors of pressure ulcers is essential to determining appropriate prevention strategies; in this work we examine medication, diagnosis, and traditional Braden pressure ulcer assessment scale measurements as patient features. In order to predict pressure ulcer incidence and better understand the structure of related risk factors, we construct Bayesian networks from patient features. Bayesian network nodes (features) and edges (conditional dependencies) are simplified with statistical network techniques. Upon reviewing a network visualization of our model, our clinician collaborators were able to identify strong relationships between risk factors widely recognized as associated with pressure ulcers. We present a three-stage framework for predictive analysis of patient clinical data: 1) Developing electronic health record feature extraction functions with assistance of clinicians, 2) simplifying features, and 3) building Bayesian network predictive models. We evaluate all combinations of Bayesian network models from different search algorithms, scoring functions, prior structure initializations, and sets of features. From the EHRs of 7,717 ICU patients, we construct Bayesian network predictive models from 86 medication, diagnosis, and Braden scale features. Our model not only identifies known and suspected high PU risk factors, but also substantially increases sensitivity of the prediction - nearly three times higher comparing to logistical regression models - without sacrificing the overall accuracy. We visualize a representative model with which our clinician collaborators identify strong relationships between risk factors widely recognized as associated with pressure ulcers. Given the strong adverse effect of pressure ulcers

  12. Numerical Demons in Monte Carlo Estimation of Bayesian Model Evidence with Application to Soil Respiration Models

    Science.gov (United States)

    Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.

    2016-12-01

    Bayesian multimodel inference is increasingly being used in hydrology. Estimating Bayesian model evidence (BME) is of central importance in many Bayesian multimodel analysis such as Bayesian model averaging and model selection. BME is the overall probability of the model in reproducing the data, accounting for the trade-off between the goodness-of-fit and the model complexity. Yet estimating BME is challenging, especially for high dimensional problems with complex sampling space. Estimating BME using the Monte Carlo numerical methods is preferred, as the methods yield higher accuracy than semi-analytical solutions (e.g. Laplace approximations, BIC, KIC, etc.). However, numerical methods are prone the numerical demons arising from underflow of round off errors. Although few studies alluded to this issue, to our knowledge this is the first study that illustrates these numerical demons. We show that the precision arithmetic can become a threshold on likelihood values and Metropolis acceptance ratio, which results in trimming parameter regions (when likelihood function is less than the smallest floating point number that a computer can represent) and corrupting of the empirical measures of the random states of the MCMC sampler (when using log-likelihood function). We consider two of the most powerful numerical estimators of BME that are the path sampling method of thermodynamic integration (TI) and the importance sampling method of steppingstone sampling (SS). We also consider the two most widely used numerical estimators, which are the prior sampling arithmetic mean (AS) and posterior sampling harmonic mean (HM). We investigate the vulnerability of these four estimators to the numerical demons. Interesting, the most biased estimator, namely the HM, turned out to be the least vulnerable. While it is generally assumed that AM is a bias-free estimator that will always approximate the true BME by investing in computational effort, we show that arithmetic underflow can

  13. Cost-Effectiveness Analysis of Etanercept in Combination with Methotrexate for Rheumatoid Arthritis - Markov Model Based on Data from Serbia

    Directory of Open Access Journals (Sweden)

    Kostic Marina

    2017-12-01

    Full Text Available Biological therapeutic strategies have shown positive benefits for chronic and progressive rheumatoid arthritis (RA in clinical and radiological outcomes. Despite these results, the use of biological drugs in the treatment of RA is limited by high costs. The aim of this study was to compare the cost effectiveness of etanercept in combination with methotrexate and methotrexate alone in patients with RA in the socioeconomic environment of a Balkan country.

  14. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  15. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  16. Corruption of parameter behavior and regionalization by model and forcing data errors: a Bayesian example using the SNOW17 model

    NARCIS (Netherlands)

    He, M.; Hogue, T.S.; Franz, K.J.; Margulis, S.A.; Vrugt, J.A.

    2011-01-01

    The current study evaluates the impacts of various sources of uncertainty involved in hydrologic modeling on parameter behavior and regionalization utilizing different Bayesian likelihood functions and the Differential Evolution Adaptive Metropolis (DREAM) algorithm. The developed likelihood

  17. Cost effectiveness of strategies to combat vision and hearing loss in sub-Saharan Africa and South East Asia: mathematical modelling study.

    NARCIS (Netherlands)

    Baltussen, R.M.; Smith, A.

    2012-01-01

    OBJECTIVE: To determine the relative costs, effects, and cost effectiveness of selected interventions to control cataract, trachoma, refractive error, hearing loss, meningitis and chronic otitis media. DESIGN: Cost effectiveness analysis of or combined strategies for controlling vision and hearing

  18. Bayesian Estimation of Graded Response Multilevel Models Using Gibbs Sampling: Formulation and Illustration

    Science.gov (United States)

    Natesan, Prathiba; Limbers, Christine; Varni, James W.

    2010-01-01

    The present study presents the formulation of graded response models in the multilevel framework (as nonlinear mixed models) and demonstrates their use in estimating item parameters and investigating the group-level effects for specific covariates using Bayesian estimation. The graded response multilevel model (GRMM) combines the formulation of…

  19. Bayesian Comparison of Alternative Graded Response Models for Performance Assessment Applications

    Science.gov (United States)

    Zhu, Xiaowen; Stone, Clement A.

    2012-01-01

    This study examined the relative effectiveness of Bayesian model comparison methods in selecting an appropriate graded response (GR) model for performance assessment applications. Three popular methods were considered: deviance information criterion (DIC), conditional predictive ordinate (CPO), and posterior predictive model checking (PPMC). Using…

  20. Bayesian networks for multivariate data analysis and prognostic modelling in cardiac surgery

    NARCIS (Netherlands)

    Peek, Niels; Verduijn, Marion; Rosseel, Peter M. J.; de Jonge, Evert; de Mol, Bas A.

    2007-01-01

    Prognostic models are tools to predict the outcome of disease and disease treatment. These models are traditionally built with supervised machine learning techniques, and consider prognosis as a static, one-shot activity. This paper presents a new type of prognostic model that builds on the Bayesian

  1. Bayesian model selection: Evidence estimation based on DREAM simulation and bridge sampling

    Science.gov (United States)

    Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.

    2017-04-01

    Bayesian inference has found widespread application in Earth and Environmental Systems Modeling, providing an effective tool for prediction, data assimilation, parameter estimation, uncertainty analysis and hypothesis testing. Under multiple competing hypotheses, the Bayesian approach also provides an attractive alternative to traditional information criteria (e.g. AIC, BIC) for model selection. The key variable for Bayesian model selection is the evidence (or marginal likelihood) that is the normalizing constant in the denominator of Bayes theorem; while it is fundamental for model selection, the evidence is not required for Bayesian inference. It is computed for each hypothesis (model) by averaging the likelihood function over the prior parameter distribution, rather than maximizing it as by information criteria; the larger a model evidence the more support it receives among a collection of hypothesis as the simulated values assign relatively high probability density to the observed data. Hence, the evidence naturally acts as an Occam's razor, preferring simpler and more constrained models against the selection of over-fitted ones by information criteria that incorporate only the likelihood maximum. Since it is not particularly easy to estimate the evidence in practice, Bayesian model selection via the marginal likelihood has not yet found mainstream use. We illustrate here the properties of a new estimator of the Bayesian model evidence, which provides robust and unbiased estimates of the marginal likelihood; the method is coined Gaussian Mixture Importance Sampling (GMIS). GMIS uses multidimensional numerical integration of the posterior parameter distribution via bridge sampling (a generalization of importance sampling) of a mixture distribution fitted to samples of the posterior distribution derived from the DREAM algorithm (Vrugt et al., 2008; 2009). Some illustrative examples are presented to show the robustness and superiority of the GMIS estimator with

  2. Comparative performance of Bayesian and AIC-based measures of phylogenetic model uncertainty.

    Science.gov (United States)

    Alfaro, Michael E; Huelsenbeck, John P

    2006-02-01

    Reversible-jump Markov chain Monte Carlo (RJ-MCMC) is a technique for simultaneously evaluating multiple related (but not necessarily nested) statistical models that has recently been applied to the problem of phylogenetic model selection. Here we use a simulation approach to assess the performance of this method and compare it to Akaike weights, a measure of model uncertainty that is based on the Akaike information criterion. Under conditions where the assumptions of the candidate models matched the generating conditions, both Bayesian and AIC-based methods perform well. The 95% credible interval contained the generating model close to 95% of the time. However, the size of the credible interval differed with the Bayesian credible set containing approximately 25% to 50% fewer models than an AIC-based credible interval. The posterior probability was a better indicator of the correct model than the Akaike weight when all assumptions were met but both measures performed similarly when some model assumptions were violated. Models in the Bayesian posterior distribution were also more similar to the generating model in their number of parameters and were less biased in their complexity. In contrast, Akaike-weighted models were more distant from the generating model and biased towards slightly greater complexity. The AIC-based credible interval appeared to be more robust to the violation of the rate homogeneity assumption. Both AIC and Bayesian approaches suggest that substantial uncertainty can accompany the choice of model for phylogenetic analyses, suggesting that alternative candidate models should be examined in analysis of phylogenetic data. [AIC; Akaike weights; Bayesian phylogenetics; model averaging; model selection; model uncertainty; posterior probability; reversible jump.].

  3. The cost-effectiveness of alternative vaccination strategies for polyvalent meningococcal vaccines in Burkina Faso: A transmission dynamic modeling study.

    Science.gov (United States)

    Yaesoubi, Reza; Trotter, Caroline; Colijn, Caroline; Yaesoubi, Maziar; Colombini, Anaïs; Resch, Stephen; Kristiansen, Paul A; LaForce, F Marc; Cohen, Ted

    2018-01-01

    circulating meningococcal serogroups can be aggregated into a single group; while this assumption is critical for model tractability, it would compromise the insights derived from our model if the effectiveness of the vaccine differs markedly between serogroups or if there are complex between-serogroup interactions that influence the frequency and magnitude of future meningitis epidemics. Our results suggest that a vaccination strategy that includes a catch-up nationwide immunization campaign in young adults with a PMC vaccine and the addition of this new vaccine into EPI is cost-effective and would avert a substantial portion of meningococcal cases expected under the current World Health Organization-recommended strategy of reactive vaccination. This analysis is limited to Burkina Faso and assumes that polyvalent vaccines offer equal protection against all meningococcal serogroups; further studies are needed to evaluate the robustness of this assumption and applicability for other countries in the meningitis belt.

  4. The cost-effectiveness of alternative vaccination strategies for polyvalent meningococcal vaccines in Burkina Faso: A transmission dynamic modeling study.

    Directory of Open Access Journals (Sweden)

    Reza Yaesoubi

    2018-01-01

    all circulating meningococcal serogroups can be aggregated into a single group; while this assumption is critical for model tractability, it would compromise the insights derived from our model if the effectiveness of the vaccine differs markedly between serogroups or if there are complex between-serogroup interactions that influence the frequency and magnitude of future meningitis epidemics.Our results suggest that a vaccination strategy that includes a catch-up nationwide immunization campaign in young adults with a PMC vaccine and the addition of this new vaccine into EPI is cost-effective and would avert a substantial portion of meningococcal cases expected under the current World Health Organization-recommended strategy of reactive vaccination. This analysis is limited to Burkina Faso and assumes that polyvalent vaccines offer equal protection against all meningococcal serogroups; further studies are needed to evaluate the robustness of this assumption and applicability for other countries in the meningitis belt.

  5. Verification of a decision analytic model assumption using real-world practice data: implications for the cost effectiveness of cyclo-oxygenase 2 inhibitors (COX-2s).

    Science.gov (United States)

    Cox, Emily R; Motheral, Brenda; Mager, Doug

    2003-12-01

    To verify the gastroprotective agent (GPA) rate assumption used in cost-effectiveness models for cyclo-oxygenase 2 inhibitors (COX-2s) and to re-estimate model outcomes using GPA rates from actual practice. Prescription and medical claims data obtained from January 1, 1999, through May 31, 2001, from a large preferred provider organization in the Midwest, were used to estimate GPA rates within 3 groups of patients aged at least 18 years who were new to nonselective nonsteroidal anti-inflammatory drugs (NSAIDs) and COX-2 therapy: all new NSAID users, new NSAID users with a diagnosis of rheumatoid arthritis (RA) or osteoarthritis (OA), and a matched cohort of new NSAID users. Of the more than 319,000 members with at least 1 day of eligibility, 1900 met the study inclusion criteria for new NSAID users, 289 had a diagnosis of OA or RA, and 1232 were included in the matched cohort. Gastroprotective agent estimates for nonselective NSAID and COX-2 users were consistent across all 3 samples (all new NSAID users, new NSAID users with a diagnosis of OA or RA, and the matched cohort), with COX-2 GPA rates of 22%, 21%, and 20%, and nonselective NSAID GPA rates of 15%, 15%, and 18%, respectively. Re-estimation of the cost-effectiveness model increased the cost per year of life saved for COX-2s from $18,614 to more than $100,000. Contrary to COX-2 cost-effectiveness model assumptions, the rate of GPA use is positive and marginally higher among COX-2 users than among nonselective NSAID users. These findings call into question the use of expert opinion in estimating practice pattern model inputs prior to a product's use in clinical practice. A re-evaluation of COX-2 cost-effectiveness models is warranted.

  6. A surrogate-based sensitivity quantification and Bayesian inversion of a regional groundwater flow model

    Science.gov (United States)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor

    2018-02-01

    Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.

  7. The modelled cost-effectiveness of cognitive dissonance for the prevention of anorexia nervosa and bulimia nervosa in adolescent girls in Australia.

    Science.gov (United States)

    Le, Long Khanh-Dao; Barendregt, Jan J; Hay, Phillipa; Sawyer, Susan M; Paxton, Susan J; Mihalopoulos, Cathrine

    2017-07-01

    Eating disorders (EDs), including anorexia nervosa (AN) and bulimia nervosa (BN), are prevalent disorders that carry substantial economic and social burden. The aim of the current study was to evaluate the modelled population cost-effectiveness of cognitive dissonance (CD), a school-based preventive intervention for EDs, in the Australian health care context. A population-based Markov model was developed to estimate the cost per disability adjusted life-year (DALY) averted by CD relative to no intervention. We modelled the cases of AN and BN that could be prevented over a 10-year time horizon in each study arm and the subsequent reduction in DALYs associated with this. The target population was 15-18 year old secondary school girls with high body-image concerns. This study only considered costs of the health sector providing services and not costs to individuals. Multivariate probabilistic and one-way sensitivity analyses were conducted to test model assumptions. Findings showed that the mean incremental cost-effectiveness ratio at base-case for the intervention was $103,980 per DALY averted with none of the uncertainty iterations falling below the threshold of AUD$50,000 per DALY averted. The evaluation was most sensitive to estimates of participant rates with higher rates associated with more favourable results. The intervention would become cost-effective (84% chance) if the effect of the intervention lasted up to 5 years. As modelled, school-based CD intervention is not a cost-effective preventive intervention for AN and BN. Given the burden of EDs, understanding how to improve participation rates is an important opportunity for future research. © 2017 Wiley Periodicals, Inc.

  8. Simplifying Probability Elicitation and Uncertainty Modeling in Bayesian Networks

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R; Carroll, Thomas E; Sivaraman, Chitra; Neorr, Peter A; Unwin, Stephen D; Hossain, Shamina S

    2011-04-16

    In this paper we contribute two methods that simplify the demands of knowledge elicitation for particular types of Bayesian networks. The first method simplify the task of providing probabilities when the states that a random variable takes can be described by a new, fully ordered state set in which a state implies all the preceding states. The second method leverages Dempster-Shafer theory of evidence to provide a way for the expert to express the degree of ignorance that they feel about the estimates being provided.

  9. Banking Crisis Early Warning Model based on a Bayesian Model Averaging Approach

    Directory of Open Access Journals (Sweden)

    Taha Zaghdoudi

    2016-08-01

    Full Text Available The succession of banking crises in which most have resulted in huge economic and financial losses, prompted several authors to study their determinants. These authors constructed early warning models to prevent their occurring. It is in this same vein as our study takes its inspiration. In particular, we have developed a warning model of banking crises based on a Bayesian approach. The results of this approach have allowed us to identify the involvement of the decline in bank profitability, deterioration of the competitiveness of the traditional intermediation, banking concentration and higher real interest rates in triggering bank crisis.

  10. A Bayesian state-space model for mixed-stock migrations, with ...

    African Journals Online (AJOL)

    We present a multi-stock, multi-fleet, multi-area, seasonally structured Bayesian state-space model in which different stocks spawn in spatially different areas and the mixing of these stocks is explicitly accounted for in the absence of sufficient tagging data with which to estimate migration rates. The model is applied to the ...

  11. Guidelines for developing and updating Bayesian belief networks applied to ecological modeling and conservation.

    Science.gov (United States)

    B.G. Marcot; J.D. Steventon; G.D. Sutherland; R.K. McCann

    2006-01-01

    We provide practical guidelines for developing, testing, and revising Bayesian belief networks (BBNs). Primary steps in this process include creating influence diagrams of the hypothesized "causal web" of key factors affecting a species or ecological outcome of interest; developing a first, alpha-level BBN model from the influence diagram; revising the model...

  12. Non-parametric Bayesian graph models reveal community structure in resting state fMRI

    DEFF Research Database (Denmark)

    Andersen, Kasper Winther; Madsen, Kristoffer H.; Siebner, Hartwig Roman

    2014-01-01

    Modeling of resting state functional magnetic resonance imaging (rs-fMRI) data using network models is of increasing interest. It is often desirable to group nodes into clusters to interpret the communication patterns between nodes. In this study we consider three different nonparametric Bayesian...

  13. Adaptive mastery testing using the Rasch model and Bayesian sequential decision theory

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Vos, Hendrik J.

    1998-01-01

    A version of sequential mastery testing is studied in which response behavior is modeled by an item response theory (IRT) model. First, a general theoretical framework is sketched that is based on a combination of Bayesian sequential decision theory and item response theory. A discussion follows on

  14. Bayesian prediction of spatial count data using generalized linear mixed models

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Waagepetersen, Rasmus Plenge

    2002-01-01

    Spatial weed count data are modeled and predicted using a generalized linear mixed model combined with a Bayesian approach and Markov chain Monte Carlo. Informative priors for a data set with sparse sampling are elicited using a previously collected data set with extensive sampling. Furthermore, ...

  15. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    Science.gov (United States)

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  16. A Bayesian MCMC method for point process models with intractable normalising constants

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2004-01-01

    to simulate from the "unknown distribution", perfect simulation algorithms become useful. We illustrate the method in cases whre the likelihood is given by a Markov point process model. Particularly, we consider semi-parametric Bayesian inference in connection to both inhomogeneous Markov point process models...... and pairwise interaction point processes....

  17. Multi-objective calibration of forecast ensembles using Bayesian model averaging

    NARCIS (Netherlands)

    Vrugt, J.A.; Clark, M.P.; Diks, C.G.H.; Duan, Q.; Robinson, B.A.

    2006-01-01

    Bayesian Model Averaging (BMA) has recently been proposed as a method for statistical postprocessing of forecast ensembles from numerical weather prediction models. The BMA predictive probability density function (PDF) of any weather quantity of interest is a weighted average of PDFs centered on the

  18. Hierarchical Bayesian modeling of the space - time diffusion patterns of cholera epidemic in Kumasi, Ghana

    NARCIS (Netherlands)

    Osei, Frank B.; Osei, F.B.; Duker, Alfred A.; Stein, A.

    2011-01-01

    This study analyses the joint effects of the two transmission routes of cholera on the space-time diffusion dynamics. Statistical models are developed and presented to investigate the transmission network routes of cholera diffusion. A hierarchical Bayesian modelling approach is employed for a joint

  19. Bayesian estimation and hypothesis tests for a circular Generalized Linear Model

    NARCIS (Netherlands)

    Mulder, Kees; Klugkist, Irene

    2017-01-01

    Motivated by a study from cognitive psychology, we develop a Generalized Linear Model for circular data within the Bayesian framework, using the von Mises distribution. Although circular data arise in a wide variety of scientific fields, the number of methods for their analysis is limited. Our model

  20. Bayesian prediction of spatial count data using generalized linear mixed models

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Waagepetersen, Rasmus Plenge

    2002-01-01

    Spatial weed count data are modeled and predicted using a generalized linear mixed model combined with a Bayesian approach and Markov chain Monte Carlo. Informative priors for a data set with sparse sampling are elicited using a previously collected data set with extensive sampling. Furthermore, we...

  1. SensibleSleep: A Bayesian Model for Learning Sleep Patterns from Smartphone Events

    DEFF Research Database (Denmark)

    Cuttone, Andrea; Bækgaard, Per; Sekara, Vedran

    2017-01-01

    participants from two different datasets, and we verify the results against ground truth from dedicated armband sleep trackers. We show that the model is able to produce reliable sleep estimates with an accuracy of 0.89, both at the individual and at the collective level. Moreover the Bayesian model is able...

  2. Bayesian networks with a logistic regression model for the conditional probabilities

    NARCIS (Netherlands)

    Rijmen, F.P.J.

    2008-01-01

    Logistic regression techniques can be used to restrict the conditional probabilities of a Bayesian network for discrete variables. More specifically, each variable of the network can be modeled through a logistic regression model, in which the parents of the variable define the covariates. When all

  3. Featuring Multiple Local Optima to Assist the User in the Interpretation of Induced Bayesian Network Models

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Pena, Jose; Kocka, Tomas

    2004-01-01

    We propose a method to assist the user in the interpretation of the best Bayesian network model indu- ced from data. The method consists in extracting relevant features from the model (e.g. edges, directed paths and Markov blankets) and, then, assessing the con¯dence in them by studying multiple...

  4. Bayesian network models for the management of ventilator-associated pneumonia

    NARCIS (Netherlands)

    Visscher, S.

    2008-01-01

    The purpose of the research described in this thesis was to develop Bayesian network models for the analysis of patient data, as well as to use such a model as a clinical decision-support system for assisting clinicians in the diagnosis and treatment of ventilator-associated pneumonia (VAP) in

  5. A Danish cost-effectiveness model of escitalopram in comparison with citalopram and venlafaxine as first-line treatments for major depressive disorder in primary care.

    Science.gov (United States)

    Sørensen, Jan; Stage, Kurt B; Damsbo, Niels; Le Lay, Agathe; Hemels, Michiel E

    2007-01-01

    The objective of this study was to model the cost-effectiveness of escitalopram in comparison with generic citalopram and venlafaxine in primary care treatment of major depressive disorder (baseline scores 22-40 on the Montgomery-Asberg Depression Rating Scale, MADRS) in Denmark. A three-path decision analytic model with a 6-month horizon was used. All patients started at the primary care path and were referred to outpatient or inpatient secondary care in the case of insufficient response to treatment. Model inputs included drug-specific probabilities derived from systematic literature review, ad-hoc survey and expert opinion. Main outcome measures were remission defined as MADRS escitalopram (64.1%) than citalopram (58.9%). From both perspectives, the total expected cost per successfully treated patient was lower for escitalopram (DKK 22,323 healthcare, DKK 72,399 societal) than for citalopram (DKK 25,778 healthcare, DKK 87,786 societal). Remission rates and costs were similar for escitalopram and venlafaxine. Robustness of the findings was verified in multivariate sensitivity analyses. For patients in primary care, escitalopram appears to be a cost-effective alternative to (generic) citalopram, with greater clinical benefit and cost-savings, and similar in cost-effectiveness to venlafaxine.

  6. Basic and Advanced Bayesian Structural Equation Modeling With Applications in the Medical and Behavioral Sciences

    CERN Document Server

    Lee, Sik-Yum

    2012-01-01

    This book provides clear instructions to researchers on how to apply Structural Equation Models (SEMs) for analyzing the inter relationships between observed and latent variables. Basic and Advanced Bayesian Structural Equation Modeling introduces basic and advanced SEMs for analyzing various kinds of complex data, such as ordered and unordered categorical data, multilevel data, mixture data, longitudinal data, highly non-normal data, as well as some of their combinations. In addition, Bayesian semiparametric SEMs to capture the true distribution of explanatory latent variables are introduce

  7. Bayesian conditional-independence modeling of the AIDS epidemic in England and Wales

    Science.gov (United States)

    Gilks, Walter R.; De Angelis, Daniela; Day, Nicholas E.

    We describe the use of conditional-independence modeling, Bayesian inference and Markov chain Monte Carlo, to model and project the HIV-AIDS epidemic in homosexual/bisexual males in England and Wales. Complexity in this analysis arises through selectively missing data, indirectly observed underlying processes, and measurement error. Our emphasis is on presentation and discussion of the concepts, not on the technicalities of this analysis, which can be found elsewhere [D. De Angelis, W.R. Gilks, N.E. Day, Bayesian projection of the the acquired immune deficiency syndrome epidemic (with discussion), Applied Statistics, in press].

  8. Sparse Estimation Using Bayesian Hierarchical Prior Modeling for Real and Complex Linear Models

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Badiu, Mihai Alin

    2015-01-01

    -valued models, this paper proposes a GSM model - the Bessel K model - that induces concave penalty functions for the estimation of complex sparse signals. The properties of the Bessel K model are analyzed when it is applied to Type I and Type II estimation. This analysis reveals that, by tuning the parameters...... of the mixing pdf different penalty functions are invoked depending on the estimation type used, the value of the noise variance, and whether real or complex signals are estimated. Using the Bessel K model, we derive a sparse estimator based on a modification of the expectation-maximization algorithm formulated......In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex...

  9. Public Health Impact and Cost-Effectiveness of Hepatitis A Vaccination in the United States: A Disease Transmission Dynamic Modeling Approach.

    Science.gov (United States)

    Dhankhar, Praveen; Nwankwo, Chizoba; Pillsbury, Matthew; Lauschke, Andreas; Goveia, Michelle G; Acosta, Camilo J; Elbasha, Elamin H

    2015-06-01

    To assess the population-level impact and cost-effectiveness of hepatitis A vaccination programs in the United States. We developed an age-structured population model of hepatitis A transmission dynamics to evaluate two policies of administering a two-dose hepatitis A vaccine to children aged 12 to 18 months: 1) universal routine vaccination as recommended by the Advisory Committee on Immunization Practices in 2006 and 2) Advisory Committee on Immunization Practices's previous regional policy of routine vaccination of children living in states with high hepatitis A incidence. Inputs were obtained from the published literature, public sources, and clinical trial data. The model was fitted to hepatitis A seroprevalence (National Health and Nutrition Examination Survey II and III) and reported incidence from the National Notifiable Diseases Surveillance System (1980-1995). We used a societal perspective and projected costs (in 2013 US $), quality-adjusted life-years, incremental cost-effectiveness ratio, and other outcomes over the period 2006 to 2106. On average, universal routine hepatitis A vaccination prevented 259,776 additional infections, 167,094 outpatient visits, 4781 hospitalizations, and 228 deaths annually. Compared with the regional vaccination policy, universal routine hepatitis A vaccination was cost saving. In scenario analysis, universal vaccination prevented 94,957 infections, 46,179 outpatient visits, 1286 hospitalizations, and 15 deaths annually and had an incremental cost-effectiveness ratio of $21,223/quality-adjusted life-year when herd protection was ignored. Our model predicted that universal childhood hepatitis A vaccination led to significant reductions in hepatitis A mortality and morbidity. Consequently, universal vaccination was cost saving compared with a regional vaccination policy. Herd protection effects of hepatitis A vaccination programs had a significant impact on hepatitis A mortality, morbidity, and cost-effectiveness ratios

  10. Cost Effective Prototyping

    Science.gov (United States)

    Wickman, Jerry L.; Kundu, Nikhil K.

    1996-01-01

    This laboratory exercise seeks to develop a cost effective prototype development. The exercise has the potential of linking part design, CAD, mold development, quality control, metrology, mold flow, materials testing, fixture design, automation, limited parts production and other issues as related to plastics manufacturing.

  11. Bayesian Hierarchical Scale Mixtures of Log-Normal Models for Inference in Reliability with Stochastic Constraint

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2017-06-01

    Full Text Available This paper develops Bayesian inference in reliability of a class of scale mixtures of log-normal failure time (SMLNFT models with stochastic (or uncertain constraint in their reliability measures. The class is comprehensive and includes existing failure time (FT models (such as log-normal, log-Cauchy, and log-logistic FT models as well as new models that are robust in terms of heavy-tailed FT observations. Since classical frequency approaches to reliability analysis based on the SMLNFT model with stochastic constraint are intractable, the Bayesian method is pursued utilizing a Markov chain Monte Carlo (MCMC sampling based approach. This paper introduces a two-stage maximum entropy (MaxEnt prior, which elicits a priori uncertain constraint and develops Bayesian hierarchical SMLNFT model by using the prior. The paper also proposes an MCMC method for Bayesian inference in the SMLNFT model reliability and calls attention to properties of the MaxEnt prior that are useful for method development. Finally, two data sets are used to illustrate how the proposed methodology works.

  12. Bayesian analysis of data and model error in rainfall-runoff hydrological models

    Science.gov (United States)

    Kavetski, D.; Franks, S. W.; Kuczera, G.

    2004-12-01

    A major unresolved issue in the identification and use of conceptual hydrologic models is realistic description of uncertainty in the data and model structure. In particular, hydrologic parameters often cannot be measured directly and must be inferred (calibrated) from observed forcing/response data (typically, rainfall and runoff). However, rainfall varies significantly in space and time, yet is often estimated from sparse gauge networks. Recent work showed that current calibration methods (e.g., standard least squares, multi-objective calibration, generalized likelihood uncertainty estimation) ignore forcing uncertainty and assume that the rainfall is known exactly. Consequently, they can yield strongly biased and misleading parameter estimates. This deficiency confounds attempts to reliably test model hypotheses, to generalize results across catchments (the regionalization problem) and to quantify predictive uncertainty when the hydrologic model is extrapolated. This paper continues the development of a Bayesian total error analysis (BATEA) methodology for the calibration and identification of hydrologic models, which explicitly incorporates the uncertainty in both the forcing and response data, and allows systematic model comparison based on residual model errors and formal Bayesian hypothesis testing (e.g., using Bayes factors). BATEA is based on explicit stochastic models for both forcing and response uncertainty, whereas current techniques focus solely on response errors. Hence, unlike existing methods, the BATEA parameter equations directly reflect the modeler's confidence in all the data. We compare several approaches to approximating the parameter distributions: a) full Markov Chain Monte Carlo methods and b) simplified approaches based on linear approximations. Studies using synthetic and real data from the US and Australia show that BATEA systematically reduces the parameter bias, leads to more meaningful model fits and allows model comparison taking

  13. A Bayesian Stepwise Discriminant Model for Predicting Risk Factors of Preterm Premature Rupture of Membranes: A Case-control Study

    Directory of Open Access Journals (Sweden)

    Li-Xia Zhang

    2017-01-01

    Conclusions: This study established a Bayesian stepwise discriminant model to predict the incidence of PPROM. The UU, CT, and GBS infections were discriminant factors for PPROM according to a Bayesian stepwise discriminant analysis. This model could provide a new method for the early predicting of PPROM in pregnant women.

  14. Introduction of a methodology for visualization and graphical interpretation of Bayesian classification models.

    Science.gov (United States)

    Balfer, Jenny; Bajorath, Jürgen

    2014-09-22

    Supervised machine learning models are widely used in chemoinformatics, especially for the prediction of new active compounds or targets of known actives. Bayesian classification methods are among the most popular machine learning approaches for the prediction of activity from chemical structure. Much work has focused on predicting structure-activity relationships (SARs) on the basis of experimental training data. By contrast, only a few efforts have thus far been made to rationalize the performance of Bayesian or other supervised machine learning models and better understand why they might succeed or fail. In this study, we introduce an intuitive approach for the visualization and graphical interpretation of naïve Bayesian classification models. Parameters derived during supervised learning are visualized and interactively analyzed to gain insights into model performance and identify features that determine predictions. The methodology is introduced in detail and applied to assess Bayesian modeling efforts and predictions on compound data sets of varying structural complexity. Different classification models and features determining their performance are characterized in detail. A prototypic implementation of the approach is provided.

  15. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  16. Risk prediction model for knee pain in the Nottingham community: a Bayesian modelling approach.

    Science.gov (United States)

    Fernandes, G S; Bhattacharya, A; McWilliams, D F; Ingham, S L; Doherty, M; Zhang, W

    2017-03-20

    Twenty-five percent of the British population over the age of 50 years experiences knee pain. Knee pain can limit physical ability and cause distress and bears significant socioeconomic costs. The objectives of this study were to develop and validate the first risk prediction model for incident knee pain in the Nottingham community and validate this internally within the Nottingham cohort and externally within the Osteoarthritis Initiative (OAI) cohort. A total of 1822 participants from the Nottingham community who were at risk for knee pain were followed for 12 years. Of this cohort, two-thirds (n = 1203) were used to develop the risk prediction model, and one-third (n = 619) were used to validate the model. Incident knee pain was defined as pain on most days for at least 1 month in the past 12 months. Predictors were age, sex, body mass index, pain elsewhere, prior knee injury and knee alignment. A Bayesian logistic regression model was used to determine the probability of an OR >1. The Hosmer-Lemeshow χ 2 statistic (HLS) was used for calibration, and ROC curve analysis was used for discrimination. The OAI cohort from the United States was also used to examine the performance of the model. A risk prediction model for knee pain incidence was developed using a Bayesian approach. The model had good calibration, with an HLS of 7.17 (p = 0.52) and moderate discriminative ability (ROC 0.70) in the community. Individual scenarios are given using the model. However, the model had poor calibration (HLS 5866.28, p prediction model for knee pain, regardless of underlying structural changes of knee osteoarthritis, in the community using a Bayesian modelling approach. The model appears to work well in a community-based population but not in individuals with a higher risk for knee osteoarthritis, and it may provide a convenient tool for use in primary care to predict the risk of knee pain in the general population.

  17. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  18. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems. © 2013 Elsevier Inc.

  19. Medical Inpatient Journey Modeling and Clustering: A Bayesian Hidden Markov Model Based Approach.

    Science.gov (United States)

    Huang, Zhengxing; Dong, Wei; Wang, Fei; Duan, Huilong

    2015-01-01

    Modeling and clustering medical inpatient journeys is useful to healthcare organizations for a number of reasons including inpatient journey reorganization in a more convenient way for understanding and browsing, etc. In this study, we present a probabilistic model-based approach to model and cluster medical inpatient journeys. Specifically, we exploit a Bayesian Hidden Markov Model based approach to transform medical inpatient journeys into a probabilistic space, which can be seen as a richer representation of inpatient journeys to be clustered. Then, using hierarchical clustering on the matrix of similarities, inpatient journeys can be clustered into different categories w.r.t their clinical and temporal characteristics. We evaluated the proposed approach on a real clinical data set pertaining to the unstable angina treatment process. The experimental results reveal that our method can identify and model latent treatment topics underlying in personalized inpatient journeys, and yield impressive clustering quality.

  20. Adaptive surrogate modeling for response surface approximations with application to bayesian inference

    KAUST Repository

    Prudhomme, Serge

    2015-09-17

    Parameter estimation for complex models using Bayesian inference is usually a very costly process as it requires a large number of solves of the forward problem. We show here how the construction of adaptive surrogate models using a posteriori error estimates for quantities of interest can significantly reduce the computational cost in problems of statistical inference. As surrogate models provide only approximations of the true solutions of the forward problem, it is nevertheless necessary to control these errors in order to construct an accurate reduced model with respect to the observables utilized in the identification of the model parameters. Effectiveness of the proposed approach is demonstrated on a numerical example dealing with the Spalart–Allmaras model for the simulation of turbulent channel flows. In particular, we illustrate how Bayesian model selection using the adapted surrogate model in place of solving the coupled nonlinear equations leads to the same quality of results while requiring fewer nonlinear PDE solves.

  1. Bayesian Inference on the Memory Parameter for Gamma-Modulated Regression Models

    Directory of Open Access Journals (Sweden)

    Plinio Andrade

    2015-09-01

    Full Text Available In this work, we propose a Bayesian methodology to make inferences for the memory parameter and other characteristics under non-standard assumptions for a class of stochastic processes. This class generalizes the Gamma-modulated process, with trajectories that exhibit long memory behavior, as well as decreasing variability as time increases. Different values of the memory parameter influence the speed of this decrease, making this heteroscedastic model very flexible. Its properties are used to implement an approximate Bayesian computation and MCMC scheme to obtain posterior estimates. We test and validate our method through simulations and real data from the big earthquake that occurred in 2010 in Chile.

  2. [Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].

    Science.gov (United States)

    Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L

    2017-03-10

    To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.

  3. Using Bayesian statistics for modeling PTSD through Latent Growth Mixture Modeling: implementation and discussion

    Directory of Open Access Journals (Sweden)

    Sarah Depaoli

    2015-03-01

    Full Text Available Background: After traumatic events, such as disaster, war trauma, and injuries including burns (which is the focus here, the risk to develop posttraumatic stress disorder (PTSD is approximately 10% (Breslau & Davis, 1992. Latent Growth Mixture Modeling can be used to classify individuals into distinct groups exhibiting different patterns of PTSD (Galatzer-Levy, 2015. Currently, empirical evidence points to four distinct trajectories of PTSD patterns in those who have experienced burn trauma. These trajectories are labeled as: resilient, recovery, chronic, and delayed onset trajectories (e.g., Bonanno, 2004; Bonanno, Brewin, Kaniasty, & Greca, 2010; Maercker, Gäbler, O'Neil, Schützwohl, & Müller, 2013; Pietrzak et al., 2013. The delayed onset trajectory affects only a small group of individuals, that is, about 4–5% (O'Donnell, Elliott, Lau, & Creamer, 2007. In addition to its low frequency, the later onset of this trajectory may contribute to the fact that these individuals can be easily overlooked by professionals. In this special symposium on Estimating PTSD trajectories (Van de Schoot, 2015a, we illustrate how to properly identify this small group of individuals through the Bayesian estimation framework using previous knowledge through priors (see, e.g., Depaoli & Boyajian, 2014; Van de Schoot, Broere, Perryck, Zondervan-Zwijnenburg, & Van Loey, 2015. Method: We used latent growth mixture modeling (LGMM (Van de Schoot, 2015b to estimate PTSD trajectories across 4 years that followed a traumatic burn. We demonstrate and compare results from traditional (maximum likelihood and Bayesian estimation using priors (see, Depaoli, 2012, 2013. Further, we discuss where priors come from and how to define them in the estimation process. Results: We demonstrate that only the Bayesian approach results in the desired theory-driven solution of PTSD trajectories. Since the priors are chosen subjectively, we also present a sensitivity analysis of the

  4. Evidence on a Real Business Cycle Model with Neutral and Investment-Specific Technology Shocks using Bayesian Model Averaging

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2010-01-01

    textabstractThe empirical support for a real business cycle model with two technology shocks is evaluated using a Bayesian model averaging procedure. This procedure makes use of a finite mixture of many models within the class of vector autoregressive (VAR) processes. The linear VAR model is

  5. Bayesian Analysis of Multidimensional Item Response Theory Models: A Discussion and Illustration of Three Response Style Models

    Science.gov (United States)

    Leventhal, Brian C.; Stone, Clement A.

    2018-01-01

    Interest in Bayesian analysis of item response theory (IRT) models has grown tremendously due to the appeal of the paradigm among psychometricians, advantages of these methods when analyzing complex models, and availability of general-purpose software. Possible models include models which reflect multidimensionality due to designed test structure,…

  6. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  7. Cost-effectiveness evaluation of quadrivalent influenza vaccines for seasonal influenza prevention: a dynamic modeling study of Canada and the United Kingdom.

    Science.gov (United States)

    Thommes, Edward W; Ismaila, Afisi; Chit, Ayman; Meier, Genevieve; Bauch, Christopher T

    2015-10-27

    The adoption of quadrivalent influenza vaccine (QIV) to replace trivalent influenza vaccine (TIV) in immunization programs is growing worldwide, thus helping to address the problem of influenza B lineage mismatch. However, the price per dose of QIV is higher than that of TIV. In such circumstances, cost-effectiveness analyses provide important and relevant information to inform national health recommendations and implementation decisions. This analysis assessed potential vaccine impacts and cost-effectiveness of a country-wide switch from TIV to QIV, in Canada and the UK, from a third-party payer perspective. An age-stratified, dynamic four-strain transmission model which incorporates strain interaction, transmission-rate seasonality and age-specific mixing in the population was used. Model input data were obtained from published literature and online databases. In Canada, we evaluated a switch from TIV to QIV in the entire population. For the UK, we considered two strategies: Children aged 2-17 years who receive the live-attenuated influenza vaccine (LAIV) switch to the quadrivalent formulation (QLAIV), while individuals aged > 18 years switch from TIV to QIV. Two different vaccination uptake scenarios in children (UK1 and UK2, which differ in the vaccine uptake level) were considered. Health and cost outcomes for both vaccination strategies, and the cost-effectiveness of switching from TIV/LAIV to QIV/QLAIV, were estimated from the payer perspective. For Canada and the UK, cost and outcomes were discounted using 5 % and 3.5 % per year, respectively. Overall, in an average influenza season, our model predicts that a nationwide switch from TIV to QIV would prevent 4.6 % influenza cases, 4.9 % general practitioner (GP) visits, 5.7 % each of emergency room (ER) visits and hospitalizations, and 6.8 % deaths in Canada. In the UK (UK1/UK2), implementing QIV would prevent 1.4 %/1.8 % of influenza cases, 1.6 %/2.0 % each of GP and ER visits, 1.5 %/1.9 % of

  8. A comparison of machine learning and Bayesian modelling for molecular serotyping.

    Science.gov (United States)

    Newton, Richard; Wernisch, Lorenz

    2017-08-11

    Streptococcus pneumoniae is a human pathogen that is a major cause of infant mortality. Identifying the pneumococcal serotype is an important step in monitoring the impact of vaccines used to protect against disease. Genomic microarrays provide an effective method for molecular serotyping. Previously we developed an empirical Bayesian model for the classification of serotypes from a molecular serotyping array. With only few samples available, a model driven approach was the only option. In the meanwhile, several thousand samples have been made available to us, providing an opportunity to investigate serotype classification by machine learning methods, which could complement the Bayesian model. We compare the performance of the original Bayesian model with two machine learning algorithms: Gradient Boosting Machines and Random Forests. We present our results as an example of a generic strategy whereby a preliminary probabilistic model is complemented or replaced by a machine learning classifier once enough data are available. Despite the availability of thousands of serotyping arrays, a problem encountered when applying machine learning methods is the lack of training data containing mixtures of serotypes; due to the large number of possible combinations. Most of the available training data comprises samples with only a single serotype. To overcome the lack of training data we implemented an iterative analysis, creating artificial training data of serotype mixtures by combining raw data from single serotype arrays. With the enhanced training set the machine learning algorithms out perform the original Bayesian model. However, for serotypes currently lacking sufficient training data the best performing implementation was a combination of the results of the Bayesian Model and the Gradient Boosting Machine. As well as being an effective method for classifying biological data, machine learning can also be used as an efficient method for revealing subtle biological

  9. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    Science.gov (United States)

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes.

  10. Epidemiological and Economic Impact of Monovalent and Pentavalent Rotavirus Vaccines in Low and Middle Income Countries: A Cost-effectiveness Modeling Analysis.

    Science.gov (United States)

    Paternina-Caicedo, Angel; De la Hoz-Restrepo, Fernando; Alvis-Guzmán, Nelson

    2015-07-01

    The competing choices of vaccination with either RV1 or RV5, the potential budget impact of vaccines on the EPI with different prices and new evidence make important an updated analysis for health decision makers in each country. The objective of this study is to assess cost-effectiveness of the monovalent and pentavalent rotavirus vaccines and impact on children deaths, inpatient and outpatient visits in 116 low and middle income countries that represent approximately 99% of rotavirus mortality. A decision tree model followed hypothetical cohorts of children from birth up to 5 years of age for each country in 2010. Inputs were gathered from international databases and previous research on incidence and effectiveness of monovalent and pentavalent vaccines. Costs were expressed in 2010 international dollars. Outcomes were reported in terms of cost per disability-adjusted life-year averted, comparing no vaccination with either monovalent or pentavalent mass introduction. Vaccine price was assumed fixed for all world low-income and middle-income countries. Around 292,000 deaths, 3.34 million inpatient cases and 23.09 million outpatient cases would occur with no vaccination. In the base-case scenario, monovalent vaccination would prevent 54.7% of inpatient cases and 45.4% of deaths. Pentavalent vaccination would prevent 51.4% of inpatient cases and 41.1% of deaths. The vaccine was cost-effective in all world countries in the base-case scenario for both vaccines. Cost per disability-adjusted life-year averted in all selected countries was I$372 for monovalent, and I$453 for pentavalent vaccination. Rotavirus vaccine is cost-effective in most analyzed countries. Despite cost-effectiveness analysis is a useful tool for decision making in middle-income countries, for low-income countries health decision makers should also assess the impact of introducing either vaccine on local resources and budget impact analysis of vaccination.

  11. A Bayesian Performance Prediction Model for Mathematics Education: A Prototypical Approach for Effective Group Composition

    Science.gov (United States)

    Bekele, Rahel; McPherson, Maggie

    2011-01-01

    This research work presents a Bayesian Performance Prediction Model that was created in order to determine the strength of personality traits in predicting the level of mathematics performance of high school students in Addis Ababa. It is an automated tool that can be used to collect information from students for the purpose of effective group…

  12. A Bayesian Beta-Mixture Model for Nonparametric IRT (BBM-IRT)

    Science.gov (United States)

    Arenson, Ethan A.; Karabatsos, George

    2017-01-01

    Item response models typically assume that the item characteristic (step) curves follow a logistic or normal cumulative distribution function, which are strictly monotone functions of person test ability. Such assumptions can be overly-restrictive for real item response data. We propose a simple and more flexible Bayesian nonparametric IRT model…

  13. Joint Bayesian Analysis of Parameters and States in Nonlinear, Non-Gaussian State Space Models

    NARCIS (Netherlands)

    Barra, I.; Hoogerheide, L.F.; Koopman, S.J.; Lucas, A.

    2017-01-01

    We propose a new methodology for designing flexible proposal densities for the joint posterior density of parameters and states in a nonlinear, non-Gaussian state space model. We show that a highly efficient Bayesian procedure emerges when these proposal densities are used in an independent

  14. What Type of Finance Matters for Growth? Bayesian Model Averaging Evidence

    Czech Academy of Sciences Publication Activity Database

    Iftekhar, H.; Horváth, Roman; Mareš, J.

    -, - (2018) ISSN 0258-6770 R&D Projects: GA ČR GA16-09190S Institutional support: RVO:67985556 Keywords : long-term economic growth * Bayesian model * uncertainty Subject RIV: AH - Economic s Impact factor: 1.431, year: 2016 http://library.utia.cas.cz/separaty/2017/E/horvath-0466516.pdf

  15. Food Reconstruction Using Isotopic Transferred Signals (FRUITS): A Bayesian Model for Diet Reconstruction

    Czech Academy of Sciences Publication Activity Database

    Fernandes, R.; Millard, A.R.; Brabec, Marek; Nadeau, M.J.; Grootes, P.

    2014-01-01

    Roč. 9, č. 2 (2014), Art . no. e87436 E-ISSN 1932-6203 Institutional support: RVO:67985807 Keywords : ancienit diet reconstruction * stable isotope measurements * mixture model * Bayesian estimation * Dirichlet prior Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.234, year: 2014

  16. Pretense, Counterfactuals, and Bayesian Causal Models: Why What Is Not Real Really Matters

    Science.gov (United States)

    Weisberg, Deena S.; Gopnik, Alison

    2013-01-01

    Young children spend a large portion of their time pretending about non-real situations. Why? We answer this question by using the framework of Bayesian causal models to argue that pretending and counterfactual reasoning engage the same component cognitive abilities: disengaging with current reality, making inferences about an alternative…

  17. Bayesian Uncertainty Quantification for Subsurface Inversion Using a Multiscale Hierarchical Model

    KAUST Repository

    Mondal, Anirban

    2014-07-03

    We consider a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a random field (spatial or temporal). The Bayesian approach contains a natural mechanism for regularization in the form of prior information, can incorporate information from heterogeneous sources and provide a quantitative assessment of uncertainty in the inverse solution. The Bayesian setting casts the inverse solution as a posterior probability distribution over the model parameters. The Karhunen-Loeve expansion is used for dimension reduction of the random field. Furthermore, we use a hierarchical Bayes model to inject multiscale data in the modeling framework. In this Bayesian framework, we show that this inverse problem is well-posed by proving that the posterior measure is Lipschitz continuous with respect to the data in total variation norm. Computational challenges in this construction arise from the need for repeated evaluations of the forward model (e.g., in the context of MCMC) and are compounded by high dimensionality of the posterior. We develop two-stage reversible jump MCMC that has the ability to screen the bad proposals in the first inexpensive stage. Numerical results are presented by analyzing simulated as well as real data from hydrocarbon reservoir. This article has supplementary material available online. © 2014 American Statistical Association and the American Society for Quality.

  18. A Bayesian model for predicting face recognition performance using image quality

    NARCIS (Netherlands)

    Dutta, A.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2014-01-01

    Quality of a pair of facial images is a strong indicator of the uncertainty in decision about identity based on that image pair. In this paper, we describe a Bayesian approach to model the relation between image quality (like pose, illumination, noise, sharpness, etc) and corresponding face

  19. Bayesian Model Averaging Employing Fixed and Flexible Priors: The BMS Package for R

    Directory of Open Access Journals (Sweden)

    Stefan Zeugner

    2015-11-01

    Full Text Available This article describes the BMS (Bayesian model sampling package for R that implements Bayesian model averaging for linear regression models. The package excels in allowing for a variety of prior structures, among them the "binomial-beta" prior on the model space and the so-called "hyper-g" specifications for Zellner's g prior. Furthermore, the BMS package allows the user to specify her own model priors and offers a possibility of subjective inference by setting "prior inclusion probabilities" according to the researcher's beliefs. Furthermore, graphical analysis of results is provided by numerous built-in plot functions of posterior densities, predictive densities and graphical illustrations to compare results under different prior settings. Finally, the package provides full enumeration of the model space for small scale problems as well as two efficient MCMC (Markov chain Monte Carlo samplers that sort through the model space when the number of potential covariates is large.

  20. Climatic Models Ensemble-based Mid-21st Century Runoff Projections: A Bayesian Framework

    Science.gov (United States)

    Achieng, K. O.; Zhu, J.

    2017-12-01

    There are a number of North American Regional Climate Change Assessment Program (NARCCAP) climatic models that have been used to project surface runoff in the mid-21st century. Statistical model selection techniques are often used to select the model that best fits data. However, model selection techniques often lead to different conclusions. In this study, ten models are averaged in Bayesian paradigm to project runoff. Bayesian Model Averaging (BMA) is used to project and identify effect of model uncertainty on future runoff projections. Baseflow separation - a two-digital filter which is also called Eckhardt filter - is used to separate USGS streamflow (total runoff) into two components: baseflow and surface runoff. We use this surface runoff as the a priori runoff when conducting BMA of runoff simulated from the ten RCM models. The primary objective of this study is to evaluate how well RCM multi-model ensembles simulate surface runoff, in a Bayesian framework. Specifically, we investigate and discuss the following questions: How well do ten RCM models ensemble jointly simulate surface runoff by averaging over all the models using BMA, given a priori surface runoff? What are the effects of model uncertainty on surface runoff simulation?

  1. Systematic review and economic modelling of the clinical effectiveness and cost-effectiveness of art therapy among people with non-psychotic mental health disorders.

    Science.gov (United States)

    Uttley, Lesley; Scope, Alison; Stevenson, Matt; Rawdin, Andrew; Taylor Buck, Elizabeth; Sutton, Anthea; Stevens, John; Kaltenthaler, Eva; Dent-Brown, Kim; Wood, Chris

    2015-03-01

    . One study reported that outcomes were more favourable in the control group. The quality of included RCTs was generally low. In the qualitative review, 12 cohort studies were included (n = 188 service users; n = 16 service providers). Themes relating to benefits of art therapy for service users included the relationship with the therapist, personal achievement and distraction. Areas of potential harms were related to the activation of emotions that were then unresolved, lack of skill of the art therapist and sudden termination of art therapy. The quality of included qualitative studies was generally low to moderate. In the cost-effectiveness review, a de novo model was constructed and populated with data identified from the clinical review. Scenario analyses were conducted allowing comparisons of group art therapy with wait-list control, group art therapy with group verbal therapy, and individual art therapy versus control. Art therapy appeared cost-effective compared with wait-list control with high certainty, although generalisability to the target population was unclear. Verbal therapy appeared more cost-effective than art therapy but there was considerable uncertainty and a sizeable probability that art therapy was more clinically effective. The cost-effectiveness of individual art therapy was uncertain and dependent on assumptions regarding clinical benefit and duration of benefit. From the limited available evidence, art therapy was associated with positive effects when compared with a control in a number of studies in patients with different clinical profiles, and it was reported to be an acceptable treatment and was associated with a number of benefits. Art therapy appeared to be cost-effective compared with wait-list but further studies are needed to confirm this finding as well as evidence to inform future cost-effective analyses of art therapy versus other treatments. The study is registered as PROSPERO CRD42013003957. The National Institute for

  2. Cost effectiveness of self-monitoring of blood glucose (SMBG) for patients with type 2 diabetes and not on insulin: impact of modelling assumptions on recent Canadian findings.

    Science.gov (United States)

    Tunis, Sandra L

    2011-11-01

    Canadian patients, healthcare providers and payers share interest in assessing the value of self-monitoring of blood glucose (SMBG) for individuals with type 2 diabetes but not on insulin. Using the UKPDS (UK Prospective Diabetes Study) model, the Canadian Optimal Prescribing and Utilization Service (COMPUS) conducted an SMBG cost-effectiveness analysis. Based on the results, COMPUS does not recommend routine strip use for most adults with type 2 diabetes who are not on insulin. Cost-effectiveness studies require many assumptions regarding cohort, clinical effect, complication costs, etc. The COMPUS evaluation included several conservative assumptions that negatively impacted SMBG cost effectiveness. Current objectives were to (i) review key, impactful COMPUS assumptions; (ii) illustrate how alternative inputs can lead to more favourable results for SMBG cost effectiveness; and (iii) provide recommendations for assessing its long-term value. A summary of COMPUS methods and results was followed by a review of assumptions (for trial-based glycosylated haemoglobin [HbA(1c)] effect, patient characteristics, costs, simulation pathway) and their potential impact. The UKPDS model was used for a 40-year cost-effectiveness analysis of SMBG (1.29 strips per day) versus no SMBG in the Canadian payer setting. COMPUS assumptions for patient characteristics (e.g. HbA(1c) 8.4%), SMBG HbA(1c) advantage (-0.25%) and costs were retained. As with the COMPUS analysis, UKPDS HbA(1c) decay curves were incorporated into SMBG and no-SMBG pathways. An important difference was that SMBG HbA(1c) benefits in the current study could extend beyond the initial simulation period. Sensitivity analyses examined SMBG HbA(1c) advantage, adherence, complication history and cost inputs. Outcomes (discounted at 5%) included QALYs, complication rates, total costs (year 2008 values) and incremental cost-effectiveness ratios (ICERs). The base-case ICER was $Can63 664 per QALY gained; approximately 56% of

  3. Iterative Bayesian Model Averaging: a method for the application of survival analysis to high-dimensional microarray data

    Directory of Open Access Journals (Sweden)

    Raftery Adrian E

    2009-02-01

    Full Text Available Abstract Background Microarray technology is increasingly used to identify potential biomarkers for cancer prognostics and diagnostics. Previously, we have developed the iterative Bayesian Model Averaging (BMA algorithm for use in classification. Here, we extend the iterative BMA algorithm for application to survival analysis on high-dimensional microarray data. The main goal in applying survival analysis to microarray data is to determine a highly predictive model of patients' time to event (such as death, relapse, or metastasis using a small number of selected genes. Our multivariate procedure combines the effectiveness of multiple contending models by calculating the weighted average of their posterior probability distributions. Our results demonstrate that our iterative BMA algorithm for survival analysis achieves high prediction accuracy while consistently selecting a small and cost-effective number of predictor genes. Results We applied the iterative BMA algorithm to two cancer datasets: breast cancer and diffuse large B-cell lymphoma (DLBCL data. On the breast cancer data, the algorithm selected a total of 15 predictor genes across 84 contending models from the training data. The maximum likelihood estimates of the selected genes and the posterior probabilities of the selected models from the training data were used to divide patients in the test (or validation dataset into high- and low-risk categories. Using the genes and models determined from the training data, we assigned patients from the test data into highly distinct risk groups (as indicated by a p-value of 7.26e-05 from the log-rank test. Moreover, we achieved comparable results using only the 5 top selected genes with 100% posterior probabilities. On the DLBCL data, our iterative BMA procedure selected a total of 25 genes across 3 contending models from the training data. Once again, we assigned the patients in the validation set to significantly distinct risk groups (p

  4. Bayesian model and spatial analysis of oral and oropharynx cancer mortality in Minas Gerais, Brazil.

    Science.gov (United States)

    Fonseca, Emílio Prado da; Oliveira, Cláudia Di Lorenzo; Chiaravalloti, Francisco; Pereira, Antonio Carlos; Vedovello, Silvia Amélia Scudeler; Meneghim, Marcelo de Castro

    2018-01-01

    The objective of this study was to determine of oral and oropharynx cancer mortality rate and the results were analyzed by applying the Spatial Analysis of Empirical Bayesian Model. To this end, we used the information contained in the International Classification of Diseases (ICD-10), Chapter II, Category C00 to C14 and Brazilian Mortality Information System (SIM) of Minas Gerais State. Descriptive statistics were observed and the gross rate of mortality was calculated for each municipality. Then Empirical Bayesian estimators were applied. The results showed that, in 2012, in the state of Minas Gerais, were registered 769 deaths of patients with cancer of oral and oropharynx, with 607 (78.96%) men and 162 (21.04%) women. There was a wide variation in spatial distribution of crude mortality rate and were identified agglomeration in the South, Central and North more accurately by Bayesian Estimator Global and Local Model. Through Bayesian models was possible to map the spatial clustering of deaths from oral cancer more accurately, and with the application of the method of spatial epidemiology, it was possible to obtain more accurate results and provide subsidies to reduce the number of deaths from this type of cancer.

  5. Equifinality of formal (DREAM) and informal (GLUE) bayesian approaches in hydrologic modeling?

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Ter Braak, Cajo J F [NON LANL; Gupta, Hoshin V [NON LANL

    2008-01-01

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.

  6. Bayesian Network Model with Application to Smart Power Semiconductor Lifetime Data.

    Science.gov (United States)

    Plankensteiner, Kathrin; Bluder, Olivia; Pilz, Jürgen

    2015-09-01

    In this article, Bayesian networks are used to model semiconductor lifetime data obtained from a cyclic stress test system. The data of interest are a mixture of log-normal distributions, representing two dominant physical failure mechanisms. Moreover, the data can be censored due to limited test resources. For a better understanding of the complex lifetime behavior, interactions between test settings, geometric designs, material properties, and physical parameters of the semiconductor device are modeled by a Bayesian network. Statistical toolboxes in MATLAB® have been extended and applied to find the best structure of the Bayesian network and to perform parameter learning. Due to censored observations Markov chain Monte Carlo (MCMC) simulations are employed to determine the posterior distributions. For model selection the automatic relevance determination (ARD) algorithm and goodness-of-fit criteria such as marginal likelihoods, Bayes factors, posterior predictive density distributions, and sum of squared errors of prediction (SSEP) are applied and evaluated. The results indicate that the application of Bayesian networks to semiconductor reliability provides useful information about the interactions between the significant covariates and serves as a reliable alternative to currently applied methods. © 2015 Society for Risk Analysis.

  7. Maximum likelihood Bayesian model averaging and its predictive analysis for groundwater reactive transport models

    Science.gov (United States)

    Curtis, Gary P.; Lu, Dan; Ye, Ming

    2015-01-01

    While Bayesian model averaging (BMA) has been widely used in groundwater modeling, it is infrequently applied to groundwater reactive transport modeling because of multiple sources of uncertainty in the coupled hydrogeochemical processes and because of the long execution time of each model run. To resolve these problems, this study analyzed different levels of uncertainty in a hierarchical way, and used the maximum likelihood version of BMA, i.e., MLBMA, to improve the computational efficiency. This study demonstrates the applicability of MLBMA to groundwater reactive transport modeling in a synthetic case in which twenty-seven reactive transport models were designed to predict the reactive transport of hexavalent uranium (U(VI)) based on observations at a former uranium mill site near Naturita, CO. These reactive transport models contain three uncertain model components, i.e., parameterization of hydraulic conductivity, configuration of model boundary, and surface complexation reactions that simulate U(VI) adsorption. These uncertain model components were aggregated into the alternative models by integrating a hierarchical structure into MLBMA. The modeling results of the individual models and MLBMA were analyzed to investigate their predictive performance. The predictive logscore results show that MLBMA generally outperforms the best model, suggesting that using MLBMA is a sound strategy to achieve more robust model predictions relative to a single model. MLBMA works best when the alternative models are structurally distinct and have diverse model predictions. When correlation in model structure exists, two strategies were used to improve predictive performance by retaining structurally distinct models or assigning smaller prior model probabilities to correlated models. Since the synthetic models were designed using data from the Naturita site, the results of this study are expected to provide guidance for real-world modeling. Limitations of applying MLBMA to the

  8. Careful with Those Priors: A Note on Bayesian Estimation in Two-Parameter Logistic Item Response Theory Models

    Science.gov (United States)

    Marcoulides, Katerina M.

    2018-01-01

    This study examined the use of Bayesian analysis methods for the estimation of item parameters in a two-parameter logistic item response theory model. Using simulated data under various design conditions with both informative and non-informative priors, the parameter recovery of Bayesian analysis methods were examined. Overall results showed that…

  9. A Bayesian estimation of a stochastic predator-prey model of economic fluctuations

    Science.gov (United States)

    Dibeh, Ghassan; Luchinsky, Dmitry G.; Luchinskaya, Daria D.; Smelyanskiy, Vadim N.

    2007-06-01

    In this paper, we develop a Bayesian framework for the empirical estimation of the parameters of one of the best known nonlinear models of the business cycle: The Marx-inspired model of a growth cycle introduced by R. M. Goodwin. The model predicts a series of closed cycles representing the dynamics of labor's share and the employment rate in the capitalist economy. The Bayesian framework is used to empirically estimate a modified Goodwin model. The original model is extended in two ways. First, we allow for exogenous periodic variations of the otherwise steady growth rates of the labor force and productivity per worker. Second, we allow for stochastic variations of those parameters. The resultant modified Goodwin model is a stochastic predator-prey model with periodic forcing. The model is then estimated using a newly developed Bayesian estimation method on data sets representing growth cycles in France and Italy during the years 1960-2005. Results show that inference of the parameters of the stochastic Goodwin model can be achieved. The comparison of the dynamics of the Goodwin model with the inferred values of parameters demonstrates quantitative agreement with the growth cycle empirical data.

  10. Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets

    Science.gov (United States)

    2015-01-01

    On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which