Sample records for bayesian cost-effectiveness models

  1. Bayesian Variable Selection in Cost-Effectiveness Analysis

    Miguel A. Negrín


    Full Text Available Linear regression models are often used to represent the cost and effectiveness of medical treatment. The covariates used may include sociodemographic variables, such as age, gender or race; clinical variables, such as initial health status, years of treatment or the existence of concomitant illnesses; and a binary variable indicating the treatment received. However, most studies estimate only one model, which usually includes all the covariates. This procedure ignores the question of uncertainty in model selection. In this paper, we examine four alternative Bayesian variable selection methods that have been proposed. In this analysis, we estimate the inclusion probability of each covariate in the real model conditional on the data. Variable selection can be useful for estimating incremental effectiveness and incremental cost, through Bayesian model averaging, as well as for subgroup analysis.

  2. Bayesian Graphical Models

    Jensen, Finn Verner; Nielsen, Thomas Dyhre


    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes and...... largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  3. Bayesian default probability models

    Andrlíková, Petra


    This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...

  4. Cost effectiveness of recycling: A systems model

    Highlights: • Curbside collection of recyclables reduces overall system costs over a range of conditions. • When avoided costs for recyclables are large, even high collection costs are supported. • When avoided costs for recyclables are not great, there are reduced opportunities for savings. • For common waste compositions, maximizing curbside recyclables collection always saves money. - Abstract: Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets

  5. Bayesian comparison of cost-effectiveness of different clinical approaches to diagnose coronary artery disease.

    Patterson, R E; Eng, C; Horowitz, S F; Gorlin, R; Goldstein, S R


    The objective of this study was to compare the cost-effectiveness of four clinical policies (policies I to IV) in the diagnosis of the presence or absence of coronary artery disease. A model based on Bayes' theorem and published clinical data was constructed to make these comparisons. Effectiveness was defined as either the number of patients with coronary disease diagnosed or as the number of quality-adjusted life years extended by therapy after the diagnosis of coronary disease. The following conclusions arise strictly from analysis of the model and may not necessarily be applicable to all situations. As prevalence of coronary disease in the population increased, it caused a linear increase in cost per patient tested, but a hyperbolic decrease in cost per effect, that is, increased cost-effectiveness. Thus, cost-effectiveness of all policies (I to IV) was poor in populations with a prevalence of disease below 10%, for example, asymptomatic people with no risk factors. Analysis of the model also indicates that at prevalences less than 80%, exercise thallium scintigraphy alone as a first test (policy II) is a more cost-effective initial test than is exercise electrocardiography alone as a first test (policy I) or exercise electrocardiography first combined with thallium imaging as a second test (policy IV). Exercise electrocardiography before thallium imaging (policy IV) is more cost-effective than exercise electrocardiography alone (policy I) at prevalences less than 80%. 4) Noninvasive exercise testing before angiography (policies I, II and IV) is more cost-effective than using coronary angiography as the first and only test (policy III) at prevalences less than 80%. 5) Above a threshold value of prevalence of 80% (for example patients with typical angina), proceeding to angiography as the first test (policy III) was more cost-effective than initial noninvasive exercise tests (policies I, II and IV). One advantage of this quantitative model is that it estimates a

  6. Applied Bayesian modelling

    Congdon, Peter


    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  7. A Layered Decision Model for Cost-Effective System Security

    Wei, Huaqiang; Alves-Foss, James; Soule, Terry; Pforsich, Hugh; Zhang, Du; Frincke, Deborah A.


    System security involves decisions in at least three areas: identification of well-defined security policies, selection of cost-effective defence strategies, and implementation of real-time defence tactics. Although choices made in each of these areas affect the others, existing decision models typically handle these three decision areas in isolation. There is no comprehensive tool that can integrate them to provide a single efficient model for safeguarding a network. In addition, there is no clear way to determine which particular combinations of defence decisions result in cost-effective solutions. To address these problems, this paper introduces a Layered Decision Model (LDM) for use in deciding how to address defence decisions based on their cost-effectiveness. To validate the LDM and illustrate how it is used, we used simulation to test model rationality and applied the LDM to the design of system security for an e-commercial business case.

  8. Modeling and Cost-Effectiveness in HIV Prevention.

    Jacobsen, Margo M; Walensky, Rochelle P


    With HIV funding plateauing and the number of people living with HIV increasing due to the rollout of life-saving antiretroviral therapy, policy makers are faced with increasingly tighter budgets to manage the ongoing HIV epidemic. Cost-effectiveness and modeling analyses can help determine which HIV interventions may be of best value. Incidence remains remarkably high in certain populations and countries, making prevention key to controlling the spread of HIV. This paper briefly reviews concepts in modeling and cost-effectiveness methodology and then examines results of recently published cost-effectiveness analyses on the following HIV prevention strategies: condoms and circumcision, behavioral- or community-based interventions, prevention of mother-to-child transmission, HIV testing, pre-exposure prophylaxis, and treatment as prevention. We find that the majority of published studies demonstrate cost-effectiveness; however, not all interventions are affordable. We urge continued research on combination strategies and methodologies that take into account willingness to pay and budgetary impact. PMID:26830283

  9. Bayesian Model Averaging for Propensity Score Analysis

    Kaplan, David; Chen, Jianshen


    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  10. Bayesian modeling using WinBUGS

    Ntzoufras, Ioannis


    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  11. A Cost-Effectiveness Analysis Model for Evaluating and Planning Secondary Vocational Programs

    Kim, Jin Eun


    This paper conceptualizes a cost-effectiveness analysis and describes a cost-effectiveness analysis model for secondary vocational programs. It generates three kinds of cost-effectiveness measures: program effectiveness, cost efficiency, and cost-effectiveness and/or performance ratio. (Author)

  12. Cost effectiveness of the 1993 Model Energy Code in Colorado

    Lucas, R.G.


    This report documents an analysis of the cost effectiveness of the Council of American Building Officials` 1993 Model Energy Code (MEC) building thermal-envelope requirements for single-family homes in Colorado. The goal of this analysis was to compare the cost effectiveness of the 1993 MEC to current construction practice in Colorado based on an objective methodology that determined the total life-cycle cost associated with complying with the 1993 MEC. This analysis was performed for the range of Colorado climates. The costs and benefits of complying with the 1993 NIEC were estimated from the consumer`s perspective. The time when the homeowner realizes net cash savings (net positive cash flow) for homes built in accordance with the 1993 MEC was estimated to vary from 0.9 year in Steamboat Springs to 2.4 years in Denver. Compliance with the 1993 MEC was estimated to increase first costs by $1190 to $2274, resulting in an incremental down payment increase of $119 to $227 (at 10% down). The net present value of all costs and benefits to the home buyer, accounting for the mortgage and taxes, varied from a savings of $1772 in Springfield to a savings of $6614 in Steamboat Springs. The ratio of benefits to costs ranged from 2.3 in Denver to 3.8 in Steamboat Springs.

  13. Cost effectiveness of the 1995 model energy code in Massachusetts

    Lucas, R.G.


    This report documents an analysis of the cost effectiveness of the Council of American Building Officials` 1995 Model Energy Code (MEC) building thermal-envelope requirements for single-family houses and multifamily housing units in Massachusetts. The goal was to compare the cost effectiveness of the 1995 MEC to the energy conservation requirements of the Massachusetts State Building Code-based on a comparison of the costs and benefits associated with complying with each.. This comparison was performed for three cities representing three geographical regions of Massachusetts--Boston, Worcester, and Pittsfield. The analysis was done for two different scenarios: a ``move-up`` home buyer purchasing a single-family house and a ``first-time`` financially limited home buyer purchasing a multifamily condominium unit. Natural gas, oil, and electric resistance heating were examined. The Massachusetts state code has much more stringent requirements if electric resistance heating is used rather than other heating fuels and/or equipment types. The MEC requirements do not vary by fuel type. For single-family homes, the 1995 MEC has requirements that are more energy-efficient than the non-electric resistance requirements of the current state code. For multifamily housing, the 1995 MEC has requirements that are approximately equally energy-efficient to the non-electric resistance requirements of the current state code. The 1995 MEC is generally not more stringent than the electric resistance requirements of the state code, in fact; for multifamily buildings the 1995 MEC is much less stringent.

  14. Bayesian kinematic earthquake source models

    Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.


    Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high

  15. A Bayesian Nonparametric IRT Model

    Karabatsos, George


    This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...

  16. Bayesian Stable Isotope Mixing Models

    Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard


    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...

  17. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    C. Dimitrakakis


    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st

  18. Nonparametric Bayesian Modeling of Complex Networks

    Schmidt, Mikkel Nørgaard; Mørup, Morten


    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  19. Involving Stakeholders in Building Integrated Fisheries Models Using Bayesian Methods

    Haapasaari, Päivi; Mäntyniemi, Samu; Kuikka, Sakari


    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame the management problem of the herring fishery and elucidate what kind of causalities the different views involve. The paper combines these two tasks to assess the suitability of the methodological choices to participatory modeling in terms of both a modeling tool and participation mode. The paper also assesses the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology provides a flexible tool that can be adapted to different kinds of needs and challenges of participatory modeling. The ability of the approach to deal with small data sets makes it cost-effective in participatory contexts. However, the BMA methodology used in modeling the biological uncertainties is so complex that it needs further development before it can be introduced to wider use in participatory contexts.

  20. A Bayesian approach to model uncertainty

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  1. Computational methods for Bayesian model choice

    Robert, Christian P.; Wraith, Darren


    In this note, we shortly survey some recent approaches on the approximation of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model choice. In particular, we reassess importance sampling, harmonic mean sampling, and nested sampling from a unified perspective.

  2. Bayesian Variable Selection in Spatial Autoregressive Models

    Jesus Crespo Cuaresma; Philipp Piribauer


    This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. We present two alternative approaches which can be implemented using Gibbs sampling methods in a straightforward way and allow us to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. In a simulation study we show that the variable selection approaches tend to outperform existing Bayesian model averaging tech...

  3. Bayesian Models of Brain and Behaviour

    Penny, William


    This paper presents a review of Bayesian models of brain and behaviour. We first review the basic principles of Bayesian inference. This is followed by descriptions of sampling and variational methods for approximate inference, and forward and backward recursions in time for inference in dynamical models. The review of behavioural models covers work in visual processing, sensory integration, sensorimotor integration, and collective decision making. The review of brain models covers a range of...

  4. Bayesian models a statistical primer for ecologists

    Hobbs, N Thompson


    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  5. Risk-Based Analysis of Drilling Waste Handling Operations. Bayesian Network, Cost-effectiveness, and Operational Conditions

    Ayele, Yonas Zewdu


    The papers of this thesis are not available in Munin. Paper I. Ayele YZ, Barabadi A, Barabady J.: A methodology for identification of a suitable drilling waste handling system in the Arctic region. (Manuscript). Paper II. Ayele YZ, Barabady J, Droguett EL.: Dynamic Bayesian network based risk assessment for Arctic offshore drilling waste handling practices. (Manuscript). Published version available in Journal of Offshore Mechanics and Arctic Engineering 138(5), 051302 (Jun 17, 2016) ...

  6. Bayesian Analysis of Multivariate Probit Models

    Siddhartha Chib; Edward Greenberg


    This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...

  7. Bayesian Network Models for Adaptive Testing

    Plajner, Martin; Vomlel, Jiří

    Achen: Sun SITE Central Europe, 2016 - (Agosta, J.; Carvalho, R.), s. 24-33. (CEUR Workshop Proceedings. Vol 1565). ISSN 1613-0073. [The Twelfth UAI Bayesian Modeling Applications Workshop (BMAW 2015). Amsterdam (NL), 16.07.2015] R&D Projects: GA ČR GA13-20012S Institutional support: RVO:67985556 Keywords : Bayesian networks * Computerized adaptive testing Subject RIV: JD - Computer Applications, Robotics

  8. On Bayesian Nonparametric Continuous Time Series Models

    Karabatsos, George; Walker, Stephen G.


    This paper is a note on the use of Bayesian nonparametric mixture models for continuous time series. We identify a key requirement for such models, and then establish that there is a single type of model which meets this requirement. As it turns out, the model is well known in multiple change-point problems.

  9. Bayesian semiparametric dynamic Nelson-Siegel model

    C. Cakmakli


    This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Diric

  10. Bayesian calibration of car-following models

    Van Hinsbergen, C.P.IJ.; Van Lint, H.W.C.; Hoogendoorn, S.P.; Van Zuylen, H.J.


    Recent research has revealed that there exist large inter-driver differences in car-following behavior such that different car-following models may apply to different drivers. This study applies Bayesian techniques to the calibration of car-following models, where prior distributions on each model p

  11. Cost Effective Community Based Dementia Screening: A Markov Model Simulation

    Erin Saito


    Full Text Available Background. Given the dementia epidemic and the increasing cost of healthcare, there is a need to assess the economic benefit of community based dementia screening programs. Materials and Methods. Markov model simulations were generated using data obtained from a community based dementia screening program over a one-year period. The models simulated yearly costs of caring for patients based on clinical transitions beginning in pre dementia and extending for 10 years. Results. A total of 93 individuals (74 female, 19 male were screened for dementia and 12 meeting clinical criteria for either mild cognitive impairment (n=7 or dementia (n=5 were identified. Assuming early therapeutic intervention beginning during the year of dementia detection, Markov model simulations demonstrated 9.8% reduction in cost of dementia care over a ten-year simulation period, primarily through increased duration in mild stages and reduced time in more costly moderate and severe stages. Discussion. Community based dementia screening can reduce healthcare costs associated with caring for demented individuals through earlier detection and treatment, resulting in proportionately reduced time in more costly advanced stages.

  12. Cost Effective System Modeling of Active Micro- Module Solar Tracker

    Md. Faisal Shuvo


    Full Text Available The increasing interests in using renewable energies are coming from solar thermal energy and solar photovoltaic systems to the micro production of electricity. Usually we already have considered the solar tracking topology in large scale applications like power plants and satellite but most of small scale applications don’t have any solar tracker system, mainly because of its high cost and complex circuit design. From that aspect, this paper confab microcontroller based one dimensional active micro-module solar tracking system, in which inexpensive LDR is used to generate reference voltage to operate microcontroller for functioning the tracking system. This system provides a fast response of tracking system to the parameters like change of light intensity as well as temperature variations. This micro-module model of tracking system can be used for small scale applications like portable electronic devices and running vehicles.

  13. Bayesian Semiparametric Modeling of Realized Covariance Matrices

    Jin, Xin; John M Maheu


    This paper introduces several new Bayesian nonparametric models suitable for capturing the unknown conditional distribution of realized covariance (RCOV) matrices. Existing dynamic Wishart models are extended to countably infinite mixture models of Wishart and inverse-Wishart distributions. In addition to mixture models with constant weights we propose models with time-varying weights to capture time dependence in the unknown distribution. Each of our models can be combined with returns...

  14. Complex Bayesian models: construction, and sampling strategies

    Huston, Carolyn Marie


    Bayesian models are useful tools for realistically modeling processes occurring in the real world. In particular, we consider models for spatio-temporal data where the response vector is compositional, ie. has components that sum-to-one. A unique multivariate conditional hierarchical model (MVCAR) is proposed. Statistical methods for MVCAR models are well developed and we extend these tools for use with a discrete compositional response. We harness the advantages of an MVCAR model when the re...

  15. Bayesian Approach to Neuro-Rough Models for Modelling HIV

    Marwala, Tshilidzi


    This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.

  16. Survey of Bayesian Models for Modelling of Stochastic Temporal Processes

    Ng, B


    This survey gives an overview of popular generative models used in the modeling of stochastic temporal systems. In particular, this survey is organized into two parts. The first part discusses the discrete-time representations of dynamic Bayesian networks and dynamic relational probabilistic models, while the second part discusses the continuous-time representation of continuous-time Bayesian networks.

  17. Novel anticoagulants for stroke prevention in atrial fibrillation: a systematic review of cost-effectiveness models.

    Brendan L Limone

    Full Text Available OBJECTIVE: To conduct a systematic review of economic models of newer anticoagulants for stroke prevention in atrial fibrillation (SPAF. PATIENTS AND METHODS: We searched Medline, Embase, NHSEED and HTA databases and the Tuft's Registry from January 1, 2008 through October 10, 2012 to identify economic (Markov or discrete event simulation models of newer agents for SPAF. RESULTS: Eighteen models were identified. Each was based on a lone randomized trial/new agent, and these trials were clinically and methodologically heterogeneous. Dabigatran 150 mg, 110 mg and sequentially-dosed were assessed in 9, 8, and 9 models, rivaroxaban in 4 and apixaban in 4. Warfarin was a first-line comparator in 94% of models. Models were conducted from United States (44%, European (39% and Canadian (17% perspectives. Models typically assumed patients between 65-73 years old at moderate-risk of stroke initiated anticoagulation for/near a lifetime. All models reported cost/quality-adjusted life-year, 22% reported using a societal perspective, but none included indirect costs. Four models reported an incremental cost-effectiveness ratio (ICER for a newer anticoagulant (dabigatran 110 mg (n = 4/150 mg (n = 2; rivaroxaban (n = 1 vs. warfarin above commonly reported willingness-to-pay thresholds. ICERs vs. warfarin ranged from $3,547-$86,000 for dabigatran 150 mg, $20,713-$150,000 for dabigatran 110 mg, $4,084-$21,466 for sequentially-dosed dabigatran and $23,065-$57,470 for rivaroxaban. Apixaban was found economically-dominant to aspirin, and dominant or cost-effective ($11,400-$25,059 vs. warfarin. Indirect comparisons from 3 models suggested conflicting comparative cost-effectiveness results. CONCLUSIONS: Cost-effectiveness models frequently found newer anticoagulants cost-effective, but the lack of head-to-head trials and the heterogeneous characteristics of underlying trials and modeling methods make it difficult to determine the most cost-effective agent.

  18. Modeling the cost effectiveness of injury interventions in lower and middle income countries: opportunities and challenges

    Hyder Adnan A


    Full Text Available Abstract Background This paper estimates the cost-effectiveness of five interventions that could counter injuries in lower and middle income countries(LMICs: better traffic enforcement, erecting speed bumps, promoting helmets for bicycles, promoting helmets for motorcycles, and storing kerosene in child proof containers. Methods We adopt an ingredients based approach to form models of what each intervention would cost in 6 world regions over a 10 year period discounted at both 3% and 6% from both the governmental and societal perspectives. Costs are expressed in local currency converted into US $2001. Each of these interventions has been assessed for effectiveness in a LMIC in limited region, these effectiveness estimates have been used to form models of disability adjusted life years (DALYs averted for various regions, taking account of regional differences in the baseline burden of injury. Results The interventions modeled in this paper have cost effectiveness ratios ranging from US $5 to $ 556 per DALY averted depending on region. Depending on local acceptability thresholds many of them could be judged cost-effective relative to interventions that are already adopted. Enhanced enforcement of traffic regulations is the most cost-effective interventions with an average cost per DALY of $64 Conclusion Injury counter measures appear to be cost-effective based on models. More evaluations of real interventions will help to strengthen the evidence basis.

  19. Bayesian Spatial Modelling with R-INLA

    Finn Lindgren; Håvard Rue


    The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA) approach proposed by Rue, Martino, and Chopin (2009) is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized) linear mixed to spatial and spatio-temporal models. Combined with the stochastic...

  20. Bayesian modeling and classification of neural signals

    Lewicki, Michael S.


    Signal processing and classification algorithms often have limited applicability resulting from an inaccurate model of the signal's underlying structure. We present here an efficient, Bayesian algorithm for modeling a signal composed of the superposition of brief, Poisson-distributed functions. This methodology is applied to the specific problem of modeling and classifying extracellular neural waveforms which are composed of a superposition of an unknown number of action potentials CAPs). ...

  1. Distributed Bayesian Networks for User Modeling

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang;


    The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used by such...... adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... mechanism efficiently combines distributed learner models without the need to exchange internal structure of local Bayesian networks, nor local evidence between the involved platforms....

  2. Constrained bayesian inference of project performance models

    Sunmola, Funlade


    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  3. Cost effectiveness of vaccination against pandemic influenza in European countries : mathematical modelling analysis

    Lugner, A.K.; van Boven, Michiel; de Vries, Robin; Postma, M.J.; Wallinga, J.


    Objective To investigate whether a single optimal vaccination strategy exists across countries to deal with a future influenza pandemic by comparing the cost effectiveness of different strategies in various pandemic scenarios for three European countries. Design Economic and epidemic modelling study

  4. Bayesian Network Based XP Process Modelling

    Mohamed Abouelela


    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  5. A Bayesian Modelling of Wildfires in Portugal

    Silva, Giovani L.; Soares, Paulo; Marques, Susete; Dias, Inês M.; Oliveira, Manuela M.; Borges, Guilherme J.


    In the last decade wildfires became a serious problem in Portugal due to different issues such as climatic characteristics and nature of Portuguese forest. In order to analyse wildfire data, we employ beta regression for modelling the proportion of burned forest area, under a Bayesian perspective. Our main goal is to find out fire risk factors that influence the proportion of area burned and what may make a forest type susceptible or resistant to fire. Then, we analyse wildfire...

  6. Market Segmentation Using Bayesian Model Based Clustering

    Van Hattum, P.


    This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...

  7. Centralized Bayesian reliability modelling with sensor networks

    Dedecius, Kamil; Sečkárová, Vladimíra


    Roč. 19, č. 5 (2013), s. 471-482. ISSN 1387-3954 R&D Projects: GA MŠk 7D12004 Grant ostatní: GA MŠk(CZ) SVV-265315 Keywords : Bayesian modelling * Sensor network * Reliability Subject RIV: BD - Theory of Information Impact factor: 0.984, year: 2013

  8. Bayesian mixture models for Poisson astronomical images

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker


    Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...

  9. Bayesian Inference of a Multivariate Regression Model

    Marick S. Sinay


    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  10. Cost effectiveness of vaccination against pandemic influenza in European countries: mathematical modelling analysis

    Lugner, A.K.; van Boven, Michiel; de Vries, Robin; Postma, M. J.; Wallinga, J.


    Objective To investigate whether a single optimal vaccination strategy exists across countries to deal with a future influenza pandemic by comparing the cost effectiveness of different strategies in various pandemic scenarios for three European countries. Design Economic and epidemic modelling study. Settings General populations in Germany, the Netherlands, and the United Kingdom. Data sources Country specific patterns of social contact and demographic data. Model An age structured susceptibl...


    Abhijit Kundu; Soumyajit Nath; Saheli Nag; Saikat Bhattacharyya; Bap Sadhukhan; M. Ray Kanjilal


    A simple and successful design is developed which has the objective to put together a cost effective model, scaled down both in size and energy required for an average residential home driven through Solar Panels. It also deals with the autonomous illumination of streets of a model colony through solar panels to meet the requirements and attain the maximum efficiency of the available energy. The Photovoltaic system along with an inverter and intensity control circuit counts for...

  12. Cost-Effectiveness of a Community Pharmacist-Led Sleep Apnea Screening Program - A Markov Model.

    Clémence Perraudin

    Full Text Available Despite the high prevalence and major public health ramifications, obstructive sleep apnea syndrome (OSAS remains underdiagnosed. In many developed countries, because community pharmacists (CP are easily accessible, they have been developing additional clinical services that integrate the services of and collaborate with other healthcare providers (general practitioners (GPs, nurses, etc.. Alternative strategies for primary care screening programs for OSAS involving the CP are discussed.To estimate the quality of life, costs, and cost-effectiveness of three screening strategies among patients who are at risk of having moderate to severe OSAS in primary care.Markov decision model.Published data.Hypothetical cohort of 50-year-old male patients with symptoms highly evocative of OSAS.The 5 years after initial evaluation for OSAS.Societal.Screening strategy with CP (CP-GP collaboration, screening strategy without CP (GP alone and no screening.Quality of life, survival and costs for each screening strategy.Under almost all modeled conditions, the involvement of CPs in OSAS screening was cost effective. The maximal incremental cost for "screening strategy with CP" was about 455€ per QALY gained.Our results were robust but primarily sensitive to the treatment costs by continuous positive airway pressure, and the costs of untreated OSAS. The probabilistic sensitivity analysis showed that the "screening strategy with CP" was dominant in 80% of cases. It was more effective and less costly in 47% of cases, and within the cost-effective range (maximum incremental cost effectiveness ratio at €6186.67/QALY in 33% of cases.CP involvement in OSAS screening is a cost-effective strategy. This proposal is consistent with the trend in Europe and the United States to extend the practices and responsibilities of the pharmacist in primary care.

  13. Bayesian Kinematic Finite Fault Source Models (Invited)

    Minson, S. E.; Simons, M.; Beck, J. L.


    Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.

  14. Bayesian Estimation of a Mixture Model

    Ilhem Merah; Assia Chadli


    We present the properties of a bathtub curve reliability model having both a sufficient adaptability and a minimal number of parameters introduced by Idée and Pierrat (2010). This one is a mixture of a Gamma distribution G(2, (1/θ)) and a new distribution L(θ). We are interesting by Bayesian estimation of the parameters and survival function of this model with a squared-error loss function and non-informative prior using the approximations of Lindley (1980) and Tierney and Kadane (1986). Usin...

  15. Bayesian mixture models for partially verified data

    Kostoulas, Polychronis; Browne, William J.; Nielsen, Søren Saxmose;


    , where a perfect reference test does not exist. However, their discriminatory ability diminishes with increasing overlap of the distributions and with increasing number of latent infection stages to be discriminated. We provide a method that uses partially verified data, with known infection status for......Bayesian mixture models can be used to discriminate between the distributions of continuous test responses for different infection stages. These models are particularly useful in case of chronic infections with a long latent period, like Mycobacterium avium subsp. paratuberculosis (MAP) infection...


    Abhijit Kundu


    Full Text Available A simple and successful design is developed which has the objective to put together a cost effective model, scaled down both in size and energy required for an average residential home driven through Solar Panels. It also deals with the autonomous illumination of streets of a model colony through solar panels to meet the requirements and attain the maximum efficiency of the available energy. The Photovoltaic system along with an inverter and intensity control circuit counts for the basic design. The effort deals with the efficient, cost effective and needful implementation of Photovoltaic systems which would be useful primarily in rural and remote parts of India for both social and economic development of the people.

  17. Effectiveness and cost-effectiveness of antidepressants in primary care: a multiple treatment comparison meta-analysis and cost-effectiveness model.

    Joakim Ramsberg

    Full Text Available OBJECTIVE: To determine effectiveness and cost-effectiveness over a one-year time horizon of pharmacological first line treatment in primary care for patients with moderate to severe depression. DESIGN: A multiple treatment comparison meta-analysis was employed to determine the relative efficacy in terms of remission of 10 antidepressants (citalopram, duloxetine escitalopram, fluoxetine, fluvoxamine mirtazapine, paroxetine, reboxetine, sertraline and venlafaxine. The estimated remission rates were then applied in a decision-analytic model in order to estimate costs and quality of life with different treatments at one year. DATA SOURCES: Meta-analyses of remission rates from randomised controlled trials, and cost and quality-of-life data from published sources. RESULTS: The most favourable pharmacological treatment in terms of remission was escitalopram with an 8- to 12-week probability of remission of 0.47. Despite a high acquisition cost, this clinical effectiveness translated into escitalopram being both more effective and having a lower total cost than all other comparators from a societal perspective. From a healthcare perspective, the cost per QALY of escitalopram was €3732 compared with venlafaxine. CONCLUSION: Of the investigated antidepressants, escitalopram has the highest probability of remission and is the most effective and cost-effective pharmacological treatment in a primary care setting, when evaluated over a one year time-horizon. Small differences in remission rates may be important when assessing costs and cost-effectiveness of antidepressants.

  18. Different approaches to modelling the cost-effectiveness of schistosomiasis control

    Guyatt Helen


    Full Text Available This paper reviews three different approaches to modelling the cost-effectiveness of schistosomiasis control. Although these approaches vary in their assessment of costs, the major focus of the paper is on the evaluation of effectiveness. The first model presented is a static economic model which assesses effectiveness in terms of the proportion of cases cured. This model is important in highlighting that the optimal choice of chemotherapy regime depends critically on the level of budget constraint, the unit costs of screening and treatment, the rates of compliance with screening and chemotherapy and the prevalence of infection. The limitations of this approach is that it models the cost-effectiveness of only one cycle of treatment, and effectiveness reflects only the immediate impact of treatment. The second model presented is a prevalence-based dynamic model which links prevalence rates from one year to the next, and assesses effectiveness as the proportion of cases prevented. This model was important as it introduced the concept of measuring the long-term impact of control by using a transmission model which can assess reduction in infection through time, but is limited to assessing the impact only on the prevalence of infection. The third approach presented is a theoretical framework which describes the dynamic relationships between infection and morbidity, and which assesses effectiveness in terms of case-years prevented of infection and morbidity. The use of this model in assessing the cost-effectiveness of age-targeted treatment in controlling Schistosoma mansoni is explored in detail, with respect to varying frequencies of treatment and the interaction between drug price and drug efficacy.

  19. Structuring and validating a cost-effectiveness model of primary asthma prevention amongst children

    Ramos G Feljandro P


    Full Text Available Abstract Background Given the rising number of asthma cases and the increasing costs of health care, prevention may be the best cure. Decisions regarding the implementation of prevention programmes in general and choosing between unifaceted and multifaceted strategies in particular are urgently needed. Existing trials on the primary prevention of asthma are, however, insufficient on their own to inform the decision of stakeholders regarding the cost-effectiveness of such prevention strategies. Decision analytic modelling synthesises available data for the cost-effectiveness evaluation of strategies in an explicit manner. Published reports on model development should provide the detail and transparency required to increase the acceptability of cost-effectiveness modelling. But, detail on the explicit steps and the involvement of experts in structuring a model is often unevenly reported. In this paper, we describe a procedure to structure and validate a model for the primary prevention of asthma in children. Methods An expert panel was convened for round-table discussions to frame the cost-effectiveness research question and to select and structure a model. The model's structural validity, which indicates how well a model reflects the reality, was determined through descriptive and parallel validation. Descriptive validation was performed with the experts. Parallel validation qualitatively compared similarity between other published models with different decision problems. Results The multidisciplinary input of experts helped to develop a decision-tree structure which compares the current situation with screening and prevention. The prevention was further divided between multifaceted and unifaceted approaches to analyse the differences. The clinical outcome was diagnosis of asthma. No similar model was found in the literature discussing the same decision problem. Structural validity in terms of descriptive validity was achieved with the experts

  20. A Nonparametric Bayesian Model for Nested Clustering.

    Lee, Juhee; Müller, Peter; Zhu, Yitan; Ji, Yuan


    We propose a nonparametric Bayesian model for clustering where clusters of experimental units are determined by a shared pattern of clustering another set of experimental units. The proposed model is motivated by the analysis of protein activation data, where we cluster proteins such that all proteins in one cluster give rise to the same clustering of patients. That is, we define clusters of proteins by the way that patients group with respect to the corresponding protein activations. This is in contrast to (almost) all currently available models that use shared parameters in the sampling model to define clusters. This includes in particular model based clustering, Dirichlet process mixtures, product partition models, and more. We show results for two typical biostatistical inference problems that give rise to clustering. PMID:26519174

  1. Bayesian Spatial Modelling with R-INLA

    Finn Lindgren


    Full Text Available The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA approach proposed by Rue, Martino, and Chopin (2009 is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized linear mixed to spatial and spatio-temporal models. Combined with the stochastic partial differential equation approach (SPDE, Lindgren, Rue, and Lindstrm 2011, one can accommodate all kinds of geographically referenced data, including areal and geostatistical ones, as well as spatial point process data. The implementation interface covers stationary spatial mod- els, non-stationary spatial models, and also spatio-temporal models, and is applicable in epidemiology, ecology, environmental risk assessment, as well as general geostatistics.

  2. The cost-effectiveness of the Olweus Bullying Prevention Program: Results from a modelling study.

    Beckman, Linda; Svensson, Mikael


    Exposure to bullying affects around 3-5 percent of adolescents in secondary school and is related to various mental health problems. Many different anti-bullying programmes are currently available, but economic evaluations are lacking. The aim of this study is to identify the cost effectiveness of the Olweus Bullying Prevention Program (OBPP). We constructed a decision-tree model for a Swedish secondary school, using a public payer perspective, and retrieved data on costs and effects from the published literature. Probabilistic sensitivity analysis to reflect the uncertainty in the model was conducted. The base-case analysis showed that using the OBPP to reduce the number of victims of bullying costs 131,250 Swedish kronor (€14,470) per victim spared. Compared to a relevant threshold of the societal value of bullying reduction, this indicates that the programme is cost-effective. Using a relevant willingness-to-pay threshold shows that the OBPP is a cost-effective intervention. PMID:26433734

  3. Cost-effectiveness of new pneumococcal conjugate vaccines in Turkey: a decision analytical model

    Bakır Mustafa


    Full Text Available Abstract Background Streptococcus pneumoniae infections, which place a considerable burden on healthcare resources, can be reduced in a cost-effective manner using a 7-valent pneumococcal conjugate vaccine (PCV-7. We compare the cost effectiveness of a 13-valent PCV (PCV-13 and a 10-valent pneumococcal non-typeable Haemophilus influenzae protein D conjugate vaccine (PHiD-CV with that of PCV-7 in Turkey. Methods A cost-utility analysis was conducted and a decision analytical model was used to estimate the proportion of the Turkish population Results PCV-13 and PHiD-CV are projected to have a substantial impact on pneumococcal disease in Turkey versus PCV-7, with 2,223 and 3,156 quality-adjusted life years (QALYs and 2,146 and 2,081 life years, respectively, being saved under a 3+1 schedule. Projections of direct medical costs showed that a PHiD-CV vaccination programme would provide the greatest cost savings, offering additional savings of US$11,718,813 versus PCV-7 and US$8,235,010 versus PCV-13. Probabilistic sensitivity analysis showed that PHiD-CV dominated PCV-13 in terms of QALYs gained and cost savings in 58.3% of simulations. Conclusion Under the modeled conditions, PHiD-CV would provide the most cost-effective intervention for reducing pneumococcal disease in Turkish children.

  4. Bayesian Discovery of Linear Acyclic Causal Models

    Hoyer, Patrik O


    Methods for automated discovery of causal relationships from non-interventional data have received much attention recently. A widely used and well understood model family is given by linear acyclic causal models (recursive structural equation models). For Gaussian data both constraint-based methods (Spirtes et al., 1993; Pearl, 2000) (which output a single equivalence class) and Bayesian score-based methods (Geiger and Heckerman, 1994) (which assign relative scores to the equivalence classes) are available. On the contrary, all current methods able to utilize non-Gaussianity in the data (Shimizu et al., 2006; Hoyer et al., 2008) always return only a single graph or a single equivalence class, and so are fundamentally unable to express the degree of certainty attached to that output. In this paper we develop a Bayesian score-based approach able to take advantage of non-Gaussianity when estimating linear acyclic causal models, and we empirically demonstrate that, at least on very modest size networks, its accur...

  5. Adversarial life testing: A Bayesian negotiation model

    Life testing is a procedure intended for facilitating the process of making decisions in the context of industrial reliability. On the other hand, negotiation is a process of making joint decisions that has one of its main foundations in decision theory. A Bayesian sequential model of negotiation in the context of adversarial life testing is proposed. This model considers a general setting for which a manufacturer offers a product batch to a consumer. It is assumed that the reliability of the product is measured in terms of its lifetime. Furthermore, both the manufacturer and the consumer have to use their own information with respect to the quality of the product. Under these assumptions, two situations can be analyzed. For both of them, the main aim is to accept or reject the product batch based on the product reliability. This topic is related to a reliability demonstration problem. The procedure is applied to a class of distributions that belong to the exponential family. Thus, a unified framework addressing the main topics in the considered Bayesian model is presented. An illustrative example shows that the proposed technique can be easily applied in practice

  6. Bayesian Fundamentalism or Enlightenment? On the explanatory status and theoretical contributions of Bayesian models of cognition.

    Jones, Matt; Love, Bradley C


    The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls

  7. Assessing the cost-effectiveness of electric vehicles in European countries using integrated modeling

    Electric vehicles (EVs) are considered alternatives to internal combustion engines due to their energy efficiency and contribution to CO2 mitigation. The adoption of EVs depends on consumer preferences, including cost, social status and driving habits, although it is agreed that current and expected costs play a major role. We use a partial equilibrium model that minimizes total energy system costs to assess whether EVs can be a cost-effective option for the consumers of each EU27 member state up to 2050, focusing on the impact of different vehicle investment costs and CO2 mitigation targets. We found that for an EU-wide greenhouse gas emission reduction cap of 40% and 70% by 2050 vis-à-vis 1990 emissions, battery electric vehicles (BEVs) are cost-effective in the EU only by 2030 and only if their costs are 30% lower than currently expected. At the EU level, vehicle costs and the capability to deliver both short- and long-distance mobility are the main drivers of BEV deployment. Other drivers include each state’s national mobility patterns and the cost-effectiveness of alternative mitigation options, both in the transport sector, such as plug-in hybrid electric vehicles (PHEVs) or biofuels, and in other sectors, such as renewable electricity. - Highlights: • Electric vehicles were assessed through the minimization of the total energy systems costs. • EU climate policy targets could act as a major driver for PHEV adoption. • Battery EV is an option before 2030 if costs will drop by 30% from expected costs. • EV deployment varies per country depending on each energy system configuration. • Incentives at the country level should consider specific cost-effectiveness factors

  8. The cost-effectiveness of testing strategies for type 2 diabetes: a modelling study.

    Gillett, Mike; Brennan, Alan; Watson, Penny; Khunti, Kamlesh; Davies, Melanie; Mostafa, Samiul; Gray, Laura J


    BACKGROUND An estimated 850,000 people have diabetes without knowing it and as many as 7 million more are at high risk of developing it. Within the NHS Health Checks programme, blood glucose testing can be undertaken using a fasting plasma glucose (FPG) or a glycated haemoglobin (HbA1c) test but the relative cost-effectiveness of these is unknown. OBJECTIVES To estimate and compare the cost-effectiveness of screening for type 2 diabetes using a HbA1c test versus a FPG test. In addition, to compare the use of a random capillary glucose (RCG) test versus a non-invasive risk score to prioritise individuals who should undertake a HbA1c or FPG test. DESIGN Cost-effectiveness analysis using the Sheffield Type 2 Diabetes Model to model lifetime incidence of complications, costs and health benefits of screening. SETTING England; population in the 40-74-years age range eligible for a NHS health check. DATA SOURCES The Leicester Ethnic Atherosclerosis and Diabetes Risk (LEADER) data set was used to analyse prevalence and screening outcomes for a multiethnic population. Alternative prevalence rates were obtained from the literature or through personal communication. METHODS (1) Modelling of screening pathways to determine the cost per case detected followed by long-term modelling of glucose progression and complications associated with hyperglycaemia; and (2) calculation of the costs and health-related quality of life arising from complications and calculation of overall cost per quality-adjusted life-year (QALY), net monetary benefit and the likelihood of cost-effectiveness. RESULTS Based on the LEADER data set from a multiethnic population, the results indicate that screening using a HbA1c test is more cost-effective than using a FPG. For National Institute for Health and Care Excellence (NICE)-recommended screening strategies, HbA1c leads to a cost saving of £12 and a QALY gain of 0.0220 per person when a risk score is used as a prescreen. With no prescreen, the cost

  9. Bayesian Estimation of a Mixture Model

    Ilhem Merah


    Full Text Available We present the properties of a bathtub curve reliability model having both a sufficient adaptability and a minimal number of parameters introduced by Idée and Pierrat (2010. This one is a mixture of a Gamma distribution G(2, (1/θ and a new distribution L(θ. We are interesting by Bayesian estimation of the parameters and survival function of this model with a squared-error loss function and non-informative prior using the approximations of Lindley (1980 and Tierney and Kadane (1986. Using a statistical sample of 60 failure data relative to a technical device, we illustrate the results derived. Based on a simulation study, comparisons are made between these two methods and the maximum likelihood method of this two parameters model.

  10. Impact and cost-effectiveness of current and future tuberculosis diagnostics: the contribution of modelling.

    Dowdy, D W; Houben, R; Cohen, T; Pai, M; Cobelens, F; Vassall, A; Menzies, N A; Gomez, G B; Langley, I; Squire, S B; White, R


    The landscape of diagnostic testing for tuberculosis (TB) is changing rapidly, and stakeholders need urgent guidance on how to develop, deploy and optimize TB diagnostics in a way that maximizes impact and makes best use of available resources. When decisions must be made with only incomplete or preliminary data available, modelling is a useful tool for providing such guidance. Following a meeting of modelers and other key stakeholders organized by the TB Modelling and Analysis Consortium, we propose a conceptual framework for positioning models of TB diagnostics. We use that framework to describe modelling priorities in four key areas: Xpert(®) MTB/RIF scale-up, target product profiles for novel assays, drug susceptibility testing to support new drug regimens, and the improvement of future TB diagnostic models. If we are to maximize the impact and cost-effectiveness of TB diagnostics, these modelling priorities should figure prominently as targets for future research. PMID:25189546

  11. The Bayesian Modelling Of Inflation Rate In Romania

    Mihaela Simionescu


    Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estim...

  12. A tutorial introduction to Bayesian models of cognitive development

    Perfors, Amy; Tenenbaum, Joshua B.; Griffiths, Thomas L.; Xu, Fei


    We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in...

  13. Merging Digital Surface Models Implementing Bayesian Approaches

    Sadeq, H.; Drummond, J.; Li, Z.


    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  14. Cost-effectiveness of interventions to promote physical activity: a modelling study.

    Linda J Cobiac


    Full Text Available BACKGROUND: Physical inactivity is a key risk factor for chronic disease, but a growing number of people are not achieving the recommended levels of physical activity necessary for good health. Australians are no exception; despite Australia's image as a sporting nation, with success at the elite level, the majority of Australians do not get enough physical activity. There are many options for intervention, from individually tailored advice, such as counselling from a general practitioner, to population-wide approaches, such as mass media campaigns, but the most cost-effective mix of interventions is unknown. In this study we evaluate the cost-effectiveness of interventions to promote physical activity. METHODS AND FINDINGS: From evidence of intervention efficacy in the physical activity literature and evaluation of the health sector costs of intervention and disease treatment, we model the cost impacts and health outcomes of six physical activity interventions, over the lifetime of the Australian population. We then determine cost-effectiveness of each intervention against current practice for physical activity intervention in Australia and derive the optimal pathway for implementation. Based on current evidence of intervention effectiveness, the intervention programs that encourage use of pedometers (Dominant and mass media-based community campaigns (Dominant are the most cost-effective strategies to implement and are very likely to be cost-saving. The internet-based intervention program (AUS$3,000/DALY, the GP physical activity prescription program (AUS$12,000/DALY, and the program to encourage more active transport (AUS$20,000/DALY, although less likely to be cost-saving, have a high probability of being under a AUS$50,000 per DALY threshold. GP referral to an exercise physiologist (AUS$79,000/DALY is the least cost-effective option if high time and travel costs for patients in screening and consulting an exercise physiologist are considered

  15. Bayesian Models of Graphs, Arrays and Other Exchangeable Random Structures.

    Orbanz, Peter; Roy, Daniel M


    The natural habitat of most Bayesian methods is data represented by exchangeable sequences of observations, for which de Finetti's theorem provides the theoretical foundation. Dirichlet process clustering, Gaussian process regression, and many other parametric and nonparametric Bayesian models fall within the remit of this framework; many problems arising in modern data analysis do not. This article provides an introduction to Bayesian models of graphs, matrices, and other data that can be modeled by random structures. We describe results in probability theory that generalize de Finetti's theorem to such data and discuss their relevance to nonparametric Bayesian modeling. With the basic ideas in place, we survey example models available in the literature; applications of such models include collaborative filtering, link prediction, and graph and network analysis. We also highlight connections to recent developments in graph theory and probability, and sketch the more general mathematical foundation of Bayesian methods for other types of data beyond sequences and arrays. PMID:26353253

  16. Modeling the cost effectiveness of malaria control interventions in the highlands of western Kenya.

    Erin M Stuckey

    Full Text Available INTRODUCTION: Tools that allow for in silico optimization of available malaria control strategies can assist the decision-making process for prioritizing interventions. The OpenMalaria stochastic simulation modeling platform can be applied to simulate the impact of interventions singly and in combination as implemented in Rachuonyo South District, western Kenya, to support this goal. METHODS: Combinations of malaria interventions were simulated using a previously-published, validated model of malaria epidemiology and control in the study area. An economic model of the costs of case management and malaria control interventions in Kenya was applied to simulation results and cost-effectiveness of each intervention combination compared to the corresponding simulated outputs of a scenario without interventions. Uncertainty was evaluated by varying health system and intervention delivery parameters. RESULTS: The intervention strategy with the greatest simulated health impact employed long lasting insecticide treated net (LLIN use by 80% of the population, 90% of households covered by indoor residual spraying (IRS with deployment starting in April, and intermittent screen and treat (IST of school children using Artemether lumefantrine (AL with 80% coverage twice per term. However, the current malaria control strategy in the study area including LLIN use of 56% and IRS coverage of 70% was the most cost effective at reducing disability-adjusted life years (DALYs over a five year period. CONCLUSIONS: All the simulated intervention combinations can be considered cost effective in the context of available resources for health in Kenya. Increasing coverage of vector control interventions has a larger simulated impact compared to adding IST to the current implementation strategy, suggesting that transmission in the study area is not at a level to warrant replacing vector control to a school-based screen and treat program. These results have the potential to

  17. Modeling Social Annotation: a Bayesian Approach

    Plangprasopchok, Anon


    Collaborative tagging systems, such as, CiteULike, and others, allow users to annotate objects, e.g., Web pages or scientific papers, with descriptive labels called tags. The social annotations, contributed by thousands of users, can potentially be used to infer categorical knowledge, classify documents or recommend new relevant information. Traditional text inference methods do not make best use of socially-generated data, since they do not take into account variations in individual users' perspectives and vocabulary. In a previous work, we introduced a simple probabilistic model that takes interests of individual annotators into account in order to find hidden topics of annotated objects. Unfortunately, our proposed approach had a number of shortcomings, including overfitting, local maxima and the requirement to specify values for some parameters. In this paper we address these shortcomings in two ways. First, we extend the model to a fully Bayesian framework. Second, we describe an infinite ver...

  18. Improving randomness characterization through Bayesian model selection

    R., Rafael Díaz-H; Martínez, Alí M Angulo; U'Ren, Alfred B; Hirsch, Jorge G; Marsili, Matteo; Castillo, Isaac Pérez


    Nowadays random number generation plays an essential role in technology with important applications in areas ranging from cryptography, which lies at the core of current communication protocols, to Monte Carlo methods, and other probabilistic algorithms. In this context, a crucial scientific endeavour is to develop effective methods that allow the characterization of random number generators. However, commonly employed methods either lack formality (e.g. the NIST test suite), or are inapplicable in principle (e.g. the characterization derived from the Algorithmic Theory of Information (ATI)). In this letter we present a novel method based on Bayesian model selection, which is both rigorous and effective, for characterizing randomness in a bit sequence. We derive analytic expressions for a model's likelihood which is then used to compute its posterior probability distribution. Our method proves to be more rigorous than NIST's suite and the Borel-Normality criterion and its implementation is straightforward. We...

  19. A conceptual model to estimate cost effectiveness of the indoor environment improvements

    Seppanen, Olli; Fisk, William J.


    Macroeconomic analyses indicate a high cost to society of a deteriorated indoor climate. The few example calculations performed to date indicate that measures taken to improve IEQ are highly cost-effective when health and productivity benefits are considered. We believe that cost-benefit analyses of building designs and operations should routinely incorporate health and productivity impacts. As an initial step, we developed a conceptual model that shows the links between improvements in IEQ and the financial gains from reductions in medical care and sick leave, improved work performance, lower employee turn over, and reduced maintenance due to fewer complaints.

  20. EPICE-HIV: An Epidemiologic Cost-Effectiveness Model for HIV Treatment.

    Vandewalle, Björn; Llibre, Josep M; Parienti, Jean-Jacques; Ustianowski, Andrew; Camacho, Ricardo; Smith, Colette; Miners, Alec; Ferreira, Diana; Félix, Jorge


    The goal of this research was to establish a new and innovative framework for cost-effectiveness modeling of HIV-1 treatment, simultaneously considering both clinical and epidemiological outcomes. EPICE-HIV is a multi-paradigm model based on a within-host micro-simulation model for the disease progression of HIV-1 infected individuals and an agent-based sexual contact network (SCN) model for the transmission of HIV-1 infection. It includes HIV-1 viral dynamics, CD4+ T cell infection rates, and pharmacokinetics/pharmacodynamics modeling. Disease progression of HIV-1 infected individuals is driven by the interdependent changes in CD4+ T cell count, changes in plasma HIV-1 RNA, accumulation of resistance mutations and adherence to treatment. The two parts of the model are joined through a per-sexual-act and viral load dependent probability of disease transmission in HIV-discordant couples. Internal validity of the disease progression part of the model is assessed and external validity is demonstrated in comparison to the outcomes observed in the STaR randomized controlled clinical trial. We found that overall adherence to treatment and the resulting pattern of treatment interruptions are key drivers of HIV-1 treatment outcomes. Our model, though largely independent of efficacy data from RCT, was accurate in producing 96-week outcomes, qualitatively and quantitatively comparable to the ones observed in the STaR trial. We demonstrate that multi-paradigm micro-simulation modeling is a promising tool to generate evidence about optimal policy strategies in HIV-1 treatment, including treatment efficacy, HIV-1 transmission, and cost-effectiveness analysis. PMID:26870960

  1. Bayesian mixture models for Poisson astronomical images

    Guglielmetti, Fabrizia; Dose, Volker


    Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as well as the sources with their respective uncertainties. Background estimation and source detection is achieved in a single algorithm. A large variety of source morphologies is revealed. The technique is applied in the X-ray part of the electromagnetic spectrum on ROSAT and Chandra data sets and it is under a feasibility study for the forthcoming eROSITA mission.

  2. Cost-effectiveness analysis of rotavirus vaccination among Libyan children using a simple economic model

    Salem Alkoshi


    Full Text Available Background: Rotavirus infection is a major cause of childhood diarrhea in Libya. The objective of this study is to evaluate the cost-effectiveness of rotavirus vaccination in that country. Methods: We used a published decision tree model that has been adapted to the Libyan situation to analyze a birth cohort of 160,000 children. The evaluation of diarrhea events in three public hospitals helped to estimate the rotavirus burden. The economic analysis was done from two perspectives: health care provider and societal. Univariate sensitivity analyses were conducted to assess uncertainty in some values of the variables selected. Results: The three hospitals received 545 diarrhea patients aged≤5 with 311 (57% rotavirus positive test results during a 9-month period. The societal cost for treatment of a case of rotavirus diarrhea was estimated at US$ 661/event. The incremental cost-effectiveness ratio with a vaccine price of US$ 27 per course was US$ 8,972 per quality-adjusted life year gained from the health care perspective. From a societal perspective, the analysis shows cost savings of around US$ 16 per child. Conclusion: The model shows that rotavirus vaccination could be economically a very attractive intervention in Libya.


    We present a Bayesian approach to regress a circular variable on a linear predictor. The regression coefficients are assumed to have a nonparametric distribution with a Dirichlet process prior. The semiparametric Bayesian approach gives added flexibility to the model and is usefu...

  4. A new approach for Bayesian model averaging

    TIAN XiangJun; XIE ZhengHui; WANG AiHui; YANG XiaoChun


    Bayesian model averaging (BMA) is a recently proposed statistical method for calibrating forecast ensembles from numerical weather models.However,successful implementation of BMA requires accurate estimates of the weights and variances of the individual competing models in the ensemble.Two methods,namely the Expectation-Maximization (EM) and the Markov Chain Monte Carlo (MCMC) algorithms,are widely used for BMA model training.Both methods have their own respective strengths and weaknesses.In this paper,we first modify the BMA log-likelihood function with the aim of removing the additional limitation that requires that the BMA weights add to one,and then use a limited memory quasi-Newtonian algorithm for solving the nonlinear optimization problem,thereby formulating a new approach for BMA (referred to as BMA-BFGS).Several groups of multi-model soil moisture simulation experiments from three land surface models show that the performance of BMA-BFGS is similar to the MCMC method in terms of simulation accuracy,and that both are superior to the EM algorithm.On the other hand,the computational cost of the BMA-BFGS algorithm is substantially less than for MCMC and is almost equivalent to that for EM.

  5. Quantifying Multiscale Habitat Structural Complexity: A Cost-Effective Framework for Underwater 3D Modelling

    Renata Ferrari


    Full Text Available Coral reef habitat structural complexity influences key ecological processes, ecosystem biodiversity, and resilience. Measuring structural complexity underwater is not trivial and researchers have been searching for accurate and cost-effective methods that can be applied across spatial extents for over 50 years. This study integrated a set of existing multi-view, image-processing algorithms, to accurately compute metrics of structural complexity (e.g., ratio of surface to planar area underwater solely from images. This framework resulted in accurate, high-speed 3D habitat reconstructions at scales ranging from small corals to reef-scapes (10s km2. Structural complexity was accurately quantified from both contemporary and historical image datasets across three spatial scales: (i branching coral colony (Acropora spp.; (ii reef area (400 m2; and (iii reef transect (2 km. At small scales, our method delivered models with <1 mm error over 90% of the surface area, while the accuracy at transect scale was 85.3% ± 6% (CI. Advantages are: no need for an a priori requirement for image size or resolution, no invasive techniques, cost-effectiveness, and utilization of existing imagery taken from off-the-shelf cameras (both monocular or stereo. This remote sensing method can be integrated to reef monitoring and improve our knowledge of key aspects of coral reef dynamics, from reef accretion to habitat provisioning and productivity, by measuring and up-scaling estimates of structural complexity.

  6. Bayesian Model Selection for LISA Pathfinder

    Karnesis, Nikolaos; Sopuerta, Carlos F; Gibert, Ferran; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Ferraioli, Luigi; Hewitson, Martin; Hueller, Mauro; Korsakova, Natalia; Plagnol, Eric; Vitale, and Stefano


    The main goal of the LISA Pathfinder (LPF) mission is to fully characterize the acceleration noise models and to test key technologies for future space-based gravitational-wave observatories similar to the LISA/eLISA concept. The Data Analysis (DA) team has developed complex three-dimensional models of the LISA Technology Package (LTP) experiment on-board LPF. These models are used for simulations, but more importantly, they will be used for parameter estimation purposes during flight operations. One of the tasks of the DA team is to identify the physical effects that contribute significantly to the properties of the instrument noise. A way of approaching to this problem is to recover the essential parameters of the LTP which describe the data. Thus, we want to define the simplest model that efficiently explains the observations. To do so, adopting a Bayesian framework, one has to estimate the so-called Bayes Factor between two competing models. In our analysis, we use three main different methods to estimate...

  7. Bayesian Model Averaging in the Instrumental Variable Regression Model

    Gary Koop; Robert Leon Gonzalez; Rodney Strachan


    This paper considers the instrumental variable regression model when there is uncertainly about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainly can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very fl...


    Lanos, Philippe; Philippe, Anne


    We propose a new modeling approach for combining dates through the Event model by using hierarchical Bayesian statistics. The Event model aims to estimate the date of a context (unit of stratification) from individual dates assumed to be contemporaneous and which are affected by errors of different types: laboratory and calibration curve errors and also irreducible errors related to contaminations, taphonomic disturbances, etc, hence the possible presence of outliers. The Event model has a hi...

  9. A model to estimate the cost effectiveness of the indoorenvironment improvements in office work

    Seppanen, Olli; Fisk, William J.


    Deteriorated indoor climate is commonly related to increases in sick building syndrome symptoms, respiratory illnesses, sick leave, reduced comfort and losses in productivity. The cost of deteriorated indoor climate for the society is high. Some calculations show that the cost is higher than the heating energy costs of the same buildings. Also building-level calculations have shown that many measures taken to improve indoor air quality and climate are cost-effective when the potential monetary savings resulting from an improved indoor climate are included as benefits gained. As an initial step towards systemizing these building level calculations we have developed a conceptual model to estimate the cost-effectiveness of various measures. The model shows the links between the improvements in the indoor environment and the following potential financial benefits: reduced medical care cost, reduced sick leave, better performance of work, lower turn over of employees, and lower cost of building maintenance due to fewer complaints about indoor air quality and climate. The pathways to these potential benefits from changes in building technology and practices go via several human responses to the indoor environment such as infectious diseases, allergies and asthma, sick building syndrome symptoms, perceived air quality, and thermal environment. The model also includes the annual cost of investments, operation costs, and cost savings of improved indoor climate. The conceptual model illustrates how various factors are linked to each other. SBS symptoms are probably the most commonly assessed health responses in IEQ studies and have been linked to several characteristics of buildings and IEQ. While the available evidence indicates that SBS symptoms can affect these outcomes and suspects that such a linkage exists, at present we can not quantify the relationships sufficiently for cost-benefit modeling. New research and analyses of existing data to quantify the financial

  10. Stochastic model updating utilizing Bayesian approach and Gaussian process model

    Wan, Hua-Ping; Ren, Wei-Xin


    Stochastic model updating (SMU) has been increasingly applied in quantifying structural parameter uncertainty from responses variability. SMU for parameter uncertainty quantification refers to the problem of inverse uncertainty quantification (IUQ), which is a nontrivial task. Inverse problem solved with optimization usually brings about the issues of gradient computation, ill-conditionedness, and non-uniqueness. Moreover, the uncertainty present in response makes the inverse problem more complicated. In this study, Bayesian approach is adopted in SMU for parameter uncertainty quantification. The prominent strength of Bayesian approach for IUQ problem is that it solves IUQ problem in a straightforward manner, which enables it to avoid the previous issues. However, when applied to engineering structures that are modeled with a high-resolution finite element model (FEM), Bayesian approach is still computationally expensive since the commonly used Markov chain Monte Carlo (MCMC) method for Bayesian inference requires a large number of model runs to guarantee the convergence. Herein we reduce computational cost in two aspects. On the one hand, the fast-running Gaussian process model (GPM) is utilized to approximate the time-consuming high-resolution FEM. On the other hand, the advanced MCMC method using delayed rejection adaptive Metropolis (DRAM) algorithm that incorporates local adaptive strategy with global adaptive strategy is employed for Bayesian inference. In addition, we propose the use of the powerful variance-based global sensitivity analysis (GSA) in parameter selection to exclude non-influential parameters from calibration parameters, which yields a reduced-order model and thus further alleviates the computational burden. A simulated aluminum plate and a real-world complex cable-stayed pedestrian bridge are presented to illustrate the proposed framework and verify its feasibility.

  11. Bayesian estimation of parameters in a regional hydrological model

    Engeland, K.; Gottschalk, L.


    This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC) analysis. The Bayesian method requires formulation of a likelihood funct...

  12. Bayesian estimation of parameters in a regional hydrological model

    Engeland, K.; Gottschalk, L.


    This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC) analysis. The Bayesian method requires formulation of ...

  13. Bayesian Analysis of Dynamic Multivariate Models with Multiple Structural Breaks

    Sugita, Katsuhiro


    This paper considers a vector autoregressive model or a vector error correction model with multiple structural breaks in any subset of parameters, using a Bayesian approach with Markov chain Monte Carlo simulation technique. The number of structural breaks is determined as a sort of model selection by the posterior odds. For a cointegrated model, cointegrating rank is also allowed to change with breaks. Bayesian approach by Strachan (Journal of Business and Economic Statistics 21 (2003) 185) ...

  14. Bayesian Test of Significance for Conditional Independence: The Multinomial Model

    de Morais Andrade, Pablo; Stern, Julio; de Bragança Pereira, Carlos


    Conditional independence tests (CI tests) have received special attention lately in Machine Learning and Computational Intelligence related literature as an important indicator of the relationship among the variables used by their models. In the field of Probabilistic Graphical Models (PGM)--which includes Bayesian Networks (BN) models--CI tests are especially important for the task of learning the PGM structure from data. In this paper, we propose the Full Bayesian Significance Test (FBST) for tests of conditional independence for discrete datasets. FBST is a powerful Bayesian test for precise hypothesis, as an alternative to frequentist's significance tests (characterized by the calculation of the \\emph{p-value}).

  15. Bayesian Nonparametrics in Topic Modeling: A Brief Tutorial

    Spangher, Alexander


    Using nonparametric methods has been increasingly explored in Bayesian hierarchical modeling as a way to increase model flexibility. Although the field shows a lot of promise, inference in many models, including Hierachical Dirichlet Processes (HDP), remain prohibitively slow. One promising path forward is to exploit the submodularity inherent in Indian Buffet Process (IBP) to derive near-optimal solutions in polynomial time. In this work, I will present a brief tutorial on Bayesian nonparame...

  16. Two-Stage Bayesian Model Averaging in Endogenous Variable Models.

    Lenkoski, Alex; Eicher, Theo S; Raftery, Adrian E


    Economic modeling in the presence of endogeneity is subject to model uncertainty at both the instrument and covariate level. We propose a Two-Stage Bayesian Model Averaging (2SBMA) methodology that extends the Two-Stage Least Squares (2SLS) estimator. By constructing a Two-Stage Unit Information Prior in the endogenous variable model, we are able to efficiently combine established methods for addressing model uncertainty in regression models with the classic technique of 2SLS. To assess the validity of instruments in the 2SBMA context, we develop Bayesian tests of the identification restriction that are based on model averaged posterior predictive p-values. A simulation study showed that 2SBMA has the ability to recover structure in both the instrument and covariate set, and substantially improves the sharpness of resulting coefficient estimates in comparison to 2SLS using the full specification in an automatic fashion. Due to the increased parsimony of the 2SBMA estimate, the Bayesian Sargan test had a power of 50 percent in detecting a violation of the exogeneity assumption, while the method based on 2SLS using the full specification had negligible power. We apply our approach to the problem of development accounting, and find support not only for institutions, but also for geography and integration as development determinants, once both model uncertainty and endogeneity have been jointly addressed. PMID:24223471

  17. Bayesian model reduction and empirical Bayes for group (DCM) studies.

    Friston, Karl J; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E; van Wijk, Bernadette C M; Ziegler, Gabriel; Zeidman, Peter


    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level - e.g., dynamic causal models - and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. PMID:26569570

  18. Sampling Techniques in Bayesian Finite Element Model Updating

    Boulkaibet, I; Mthembu, L; Friswell, M I; Adhikari, S


    Recent papers in the field of Finite Element Model (FEM) updating have highlighted the benefits of Bayesian techniques. The Bayesian approaches are designed to deal with the uncertainties associated with complex systems, which is the main problem in the development and updating of FEMs. This paper highlights the complexities and challenges of implementing any Bayesian method when the analysis involves a complicated structural dynamic model. In such systems an analytical Bayesian formulation might not be available in an analytic form; therefore this leads to the use of numerical methods, i.e. sampling methods. The main challenge then is to determine an efficient sampling of the model parameter space. In this paper, three sampling techniques, the Metropolis-Hastings (MH) algorithm, Slice Sampling and the Hybrid Monte Carlo (HMC) technique, are tested by updating a structural beam model. The efficiency and limitations of each technique is investigated when the FEM updating problem is implemented using the Bayesi...

  19. Cost effectiveness of the 1993 model energy code in New Jersey

    Lucas, R.G.


    This is an analysis of cost effectiveness the Council of American Building Officials` 1993 Model Energy Code (MEC) building thermal-envelope requirements for single-family houses and multifamily housing units in New Jersey. Goal was to compare the cost effectiveness of the 1993 MEC to the alternate allowed in the 1993 Building Officials & Code Administrators (BOCA) National Energy Conservation Code -- American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) Standard 90A-1980 -- based on a comparison of the costs and benefits associated with complying with each. This comparison was performed for Camden, New Brunswick; Somerville, and Sparta. The analysis was done for two different scenarios: a ``move-up`` home buyer purchasing a single-family house and a ``first-time`` financially limited home buyer purchasing a multifamily unit. For the single-family home buyer, compliance with the 1993 MEC was estimated to increase first costs by $1028 to $1564, resulting in an incremental down payment increase of $206 to $313 (at 20% down). The time when the homeowner realizes net cash savings (net positive cash flow) for houses built in accordance with the 1993 MEC was from 1 to 5 years. The home buyer who paid 20% down had recovered increases in down payments and mortgage payments in energy cost savings by the end of the fifth year or sooner and thereafter will save more money each year. For the multifamily unit home buyer first costs were estimated to increase by $121 to $223, resulting in an incremental down payment increase of $12 to $22 (at 10% down). The time when the homeowner realizes net cash savings (net positive cash flow) for houses built in accordance with the 1993 MEC was 1 to 3 years.

  20. Efficient Nonparametric Bayesian Modelling with Sparse Gaussian Process Approximations

    Seeger, Matthias; Lawrence, Neil; Herbrich, Ralf


    Sparse approximations to Bayesian inference for nonparametric Gaussian Process models scale linearly in the number of training points, allowing for the application of powerful kernel-based models to large datasets. We present a general framework based on the informative vector machine (IVM) (Lawrence, 2002) and show how the complete Bayesian task of inference and learning of free hyperparameters can be performed in a practically efficient manner. Our framework allows for arbitrary like...

  1. Modelling biogeochemical cycles in forest ecosystems: a Bayesian approach

    Bagnara, Maurizio


    Forest models are tools for explaining and predicting the dynamics of forest ecosystems. They simulate forest behavior by integrating information on the underlying processes in trees, soil and atmosphere. Bayesian calibration is the application of probability theory to parameter estimation. It is a method, applicable to all models, that quantifies output uncertainty and identifies key parameters and variables. This study aims at testing the Bayesian procedure for calibration to different t...

  2. Varying efficacy of Helicobacter pylori eradication regimens: cost effectiveness study using a decision analysis model

    Duggan, A E; Tolley, K.; Hawkey, C. J.; Logan, R F A


    Objective: To determine how small differences in the efficacy and cost of two antibiotic regimens to eradicate Helicobacter pylori can affect the overall cost effectiveness of H pylori eradication in duodenal ulcer disease.

  3. Bayesian Inference and Optimal Design in the Sparse Linear Model

    Seeger, Matthias; Steinke, Florian; Tsuda, Koji


    The sparse linear model has seen many successful applications in Statistics, Machine Learning, and Computational Biology, such as identification of gene regulatory networks from micro-array expression data. Prior work has either approximated Bayesian inference by expensive Markov chain Monte Carlo, or replaced it by point estimation. We show how to obtain a good approximation to Bayesian analysis efficiently, using the Expectation Propagation method. We also address the problems of optimal de...

  4. A Bayesian observer model constrained by efficient coding can explain 'anti-Bayesian' percepts.

    Wei, Xue-Xin; Stocker, Alan A


    Bayesian observer models provide a principled account of the fact that our perception of the world rarely matches physical reality. The standard explanation is that our percepts are biased toward our prior beliefs. However, reported psychophysical data suggest that this view may be simplistic. We propose a new model formulation based on efficient coding that is fully specified for any given natural stimulus distribution. The model makes two new and seemingly anti-Bayesian predictions. First, it predicts that perception is often biased away from an observer's prior beliefs. Second, it predicts that stimulus uncertainty differentially affects perceptual bias depending on whether the uncertainty is induced by internal or external noise. We found that both model predictions match reported perceptual biases in perceived visual orientation and spatial frequency, and were able to explain data that have not been explained before. The model is general and should prove applicable to other perceptual variables and tasks. PMID:26343249

  5. Modelling of JET diagnostics using Bayesian Graphical Models

    Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.


    The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This

  6. Bayesian model discrimination for glucose-insulin homeostasis

    Andersen, Kim Emil; Brooks, Stephen P.; Højbjerre, Malene

    the reformulation of existing deterministic models as stochastic state space models which properly accounts for both measurement and process variability. The analysis is further enhanced by Bayesian model discrimination techniques and model averaged parameter estimation which fully accounts for model as well......In this paper we analyse a set of experimental data on a number of healthy and diabetic patients and discuss a variety of models for describing the physiological processes involved in glucose absorption and insulin secretion within the human body. We adopt a Bayesian approach which facilitates...

  7. Using consensus bayesian network to model the reactive oxygen species regulatory pathway.

    Liangdong Hu

    Full Text Available Bayesian network is one of the most successful graph models for representing the reactive oxygen species regulatory pathway. With the increasing number of microarray measurements, it is possible to construct the bayesian network from microarray data directly. Although large numbers of bayesian network learning algorithms have been developed, when applying them to learn bayesian networks from microarray data, the accuracies are low due to that the databases they used to learn bayesian networks contain too few microarray data. In this paper, we propose a consensus bayesian network which is constructed by combining bayesian networks from relevant literatures and bayesian networks learned from microarray data. It would have a higher accuracy than the bayesian networks learned from one database. In the experiment, we validated the bayesian network combination algorithm on several classic machine learning databases and used the consensus bayesian network to model the Escherichia coli's ROS pathway.

  8. Modeling the Cost Effectiveness of Neuroimaging-Based Treatment of Acute Wake-Up Stroke.

    Ankur Pandya

    Full Text Available Thrombolytic treatment (tissue-type plasminogen activator [tPA] is only recommended for acute ischemic stroke patients with stroke onset time 4.5 hours, 46.3% experienced a good stroke outcome. Lifetime discounted QALYs and costs were 5.312 and $88,247 for the no treatment strategy and 5.342 and $90,869 for the MRI-based strategy, resulting in an ICER of $88,000/QALY. Results were sensitive to variations in patient- and provider-specific factors such as sleep duration, hospital travel and door-to-needle times, as well as onset probability distribution, MRI specificity, and mRS utility values.Our model-based findings suggest that an MRI-based treatment strategy for this population could be cost-effective and quantifies the impact that patient- and provider-specific factors, such as sleep duration, hospital travel and door-to-needle times, could have on the optimal decision for wake-up stroke patients.

  9. Hellinger Distance and Bayesian Non-Parametrics: Hierarchical Models for Robust and Efficient Bayesian Inference

    Wu, Yuefeng; Hooker, Giles


    This paper introduces a hierarchical framework to incorporate Hellinger distance methods into Bayesian analysis. We propose to modify a prior over non-parametric densities with the exponential of twice the Hellinger distance between a candidate and a parametric density. By incorporating a prior over the parameters of the second density, we arrive at a hierarchical model in which a non-parametric model is placed between parameters and the data. The parameters of the family can then be estimate...

  10. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Raj Kumar


    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  11. Lack of Confidence in Approximate Bayesian Computation Model Choice

    Robert, Christian P.; Cornuet, Jean-Marie; Marin, Jean-Michel; Pillai, Natesh S.


    Approximate Bayesian computation (ABC) have become an essential tool for the analysis of complex stochastic models. Grelaud et al. [(2009) Bayesian Anal 3:427–442] advocated the use of ABC for model choice in the specific case of Gibbs random fields, relying on an intermodel sufficiency property to show that the approximation was legitimate. We implemented ABC model choice in a wide range of phylogenetic models in the Do It Yourself-ABC (DIY-ABC) software [Cornuet et al. (2008) Bioinformatics...

  12. On the Bayesian Nonparametric Generalization of IRT-Type Models

    San Martin, Ernesto; Jara, Alejandro; Rolin, Jean-Marie; Mouchart, Michel


    We study the identification and consistency of Bayesian semiparametric IRT-type models, where the uncertainty on the abilities' distribution is modeled using a prior distribution on the space of probability measures. We show that for the semiparametric Rasch Poisson counts model, simple restrictions ensure the identification of a general…

  13. Bayesian inference model for fatigue life of laminated composites

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der; Berggreen, Christian


    A probabilistic model for estimating the fatigue life of laminated composite plates is developed. The model is based on lamina-level input data, making it possible to predict fatigue properties for a wide range of laminate configurations. Model parameters are estimated by Bayesian inference. The...

  14. The Albufera Initiative for Biodiversity: a cost effective model for integrating science and volunteer participation in coastal protected area management

    Riddiford, N.J.; Veraart, J.A.; Férriz, I.; Owens, N.W.; Royo, L.; Honey, M.R.


    This paper puts forward a multi-disciplinary field project, set up in 1989 at the Parc Natural de s’Albufera in Mallorca, Balearic Islands, Spain, as an example of a cost effective model for integrating science and volunteer participation in a coastal protected area. Outcomes include the provision o

  15. Modelling LGD for unsecured retail loans using Bayesian methods

    Katarzyna Bijak; Thomas, Lyn C


    Loss Given Default (LGD) is the loss borne by the bank when a customer defaults on a loan. LGD for unsecured retail loans is often found difficult to model. In the frequentist (non-Bayesian) two-step approach, two separate regression models are estimated independently, which can be considered potentially problematic when trying to combine them to make predictions about LGD. The result is a point estimate of LGD for each loan. Alternatively, LGD can be modelled using Bayesian methods. In the B...

  16. A Bayesian Matrix Factorization Model for Relational Data

    Singh, Ajit P


    Relational learning can be used to augment one data source with other correlated sources of information, to improve predictive accuracy. We frame a large class of relational learning problems as matrix factorization problems, and propose a hierarchical Bayesian model. Training our Bayesian model using random-walk Metropolis-Hastings is impractically slow, and so we develop a block Metropolis- Hastings sampler which uses the gradient and Hessian of the likelihood to dynamically tune the proposal. We demonstrate that a predictive model of brain response to stimuli can be improved by augmenting it with side information about the stimuli.

  17. Bayesian inference of chemical kinetic models from proposed reactions

    Galagali, Nikhil


    © 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.

  18. Cost-effectiveness model comparing olanzapine and other oral atypical antipsychotics in the treatment of schizophrenia in the United States

    Smolen Lee J


    Full Text Available Abstract Background Schizophrenia is often a persistent and costly illness that requires continued treatment with antipsychotics. Differences among antipsychotics on efficacy, safety, tolerability, adherence, and cost have cost-effectiveness implications for treating schizophrenia. This study compares the cost-effectiveness of oral olanzapine, oral risperidone (at generic cost, primary comparator, quetiapine, ziprasidone, and aripiprazole in the treatment of patients with schizophrenia from the perspective of third-party payers in the U.S. health care system. Methods A 1-year microsimulation economic decision model, with quarterly cycles, was developed to simulate the dynamic nature of usual care of schizophrenia patients who switch, continue, discontinue, and restart their medications. The model captures clinical and cost parameters including adherence levels, relapse with and without hospitalization, quality-adjusted life years (QALYs, treatment discontinuation by reason, treatment-emergent adverse events, suicide, health care resource utilization, and direct medical care costs. Published medical literature and a clinical expert panel were used to develop baseline model assumptions. Key model outcomes included mean annual total direct cost per treatment, cost per stable patient, and incremental cost-effectiveness values per QALY gained. Results The results of the microsimulation model indicated that olanzapine had the lowest mean annual direct health care cost ($8,544 followed by generic risperidone ($9,080. In addition, olanzapine resulted in more QALYs than risperidone (0.733 vs. 0.719. The base case and multiple sensitivity analyses found olanzapine to be the dominant choice in terms of incremental cost-effectiveness per QALY gained. Conclusion The utilization of olanzapine is predicted in this model to result in better clinical outcomes and lower total direct health care costs compared to generic risperidone, quetiapine, ziprasidone, and

  19. Cost-effectiveness of face-to-face smoking cessation interventions: A dynamic modeling study

    T.L. Feenstra (Talitha); H.H. Hamberg-Van Reenen (Heleen); R.T. Hoogenveen (Rudolf); M.P.M.H. Rutten-van Mölken (Maureen)


    textabstractObjectives: To estimate the cost-effectiveness of five face-to-face smoking cessation interventions (i.e., minimal counseling by a general practitioner (GP) with, or without nicotine replacement therapy (NRT), intensive counseling with NRT, or bupropion, and telephone counseling) in term

  20. From intermediate to final behavioral endpoints; Modeling cognitions in (cost-)effectiveness analyses in health promotion

    Prenger, Rilana


    Cost-effectiveness analyses (CEAs) are considered an increasingly important tool in health promotion and psychology. In health promotion adequate effectiveness data of innovative interventions are often lacking. In case of many promising interventions the available data are inadequate for CEAs due t

  1. The Bayesian Modelling Of Inflation Rate In Romania

    Mihaela Simionescu (Bratu


    Full Text Available Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estimation was presented, realizing two empirical studies for data taken from the Romanian economy. Thus, an autoregressive model of order 2 and a multiple regression model were built for the index of consumer prices. The Gibbs sampling algorithm was used for estimation in R software, computing the posterior means and the standard deviations. The parameters’ stability proved to be greater than in the case of estimations based on the methods of classical Econometrics.

  2. Bayesian Subset Modeling for High-Dimensional Generalized Linear Models

    Liang, Faming


    This article presents a new prior setting for high-dimensional generalized linear models, which leads to a Bayesian subset regression (BSR) with the maximum a posteriori model approximately equivalent to the minimum extended Bayesian information criterion model. The consistency of the resulting posterior is established under mild conditions. Further, a variable screening procedure is proposed based on the marginal inclusion probability, which shares the same properties of sure screening and consistency with the existing sure independence screening (SIS) and iterative sure independence screening (ISIS) procedures. However, since the proposed procedure makes use of joint information from all predictors, it generally outperforms SIS and ISIS in real applications. This article also makes extensive comparisons of BSR with the popular penalized likelihood methods, including Lasso, elastic net, SIS, and ISIS. The numerical results indicate that BSR can generally outperform the penalized likelihood methods. The models selected by BSR tend to be sparser and, more importantly, of higher prediction ability. In addition, the performance of the penalized likelihood methods tends to deteriorate as the number of predictors increases, while this is not significant for BSR. Supplementary materials for this article are available online. © 2013 American Statistical Association.


    Mohammad Syuhaimi Ab-Rahman


    Full Text Available In the past decade automotive industries faced the exponential increase of in-vehicle electronic devices. The hydraulic systems are replacing with sophisticate electronic systems. Market demands for exploiting new in-vehicle technologies such as multimedia systems, internet access, GPS, Mobile communication, internal private network; engine, body and power train intelligent control and monitoring systems are increasing daily. These new needs make the wire-harness as physical pathway for power and data more complex. The amount of different data types’ transmission in vehicle networking requires higher bandwidth and subsequently applying expensive and advanced equipment. Also more functions and facilities lead to raise the number of Electronic Control Units (ECU. The high cost of manufacturing and implementing all mentioned equipment and systems only can be justified to luxury vehicle’s high prices. This study presents a conceptual model of in-vehicle networking which would lead to apply considerable portion of these advanced systems in non-luxury vehicles. In this context, Polymer Optical Fibers (POF exploited to achieve high speed bandwidth and cost-effective solution to transfer huge amount of data and one ECU to control and manage body/cabin electronic devices. Regarding to technical specification of POFs and using visible light as data carrier, they can meet all new needs of implementing modern expected technologies for non-luxury cars at inexpensive solution. In addition, POFs are easy-to-use, reliable and flexible in compare with silica base optical fibers. This study suggests three red, blue and green lights for transferring video/audio, communication data network such as internet/vehicle internal network and body/cabin command lines respectively. Moreover, this concept model claims for reducing wire-harness with integration of command lines into multiplexed POF line. By command lines integration also it is possible to merge

  4. Cost-effective conservation of an endangered frog under uncertainty.

    Rose, Lucy E; Heard, Geoffrey W; Chee, Yung En; Wintle, Brendan A


    How should managers choose among conservation options when resources are scarce and there is uncertainty regarding the effectiveness of actions? Well-developed tools exist for prioritizing areas for one-time and binary actions (e.g., protect vs. not protect), but methods for prioritizing incremental or ongoing actions (such as habitat creation and maintenance) remain uncommon. We devised an approach that combines metapopulation viability and cost-effectiveness analyses to select among alternative conservation actions while accounting for uncertainty. In our study, cost-effectiveness is the ratio between the benefit of an action and its economic cost, where benefit is the change in metapopulation viability. We applied the approach to the case of the endangered growling grass frog (Litoria raniformis), which is threatened by urban development. We extended a Bayesian model to predict metapopulation viability under 9 urbanization and management scenarios and incorporated the full probability distribution of possible outcomes for each scenario into the cost-effectiveness analysis. This allowed us to discern between cost-effective alternatives that were robust to uncertainty and those with a relatively high risk of failure. We found a relatively high risk of extinction following urbanization if the only action was reservation of core habitat; habitat creation actions performed better than enhancement actions; and cost-effectiveness ranking changed depending on the consideration of uncertainty. Our results suggest that creation and maintenance of wetlands dedicated to L. raniformis is the only cost-effective action likely to result in a sufficiently low risk of extinction. To our knowledge we are the first study to use Bayesian metapopulation viability analysis to explicitly incorporate parametric and demographic uncertainty into a cost-effective evaluation of conservation actions. The approach offers guidance to decision makers aiming to achieve cost-effective

  5. Bayesian modeling and prediction of solar particles flux

    Dedecius, Kamil; Kalová, J.

    Praha: FJFI ČVUT v Praze, 2009 - (Štěpán, V.), s. 77-77 ISBN 978-80-01-04430-8. [XXXI. Dny radiační ochrany. Kouty nad Desnou, Hrubý Jeseník (CZ), 02.11.2009-06.11.2009] R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian model * solar particle * solar wind Subject RIV: IN - Informatics, Computer Science modeling and prediction of solar particle s flux.pdf

  6. Research & development and growth: A Bayesian model averaging analysis

    Horváth, Roman


    Roč. 28, č. 6 (2011), s. 2669-2673. ISSN 0264-9993. [Society for Non-linear Dynamics and Econometrics Annual Conferencen. Washington DC, 16.03.2011-18.03.2011] R&D Projects: GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Keywords : Research and development * Growth * Bayesian model averaging Subject RIV: AH - Economics Impact factor: 0.701, year: 2011 & development and growth a bayesian model averaging analysis.pdf

  7. Approximate Bayesian Recursive Estimation of Linear Model with Uniform Noise

    Pavelková, Lenka; Kárný, Miroslav

    Brussels: IFAC, 2012, s. 1803-1807. ISBN 978-3-902823-06-9. [16th IFAC Symposium on System Identification The International Federation of Automatic Control. Brussels (BE), 11.07.2012-13.07.2012] R&D Projects: GA TA ČR TA01030123 Institutional support: RVO:67985556 Keywords : recursive parameter estimation * bounded noise * Bayesian learning * autoregressive models Subject RIV: BC - Control System s Theory bayesian recursive estimation of linear model with uniform noise.pdf

  8. Comparing Bayesian models for multisensory cue combination without mandatory integration

    Beierholm, Ulrik R.; Shams, Ladan; Kording, Konrad P; Ma, Wei Ji


    Bayesian models of multisensory perception traditionally address the problem of estimating an underlying variable that is assumed to be the cause of the two sensory signals. The brain, however, has to solve a more general problem: it also has to establish which signals come from the same source and should be integrated, and which ones do not and should be segregated. In the last couple of years, a few models have been proposed to solve this problem in a Bayesian fashion. One of these ha...

  9. Bayesian model mixing for cold rolling mills: Test results

    Ettler, P.; Puchr, I.; Dedecius, Kamil

    Slovensko: Slovak University of Technology, 2013, s. 359-364. ISBN 978-1-4799-0926-1. [19th International Conference on Process Control . Štrbské Pleso (SK), 18.06.2013-21.06.2013] R&D Projects: GA MŠk(CZ) 7D09008; GA MŠk 7D12004 Keywords : Bayesian statistics * model mixing * process control Subject RIV: BC - Control Systems Theory model mixing for cold rolling mills test results.pdf

  10. Bayesian Model Comparison With the g-Prior

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan;


    Model comparison and selection is an important problem in many model-based signal processing applications. Often, very simple information criteria such as the Akaike information criterion or the Bayesian information criterion are used despite their shortcomings. Compared to these methods, Djuric’...

  11. Bayesian Estimation of the DINA Model with Gibbs Sampling

    Culpepper, Steven Andrew


    A Bayesian model formulation of the deterministic inputs, noisy "and" gate (DINA) model is presented. Gibbs sampling is employed to simulate from the joint posterior distribution of item guessing and slipping parameters, subject attribute parameters, and latent class probabilities. The procedure extends concepts in Béguin and Glas,…

  12. Cost-effectiveness modeling of colorectal cancer: Computed tomography colonography vs colonoscopy or fecal occult blood tests

    Objectives: To assess the cost-effectiveness of three colorectal-cancer (CRC) screening strategies in France: fecal-occult-blood tests (FOBT), computed-tomography-colonography (CTC) and optical-colonoscopy (OC). Methods: Ten-year simulation modeling was used to assess a virtual asymptomatic, average-risk population 50–74 years old. Negative OC was repeated 10 years later, and OC positive for advanced or non-advanced adenoma 3 or 5 years later, respectively. FOBT was repeated biennially. Negative CTC was repeated 5 years later. Positive CTC and FOBT led to triennial OC. Total cost and CRC rate after 10 years for each screening strategy and 0–100% adherence rates with 10% increments were computed. Transition probabilities were programmed using distribution ranges to account for uncertainty parameters. Direct medical costs were estimated using the French national health insurance prices. Probabilistic sensitivity analyses used 5000 Monte Carlo simulations generating model outcomes and standard deviations. Results: For a given adherence rate, CTC screening was always the most effective but not the most cost-effective. FOBT was the least effective but most cost-effective strategy. OC was of intermediate efficacy and the least cost-effective strategy. Without screening, treatment of 123 CRC per 10,000 individuals would cost €3,444,000. For 60% adherence, the respective costs of preventing and treating, respectively 49 and 74 FOBT-detected, 73 and 50 CTC-detected and 63 and 60 OC-detected CRC would be €2,810,000, €6,450,000 and €9,340,000. Conclusion: Simulation modeling helped to identify what would be the most effective (CTC) and cost-effective screening (FOBT) strategy in the setting of mass CRC screening in France.

  13. Cost and cost effectiveness of long-lasting insecticide-treated bed nets - a model-based analysis

    Pulkki-Brännström Anni-Maria


    Full Text Available Abstract Background The World Health Organization recommends that national malaria programmes universally distribute long-lasting insecticide-treated bed nets (LLINs. LLINs provide effective insecticide protection for at least three years while conventional nets must be retreated every 6-12 months. LLINs may also promise longer physical durability (lifespan, but at a higher unit price. No prospective data currently available is sufficient to calculate the comparative cost effectiveness of different net types. We thus constructed a model to explore the cost effectiveness of LLINs, asking how a longer lifespan affects the relative cost effectiveness of nets, and if, when and why LLINs might be preferred to conventional insecticide-treated nets. An innovation of our model is that we also considered the replenishment need i.e. loss of nets over time. Methods We modelled the choice of net over a 10-year period to facilitate the comparison of nets with different lifespan (and/or price and replenishment need over time. Our base case represents a large-scale programme which achieves high coverage and usage throughout the population by distributing either LLINs or conventional nets through existing health services, and retreats a large proportion of conventional nets regularly at low cost. We identified the determinants of bed net programme cost effectiveness and parameter values for usage rate, delivery and retreatment cost from the literature. One-way sensitivity analysis was conducted to explicitly compare the differential effect of changing parameters such as price, lifespan, usage and replenishment need. Results If conventional and long-lasting bed nets have the same physical lifespan (3 years, LLINs are more cost effective unless they are priced at more than USD 1.5 above the price of conventional nets. Because a longer lifespan brings delivery cost savings, each one year increase in lifespan can be accompanied by a USD 1 or more increase in price

  14. Cost-effectiveness of changes in alcohol taxation in Denmark: a modelling study

    Holm, Astrid Ledgaard; Veerman, Lennert; Cobiac, Linda; Ekholm, Ola; Diderichsen, Finn


    Introduction Excessive alcohol consumption is a public health problem in many countries including Denmark, where 6% of the burden of disease is due to alcohol consumption, according to the new estimates from the Global Burden of Disease 2010 study. Pricing policies, including tax increases, have been shown to effectively decrease the level of alcohol consumption. Methods We analysed the cost-effectiveness of three different scenarios of changed taxation of alcoholic beverages in Denmark (20% ...

  15. Modelling Agricultural Diffuse Pollution: CAP – WFD Interactions and Cost Effectiveness of Measures

    Mouratiadou, Ioanna; Topp, Cairistiona; Moran, Dominic


    Within the context of the Water Framework Directive (WFD) and the Common Agricultural Policy (CAP), the design of effective and sustainable agricultural and water resources management policies presents multiple challenges. This paper presents a methodological framework that will be used to identify synergies and trade-offs between the CAP and the WFD in relation to their economic and water resources environmental effects, and to assess the cost-effectiveness of measures to control water pollu...

  16. A model-based economic analysis of pre-pandemic influenza vaccination cost-effectiveness

    Halder, Nilimesh; Joel K Kelso; George J Milne


    Background A vaccine matched to a newly emerged pandemic influenza virus would require a production time of at least 6 months with current proven techniques, and so could only be used reactively after the peak of the pandemic. A pre-pandemic vaccine, although probably having lower efficacy, could be produced and used pre-emptively. While several previous studies have investigated the cost effectiveness of pre-emptive vaccination strategies, they have not been directly compared to realistic re...

  17. Cost effectiveness of first-line oral therapies for pulmonary arterial hypertension: A modelling study

    Coyle, K.; Coyle, D.; Blouin, J.; Lee, K; Jabr, MF; Tran, K.; Mielniczuk, L; Swiston, J; Innes, M.


    Background: In recent years, a significant number of costly oral therapies have become available for the treatment of pulmonary arterial hypertension (PAH). Funding decisions for these therapies requires weighing up their effectiveness and costs. Objective: The aim of this study was to assess the cost effectiveness of monotherapy with oral PAH-specific therapies versus supportive care as initial therapy for patients with functional class (FC) II and III PAH in Canada. Methods: A cost-utility ...

  18. Cost Effectiveness of First-Line Oral Therapies for Pulmonary Arterial Hypertension: A Modelling Study

    Coyle, Kathryn; Coyle, Doug; Blouin, Julie; Lee, Karen; Jabr, Mohammed F.; Tran, Khai; Mielniczuk, Lisa; Swiston, John; Innes, Mike


    Background In recent years, a significant number of costly oral therapies have become available for the treatment of pulmonary arterial hypertension (PAH). Funding decisions for these therapies requires weighing up their effectiveness and costs. Objective The aim of this study was to assess the cost effectiveness of monotherapy with oral PAH-specific therapies versus supportive care as initial therapy for patients with functional class (FC) II and III PAH in Canada. Methods A cost-utility ana...

  19. Bayesian Joint Modelling for Object Localisation in Weakly Labelled Images.

    Shi, Zhiyuan; Hospedales, Timothy M; Xiang, Tao


    We address the problem of localisation of objects as bounding boxes in images and videos with weak labels. This weakly supervised object localisation problem has been tackled in the past using discriminative models where each object class is localised independently from other classes. In this paper, a novel framework based on Bayesian joint topic modelling is proposed, which differs significantly from the existing ones in that: (1) All foreground object classes are modelled jointly in a single generative model that encodes multiple object co-existence so that "explaining away" inference can resolve ambiguity and lead to better learning and localisation. (2) Image backgrounds are shared across classes to better learn varying surroundings and "push out" objects of interest. (3) Our model can be learned with a mixture of weakly labelled and unlabelled data, allowing the large volume of unlabelled images on the Internet to be exploited for learning. Moreover, the Bayesian formulation enables the exploitation of various types of prior knowledge to compensate for the limited supervision offered by weakly labelled data, as well as Bayesian domain adaptation for transfer learning. Extensive experiments on the PASCAL VOC, ImageNet and YouTube-Object videos datasets demonstrate the effectiveness of our Bayesian joint model for weakly supervised object localisation. PMID:26340253

  20. A hierarchical bayesian model to quantify uncertainty of stream water temperature forecasts.

    Guillaume Bal

    Full Text Available Providing generic and cost effective modelling approaches to reconstruct and forecast freshwater temperature using predictors as air temperature and water discharge is a prerequisite to understanding ecological processes underlying the impact of water temperature and of global warming on continental aquatic ecosystems. Using air temperature as a simple linear predictor of water temperature can lead to significant bias in forecasts as it does not disentangle seasonality and long term trends in the signal. Here, we develop an alternative approach based on hierarchical Bayesian statistical time series modelling of water temperature, air temperature and water discharge using seasonal sinusoidal periodic signals and time varying means and amplitudes. Fitting and forecasting performances of this approach are compared with that of simple linear regression between water and air temperatures using i an emotive simulated example, ii application to three French coastal streams with contrasting bio-geographical conditions and sizes. The time series modelling approach better fit data and does not exhibit forecasting bias in long term trends contrary to the linear regression. This new model also allows for more accurate forecasts of water temperature than linear regression together with a fair assessment of the uncertainty around forecasting. Warming of water temperature forecast by our hierarchical Bayesian model was slower and more uncertain than that expected with the classical regression approach. These new forecasts are in a form that is readily usable in further ecological analyses and will allow weighting of outcomes from different scenarios to manage climate change impacts on freshwater wildlife.

  1. Modelling Cost Effectiveness in Neovascular Age-Related Macular Degeneration: The Impact of Using Contrast Sensitivity vs. Visual Acuity.

    Butt, T.; Patel, P. J.; Tufail, A; Rubin, G. S.


    Background The cost utility of treatments of age-related macular degeneration (AMD) is commonly assessed using health state transition models defined by levels of visual acuity. However, there is evidence that another measure of visual function, contrast sensitivity, may be better associated with utility than visual acuity. This paper investigates the difference in cost effectiveness resulting from models based on visual acuity and contrast sensitivity using the example of bevacizumab (Avasti...

  2. Spatial and spatio-temporal bayesian models with R - INLA

    Blangiardo, Marta


    Dedication iiiPreface ix1 Introduction 11.1 Why spatial and spatio-temporal statistics? 11.2 Why do we use Bayesian methods for modelling spatial and spatio-temporal structures? 21.3 Why INLA? 31.4 Datasets 32 Introduction to 212.1 The language 212.2 objects 222.3 Data and session management 342.4 Packages 352.5 Programming in 362.6 Basic statistical analysis with 393 Introduction to Bayesian Methods 533.1 Bayesian Philosophy 533.2 Basic Probability Elements 573.3 Bayes Theorem 623.4 Prior and Posterior Distributions 643.5 Working with the Posterior Distribution 663.6 Choosing the Prior Distr

  3. Modeling error distributions of growth curve models through Bayesian methods.

    Zhang, Zhiyong


    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided. PMID:26019004

  4. Asymptotically minimax Bayesian predictive densities for multinomial models

    Komaki, Fumiyasu


    One-step ahead prediction for the multinomial model is considered. The performance of a predictive density is evaluated by the average Kullback-Leibler divergence from the true density to the predictive density. Asymptotic approximations of risk functions of Bayesian predictive densities based on Dirichlet priors are obtained. It is shown that a Bayesian predictive density based on a specific Dirichlet prior is asymptotically minimax. The asymptotically minimax prior is different from known objective priors such as the Jeffreys prior or the uniform prior.

  5. Uncertainty Modeling Based on Bayesian Network in Ontology Mapping

    LI Yuhua; LIU Tao; SUN Xiaolin


    How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.

  6. Multimethod, multistate Bayesian hierarchical modeling approach for use in regional monitoring of wolves.

    Jiménez, José; García, Emilio J; Llaneza, Luis; Palacios, Vicente; González, Luis Mariano; García-Domínguez, Francisco; Múñoz-Igualada, Jaime; López-Bao, José Vicente


    In many cases, the first step in large-carnivore management is to obtain objective, reliable, and cost-effective estimates of population parameters through procedures that are reproducible over time. However, monitoring predators over large areas is difficult, and the data have a high level of uncertainty. We devised a practical multimethod and multistate modeling approach based on Bayesian hierarchical-site-occupancy models that combined multiple survey methods to estimate different population states for use in monitoring large predators at a regional scale. We used wolves (Canis lupus) as our model species and generated reliable estimates of the number of sites with wolf reproduction (presence of pups). We used 2 wolf data sets from Spain (Western Galicia in 2013 and Asturias in 2004) to test the approach. Based on howling surveys, the naïve estimation (i.e., estimate based only on observations) of the number of sites with reproduction was 9 and 25 sites in Western Galicia and Asturias, respectively. Our model showed 33.4 (SD 9.6) and 34.4 (3.9) sites with wolf reproduction, respectively. The number of occupied sites with wolf reproduction was 0.67 (SD 0.19) and 0.76 (0.11), respectively. This approach can be used to design more cost-effective monitoring programs (i.e., to define the sampling effort needed per site). Our approach should inspire well-coordinated surveys across multiple administrative borders and populations and lead to improved decision making for management of large carnivores on a landscape level. The use of this Bayesian framework provides a simple way to visualize the degree of uncertainty around population-parameter estimates and thus provides managers and stakeholders an intuitive approach to interpreting monitoring results. Our approach can be widely applied to large spatial scales in wildlife monitoring where detection probabilities differ between population states and where several methods are being used to estimate different population

  7. Hierarchical Bayesian spatial models for multispecies conservation planning and monitoring.

    Carroll, Carlos; Johnson, Devin S; Dunk, Jeffrey R; Zielinski, William J


    Biologists who develop and apply habitat models are often familiar with the statistical challenges posed by their data's spatial structure but are unsure of whether the use of complex spatial models will increase the utility of model results in planning. We compared the relative performance of nonspatial and hierarchical Bayesian spatial models for three vertebrate and invertebrate taxa of conservation concern (Church's sideband snails [Monadenia churchi], red tree voles [Arborimus longicaudus], and Pacific fishers [Martes pennanti pacifica]) that provide examples of a range of distributional extents and dispersal abilities. We used presence-absence data derived from regional monitoring programs to develop models with both landscape and site-level environmental covariates. We used Markov chain Monte Carlo algorithms and a conditional autoregressive or intrinsic conditional autoregressive model framework to fit spatial models. The fit of Bayesian spatial models was between 35 and 55% better than the fit of nonspatial analogue models. Bayesian spatial models outperformed analogous models developed with maximum entropy (Maxent) methods. Although the best spatial and nonspatial models included similar environmental variables, spatial models provided estimates of residual spatial effects that suggested how ecological processes might structure distribution patterns. Spatial models built from presence-absence data improved fit most for localized endemic species with ranges constrained by poorly known biogeographic factors and for widely distributed species suspected to be strongly affected by unmeasured environmental variables or population processes. By treating spatial effects as a variable of interest rather than a nuisance, hierarchical Bayesian spatial models, especially when they are based on a common broad-scale spatial lattice (here the national Forest Inventory and Analysis grid of 24 km(2) hexagons), can increase the relevance of habitat models to multispecies

  8. Bayesian Modelling of fMRI Time Series

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward


    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...

  9. Bayesian Modelling of fMRI Time Series

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...

  10. Bayesian nonparametric estimation of hazard rate in monotone Aalen model

    Timková, Jana


    Roč. 50, č. 6 (2014), s. 849-868. ISSN 0023-5954 Institutional support: RVO:67985556 Keywords : Aalen model * Bayesian estimation * MCMC Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.541, year: 2014

  11. An Inhomogeneous Bayesian Texture Model for Spatially Varying Parameter Estimation

    Dharmagunawardhana, Chathurika; Mahmoodi, Sasan; Bennett, Michael; Niranjan, Mahesan


    In statistical model based texture feature extraction, features based on spatially varying parameters achieve higher discriminative performances compared to spatially constant parameters. In this paper we formulate a novel Bayesian framework which achieves texture characterization by spatially varying parameters based on Gaussian Markov random fields. The parameter estimation is carried out by Metropolis-Hastings algorithm. The distributions of estimated spatially varying paramete...

  12. Cost-effectiveness of enhanced syphilis screening among HIV-positive men who have sex with men: a microsimulation model.

    Ashleigh R Tuite

    Full Text Available Syphilis co-infection risk has increased substantially among HIV-infected men who have sex with men (MSM. Frequent screening for syphilis and treatment of men who test positive might be a practical means of controlling the risk of infection and disease sequelae in this population.We evaluated the cost-effectiveness of strategies that increased the frequency and population coverage of syphilis screening in HIV-infected MSM receiving HIV care, relative to current standard of care.We developed a state-transition microsimulation model of syphilis natural history and medical care in HIV-infected MSM receiving care for HIV. We performed Monte Carlo simulations using input data derived from a large observational cohort in Ontario, Canada, and from published biomedical literature. Simulations compared usual care (57% of the population screened annually to different combinations of more frequent (3- or 6-monthly screening and higher coverage (100% screened. We estimated expected disease-specific outcomes, quality-adjusted survival, costs, and cost-effectiveness associated with each strategy from the perspective of a public health care payer.Usual care was more costly and less effective than strategies with more frequent or higher coverage screening. Higher coverage strategies (with screening frequency of 3 or 6 months were expected to be cost-effective based on usually cited willingness-to-pay thresholds. These findings were robust in the face of probabilistic sensitivity analyses, alternate cost-effectiveness thresholds, and alternate assumptions about duration of risk, program characteristics, and management of underlying HIV.We project that higher coverage and more frequent syphilis screening of HIV-infected MSM would be a highly cost-effective health intervention, with many potentially viable screening strategies projected to both save costs and improve health when compared to usual care. The baseline requirement for regular blood testing in this

  13. Research on Bayesian Network Based User's Interest Model

    ZHANG Weifeng; XU Baowen; CUI Zifeng; XU Lei


    It has very realistic significance for improving the quality of users' accessing information to filter and selectively retrieve the large number of information on the Internet. On the basis of analyzing the existing users' interest models and some basic questions of users' interest (representation, derivation and identification of users' interest), a Bayesian network based users' interest model is given. In this model, the users' interest reduction algorithm based on Markov Blanket model is used to reduce the interest noise, and then users' interested and not interested documents are used to train the Bayesian network. Compared to the simple model, this model has the following advantages like small space requirements, simple reasoning method and high recognition rate. The experiment result shows this model can more appropriately reflect the user's interest, and has higher performance and good usability.

  14. Bayesian estimation of parameters in a regional hydrological model

    K. Engeland


    Full Text Available This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC analysis. The Bayesian method requires formulation of a likelihood function for the parameters and three alternative formulations are used. The first is a subjectively chosen objective function that describes the goodness of fit between the simulated and observed streamflow, as defined in the GLUE framework. The second and third formulations are more statistically correct likelihood models that describe the simulation errors. The full statistical likelihood model describes the simulation errors as an AR(1 process, whereas the simple model excludes the auto-regressive part. The statistical parameters depend on the catchments and the hydrological processes and the statistical and the hydrological parameters are estimated simultaneously. The results show that the simple likelihood model gives the most robust parameter estimates. The simulation error may be explained to a large extent by the catchment characteristics and climatic conditions, so it is possible to transfer knowledge about them to ungauged catchments. The statistical models for the simulation errors indicate that structural errors in the model are more important than parameter uncertainties. Keywords: regional hydrological model, model uncertainty, Bayesian analysis, Markov Chain Monte Carlo analysis

  15. A Bayesian Markov geostatistical model for estimation of hydrogeological properties

    A geostatistical methodology based on Markov-chain analysis and Bayesian statistics was developed for probability estimations of hydrogeological and geological properties in the siting process of a nuclear waste repository. The probability estimates have practical use in decision-making on issues such as siting, investigation programs, and construction design. The methodology is nonparametric which makes it possible to handle information that does not exhibit standard statistical distributions, as is often the case for classified information. Data do not need to meet the requirements on additivity and normality as with the geostatistical methods based on regionalized variable theory, e.g., kriging. The methodology also has a formal way for incorporating professional judgments through the use of Bayesian statistics, which allows for updating of prior estimates to posterior probabilities each time new information becomes available. A Bayesian Markov Geostatistical Model (BayMar) software was developed for implementation of the methodology in two and three dimensions. This paper gives (1) a theoretical description of the Bayesian Markov Geostatistical Model; (2) a short description of the BayMar software; and (3) an example of application of the model for estimating the suitability for repository establishment with respect to the three parameters of lithology, hydraulic conductivity, and rock quality designation index (RQD) at 400--500 meters below ground surface in an area around the Aespoe Hard Rock Laboratory in southeastern Sweden

  16. Theoretical Basis in Regression Model Based Selection of the Most Cost Effective Parameters of Hard Rock Surface Mining

    Antipas T. S. Massawe; Karim R. Baruti; Paul S. M. Gongo


    What determines selection of the most cost effective parameters of hard rock surface mining is consideration of all alternative variants of mine design and the conflicting effect of their parameters on cost. Consideration could be realized based on the mathematical model of the cumulative influence of rockmass and mine design variables on the overall cost per ton of the hard rock drilled, blasted, hauled and primary crushed. Available works on the topic mostly dwelt on four processes of hard ...

  17. Bayesian and maximin optimal designs for heteroscedastic regression models

    Dette, Holger; Haines, Linda M.; Imhof, Lorens A.


    The problem of constructing standardized maximin D-optimal designs for weighted polynomial regression models is addressed. In particular it is shown that, by following the broad approach to the construction of maximin designs introduced recently by Dette, Haines and Imhof (2003), such designs can be obtained as weak limits of the corresponding Bayesian Φq-optimal designs. The approach is illustrated for two specific weighted polynomial models and also for a particular growth model.

  18. Bayesian modeling growth curves for quail assuming skewness in errors

    Robson Marcelo Rossi


    Full Text Available Bayesian modeling growth curves for quail assuming skewness in errors - To assume normal distributions in the data analysis is common in different areas of the knowledge. However we can make use of the other distributions that are capable to model the skewness parameter in the situations that is needed to model data with tails heavier than the normal. This article intend to present alternatives to the assumption of the normality in the errors, adding asymmetric distributions. A Bayesian approach is proposed to fit nonlinear models when the errors are not normal, thus, the distributions t, skew-normal and skew-t are adopted. The methodology is intended to apply to different growth curves to the quail body weights. It was found that the Gompertz model assuming skew-normal errors and skew-t errors, respectively for male and female, were the best fitted to the data.


    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  20. A Bayesian nonlinear mixed-effects disease progression model

    Kim, Seongho; Jang, Hyejeong; Wu, Dongfeng; Abrams, Judith


    A nonlinear mixed-effects approach is developed for disease progression models that incorporate variation in age in a Bayesian framework. We further generalize the probability model for sensitivity to depend on age at diagnosis, time spent in the preclinical state and sojourn time. The developed models are then applied to the Johns Hopkins Lung Project data and the Health Insurance Plan for Greater New York data using Bayesian Markov chain Monte Carlo and are compared with the estimation method that does not consider random-effects from age. Using the developed models, we obtain not only age-specific individual-level distributions, but also population-level distributions of sensitivity, sojourn time and transition probability. PMID:26798562

  1. Non-stationarity in GARCH models: A Bayesian analysis

    Kleibergen, Frank; Dijk, Herman


    textabstractFirst, the non-stationarity properties of the conditional variances in the GARCH(1,1) model are analysed using the concept of infinite persistence of shocks. Given a time sequence of probabilities for increasing/decreasing conditional variances, a theoretical formula for quasi-strict non-stationarity is defined. The resulting conditions for the GARCH(1,1) model are shown to differ from the weak stationarity conditions mainly used in the literature. Bayesian statistical analysis us...

  2. A New Bayesian Unit Root Test in Stochastic Volatility Models

    Yong Li; Jun Yu


    A new posterior odds analysis is proposed to test for a unit root in volatility dynamics in the context of stochastic volatility models. This analysis extends the Bayesian unit root test of So and Li (1999, Journal of Business Economic Statistics) in two important ways. First, a numerically more stable algorithm is introduced to compute the Bayes factor, taking into account the special structure of the competing models. Owing to its numerical stability, the algorithm overcomes the problem of ...

  3. Bayesian Modelling in Machine Learning: A Tutorial Review

    Seeger, Matthias


    Many facets of Bayesian Modelling are firmly established in Machine Learning and give rise to state-of-the-art solutions to application problems. The sheer number of techniques, ideas and models which have been proposed, and the terminology, can be bewildering. With this tutorial review, we aim to give a wide high-level overview over this important field, concentrating on central ideas and methods, and on their interconnections. The reader will gain a basic understanding of the topics and the...

  4. Performance and prediction: Bayesian modelling of fallible choice in chess

    Haworth, Guy McCrossan; Regan, Ken; Di Fatta, Giuseppe


    Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration ...

  5. Bayesian modeling and prediction of solar particles flux

    An autoregression model was developed based on the Bayesian approach. Considering the solar wind non-homogeneity, the idea was applied of combining the pure autoregressive properties of the model with expert knowledge based on a similar behaviour of the various phenomena related to the flux properties. Examples of such situations include the hardening of the X-ray spectrum, which is often followed by coronal mass ejection and a significant increase in the particles flux intensity

  6. Bayesian modeling and prediction of solar particles flux

    Dedecius, Kamil; Kalová, J.

    18/56/, 7/8 (2010), s. 228-230. ISSN 1210-7085 R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : mathematical models * solar activity * solar flares * solar flux * solar particles Subject RIV: BB - Applied Statistics, Operational Research modeling and prediction of solar particles flux.pdf

  7. Hierarchical Bayesian Modeling of Hitting Performance in Baseball

    Jensen, Shane T.; McShane, Blake; Wyner, Abraham J.


    We have developed a sophisticated statistical model for predicting the hitting performance of Major League baseball players. The Bayesian paradigm provides a principled method for balancing past performance with crucial covariates, such as player age and position. We share information across time and across players by using mixture distributions to control shrinkage for improved accuracy. We compare the performance of our model to current sabermetric methods on a held-out seaso...

  8. Bayesian estimation of a DSGE model with inventories

    Foerster, Marcel


    This paper introduces inventories in an otherwise standard Dynamic Stochastic General Equilibrium Model (DSGE) of the business cycle. Firms accumulate inventories to facilitate sales, but face a cost of doing so in terms of costly storage of intermediate goods. The paper's main contribution is to present a DSGE model with inventories that is estimated using Bayesian methods. Based on US data we show that accounting for inventory dynamics has a significant impact on parameter estimates and imp...

  9. Methodologies used in cost-effectiveness models for evaluating treatments in major depressive disorder: a systematic review

    Zimovetz Evelina A


    Full Text Available Abstract Background Decision makers in many jurisdictions use cost-effectiveness estimates as an aid for selecting interventions with an appropriate balance between health benefits and costs. This systematic literature review aims to provide an overview of published cost-effectiveness models in major depressive disorder (MDD with a focus on the methods employed. Key components of the identified models are discussed and any challenges in developing models are highlighted. Methods A systematic literature search was performed to identify all primary model-based economic evaluations of MDD interventions indexed in MEDLINE, the Cochrane Library, EMBASE, EconLit, and PsycINFO between January 2000 and May 2010. Results A total of 37 studies were included in the review. These studies predominantly evaluated antidepressant medications. The analyses were performed across a broad set of countries. The majority of models were decision-trees; eight were Markov models. Most models had a time horizon of less than 1 year. The majority of analyses took a payer perspective. Clinical input data were obtained from pooled placebo-controlled comparative trials, single head-to-head trials, or meta-analyses. The majority of studies (24 of 37 used treatment success or symptom-free days as main outcomes, 14 studies incorporated health state utilities, and 2 used disability-adjusted life-years. A few models (14 of 37 incorporated probabilities and costs associated with suicide and/or suicide attempts. Two models examined the cost-effectiveness of second-line treatment in patients who had failed to respond to initial therapy. Resource use data used in the models were obtained mostly from expert opinion. All studies, with the exception of one, explored parameter uncertainty. Conclusions The review identified several model input data gaps, including utility values in partial responders, efficacy of second-line treatments, and resource utilisation estimates obtained from

  10. Markov Model of Wind Power Time Series UsingBayesian Inference of Transition Matrix

    Chen, Peiyuan; Berthelsen, Kasper Klitgaard; Bak-Jensen, Birgitte; Chen, Zhe


    This paper proposes to use Bayesian inference of transition matrix when developing a discrete Markov model of a wind speed/power time series and 95% credible interval for the model verification. The Dirichlet distribution is used as a conjugate prior for the transition matrix. Three discrete Markov models are compared, i.e. the basic Markov model, the Bayesian Markov model and the birth-and-death Markov model. The proposed Bayesian Markov model shows the best accuracy in modeling the autocorr...

  11. The cost-effectiveness of neonatal screening for Cystic Fibrosis: an analysis of alternative scenarios using a decision model

    Tu Karen


    Full Text Available Abstract Background The use of neonatal screening for cystic fibrosis is widely debated in the United Kingdom and elsewhere, but the evidence available to inform policy is limited. This paper explores the cost-effectiveness of adding screening for cystic fibrosis to an existing routine neonatal screening programme for congenital hypothyroidism and phenylketonuria, under alternative scenarios and assumptions. Methods The study is based on a decision model comparing screening to no screening in terms of a number of outcome measures, including diagnosis of cystic fibrosis, life-time treatment costs, life years and QALYs gained. The setting is a hypothetical UK health region without an existing neonatal screening programme for cystic fibrosis. Results Under initial assumptions, neonatal screening (using an immunoreactive trypsin/DNA two stage screening protocol costs £5,387 per infant diagnosed, or £1.83 per infant screened (1998 costs. Neonatal screening for cystic fibrosis produces an incremental cost-effectiveness of £6,864 per QALY gained, in our base case scenario (an assumed benefit of a 6 month delay in the emergence of symptoms. A difference of 11 months or more in the emergence of symptoms (and mean survival means neonatal screening is both less costly and produces better outcomes than no screening. Conclusion Neonatal screening is expensive as a method of diagnosis. Neonatal screening may be a cost-effective intervention if the hypothesised delays in the onset of symptoms are confirmed. Implementing both antenatal and neonatal screening would undermine potential economic benefits, since a reduction in the birth incidence of cystic fibrosis would reduce the cost-effectiveness of neonatal screening.

  12. Bayesian point event modeling in spatial and environmental epidemiology.

    Lawson, Andrew B


    This paper reviews the current state of point event modeling in spatial epidemiology from a Bayesian perspective. Point event (or case event) data arise when geo-coded addresses of disease events are available. Often, this level of spatial resolution would not be accessible due to medical confidentiality constraints. However, for the examination of small spatial scales, it is important to be capable of examining point process data directly. Models for such data are usually formulated based on point process theory. In addition, special conditioning arguments can lead to simpler Bernoulli likelihoods and logistic spatial models. Goodness-of-fit diagnostics and Bayesian residuals are also considered. Applications within putative health hazard risk assessment, cluster detection, and linkage to environmental risk fields (misalignment) are considered. PMID:23035034

  13. A global economic model to assess the cost-effectiveness of new treatments for advanced breast cancer in Canada.

    Beauchemin, C; Letarte, N; Mathurin, K; Yelle, L; Lachaine, J


    Objective Considering the increasing number of treatment options for metastatic breast cancer (MBC), it is important to develop high-quality methods to assess the cost-effectiveness of new anti-cancer drugs. This study aims to develop a global economic model that could be used as a benchmark for the economic evaluation of new therapies for MBC. Methods The Global Pharmacoeconomics of Metastatic Breast Cancer (GPMBC) model is a Markov model that was constructed to estimate the incremental cost per quality-adjusted life years (QALY) of new treatments for MBC from a Canadian healthcare system perspective over a lifetime horizon. Specific parameters included in the model are cost of drug treatment, survival outcomes, and incidence of treatment-related adverse events (AEs). Global parameters are patient characteristics, health states utilities, disutilities, and costs associated with treatment-related AEs, as well as costs associated with drug administration, medical follow-up, and end-of-life care. The GPMBC model was tested and validated in a specific context, by assessing the cost-effectiveness of lapatinib plus letrozole compared with other widely used first-line therapies for post-menopausal women with hormone receptor-positive (HR+) and epidermal growth factor receptor 2-positive (HER2+) MBC. Results When tested, the GPMBC model led to incremental cost-utility ratios of CA$131 811 per QALY, CA$56 211 per QALY, and CA$102 477 per QALY for the comparison of lapatinib plus letrozole vs letrozole alone, trastuzumab plus anastrozole, and anastrozole alone, respectively. Results of the model testing were quite similar to those obtained by Delea et al., who also assessed the cost-effectiveness of lapatinib in combination with letrozole in HR+/HER2 + MBC in Canada, thus suggesting that the GPMBC model can replicate results of well-conducted economic evaluations. Conclusions The GPMBC model can be very valuable as it allows a quick and valid assessment of the cost-effectiveness

  14. Cost Effectiveness of the Instrumentalism in Occupational Therapy (IOT) Conceptual Model as a Guide for Intervention with Adolescents with Emotional and Behavioral Disorders (EBD)

    Ikiugu, Moses N.; Anderson, Lynne


    The purpose of this paper was to demonstrate the cost-effectiveness of using the Instrumentalism in Occupational Therapy (IOT) conceptual practice model as a guide for intervention to assist teenagers with emotional and behavioral disorders (EBD) transition successfully into adulthood. The cost effectiveness analysis was based on a project…

  15. Bayesian hierarchical modelling of weak lensing - the golden goal

    Heavens, Alan; Jaffe, Andrew; Hoffmann, Till; Kiessling, Alina; Wandelt, Benjamin


    To accomplish correct Bayesian inference from weak lensing shear data requires a complete statistical description of the data. The natural framework to do this is a Bayesian Hierarchical Model, which divides the chain of reasoning into component steps. Starting with a catalogue of shear estimates in tomographic bins, we build a model that allows us to sample simultaneously from the the underlying tomographic shear fields and the relevant power spectra (E-mode, B-mode, and E-B, for auto- and cross-power spectra). The procedure deals easily with masked data and intrinsic alignments. Using Gibbs sampling and messenger fields, we show with simulated data that the large (over 67000-)dimensional parameter space can be efficiently sampled and the full joint posterior probability density function for the parameters can feasibly be obtained. The method correctly recovers the underlying shear fields and all of the power spectra, including at levels well below the shot noise.

  16. A localization model to localize multiple sources using Bayesian inference

    Dunham, Joshua Rolv

    Accurate localization of a sound source in a room setting is important in both psychoacoustics and architectural acoustics. Binaural models have been proposed to explain how the brain processes and utilizes the interaural time differences (ITDs) and interaural level differences (ILDs) of sound waves arriving at the ears of a listener in determining source location. Recent work shows that applying Bayesian methods to this problem is proving fruitful. In this thesis, pink noise samples are convolved with head-related transfer functions (HRTFs) and compared to combinations of one and two anechoic speech signals convolved with different HRTFs or binaural room impulse responses (BRIRs) to simulate room positions. Through exhaustive calculation of Bayesian posterior probabilities and using a maximal likelihood approach, model selection will determine the number of sources present, and parameter estimation will result in azimuthal direction of the source(s).

  17. Bayesian Inference and Forecasting in the Stationary Bilinear Model

    Roberto Leon-Gonzalez; Fuyu Yang


    A stationary bilinear (SB) model can be used to describe processes with a time-varying degree of persistence that depends on past shocks. An example of such a process is inflation. This study develops methods for Bayesian inference, model comparison, and forecasting in the SB model. Using monthly U.K. inflation data, we find that the SB model outperforms the random walk and first order autoregressive AR(1) models in terms of root mean squared forecast errors for both the one-step-ahead and th...

  18. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Volker J. Schmid


    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  19. Introduction to Hierarchical Bayesian Modeling for Ecological Data

    Parent, Eric


    Making statistical modeling and inference more accessible to ecologists and related scientists, Introduction to Hierarchical Bayesian Modeling for Ecological Data gives readers a flexible and effective framework to learn about complex ecological processes from various sources of data. It also helps readers get started on building their own statistical models. The text begins with simple models that progressively become more complex and realistic through explanatory covariates and intermediate hidden states variables. When fitting the models to data, the authors gradually present the concepts a

  20. The Cost-Effectiveness of Low-Cost Essential Antihypertensive Medicines for Hypertension Control in China: A Modelling Study.

    Dongfeng Gu


    Full Text Available Hypertension is China's leading cardiovascular disease risk factor. Improved hypertension control in China would result in result in enormous health gains in the world's largest population. A computer simulation model projected the cost-effectiveness of hypertension treatment in Chinese adults, assuming a range of essential medicines list drug costs.The Cardiovascular Disease Policy Model-China, a Markov-style computer simulation model, simulated hypertension screening, essential medicines program implementation, hypertension control program administration, drug treatment and monitoring costs, disease-related costs, and quality-adjusted life years (QALYs gained by preventing cardiovascular disease or lost because of drug side effects in untreated hypertensive adults aged 35-84 y over 2015-2025. Cost-effectiveness was assessed in cardiovascular disease patients (secondary prevention and for two blood pressure ranges in primary prevention (stage one, 140-159/90-99 mm Hg; stage two, ≥160/≥100 mm Hg. Treatment of isolated systolic hypertension and combined systolic and diastolic hypertension were modeled as a reduction in systolic blood pressure; treatment of isolated diastolic hypertension was modeled as a reduction in diastolic blood pressure. One-way and probabilistic sensitivity analyses explored ranges of antihypertensive drug effectiveness and costs, monitoring frequency, medication adherence, side effect severity, background hypertension prevalence, antihypertensive medication treatment, case fatality, incidence and prevalence, and cardiovascular disease treatment costs. Median antihypertensive costs from Shanghai and Yunnan province were entered into the model in order to estimate the effects of very low and high drug prices. Incremental cost-effectiveness ratios less than the per capita gross domestic product of China (11,900 international dollars [Int$] in 2015 were considered cost-effective. Treating hypertensive adults with prior

  1. Bayesian analysis of recursive SVAR models with overidentifying restrictions

    Kociecki, Andrzej; Rubaszek, Michał; Ca' Zorzi, Michele


    The paper provides a novel Bayesian methodological framework to estimate structural VAR (SVAR) models with recursive identification schemes that allows for the inclusion of over-identifying restrictions. The proposed framework enables the researcher to (i) elicit the prior on the non-zero contemporaneous relations between economic variables and to (ii) derive an analytical expression for the posterior distribution and marginal data density. We illustrate our methodological framework by estima...

  2. Differential gene co-expression networks via Bayesian biclustering models

    Gao, Chuan; Zhao, Shiwen; McDowell, Ian C.; Brown, Christopher D.; Barbara E Engelhardt


    Identifying latent structure in large data matrices is essential for exploring biological processes. Here, we consider recovering gene co-expression networks from gene expression data, where each network encodes relationships between genes that are locally co-regulated by shared biological mechanisms. To do this, we develop a Bayesian statistical model for biclustering to infer subsets of co-regulated genes whose covariation may be observed in only a subset of the samples. Our biclustering me...

  3. Bayesian parsimonious covariance estimation for hierarchical linear mixed models

    Frühwirth-Schnatter, Sylvia; Tüchler, Regina


    We considered a non-centered parameterization of the standard random-effects model, which is based on the Cholesky decomposition of the variance-covariance matrix. The regression type structure of the non-centered parameterization allows to choose a simple, conditionally conjugate normal prior on the Cholesky factor. Based on the non-centered parameterization, we search for a parsimonious variance-covariance matrix by identifying the non-zero elements of the Cholesky factors using Bayesian va...

  4. Diffusion Estimation Of State-Space Models: Bayesian Formulation

    Dedecius, Kamil

    Reims: IEEE, 2014. ISBN 978-1-4799-3693-9. [The 24th IEEE International Workshop on Machine Learning for Signal Processing (MLSP2014). Reims (FR), 21.09.2014-24.09.2014] R&D Projects: GA ČR(CZ) GP14-06678P Keywords : distributed estimation * state-space models * Bayesian estimation Subject RIV: BB - Applied Statistics, Operational Research

  5. Bayesian Methods for Neural Networks and Related Models

    Titterington, D.M.


    Models such as feed-forward neural networks and certain other structures investigated in the computer science literature are not amenable to closed-form Bayesian analysis. The paper reviews the various approaches taken to overcome this difficulty, involving the use of Gaussian approximations, Markov chain Monte Carlo simulation routines and a class of non-Gaussian but “deterministic” approximations called variational approximations.

  6. Bayesian network models in brain functional connectivity analysis

    Ide, Jaime S.; Zhang, Sheng; Chiang-shan R. Li


    Much effort has been made to better understand the complex integration of distinct parts of the human brain using functional magnetic resonance imaging (fMRI). Altered functional connectivity between brain regions is associated with many neurological and mental illnesses, such as Alzheimer and Parkinson diseases, addiction, and depression. In computational science, Bayesian networks (BN) have been used in a broad range of studies to model complex data set in the presence of uncertainty and wh...

  7. Bayesian Models of Learning and Reasoning with Relations

    Chen, Dawn


    How do humans acquire relational concepts such as larger, which are essential for analogical inference and other forms of high-level reasoning? Are they necessarily innate, or can they be learned from non-relational inputs? Using comparative relations as a model domain, we show that structured relations can be learned from unstructured inputs of realistic complexity, applying bottom-up Bayesian learning mechanisms that make minimal assumptions about innate representations. First, we introduce...

  8. Bayesian regression model for seasonal forecast of precipitation over Korea

    Jo, Seongil; Lim, Yaeji; Lee, Jaeyong; Kang, Hyun-Suk; Oh, Hee-Seok


    In this paper, we apply three different Bayesian methods to the seasonal forecasting of the precipitation in a region around Korea (32.5°N-42.5°N, 122.5°E-132.5°E). We focus on the precipitation of summer season (June-July-August; JJA) for the period of 1979-2007 using the precipitation produced by the Global Data Assimilation and Prediction System (GDAPS) as predictors. Through cross-validation, we demonstrate improvement for seasonal forecast of precipitation in terms of root mean squared error (RMSE) and linear error in probability space score (LEPS). The proposed methods yield RMSE of 1.09 and LEPS of 0.31 between the predicted and observed precipitations, while the prediction using GDAPS output only produces RMSE of 1.20 and LEPS of 0.33 for CPC Merged Analyzed Precipitation (CMAP) data. For station-measured precipitation data, the RMSE and LEPS of the proposed Bayesian methods are 0.53 and 0.29, while GDAPS output is 0.66 and 0.33, respectively. The methods seem to capture the spatial pattern of the observed precipitation. The Bayesian paradigm incorporates the model uncertainty as an integral part of modeling in a natural way. We provide a probabilistic forecast integrating model uncertainty.

  9. Statistical modelling of railway track geometry degradation using hierarchical Bayesian models

    Andrade, António Ramos; Teixeira, P. Fonseca


    Railway maintenance planners require a predictive model that can assess the railway track geometry degradation. The present paper uses a hierarchical Bayesian model as a tool to model the main two quality indicators related to railway track geometry degradation: the standard deviation of longitudinal level defects and the standard deviation of horizontal alignment defects. Hierarchical Bayesian Models (HBM) are flexible statistical models that allow specifying different spatially correlated c...

  10. The comparative cost-effectiveness of an equity-focused approach to child survival, health, and nutrition: a modelling approach.

    Carrera, Carlos; Azrack, Adeline; Begkoyian, Genevieve; Pfaffmann, Jerome; Ribaira, Eric; O'Connell, Thomas; Doughty, Patricia; Aung, Kyaw Myint; Prieto, Lorena; Rasanathan, Kumanan; Sharkey, Alyssa; Chopra, Mickey; Knippenberg, Rudolf


    Progress on child mortality and undernutrition has seen widening inequities and a concentration of child deaths and undernutrition in the most deprived communities, threatening the achievement of the Millennium Development Goals. Conversely, a series of recent process and technological innovations have provided effective and efficient options to reach the most deprived populations. These trends raise the possibility that the perceived trade-off between equity and efficiency no longer applies for child health--that prioritising services for the poorest and most marginalised is now more effective and cost effective than mainstream approaches. We tested this hypothesis with a mathematical-modelling approach by comparing the cost-effectiveness in terms of child deaths and stunting events averted between two approaches (from 2011-15 in 14 countries and one province): an equity-focused approach that prioritises the most deprived communities, and a mainstream approach that is representative of current strategies. We combined some existing models, notably the Marginal Budgeting for Bottlenecks Toolkit and the Lives Saved Tool, to do our analysis. We showed that, with the same level of investment, disproportionately higher effects are possible by prioritising the poorest and most marginalised populations, for averting both child mortality and stunting. Our results suggest that an equity-focused approach could result in sharper decreases in child mortality and stunting and higher cost-effectiveness than mainstream approaches, while reducing inequities in effective intervention coverage, health outcomes, and out-of-pocket spending between the most and least deprived groups and geographic areas within countries. Our findings should be interpreted with caution due to uncertainties around some of the model parameters and baseline data. Further research is needed to address some of these gaps in the evidence base. Strategies for improving child nutrition and survival, however

  11. AIC, BIC, Bayesian evidence against the interacting dark energy model

    Szydlowski, Marek [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland); Krawiec, Adam [Jagiellonian University, Institute of Economics, Finance and Management, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland); Kurek, Aleksandra [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Kamionka, Michal [University of Wroclaw, Astronomical Institute, Wroclaw (Poland)


    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative - the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock- Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam's razor we are inclined to reject this model. (orig.)

  12. AIC, BIC, Bayesian evidence against the interacting dark energy model

    Szydłowski, Marek, E-mail: [Astronomical Observatory, Jagiellonian University, Orla 171, 30-244, Kraków (Poland); Mark Kac Complex Systems Research Centre, Jagiellonian University, Reymonta 4, 30-059, Kraków (Poland); Krawiec, Adam, E-mail: [Institute of Economics, Finance and Management, Jagiellonian University, Łojasiewicza 4, 30-348, Kraków (Poland); Mark Kac Complex Systems Research Centre, Jagiellonian University, Reymonta 4, 30-059, Kraków (Poland); Kurek, Aleksandra, E-mail: [Astronomical Observatory, Jagiellonian University, Orla 171, 30-244, Kraków (Poland); Kamionka, Michał, E-mail: [Astronomical Institute, University of Wrocław, ul. Kopernika 11, 51-622, Wrocław (Poland)


    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative—the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam’s principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock–Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam’s razor we are inclined to reject this model.

  13. AIC, BIC, Bayesian evidence against the interacting dark energy model

    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative—the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam’s principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock–Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam’s razor we are inclined to reject this model

  14. Dissecting Magnetar Variability with Bayesian Hierarchical Models

    Huppenkothen, Daniela; Brewer, Brendon J.; Hogg, David W.; Murray, Iain; Frean, Marcus; Elenbaas, Chris; Watts, Anna L.; Levin, Yuri; van der Horst, Alexander J.; Kouveliotou, Chryssa


    Neutron stars are a prime laboratory for testing physical processes under conditions of strong gravity, high density, and extreme magnetic fields. Among the zoo of neutron star phenomena, magnetars stand out for their bursting behavior, ranging from extremely bright, rare giant flares to numerous, less energetic recurrent bursts. The exact trigger and emission mechanisms for these bursts are not known; favored models involve either a crust fracture and subsequent energy release into the magnetosphere, or explosive reconnection of magnetic field lines. In the absence of a predictive model, understanding the physical processes responsible for magnetar burst variability is difficult. Here, we develop an empirical model that decomposes magnetar bursts into a superposition of small spike-like features with a simple functional form, where the number of model components is itself part of the inference problem. The cascades of spikes that we model might be formed by avalanches of reconnection, or crust rupture aftershocks. Using Markov Chain Monte Carlo sampling augmented with reversible jumps between models with different numbers of parameters, we characterize the posterior distributions of the model parameters and the number of components per burst. We relate these model parameters to physical quantities in the system, and show for the first time that the variability within a burst does not conform to predictions from ideas of self-organized criticality. We also examine how well the properties of the spikes fit the predictions of simplified cascade models for the different trigger mechanisms.

  15. Dissecting magnetar variability with Bayesian hierarchical models

    Huppenkothen, D; Hogg, D W; Murray, I; Frean, M; Elenbaas, C; Watts, A L; Levin, Y; van der Horst, A J; Kouveliotou, C


    Neutron stars are a prime laboratory for testing physical processes under conditions of strong gravity, high density, and extreme magnetic fields. Among the zoo of neutron star phenomena, magnetars stand out for their bursting behaviour, ranging from extremely bright, rare giant flares to numerous, less energetic recurrent bursts. The exact trigger and emission mechanisms for these bursts are not known; favoured models involve either a crust fracture and subsequent energy release into the magnetosphere, or explosive reconnection of magnetic field lines. In the absence of a predictive model, understanding the physical processes responsible for magnetar burst variability is difficult. Here, we develop an empirical model that decomposes magnetar bursts into a superposition of small spike-like features with a simple functional form, where the number of model components is itself part of the inference problem. The cascades of spikes that we model might be formed by avalanches of reconnection, or crust rupture afte...

  16. Farmed deer: A veterinary model for chronic mycobacterial diseases that is accessible, appropriate and cost-effective

    Frank Griffin


    Full Text Available Although most studies in immunology have used inbred mice as the experimental model to study fundamental immune mechanisms they have been proven to be limited in their ability to chart complex functional immune pathways, such as are seen in outbred populations of humans or animals. Translation of the findings from inbred mouse studies into practical solutions in therapeutics or the clinic has been remarkably unproductive compared with many other areas of clinical practice in human and veterinary medicine. Access to an unlimited array of mouse strains and an increasing number of genetically modified strains continues to sustain their paramount position in immunology research. Since the mouse studies have provided little more than the dictionary and glossary of immunology, another approach will be required to write the classic exposition of functional immunity. Domestic animals such as ruminants and swine present worthwhile alternatives as models for immunological research into infectious diseases, which may be more informative and cost effective. The original constraint on large animal research through a lack of reagents has been superseded by new molecular technologies and robotics that allow research to progress from gene discovery to systems biology, seamlessly. The current review attempts to highlight how exotic animals such as deer can leverage off the knowledge of ruminant genomics to provide cost-effective models for research into complex, chronic infections. The unique opportunity they provide relates to their diversity and polymorphic genotypes and the integrity of their phenotype for a range of infectious diseases.

  17. Dynamic model based on Bayesian method for energy security assessment

    Highlights: • Methodology for dynamic indicator model construction and forecasting of indicators. • Application of dynamic indicator model for energy system development scenarios. • Expert judgement involvement using Bayesian method. - Abstract: The methodology for the dynamic indicator model construction and forecasting of indicators for the assessment of energy security level is presented in this article. An indicator is a special index, which provides numerical values to important factors for the investigated area. In real life, models of different processes take into account various factors that are time-dependent and dependent on each other. Thus, it is advisable to construct a dynamic model in order to describe these dependences. The energy security indicators are used as factors in the dynamic model. Usually, the values of indicators are obtained from statistical data. The developed dynamic model enables to forecast indicators’ variation taking into account changes in system configuration. The energy system development is usually based on a new object construction. Since the parameters of changes of the new system are not exactly known, information about their influences on indicators could not be involved in the model by deterministic methods. Thus, dynamic indicators’ model based on historical data is adjusted by probabilistic model with the influence of new factors on indicators using the Bayesian method

  18. A Bayesian Network View on Nested Effects Models

    Fröhlich Holger


    Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the /Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.

  19. Probe Error Modeling Research Based on Bayesian Network

    Wu Huaiqiang; Xing Zilong; Zhang Jian; Yan Yan


    Probe calibration is carried out under specific conditions; most of the error caused by the change of speed parameter has not been corrected. In order to reduce the measuring error influence on measurement accuracy, this article analyzes the relationship between speed parameter and probe error, and use Bayesian network to establish the model of probe error. Model takes account of prior knowledge and sample data, with the updating of data, which can reflect the change of the errors of the probe and constantly revised modeling results.

  20. Bayesian inference and model comparison for metallic fatigue data

    Babuška, Ivo


    In this work, we present a statistical treatment of stress-life (S-N) data drawn from a collection of records of fatigue experiments that were performed on 75S-T6 aluminum alloys. Our main objective is to predict the fatigue life of materials by providing a systematic approach to model calibration, model selection and model ranking with reference to S-N data. To this purpose, we consider fatigue-limit models and random fatigue-limit models that are specially designed to allow the treatment of the run-outs (right-censored data). We first fit the models to the data by maximum likelihood methods and estimate the quantiles of the life distribution of the alloy specimen. To assess the robustness of the estimation of the quantile functions, we obtain bootstrap confidence bands by stratified resampling with respect to the cycle ratio. We then compare and rank the models by classical measures of fit based on information criteria. We also consider a Bayesian approach that provides, under the prior distribution of the model parameters selected by the user, their simulation-based posterior distributions. We implement and apply Bayesian model comparison methods, such as Bayes factor ranking and predictive information criteria based on cross-validation techniques under various a priori scenarios.

  1. Bayesian inference and model comparison for metallic fatigue data

    Babuška, Ivo; Sawlan, Zaid; Scavino, Marco; Szabó, Barna; Tempone, Raúl


    In this work, we present a statistical treatment of stress-life (S-N) data drawn from a collection of records of fatigue experiments that were performed on 75S-T6 aluminum alloys. Our main objective is to predict the fatigue life of materials by providing a systematic approach to model calibration, model selection and model ranking with reference to S-N data. To this purpose, we consider fatigue-limit models and random fatigue-limit models that are specially designed to allow the treatment of the run-outs (right-censored data). We first fit the models to the data by maximum likelihood methods and estimate the quantiles of the life distribution of the alloy specimen. To assess the robustness of the estimation of the quantile functions, we obtain bootstrap confidence bands by stratified resampling with respect to the cycle ratio. We then compare and rank the models by classical measures of fit based on information criteria. We also consider a Bayesian approach that provides, under the prior distribution of the model parameters selected by the user, their simulation-based posterior distributions. We implement and apply Bayesian model comparison methods, such as Bayes factor ranking and predictive information criteria based on cross-validation techniques under various a priori scenarios.

  2. A Bayesian Model for Discovering Typological Implications

    Daumé, Hal


    A standard form of analysis for linguistic typology is the universal implication. These implications state facts about the range of extant languages, such as ``if objects come after verbs, then adjectives come after nouns.'' Such implications are typically discovered by painstaking hand analysis over a small sample of languages. We propose a computational model for assisting at this process. Our model is able to discover both well-known implications as well as some novel implications that deserve further study. Moreover, through a careful application of hierarchical analysis, we are able to cope with the well-known sampling problem: languages are not independent.

  3. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Alejandro Jara


    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  4. KNET: Integrating Hypermedia and Bayesian Modeling

    Chavez, R. Martin; Cooper, Gregory F.


    KNET is a general-purpose shell for constructing expert systems based on belief networks and decision networks. Such networks serve as graphical representations for decision models, in which the knowledge engineer must define clearly the alternatives, states, preferences, and relationships that constitute a decision basis. KNET contains a knowledge-engineering core written in Object Pascal and an interface that tightly integrates HyperCard, a hypertext authoring tool for the Apple Macintosh c...

  5. An Assessment of the Expected Cost-Effectiveness of Quadrivalent Influenza Vaccines in Ontario, Canada Using a Static Model.

    Ayman Chit

    Full Text Available Ontario, Canada, immunizes against influenza using a trivalent inactivated influenza vaccine (IIV3 under a Universal Influenza Immunization Program (UIIP. The UIIP offers IIV3 free-of-charge to all Ontarians over 6 months of age. A newly approved quadrivalent inactivated influenza vaccine (IIV4 offers wider protection against influenza B disease. We explored the expected cost-utility and budget impact of replacing IIV3 with IIV4, within the context of Ontario's UIIP, using a probabilistic and static cost-utility model. Wherever possible, epidemiological and cost data were obtained from Ontario sources. Canadian or U.S. sources were used when Ontario data were not available. Vaccine efficacy for IIV3 was obtained from the literature. IIV4 efficacy was derived from meta-analysis of strain-specific vaccine efficacy. Conservatively, herd protection was not considered. In the base case, we used IIV3 and IIV4 prices of $5.5/dose and $7/dose, respectively. We conducted a sensitivity analysis on the price of IIV4, as well as standard univariate and multivariate statistical uncertainty analyses. Over a typical influenza season, relative to IIV3, IIV4 is expected to avert an additional 2,516 influenza cases, 1,683 influenza-associated medical visits, 27 influenza-associated hospitalizations, and 5 influenza-associated deaths. From a societal perspective, IIV4 would generate 76 more Quality Adjusted Life Years (QALYs and a net societal budget impact of $4,784,112. The incremental cost effectiveness ratio for this comparison was $63,773/QALY. IIV4 remains cost-effective up to a 53% price premium over IIV3. A probabilistic sensitivity analysis showed that IIV4 was cost-effective with a probability of 65% for a threshold of $100,000/QALY gained. IIV4 is expected to achieve reductions in influenza-related morbidity and mortality compared to IIV3. Despite not accounting for herd protection, IIV4 is still expected to be a cost-effective alternative to IIV3 up to

  6. Is computer aided detection (CAD cost effective in screening mammography? A model based on the CADET II study

    Wallis Matthew G


    Full Text Available Abstract Background Single reading with computer aided detection (CAD is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£, year 2007/08 of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner. Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate CAD is unlikely to be a cost effective alternative to double reading for mammography screening

  7. Lack of confidence in approximate Bayesian computation model choice.

    Robert, Christian P; Cornuet, Jean-Marie; Marin, Jean-Michel; Pillai, Natesh S


    Approximate Bayesian computation (ABC) have become an essential tool for the analysis of complex stochastic models. Grelaud et al. [(2009) Bayesian Anal 3:427-442] advocated the use of ABC for model choice in the specific case of Gibbs random fields, relying on an intermodel sufficiency property to show that the approximation was legitimate. We implemented ABC model choice in a wide range of phylogenetic models in the Do It Yourself-ABC (DIY-ABC) software [Cornuet et al. (2008) Bioinformatics 24:2713-2719]. We now present arguments as to why the theoretical arguments for ABC model choice are missing, because the algorithm involves an unknown loss of information induced by the use of insufficient summary statistics. The approximation error of the posterior probabilities of the models under comparison may thus be unrelated with the computational effort spent in running an ABC algorithm. We then conclude that additional empirical verifications of the performances of the ABC procedure as those available in DIY-ABC are necessary to conduct model choice. PMID:21876135

  8. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    Hack, C Eric


    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842

  9. A study of finite mixture model: Bayesian approach on financial time series data

    Phoong, Seuk-Yen; Ismail, Mohd Tahir


    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  10. Cost-effective choices of marine fuels in a carbon-constrained world: results from a global energy model.

    Taljegard, Maria; Brynolf, Selma; Grahn, Maria; Andersson, Karin; Johnson, Hannes


    The regionalized Global Energy Transition model has been modified to include a more detailed shipping sector in order to assess what marine fuels and propulsion technologies might be cost-effective by 2050 when achieving an atmospheric CO2 concentration of 400 or 500 ppm by the year 2100. The robustness of the results was examined in a Monte Carlo analysis, varying uncertain parameters and technology options, including the amount of primary energy resources, the availability of carbon capture and storage (CCS) technologies, and costs of different technologies and fuels. The four main findings are (i) it is cost-effective to start the phase out of fuel oil from the shipping sector in the next decade; (ii) natural gas-based fuels (liquefied natural gas and methanol) are the most probable substitutes during the study period; (iii) availability of CCS, the CO2 target, the liquefied natural gas tank cost and potential oil resources affect marine fuel choices significantly; and (iv) biofuels rarely play a major role in the shipping sector, due to limited supply and competition for bioenergy from other energy sectors. PMID:25286282

  11. A 'cost-effective' probabilistic model to select the dominant factors affecting the variation of the component failure rate

    Within the framework of a Probabilistic Safety Assessment (PSA), the component failure rate λ is a key parameter in the sense that the study of its behavior gives the essential information for estimating the current values as well as the trends in the failure probabilities of interest. Since there is an infinite variety of possible underlying factors which might cause changes in λ (e.g. operating time, maintenance practices, component environment, etc.), an 'importance ranking' process of these factors is considered most desirable to prioritize research efforts. To be 'cost-effective', the modeling effort must be small, i.e. essentially involving no estimation of additional parameters other than λ. In this paper, using a multivariate data analysis technique and various statistical measures, such a 'cost-effective' screening process has been developed. Dominant factors affecting the failure rate of any components of interest can easily be identified and the appropriateness of current research plans (e.g. on the necessity of performing aging studies) can be validated. (author)

  12. Macroeconomic Forecasts in Models with Bayesian Averaging of Classical Estimates

    Piotr Białowolski


    Full Text Available The aim of this paper is to construct a forecasting model oriented on predicting basic macroeconomic variables, namely: the GDP growth rate, the unemployment rate, and the consumer price inflation. In order to select the set of the best regressors, Bayesian Averaging of Classical Estimators (BACE is employed. The models are atheoretical (i.e. they do not reflect causal relationships postulated by the macroeconomic theory and the role of regressors is played by business and consumer tendency survey-based indicators. Additionally, survey-based indicators are included with a lag that enables to forecast the variables of interest (GDP, unemployment, and inflation for the four forthcoming quarters without the need to make any additional assumptions concerning the values of predictor variables in the forecast period.  Bayesian Averaging of Classical Estimators is a method allowing for full and controlled overview of all econometric models which can be obtained out of a particular set of regressors. In this paper authors describe the method of generating a family of econometric models and the procedure for selection of a final forecasting model. Verification of the procedure is performed by means of out-of-sample forecasts of main economic variables for the quarters of 2011. The accuracy of the forecasts implies that there is still a need to search for new solutions in the atheoretical modelling.

  13. Bayesian joint modeling of longitudinal and spatial survival AIDS data.

    Martins, Rui; Silva, Giovani L; Andreozzi, Valeska


    Joint analysis of longitudinal and survival data has received increasing attention in the recent years, especially for analyzing cancer and AIDS data. As both repeated measurements (longitudinal) and time-to-event (survival) outcomes are observed in an individual, a joint modeling is more appropriate because it takes into account the dependence between the two types of responses, which are often analyzed separately. We propose a Bayesian hierarchical model for jointly modeling longitudinal and survival data considering functional time and spatial frailty effects, respectively. That is, the proposed model deals with non-linear longitudinal effects and spatial survival effects accounting for the unobserved heterogeneity among individuals living in the same region. This joint approach is applied to a cohort study of patients with HIV/AIDS in Brazil during the years 2002-2006. Our Bayesian joint model presents considerable improvements in the estimation of survival times of the Brazilian HIV/AIDS patients when compared with those obtained through a separate survival model and shows that the spatial risk of death is the same across the different Brazilian states. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26990773

  14. A Cost-Effective Model for Increasing Access to Mental Health Care at the Primary Care Level in Nigeria.

    Omigbodun, Olayinka O.


    BACKGROUND: Although effective treatment modalities for mental health problems currently exist in Nigeria, they remain irrelevant to the 70% of Nigeria's 120 million people who have no access to modern mental health care services. The nation's Health Ministry has adopted mental health as the 9th component of Primary Health Care (PHC) but ten years later, very little has been done to put this policy into practice. Mental Health is part of the training curriculum of PHC workers, but this appears to be money down the drain. AIMS OF THE STUDY: To review the weaknesses and problems with existing mode of mental health training for PHC workers with a view to developing a cost-effective model for integration. METHODS: A review and analysis of current training methods and their impact on the provision of mental health services in PHC in a rural and an urban local government area in Nigeria were done. An analysis of tested approaches for integrating mental health into PHC was carried out and a cost-effective model for the Nigerian situation based on these approaches and the local circumstances was derived. RESULTS: Virtually no mental health services are being provided at the PHC levels in the two local government areas studied. Current training is not effective and virtually none of what was learnt appears to be used by PHC workers in the field. Two models for integrating mental health into PHC emerged from the literature. Enhancement, which refers to the training of PHC personnel to carry out mental health care independently is not effective on its own and needs to be accompanied by supervision of PHC staff. Linkage, which occurs when mental health professionals leave their hospital bases to provide mental health care in PHC settings, requires a large number of skilled staff who are unavailable in Nigeria. In view of past experiences in Nigeria and other countries, a mixed enhancement-linkage model for mental health in PHC appears to be the most cost-effective approach for

  15. Modeling operational risks of the nuclear industry with Bayesian networks

    Basically, planning a new industrial plant requires information on the industrial management, regulations, site selection, definition of initial and planned capacity, and on the estimation of the potential demand. However, this is far from enough to assure the success of an industrial enterprise. Unexpected and extremely damaging events may occur that deviates from the original plan. The so-called operational risks are not only in the system, equipment, process or human (technical or managerial) failures. They are also in intentional events such as frauds and sabotage, or extreme events like terrorist attacks or radiological accidents and even on public reaction to perceived environmental or future generation impacts. For the nuclear industry, it is a challenge to identify and to assess the operational risks and their various sources. Early identification of operational risks can help in preparing contingency plans, to delay the decision to invest or to approve a project that can, at an extreme, affect the public perception of the nuclear energy. A major problem in modeling operational risk losses is the lack of internal data that are essential, for example, to apply the loss distribution approach. As an alternative, methods that consider qualitative and subjective information can be applied, for example, fuzzy logic, neural networks, system dynamic or Bayesian networks. An advantage of applying Bayesian networks to model operational risk is the possibility to include expert opinions and variables of interest, to structure the model via causal dependencies among these variables, and to specify subjective prior and conditional probabilities distributions at each step or network node. This paper suggests a classification of operational risks in industry and discusses the benefits and obstacles of the Bayesian networks approach to model those risks. (author)

  16. Finite element modelling approaches for well-ordered porous metallic materials for orthopaedic applications: cost effectiveness and geometrical considerations.

    Quevedo González, Fernando José; Nuño, Natalia


    The mechanical properties of well-ordered porous materials are related to their geometrical parameters at the mesoscale. Finite element (FE) analysis is a powerful tool to design well-ordered porous materials by analysing the mechanical behaviour. However, FE models are often computationally expensive. This article aims to develop a cost-effective FE model to simulate well-ordered porous metallic materials for orthopaedic applications. Solid and beam FE modelling approaches are compared, using finite size and infinite media models considering cubic unit cell geometry. The model is then applied to compare two unit cell geometries: cubic and diamond. Models having finite size provide similar results than the infinite media model approach for large sample sizes. In addition, these finite size models also capture the influence of the boundary conditions on the mechanical response for small sample sizes. The beam FE modelling approach showed little computational cost and similar results to the solid FE modelling approach. Diamond unit cell geometry appeared to be more suitable for orthopaedic applications than the cubic unit cell geometry. PMID:26260268

  17. Two-stage Bayesian models-application to ZEDB project

    Bunea, C. [George Washington University, School of Applied Science, 1776 G Street, NW, Suite 108, Washington, DC 20052 (United States)]. E-mail:; Charitos, T. [Institute of Information and Computing Sciences, Padualaan 14, de Uithof, 3508 TB, Utrecht (Netherlands)]. E-mail:; Cooke, R.M. [Delft University of Technology, EWI Faculty, Mekelweg 4, 2628 CD, Delft (Netherlands)]. E-mail: r.m.cooke@ewi.tudelft.n1; Becker, G. [RISA, Krumme Str., Berlin 10627 (Germany)]. E-mail:


    A well-known mathematical tool to analyze plant specific reliability data for nuclear power facilities is the two-stage Bayesian model. Such two-stage Bayesian models are standard practice nowadays, for example in the German ZEDB project or in the Swedish T-Book, although they may differ in their mathematical models and software implementation. In this paper, we review the mathematical model, its underlying assumptions and supporting arguments. Reasonable conditional assumptions are made to yield tractable and mathematically valid form for the failure rate at plant of interest, given failures and operational times at other plants in the population. The posterior probability of failure rate at plant of interest is sensitive to the choice of hyperprior parameters since the effect of hyperprior distribution will never be dominated by the effect of observation. The methods of Poern and Jeffrey for choosing distributions over hyperparameters are discussed. Furthermore, we will perform verification tasks associated with the theoretical model presented in this paper. The present software implementation produces good agreement with ZEDB results for various prior distributions. The difference between our results and those of ZEDB reflect differences that may arise from numerical implementation, as that would use different step size and truncation bounds.

  18. Two-stage Bayesian models-application to ZEDB project

    A well-known mathematical tool to analyze plant specific reliability data for nuclear power facilities is the two-stage Bayesian model. Such two-stage Bayesian models are standard practice nowadays, for example in the German ZEDB project or in the Swedish T-Book, although they may differ in their mathematical models and software implementation. In this paper, we review the mathematical model, its underlying assumptions and supporting arguments. Reasonable conditional assumptions are made to yield tractable and mathematically valid form for the failure rate at plant of interest, given failures and operational times at other plants in the population. The posterior probability of failure rate at plant of interest is sensitive to the choice of hyperprior parameters since the effect of hyperprior distribution will never be dominated by the effect of observation. The methods of Poern and Jeffrey for choosing distributions over hyperparameters are discussed. Furthermore, we will perform verification tasks associated with the theoretical model presented in this paper. The present software implementation produces good agreement with ZEDB results for various prior distributions. The difference between our results and those of ZEDB reflect differences that may arise from numerical implementation, as that would use different step size and truncation bounds

  19. Quantum-Like Bayesian Networks for Modeling Decision Making.

    Moreira, Catarina; Wichert, Andreas


    In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios. PMID:26858669

  20. Development of a cyber security risk model using Bayesian networks

    Cyber security is an emerging safety issue in the nuclear industry, especially in the instrumentation and control (I and C) field. To address the cyber security issue systematically, a model that can be used for cyber security evaluation is required. In this work, a cyber security risk model based on a Bayesian network is suggested for evaluating cyber security for nuclear facilities in an integrated manner. The suggested model enables the evaluation of both the procedural and technical aspects of cyber security, which are related to compliance with regulatory guides and system architectures, respectively. The activity-quality analysis model was developed to evaluate how well people and/or organizations comply with the regulatory guidance associated with cyber security. The architecture analysis model was created to evaluate vulnerabilities and mitigation measures with respect to their effect on cyber security. The two models are integrated into a single model, which is called the cyber security risk model, so that cyber security can be evaluated from procedural and technical viewpoints at the same time. The model was applied to evaluate the cyber security risk of the reactor protection system (RPS) of a research reactor and to demonstrate its usefulness and feasibility. - Highlights: • We developed the cyber security risk model can be find the weak point of cyber security integrated two cyber analysis models by using Bayesian Network. • One is the activity-quality model signifies how people and/or organization comply with the cyber security regulatory guide. • Other is the architecture model represents the probability of cyber-attack on RPS architecture. • The cyber security risk model can provide evidence that is able to determine the key element for cyber security for RPS of a research reactor

  1. Cost-Effectiveness of Interventions to Promote Physical Activity: A Modelling Study

    Linda J Cobiac; Vos, Theo; Barendregt, Jan J


    Linda Cobiac and colleagues model the costs and health outcomes associated with interventions to improve physical activity in the population, and identify specific interventions that are likely to be cost-saving.

  2. A numerical model for cost effective mitigation of CO₂ in the EU with stochastic carbon sink

    Gren, Ing-Marie; Munnich, Miriam; Carlsson, Mattias; Elofsson, Katarina


    This paper presents a model for the analysis of the potential of carbon sinks in the EU Emissions Trading Scheme (ETS) under conditions of stochastic carbon sequestration by forest land. A partial equilibrium model is developed which takes into account both the ETS and national commitments. Chance constraint programming is used to analyze the role of stochastic carbon sinks for national and EU-wide costs as well as carbon allowance price. The results show that the inclusion of the carbon sink...


    David D. Hanagal


    Full Text Available In this paper, we study the compound Poisson distribution as the shared frailty distribution and two different baseline distributions namely Pareto and linear failure rate distributions for modeling survival data. We are using the Markov Chain Monte Carlo (MCMC technique to estimate parameters of the proposed models by introducing the Bayesian estimation procedure. In the present study, a simulation is done to compare the true values of parameters with the estimated values. We try to fit the proposed models to a real life bivariate survival data set of McGrilchrist and Aisbett (1991 related to kidney infection. Also, we present a comparison study for the same data by using model selection criterion, and suggest a better frailty model out of two proposed frailty models.

  4. Experimental validation of a Bayesian model of visual acuity.

    Dalimier, Eugénie


    Based on standard procedures used in optometry clinics, we compare measurements of visual acuity for 10 subjects (11 eyes tested) in the presence of natural ocular aberrations and different degrees of induced defocus, with the predictions given by a Bayesian model customized with aberrometric data of the eye. The absolute predictions of the model, without any adjustment, show good agreement with the experimental data, in terms of correlation and absolute error. The efficiency of the model is discussed in comparison with image quality metrics and other customized visual process models. An analysis of the importance and customization of each stage of the model is also given; it stresses the potential high predictive power from precise modeling of ocular and neural transfer functions.

  5. Assessing global vegetation activity using spatio-temporal Bayesian modelling

    Mulder, Vera L.; van Eck, Christel M.; Friedlingstein, Pierre; Regnier, Pierre A. G.


    This work demonstrates the potential of modelling vegetation activity using a hierarchical Bayesian spatio-temporal model. This approach allows modelling changes in vegetation and climate simultaneous in space and time. Changes of vegetation activity such as phenology are modelled as a dynamic process depending on climate variability in both space and time. Additionally, differences in observed vegetation status can be contributed to other abiotic ecosystem properties, e.g. soil and terrain properties. Although these properties do not change in time, they do change in space and may provide valuable information in addition to the climate dynamics. The spatio-temporal Bayesian models were calibrated at a regional scale because the local trends in space and time can be better captured by the model. The regional subsets were defined according to the SREX segmentation, as defined by the IPCC. Each region is considered being relatively homogeneous in terms of large-scale climate and biomes, still capturing small-scale (grid-cell level) variability. Modelling within these regions is hence expected to be less uncertain due to the absence of these large-scale patterns, compared to a global approach. This overall modelling approach allows the comparison of model behavior for the different regions and may provide insights on the main dynamic processes driving the interaction between vegetation and climate within different regions. The data employed in this study encompasses the global datasets for soil properties (SoilGrids), terrain properties (Global Relief Model based on SRTM DEM and ETOPO), monthly time series of satellite-derived vegetation indices (GIMMS NDVI3g) and climate variables (Princeton Meteorological Forcing Dataset). The findings proved the potential of a spatio-temporal Bayesian modelling approach for assessing vegetation dynamics, at a regional scale. The observed interrelationships of the employed data and the different spatial and temporal trends support

  6. Non-parametric Bayesian modeling of cervical mucus symptom

    Bin, Riccardo De; Scarpa, Bruno


    The analysis of the cervical mucus symptom is useful to identify the period of maximum fertility of a woman. In this paper we analyze the daily evolution of the cervical mucus symptom during the menstrual cycle, based on the data collected in two retrospective studies, in which the mucus symptom is treated as an ordinal variable. To produce our statistical model, we follow a non-parametric Bayesian approach. In particular, we use the idea of non-parametric mixtures of rounded continuous kerne...

  7. Bayesian statistic methods and theri application in probabilistic simulation models

    Sergio Iannazzo


    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  8. Bayesian calibration of power plant models for accurate performance prediction

    Highlights: • Bayesian calibration is applied to power plant performance prediction. • Measurements from a plant in operation are used for model calibration. • A gas turbine performance model and steam cycle model are calibrated. • An integrated plant model is derived. • Part load efficiency is accurately predicted as a function of ambient conditions. - Abstract: Gas turbine combined cycles are expected to play an increasingly important role in the balancing of supply and demand in future energy markets. Thermodynamic modeling of these energy systems is frequently applied to assist in decision making processes related to the management of plant operation and maintenance. In most cases, model inputs, parameters and outputs are treated as deterministic quantities and plant operators make decisions with limited or no regard of uncertainties. As the steady integration of wind and solar energy into the energy market induces extra uncertainties, part load operation and reliability are becoming increasingly important. In the current study, methods are proposed to not only quantify various types of uncertainties in measurements and plant model parameters using measured data, but to also assess their effect on various aspects of performance prediction. The authors aim to account for model parameter and measurement uncertainty, and for systematic discrepancy of models with respect to reality. For this purpose, the Bayesian calibration framework of Kennedy and O’Hagan is used, which is especially suitable for high-dimensional industrial problems. The article derives a calibrated model of the plant efficiency as a function of ambient conditions and operational parameters, which is also accurate in part load. The article shows that complete statistical modeling of power plants not only enhances process models, but can also increases confidence in operational decisions

  9. Modelling the impact and cost-effectiveness of the HIV intervention programme amongst commercial sex workers in Ahmedabad, Gujarat, India

    Foss Anna M


    Full Text Available Abstract Background Ahmedabad is an industrial city in Gujarat, India. In 2003, the HIV prevalence among commercial sex workers (CSWs in Ahmedabad reached 13.0%. In response, the Jyoti Sangh HIV prevention programme for CSWs was initiated, which involves outreach, peer education, condom distribution, and free STD clinics. Two surveys were performed among CSWs in 1999 and 2003. This study estimates the cost-effectiveness of the Jyoti Sangh HIV prevention programme. Methods A dynamic mathematical model was used with survey and intervention-specific data from Ahmedabad to estimate the HIV impact of the Jyoti Sangh project for the 51 months between the two CSW surveys. Uncertainty analysis was used to obtain different model fits to the HIV/STI epidemiological data, producing a range for the HIV impact of the project. Financial and economic costs of the intervention were estimated from the provider's perspective for the same time period. The cost per HIV-infection averted was estimated. Results Over 51 months, projections suggest that the intervention averted 624 and 5,131 HIV cases among the CSWs and their clients, respectively. This equates to a 54% and 51% decrease in the HIV infections that would have occurred among the CSWs and clients without the intervention. In the absence of intervention, the model predicts that the HIV prevalence amongst the CSWs in 2003 would have been 26%, almost twice that with the intervention. Cost per HIV infection averted, excluding and including peer educator economic costs, was USD 59 and USD 98 respectively. Conclusion This study demonstrated that targeted CSW interventions in India can be cost-effective, and highlights the importance of replicating this effort in other similar settings.

  10. One-Stage and Bayesian Two-Stage Optimal Designs for Mixture Models

    Lin, Hefang


    In this research, Bayesian two-stage D-D optimal designs for mixture experiments with or without process variables under model uncertainty are developed. A Bayesian optimality criterion is used in the first stage to minimize the determinant of the posterior variances of the parameters. The second stage design is then generated according to an optimality procedure that collaborates with the improved model from first stage data. Our results show that the Bayesian two-stage D-D optimal design...

  11. Uncovering Transcriptional Regulatory Networks by Sparse Bayesian Factor Model

    Qi Yuan(Alan


    Full Text Available Abstract The problem of uncovering transcriptional regulation by transcription factors (TFs based on microarray data is considered. A novel Bayesian sparse correlated rectified factor model (BSCRFM is proposed that models the unknown TF protein level activity, the correlated regulations between TFs, and the sparse nature of TF-regulated genes. The model admits prior knowledge from existing database regarding TF-regulated target genes based on a sparse prior and through a developed Gibbs sampling algorithm, a context-specific transcriptional regulatory network specific to the experimental condition of the microarray data can be obtained. The proposed model and the Gibbs sampling algorithm were evaluated on the simulated systems, and results demonstrated the validity and effectiveness of the proposed approach. The proposed model was then applied to the breast cancer microarray data of patients with Estrogen Receptor positive ( status and Estrogen Receptor negative ( status, respectively.

  12. Efficient multilevel brain tumor segmentation with integrated bayesian model classification.

    Corso, J J; Sharon, E; Dube, S; El-Saden, S; Sinha, U; Yuille, A


    We present a new method for automatic segmentation of heterogeneous image data that takes a step toward bridging the gap between bottom-up affinity-based segmentation methods and top-down generative model based approaches. The main contribution of the paper is a Bayesian formulation for incorporating soft model assignments into the calculation of affinities, which are conventionally model free. We integrate the resulting model-aware affinities into the multilevel segmentation by weighted aggregation algorithm, and apply the technique to the task of detecting and segmenting brain tumor and edema in multichannel magnetic resonance (MR) volumes. The computationally efficient method runs orders of magnitude faster than current state-of-the-art techniques giving comparable or improved results. Our quantitative results indicate the benefit of incorporating model-aware affinities into the segmentation process for the difficult case of glioblastoma multiforme brain tumor. PMID:18450536

  13. Modelling the cost-effectiveness of mitigation methods for multiple pollutants at farm scale.

    Gooday, R D; Anthony, S G; Chadwick, D R; Newell-Price, P; Harris, D; Duethmann, D; Fish, R; Collins, A L; Winter, M


    Reductions in agricultural pollution are essential for meeting nationally and internationally agreed policy targets for losses to both air and water. Numerous studies quantify the impact of relevant mitigation methods by field experimentation or computer modelling. The majority of these studies have addressed individual methods and frequently also individual pollutants. This paper presents a conceptual model for the synthesis of the evidence base to calculate the impact of multiple methods addressing multiple pollutants in order to identify least cost solutions for multiple policy objectives. The model is implemented as a farm scale decision support tool that quantifies baseline pollutant losses for identifiable sources, areas and pathways and incorporates a genetic algorithm based multi-objective procedure for determining optimal suites of mitigation methods. The tool is generic as baseline losses can be replaced with measured data and the default library of mitigation methods can be edited and expanded. The tool is demonstrated through application to two contrasting farm systems, using survey data on agricultural practices typical of England and Wales. These examples show how the tool could be used to help target the adoption of mitigation options for the control of diffuse pollution from agriculture. The feedback from workshops where Farmscoper was demonstrated is included to highlight the potential role of Farmscoper as part of the farm advisory process. PMID:23706481

  14. The perioperative surgical home: An innovative, patient-centred and cost-effective perioperative care model.

    Desebbe, Olivier; Lanz, Thomas; Kain, Zeev; Cannesson, Maxime


    Contrary to the intraoperative period, the current perioperative environment is known to be fragmented and expensive. One of the potential solutions to this problem is the newly proposed perioperative surgical home (PSH) model of care. The PSH is a patient-centred micro healthcare system, which begins at the time the decision for surgery is made, is continuous through the perioperative period and concludes 30 days after discharge from the hospital. The model is based on multidisciplinary involvement: coordination of care, consistent application of best evidence/best practice protocols, full transparency with continuous monitoring and reporting of safety, quality, and cost data to optimize and decrease variation in care practices. To reduce said variation in care, the entire continuum of the perioperative process must evolve into a unique care environment handled by one perioperative team and coordinated by a leader. Anaesthesiologists are ideally positioned to lead this new model and thus significantly contribute to the highest standards in transitional medicine. The unique characteristics that place Anaesthesiologists in this framework include their systematic role in hospitals (as coordinators between patients/medical staff and institutions), the culture of safety and health care metrics innate to the specialty, and a significant role in the preoperative evaluation and counselling process, making them ideal leaders in perioperative medicine. PMID:26613678

  15. Emulation: A fast stochastic Bayesian method to eliminate model space

    Roberts, Alan; Hobbs, Richard; Goldstein, Michael


    Joint inversion of large 3D datasets has been the goal of geophysicists ever since the datasets first started to be produced. There are two broad approaches to this kind of problem, traditional deterministic inversion schemes and more recently developed Bayesian search methods, such as MCMC (Markov Chain Monte Carlo). However, using both these kinds of schemes has proved prohibitively expensive, both in computing power and time cost, due to the normally very large model space which needs to be searched using forward model simulators which take considerable time to run. At the heart of strategies aimed at accomplishing this kind of inversion is the question of how to reliably and practicably reduce the size of the model space in which the inversion is to be carried out. Here we present a practical Bayesian method, known as emulation, which can address this issue. Emulation is a Bayesian technique used with considerable success in a number of technical fields, such as in astronomy, where the evolution of the universe has been modelled using this technique, and in the petroleum industry where history matching is carried out of hydrocarbon reservoirs. The method of emulation involves building a fast-to-compute uncertainty-calibrated approximation to a forward model simulator. We do this by modelling the output data from a number of forward simulator runs by a computationally cheap function, and then fitting the coefficients defining this function to the model parameters. By calibrating the error of the emulator output with respect to the full simulator output, we can use this to screen out large areas of model space which contain only implausible models. For example, starting with what may be considered a geologically reasonable prior model space of 10000 models, using the emulator we can quickly show that only models which lie within 10% of that model space actually produce output data which is plausibly similar in character to an observed dataset. We can thus much

  16. Bayesian Dose-Response Modeling in Sparse Data

    Kim, Steven B.

    This book discusses Bayesian dose-response modeling in small samples applied to two different settings. The first setting is early phase clinical trials, and the second setting is toxicology studies in cancer risk assessment. In early phase clinical trials, experimental units are humans who are actual patients. Prior to a clinical trial, opinions from multiple subject area experts are generally more informative than the opinion of a single expert, but we may face a dilemma when they have disagreeing prior opinions. In this regard, we consider compromising the disagreement and compare two different approaches for making a decision. In addition to combining multiple opinions, we also address balancing two levels of ethics in early phase clinical trials. The first level is individual-level ethics which reflects the perspective of trial participants. The second level is population-level ethics which reflects the perspective of future patients. We extensively compare two existing statistical methods which focus on each perspective and propose a new method which balances the two conflicting perspectives. In toxicology studies, experimental units are living animals. Here we focus on a potential non-monotonic dose-response relationship which is known as hormesis. Briefly, hormesis is a phenomenon which can be characterized by a beneficial effect at low doses and a harmful effect at high doses. In cancer risk assessments, the estimation of a parameter, which is known as a benchmark dose, can be highly sensitive to a class of assumptions, monotonicity or hormesis. In this regard, we propose a robust approach which considers both monotonicity and hormesis as a possibility. In addition, We discuss statistical hypothesis testing for hormesis and consider various experimental designs for detecting hormesis based on Bayesian decision theory. Past experiments have not been optimally designed for testing for hormesis, and some Bayesian optimal designs may not be optimal under a

  17. Perceptual decision making: Drift-diffusion model is equivalent to a Bayesian model

    Sebastian Bitzer


    Full Text Available Behavioural data obtained with perceptual decision making experiments are typically analysed with the drift-diffusion model. This parsimonious model accumulates noisy pieces of evidence towards a decision bound to explain the accuracy and reaction times of subjects. Recently, Bayesian models have been proposed to explain how the brain extracts information from noisy input as typically presented in perceptual decision making tasks. It has long been known that the drift-diffusion model is tightly linked with such functional Bayesian models but the precise relationship of the two mechanisms was never made explicit. Using a Bayesian model, we derived the equations which relate parameter values between these models. In practice we show that this equivalence is useful when fitting multi-subject data. We further show that the Bayesian model suggests different decision variables which all predict equal responses and discuss how these may be discriminated based on neural correlates of accumulated evidence. In addition, we discuss extensions to the Bayesian model which would be difficult to derive for the drift-diffusion model. We suggest that these and other extensions may be highly useful for deriving new experiments which test novel hypotheses.


    Anass BAYAGA


    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  19. Bayesian predictive modeling for genomic based personalized treatment selection.

    Ma, Junsheng; Stingo, Francesco C; Hobbs, Brian P


    Efforts to personalize medicine in oncology have been limited by reductive characterizations of the intrinsically complex underlying biological phenomena. Future advances in personalized medicine will rely on molecular signatures that derive from synthesis of multifarious interdependent molecular quantities requiring robust quantitative methods. However, highly parameterized statistical models when applied in these settings often require a prohibitively large database and are sensitive to proper characterizations of the treatment-by-covariate interactions, which in practice are difficult to specify and may be limited by generalized linear models. In this article, we present a Bayesian predictive framework that enables the integration of a high-dimensional set of genomic features with clinical responses and treatment histories of historical patients, providing a probabilistic basis for using the clinical and molecular information to personalize therapy for future patients. Our work represents one of the first attempts to define personalized treatment assignment rules based on large-scale genomic data. We use actual gene expression data acquired from The Cancer Genome Atlas in the settings of leukemia and glioma to explore the statistical properties of our proposed Bayesian approach for personalizing treatment selection. The method is shown to yield considerable improvements in predictive accuracy when compared to penalized regression approaches. PMID:26575856

  20. Accounting for Heterogeneity in Relative Treatment Effects for Use in Cost-Effectiveness Models and Value-of-Information Analyses.

    Welton, Nicky J; Soares, Marta O; Palmer, Stephen; Ades, Anthony E; Harrison, David; Shankar-Hari, Manu; Rowan, Kathy M


    Cost-effectiveness analysis (CEA) models are routinely used to inform health care policy. Key model inputs include relative effectiveness of competing treatments, typically informed by meta-analysis. Heterogeneity is ubiquitous in meta-analysis, and random effects models are usually used when there is variability in effects across studies. In the absence of observed treatment effect modifiers, various summaries from the random effects distribution (random effects mean, predictive distribution, random effects distribution, or study-specific estimate [shrunken or independent of other studies]) can be used depending on the relationship between the setting for the decision (population characteristics, treatment definitions, and other contextual factors) and the included studies. If covariates have been measured that could potentially explain the heterogeneity, then these can be included in a meta-regression model. We describe how covariates can be included in a network meta-analysis model and how the output from such an analysis can be used in a CEA model. We outline a model selection procedure to help choose between competing models and stress the importance of clinical input. We illustrate the approach with a health technology assessment of intravenous immunoglobulin for the management of adult patients with severe sepsis in an intensive care setting, which exemplifies how risk of bias information can be incorporated into CEA models. We show that the results of the CEA and value-of-information analyses are sensitive to the model and highlight the importance of sensitivity analyses when conducting CEA in the presence of heterogeneity. The methods presented extend naturally to heterogeneity in other model inputs, such as baseline risk. PMID:25712447

  1. Development of a Bayesian Belief Network Runway Incursion Model

    Green, Lawrence L.


    In a previous paper, a statistical analysis of runway incursion (RI) events was conducted to ascertain their relevance to the top ten Technical Challenges (TC) of the National Aeronautics and Space Administration (NASA) Aviation Safety Program (AvSP). The study revealed connections to perhaps several of the AvSP top ten TC. That data also identified several primary causes and contributing factors for RI events that served as the basis for developing a system-level Bayesian Belief Network (BBN) model for RI events. The system-level BBN model will allow NASA to generically model the causes of RI events and to assess the effectiveness of technology products being developed under NASA funding. These products are intended to reduce the frequency of RI events in particular, and to improve runway safety in general. The development, structure and assessment of that BBN for RI events by a Subject Matter Expert panel are documented in this paper.

  2. Bayesian reduced-order models for multiscale dynamical systems

    Koutsourelakis, P S


    While existing mathematical descriptions can accurately account for phenomena at microscopic scales (e.g. molecular dynamics), these are often high-dimensional, stochastic and their applicability over macroscopic time scales of physical interest is computationally infeasible or impractical. In complex systems, with limited physical insight on the coherent behavior of their constituents, the only available information is data obtained from simulations of the trajectories of huge numbers of degrees of freedom over microscopic time scales. This paper discusses a Bayesian approach to deriving probabilistic coarse-grained models that simultaneously address the problems of identifying appropriate reduced coordinates and the effective dynamics in this lower-dimensional representation. At the core of the models proposed lie simple, low-dimensional dynamical systems which serve as the building blocks of the global model. These approximate the latent, generating sources and parameterize the reduced-order dynamics. We d...

  3. Extended Bayesian Information Criteria for Gaussian Graphical Models

    Foygel, Rina


    Gaussian graphical models with sparsity in the inverse covariance matrix are of significant interest in many modern applications. For the problem of recovering the graphical structure, information criteria provide useful optimization objectives for algorithms searching through sets of graphs or for selection of tuning parameters of other methods such as the graphical lasso, which is a likelihood penalization technique. In this paper we establish the consistency of an extended Bayesian information criterion for Gaussian graphical models in a scenario where both the number of variables p and the sample size n grow. Compared to earlier work on the regression case, our treatment allows for growth in the number of non-zero parameters in the true model, which is necessary in order to cover connected graphs. We demonstrate the performance of this criterion on simulated data when used in conjunction with the graphical lasso, and verify that the criterion indeed performs better than either cross-validation or the ordi...

  4. A Bayesian approach to the modelling of alpha Cen A

    Bazot, M; Christensen-Dalsgaard, J


    Determining the physical characteristics of a star is an inverse problem consisting in estimating the parameters of models for the stellar structure and evolution, knowing certain observable quantities. We use a Bayesian approach to solve this problem for alpha Cen A, which allows us to incorporate prior information on the parameters to be estimated, in order to better constrain the problem. Our strategy is based on the use of a Markov Chain Monte Carlo (MCMC) algorithm to estimate the posterior probability densities of the stellar parameters: mass, age, initial chemical composition,... We use the stellar evolutionary code ASTEC to model the star. To constrain this model both seismic and non-seismic observations were considered. Several different strategies were tested to fit these values, either using two or five free parameters in ASTEC. We are thus able to show evidence that MCMC methods become efficient with respect to more classical grid-based strategies when the number of parameters increases. The resul...

  5. Advances in Bayesian Model Based Clustering Using Particle Learning

    Merl, D M


    Recent work by Carvalho, Johannes, Lopes and Polson and Carvalho, Lopes, Polson and Taddy introduced a sequential Monte Carlo (SMC) alternative to traditional iterative Monte Carlo strategies (e.g. MCMC and EM) for Bayesian inference for a large class of dynamic models. The basis of SMC techniques involves representing the underlying inference problem as one of state space estimation, thus giving way to inference via particle filtering. The key insight of Carvalho et al was to construct the sequence of filtering distributions so as to make use of the posterior predictive distribution of the observable, a distribution usually only accessible in certain Bayesian settings. Access to this distribution allows a reversal of the usual propagate and resample steps characteristic of many SMC methods, thereby alleviating to a large extent many problems associated with particle degeneration. Furthermore, Carvalho et al point out that for many conjugate models the posterior distribution of the static variables can be parametrized in terms of [recursively defined] sufficient statistics of the previously observed data. For models where such sufficient statistics exist, particle learning as it is being called, is especially well suited for the analysis of streaming data do to the relative invariance of its algorithmic complexity with the number of data observations. Through a particle learning approach, a statistical model can be fit to data as the data is arriving, allowing at any instant during the observation process direct quantification of uncertainty surrounding underlying model parameters. Here we describe the use of a particle learning approach for fitting a standard Bayesian semiparametric mixture model as described in Carvalho, Lopes, Polson and Taddy. In Section 2 we briefly review the previously presented particle learning algorithm for the case of a Dirichlet process mixture of multivariate normals. In Section 3 we describe several novel extensions to the original

  6. Using Bayesian model averaging to estimate terrestrial evapotranspiration in China

    Chen, Yang; Yuan, Wenping; Xia, Jiangzhou; Fisher, Joshua B.; Dong, Wenjie; Zhang, Xiaotong; Liang, Shunlin; Ye, Aizhong; Cai, Wenwen; Feng, Jinming


    Evapotranspiration (ET) is critical to terrestrial ecosystems as it links the water, carbon, and surface energy exchanges. Numerous ET models were developed for the ET estimations, but there are large model uncertainties. In this study, a Bayesian Model Averaging (BMA) method was used to merge eight satellite-based models, including five empirical and three process-based models, for improving the accuracy of ET estimates. At twenty-three eddy covariance flux towers, we examined the model performance on all possible combinations of eight models and found that an ensemble with four models (BMA_Best) showed the best model performance. The BMA_Best method can outperform the best of eight models, and the Kling-Gupta efficiency (KGE) value increased by 4% compared with the model with the highest KGE, and decreased RMSE by 4%. Although the correlation coefficient of BMA_Best is less than the best single model, the bias of BMA_Best is the smallest compared with the eight models. Moreover, based on the water balance principle over the river basin scale, the validation indicated the BMA_Best estimates can explain 86% variations. In general, the results showed BMA estimates will be very useful for future studies to characterize the regional water availability over long-time series.

  7. Semi-parametric Bayesian Partially Identified Models based on Support Function

    Liao, Yuan; De Simoni, Anna


    We provide a comprehensive semi-parametric study of Bayesian partially identified econometric models. While the existing literature on Bayesian partial identification has mostly focused on the structural parameter, our primary focus is on Bayesian credible sets (BCS's) of the unknown identified set and the posterior distribution of its support function. We construct a (two-sided) BCS based on the support function of the identified set. We prove the Bernstein-von Mises theorem for the posterio...

  8. A Bayesian analysis of two probability models describing thunderstorm activity at Cape Kennedy, Florida

    Williford, W. O.; Hsieh, P.; Carter, M. C.


    A Bayesian analysis of the two discrete probability models, the negative binomial and the modified negative binomial distributions, which have been used to describe thunderstorm activity at Cape Kennedy, Florida, is presented. The Bayesian approach with beta prior distributions is compared to the classical approach which uses a moment method of estimation or a maximum-likelihood method. The accuracy and simplicity of the Bayesian method is demonstrated.

  9. Road network safety evaluation using Bayesian hierarchical joint model.

    Wang, Jie; Huang, Helai


    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well. PMID:26945109

  10. Modelling of population dynamics of red king crab using Bayesian approach

    Bakanev Sergey ...


    Modeling population dynamics based on the Bayesian approach enables to successfully resolve the above issues. The integration of the data from various studies into a unified model based on Bayesian parameter estimation method provides a much more detailed description of the processes occurring in the population.