WorldWideScience

Sample records for bayesian inference models

  1. Constrained bayesian inference of project performance models

    OpenAIRE

    Sunmola, Funlade

    2013-01-01

    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  2. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  3. Bayesian Inference and Optimal Design in the Sparse Linear Model

    OpenAIRE

    Seeger, Matthias; Steinke, Florian; Tsuda, Koji

    2007-01-01

    The sparse linear model has seen many successful applications in Statistics, Machine Learning, and Computational Biology, such as identification of gene regulatory networks from micro-array expression data. Prior work has either approximated Bayesian inference by expensive Markov chain Monte Carlo, or replaced it by point estimation. We show how to obtain a good approximation to Bayesian analysis efficiently, using the Expectation Propagation method. We also address the problems of optimal de...

  4. Bayesian inference model for fatigue life of laminated composites

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der; Berggreen, Christian

    2016-01-01

    A probabilistic model for estimating the fatigue life of laminated composite plates is developed. The model is based on lamina-level input data, making it possible to predict fatigue properties for a wide range of laminate configurations. Model parameters are estimated by Bayesian inference. The...

  5. Bayesian inference of chemical kinetic models from proposed reactions

    KAUST Repository

    Galagali, Nikhil

    2015-02-01

    © 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.

  6. Bayesian Inference and Forecasting in the Stationary Bilinear Model

    OpenAIRE

    Roberto Leon-Gonzalez; Fuyu Yang

    2014-01-01

    A stationary bilinear (SB) model can be used to describe processes with a time-varying degree of persistence that depends on past shocks. An example of such a process is inflation. This study develops methods for Bayesian inference, model comparison, and forecasting in the SB model. Using monthly U.K. inflation data, we find that the SB model outperforms the random walk and first order autoregressive AR(1) models in terms of root mean squared forecast errors for both the one-step-ahead and th...

  7. GPU Computing in Bayesian Inference of Realized Stochastic Volatility Model

    International Nuclear Information System (INIS)

    The realized stochastic volatility (RSV) model that utilizes the realized volatility as additional information has been proposed to infer volatility of financial time series. We consider the Bayesian inference of the RSV model by the Hybrid Monte Carlo (HMC) algorithm. The HMC algorithm can be parallelized and thus performed on the GPU for speedup. The GPU code is developed with CUDA Fortran. We compare the computational time in performing the HMC algorithm on GPU (GTX 760) and CPU (Intel i7-4770 3.4GHz) and find that the GPU can be up to 17 times faster than the CPU. We also code the program with OpenACC and find that appropriate coding can achieve the similar speedup with CUDA Fortran

  8. Bayesian inference for partially identified models exploring the limits of limited data

    CERN Document Server

    Gustafson, Paul

    2015-01-01

    Introduction Identification What Is against Us? What Is for Us? Some Simple Examples of Partially Identified ModelsThe Road Ahead The Structure of Inference in Partially Identified Models Bayesian Inference The Structure of Posterior Distributions in PIMs Computational Strategies Strength of Bayesian Updating, Revisited Posterior MomentsCredible Intervals Evaluating the Worth of Inference Partial Identification versus Model Misspecification The Siren Call of Identification Comp

  9. Markov Model of Wind Power Time Series UsingBayesian Inference of Transition Matrix

    OpenAIRE

    Chen, Peiyuan; Berthelsen, Kasper Klitgaard; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper proposes to use Bayesian inference of transition matrix when developing a discrete Markov model of a wind speed/power time series and 95% credible interval for the model verification. The Dirichlet distribution is used as a conjugate prior for the transition matrix. Three discrete Markov models are compared, i.e. the basic Markov model, the Bayesian Markov model and the birth-and-death Markov model. The proposed Bayesian Markov model shows the best accuracy in modeling the autocorr...

  10. On Fuzzy Bayesian Inference

    OpenAIRE

    Frühwirth-Schnatter, Sylvia

    1990-01-01

    In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)

  11. A localization model to localize multiple sources using Bayesian inference

    Science.gov (United States)

    Dunham, Joshua Rolv

    Accurate localization of a sound source in a room setting is important in both psychoacoustics and architectural acoustics. Binaural models have been proposed to explain how the brain processes and utilizes the interaural time differences (ITDs) and interaural level differences (ILDs) of sound waves arriving at the ears of a listener in determining source location. Recent work shows that applying Bayesian methods to this problem is proving fruitful. In this thesis, pink noise samples are convolved with head-related transfer functions (HRTFs) and compared to combinations of one and two anechoic speech signals convolved with different HRTFs or binaural room impulse responses (BRIRs) to simulate room positions. Through exhaustive calculation of Bayesian posterior probabilities and using a maximal likelihood approach, model selection will determine the number of sources present, and parameter estimation will result in azimuthal direction of the source(s).

  12. Robust Bayesian inference in Iq-Spherical models

    OpenAIRE

    Osiewalski, Jacek; Mark F.J. Steel

    1992-01-01

    The class of multivariate lq-spherical distributions is introduced and defined through their isodensity surfaces. We prove that, under a Jeffreys' type improper prior on the scale parameter, posterior inference on the location parameters is the same for all lq-spherical sampling models with common q. This gives us perfect inference robustness with respect to any departures from the reference case of independent sampling from the exponential power distribution.

  13. Bayesian inference and model comparison for metallic fatigue data

    KAUST Repository

    Babuška, Ivo

    2016-02-23

    In this work, we present a statistical treatment of stress-life (S-N) data drawn from a collection of records of fatigue experiments that were performed on 75S-T6 aluminum alloys. Our main objective is to predict the fatigue life of materials by providing a systematic approach to model calibration, model selection and model ranking with reference to S-N data. To this purpose, we consider fatigue-limit models and random fatigue-limit models that are specially designed to allow the treatment of the run-outs (right-censored data). We first fit the models to the data by maximum likelihood methods and estimate the quantiles of the life distribution of the alloy specimen. To assess the robustness of the estimation of the quantile functions, we obtain bootstrap confidence bands by stratified resampling with respect to the cycle ratio. We then compare and rank the models by classical measures of fit based on information criteria. We also consider a Bayesian approach that provides, under the prior distribution of the model parameters selected by the user, their simulation-based posterior distributions. We implement and apply Bayesian model comparison methods, such as Bayes factor ranking and predictive information criteria based on cross-validation techniques under various a priori scenarios.

  14. Bayesian inference and model comparison for metallic fatigue data

    Science.gov (United States)

    Babuška, Ivo; Sawlan, Zaid; Scavino, Marco; Szabó, Barna; Tempone, Raúl

    2016-06-01

    In this work, we present a statistical treatment of stress-life (S-N) data drawn from a collection of records of fatigue experiments that were performed on 75S-T6 aluminum alloys. Our main objective is to predict the fatigue life of materials by providing a systematic approach to model calibration, model selection and model ranking with reference to S-N data. To this purpose, we consider fatigue-limit models and random fatigue-limit models that are specially designed to allow the treatment of the run-outs (right-censored data). We first fit the models to the data by maximum likelihood methods and estimate the quantiles of the life distribution of the alloy specimen. To assess the robustness of the estimation of the quantile functions, we obtain bootstrap confidence bands by stratified resampling with respect to the cycle ratio. We then compare and rank the models by classical measures of fit based on information criteria. We also consider a Bayesian approach that provides, under the prior distribution of the model parameters selected by the user, their simulation-based posterior distributions. We implement and apply Bayesian model comparison methods, such as Bayes factor ranking and predictive information criteria based on cross-validation techniques under various a priori scenarios.

  15. The Lumiere Project: Bayesian User Modeling for Inferring the Goals and Needs of Software Users

    OpenAIRE

    Horvitz, Eric J.; Breese, John S.; Heckerman, David; Hovel, David; Rommelse, Koos

    2013-01-01

    The Lumiere Project centers on harnessing probability and utility to provide assistance to computer software users. We review work on Bayesian user models that can be employed to infer a users needs by considering a user's background, actions, and queries. Several problems were tackled in Lumiere research, including (1) the construction of Bayesian models for reasoning about the time-varying goals of computer users from their observed actions and queries, (2) gaining access to a stream of eve...

  16. Approximate Bayesian inference in semi-mechanistic models

    OpenAIRE

    Aderhold, Andrej; Husmeier, Dirk; Grzegorczyk, Marco

    2016-01-01

    Inference of interaction networks represented by systems of differential equations is a challenging problem in many scientific disciplines. In the present article, we follow a semi-mechanistic modelling approach based on gradient matching. We investigate the extent to which key factors, including the kinetic model, statistical formulation and numerical methods, impact upon performance at network reconstruction. We emphasize general lessons for computational statisticians when faced with the c...

  17. Bayesian inference of BWR model parameters by Markov chain Monte Carlo

    International Nuclear Information System (INIS)

    In this paper, the Markov chain Monte Carlo approach to Bayesian inference is applied for estimating the parameters of a reduced-order model of the dynamics of a boiling water reactor system. A Bayesian updating strategy is devised to progressively refine the estimates, as newly measured data become available. Finally, the technique is used for detecting parameter changes during the system lifetime, e.g. due to component degradation

  18. Bayesian inference for a wavefront model of the Neolithisation of Europe

    CERN Document Server

    Baggaley, Andrew W; Shukurov, Anvar; Boys, Richard J; Golightly, Andrew

    2012-01-01

    We consider a wavefront model for the spread of Neolithic culture across Europe, and use Bayesian inference techniques to provide estimates for the parameters within this model, as constrained by radiocarbon data from Southern and Western Europe. Our wavefront model allows for both an isotropic background spread (incorporating the effects of local geography), and a localized anisotropic spread associated with major waterways. We introduce an innovative numerical scheme to track the wavefront, allowing us to simulate the times of the first arrival at any site orders of magnitude more efficiently than traditional PDE approaches. We adopt a Bayesian approach to inference and use Gaussian process emulators to facilitate further increases in efficiency in the inference scheme, thereby making Markov chain Monte Carlo methods practical. We allow for uncertainty in the fit of our model, and also infer a parameter specifying the magnitude of this uncertainty. We obtain a magnitude for the background spread of order 1 ...

  19. deBInfer: Bayesian inference for dynamical models of biological systems in R

    OpenAIRE

    Boersch-Supan, Philipp H.; Johnson, Leah R

    2016-01-01

    1. Differential equations (DEs) are commonly used to model the temporal evolution of biological systems, but statistical methods for comparing DE models to data and for parameter inference are relatively poorly developed. This is especially problematic in the context of biological systems where observations are often noisy and only a small number of time points may be available. 2. Bayesian approaches offer a coherent framework for parameter inference that can account for multiple sources of ...

  20. Probabilistic Modelling of Fatigue Life of Composite Laminates Using Bayesian Inference

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der

    2014-01-01

    . Model parameters are estimated by Bayesian inference. The reference data used consists of constant-amplitude fatigue test results for a multi-directional laminate subjected to seven different load ratios. The paper describes the modelling techniques and the parameter estimation procedure, supported by...

  1. Markov Model of Wind Power Time Series UsingBayesian Inference of Transition Matrix

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Berthelsen, Kasper Klitgaard; Bak-Jensen, Birgitte;

    2009-01-01

    This paper proposes to use Bayesian inference of transition matrix when developing a discrete Markov model of a wind speed/power time series and 95% credible interval for the model verification. The Dirichlet distribution is used as a conjugate prior for the transition matrix. Three discrete Markov...

  2. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  3. Skill Rating by Bayesian Inference

    OpenAIRE

    Di Fatta, Giuseppe; Haworth, Guy McCrossan; Regan, Kenneth W.

    2009-01-01

    Systems Engineering often involves computer modelling the behaviour of proposed systems and their components. Where a component is human, fallibility must be modelled by a stochastic agent. The identification of a model of decision-making over quantifiable options is investigated using the game-domain of Chess. Bayesian methods are used to infer the distribution of players’ skill levels from the moves they play rather than from their competitive results. The approach is used on large sets of ...

  4. Hellinger Distance and Bayesian Non-Parametrics: Hierarchical Models for Robust and Efficient Bayesian Inference

    OpenAIRE

    Wu, Yuefeng; Hooker, Giles

    2013-01-01

    This paper introduces a hierarchical framework to incorporate Hellinger distance methods into Bayesian analysis. We propose to modify a prior over non-parametric densities with the exponential of twice the Hellinger distance between a candidate and a parametric density. By incorporating a prior over the parameters of the second density, we arrive at a hierarchical model in which a non-parametric model is placed between parameters and the data. The parameters of the family can then be estimate...

  5. Adaptive surrogate modeling for response surface approximations with application to bayesian inference

    KAUST Repository

    Prudhomme, Serge

    2015-09-17

    Parameter estimation for complex models using Bayesian inference is usually a very costly process as it requires a large number of solves of the forward problem. We show here how the construction of adaptive surrogate models using a posteriori error estimates for quantities of interest can significantly reduce the computational cost in problems of statistical inference. As surrogate models provide only approximations of the true solutions of the forward problem, it is nevertheless necessary to control these errors in order to construct an accurate reduced model with respect to the observables utilized in the identification of the model parameters. Effectiveness of the proposed approach is demonstrated on a numerical example dealing with the Spalart–Allmaras model for the simulation of turbulent channel flows. In particular, we illustrate how Bayesian model selection using the adapted surrogate model in place of solving the coupled nonlinear equations leads to the same quality of results while requiring fewer nonlinear PDE solves.

  6. Bayesian inference of models and hyper-parameters for robust optic-flow estimation

    OpenAIRE

    Héas, Patrick; Herzet, Cédric; Memin, Etienne

    2012-01-01

    International audience Selecting optimal models and hyper-parameters is crucial for accurate optic-flow estimation. This paper provides a solution to the problem in a generic Bayesian framework. The method is based on a conditional model linking the image intensity function, the unknown velocity field, hyper-parameters and the prior and likelihood motion models. Inference is performed on each of the three-level of this so-defined hierarchical model by maximization of marginalized \\textit{a...

  7. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael;

    2009-01-01

    and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  8. Bayesian Inference for Radio Observations

    CERN Document Server

    Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin

    2015-01-01

    (Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...

  9. Composite Bayesian inference

    OpenAIRE

    Roche, Alexis

    2015-01-01

    This paper revisits the concept of composite likelihood from the perspective of probabilistic inference, and proposes a generalization called super composite likelihood for sharper inference in multiclass problems. It is argued that, beside providing a new interpretation and a general justification of na\\"ive Bayes procedures, super composite likelihood yields a much wider class of discriminative models suitable for unsupervised and weakly supervised learning.

  10. Inferring activities from context measurements using Bayesian inference and random utility models

    OpenAIRE

    Hurtubia, Ricardo; Bierlaire, Michel; Flötteröd, Gunnar

    2009-01-01

    Smartphones collect a wealth of information about their users. This includes GPS tracks and the MAC addresses of devices around the user, and it can go as far as taking visual and acoustic samples of the user's environment. We present a framework to identify a smartphone user's activities in a Bayesian setting. As prior information, we us a random utility model that accounts for the type of activity a user is likely to perform at any given location and time; this model was estimated for the w...

  11. GNU MCSim : bayesian statistical inference for SBML-coded systems biology models

    OpenAIRE

    Bois, Frédéric Y.

    2009-01-01

    International audience Statistical inference about the parameter values of complex models, such as the ones routinely developed in systems biology, is efficiently performed through Bayesian numerical techniques. In that framework, prior information and multiple levels of uncertainty can be seamlessly integrated. GNU MCSim was precisely developed to achieve those aims, in a general non-linear differential context. Starting with version 5.3.0, GNU MCSim reads in and simulates Systems Biology...

  12. RevBayes: Bayesian Phylogenetic Inference Using Graphical Models and an Interactive Model-Specification Language.

    Science.gov (United States)

    Höhna, Sebastian; Landis, Michael J; Heath, Tracy A; Boussau, Bastien; Lartillot, Nicolas; Moore, Brian R; Huelsenbeck, John P; Ronquist, Fredrik

    2016-07-01

    Programs for Bayesian inference of phylogeny currently implement a unique and fixed suite of models. Consequently, users of these software packages are simultaneously forced to use a number of programs for a given study, while also lacking the freedom to explore models that have not been implemented by the developers of those programs. We developed a new open-source software package, RevBayes, to address these problems. RevBayes is entirely based on probabilistic graphical models, a powerful generic framework for specifying and analyzing statistical models. Phylogenetic-graphical models can be specified interactively in RevBayes, piece by piece, using a new succinct and intuitive language called Rev. Rev is similar to the R language and the BUGS model-specification language, and should be easy to learn for most users. The strength of RevBayes is the simplicity with which one can design, specify, and implement new and complex models. Fortunately, this tremendous flexibility does not come at the cost of slower computation; as we demonstrate, RevBayes outperforms competing software for several standard analyses. Compared with other programs, RevBayes has fewer black-box elements. Users need to explicitly specify each part of the model and analysis. Although this explicitness may initially be unfamiliar, we are convinced that this transparency will improve understanding of phylogenetic models in our field. Moreover, it will motivate the search for improvements to existing methods by brazenly exposing the model choices that we make to critical scrutiny. RevBayes is freely available at http://www.RevBayes.com [Bayesian inference; Graphical models; MCMC; statistical phylogenetics.]. PMID:27235697

  13. Bayesian Inference in the Time Varying Cointegration Model

    OpenAIRE

    Gary Koop; Roberto Leon-Gonzalez; Rodney Strachan

    2008-01-01

    There are both theoretical and empirical reasons for believing that the pa- rameters of macroeconomic models may vary over time. However, work with time-varying parameter models has largely involved Vector autoregressions (VARs), ignoring cointegration. This is despite the fact that cointegration plays an important role in informing macroeconomists on a range of issues. In this paper we develop time varying parameter models which permit coin- tegration. Time-varying parameter VARs (TVP-VARs) ...

  14. Quantum Inference on Bayesian Networks

    Science.gov (United States)

    Yoder, Theodore; Low, Guang Hao; Chuang, Isaac

    2014-03-01

    Because quantum physics is naturally probabilistic, it seems reasonable to expect physical systems to describe probabilities and their evolution in a natural fashion. Here, we use quantum computation to speedup sampling from a graphical probability model, the Bayesian network. A specialization of this sampling problem is approximate Bayesian inference, where the distribution on query variables is sampled given the values e of evidence variables. Inference is a key part of modern machine learning and artificial intelligence tasks, but is known to be NP-hard. Classically, a single unbiased sample is obtained from a Bayesian network on n variables with at most m parents per node in time (nmP(e) - 1 / 2) , depending critically on P(e) , the probability the evidence might occur in the first place. However, by implementing a quantum version of rejection sampling, we obtain a square-root speedup, taking (n2m P(e) -1/2) time per sample. The speedup is the result of amplitude amplification, which is proving to be broadly applicable in sampling and machine learning tasks. In particular, we provide an explicit and efficient circuit construction that implements the algorithm without the need for oracle access.

  15. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  16. Modelling Odor Decoding in the Antennal Lobe by Combining Sequential Firing Rate Models with Bayesian Inference.

    Science.gov (United States)

    Cuevas Rivera, Dario; Bitzer, Sebastian; Kiebel, Stefan J

    2015-10-01

    The olfactory information that is received by the insect brain is encoded in the form of spatiotemporal patterns in the projection neurons of the antennal lobe. These dense and overlapping patterns are transformed into a sparse code in Kenyon cells in the mushroom body. Although it is clear that this sparse code is the basis for rapid categorization of odors, it is yet unclear how the sparse code in Kenyon cells is computed and what information it represents. Here we show that this computation can be modeled by sequential firing rate patterns using Lotka-Volterra equations and Bayesian online inference. This new model can be understood as an 'intelligent coincidence detector', which robustly and dynamically encodes the presence of specific odor features. We found that the model is able to qualitatively reproduce experimentally observed activity in both the projection neurons and the Kenyon cells. In particular, the model explains mechanistically how sparse activity in the Kenyon cells arises from the dense code in the projection neurons. The odor classification performance of the model proved to be robust against noise and time jitter in the observed input sequences. As in recent experimental results, we found that recognition of an odor happened very early during stimulus presentation in the model. Critically, by using the model, we found surprising but simple computational explanations for several experimental phenomena. PMID:26451888

  17. Modelling Odor Decoding in the Antennal Lobe by Combining Sequential Firing Rate Models with Bayesian Inference.

    Directory of Open Access Journals (Sweden)

    Dario Cuevas Rivera

    2015-10-01

    Full Text Available The olfactory information that is received by the insect brain is encoded in the form of spatiotemporal patterns in the projection neurons of the antennal lobe. These dense and overlapping patterns are transformed into a sparse code in Kenyon cells in the mushroom body. Although it is clear that this sparse code is the basis for rapid categorization of odors, it is yet unclear how the sparse code in Kenyon cells is computed and what information it represents. Here we show that this computation can be modeled by sequential firing rate patterns using Lotka-Volterra equations and Bayesian online inference. This new model can be understood as an 'intelligent coincidence detector', which robustly and dynamically encodes the presence of specific odor features. We found that the model is able to qualitatively reproduce experimentally observed activity in both the projection neurons and the Kenyon cells. In particular, the model explains mechanistically how sparse activity in the Kenyon cells arises from the dense code in the projection neurons. The odor classification performance of the model proved to be robust against noise and time jitter in the observed input sequences. As in recent experimental results, we found that recognition of an odor happened very early during stimulus presentation in the model. Critically, by using the model, we found surprising but simple computational explanations for several experimental phenomena.

  18. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  19. Boosting Bayesian parameter inference of nonlinear stochastic differential equation models by Hamiltonian scale separation

    Science.gov (United States)

    Albert, Carlo; Ulzega, Simone; Stoop, Ruedi

    2016-04-01

    Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.

  20. Optimal modeling of 1D azimuth correlations in the context of Bayesian inference

    CERN Document Server

    De Kock, Michiel B; Trainor, Thomas A

    2015-01-01

    Analysis and interpretation of spectrum and correlation data from high-energy nuclear collisions is currently controversial because two opposing physics narratives derive contradictory implications from the same data-one narrative claiming collision dynamics is dominated by dijet production and projectile-nucleon fragmentation, the other claiming collision dynamics is dominated by a dense, flowing QCD medium. Opposing interpretations seem to be supported by alternative data models, and current model-comparison schemes are unable to distinguish between them. There is clearly need for a convincing new methodology to break the deadlock. In this study we introduce Bayesian Inference (BI) methods applied to angular correlation data as a basis to evaluate competing data models. For simplicity the data considered are projections of 2D angular correlations onto 1D azimuth from three centrality classes of 200 GeV Au-Au collisions. We consider several data models typical of current model choices, including Fourier seri...

  1. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  2. Bayesian inference in camera trapping studies for a class of spatial capture-recapture models

    Science.gov (United States)

    Royle, J. Andrew; Karanth, K. Ullas; Gopalaswamy, Arjun M.; Kumar, N. Samba

    2009-01-01

    We develop a class of models for inference about abundance or density using spatial capture-recapture data from studies based on camera trapping and related methods. The model is a hierarchical model composed of two components: a point process model describing the distribution of individuals in space (or their home range centers) and a model describing the observation of individuals in traps. We suppose that trap- and individual-specific capture probabilities are a function of distance between individual home range centers and trap locations. We show that the models can be regarded as generalized linear mixed models, where the individual home range centers are random effects. We adopt a Bayesian framework for inference under these models using a formulation based on data augmentation. We apply the models to camera trapping data on tigers from the Nagarahole Reserve, India, collected over 48 nights in 2006. For this study, 120 camera locations were used, but cameras were only operational at 30 locations during any given sample occasion. Movement of traps is common in many camera-trapping studies and represents an important feature of the observation model that we address explicitly in our application.

  3. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes and...... largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  4. A Unified Method for Inference of Tokamak Equilibria and Validation of Force-Balance Models Based on Bayesian Analysis

    CERN Document Server

    von Nessi, G T

    2012-01-01

    A new method, based on Bayesian analysis, is presented which unifies the inference of plasma equilibria parameters in a Tokamak with the ability to quantify differences between inferred equilibria and Grad-Shafranov force-balance solutions. At the heart of this technique is the new method of observation splitting, which allows multiple forward models to be associated with a single diagnostic observation. This new idea subsequently provides a means by which the the space of GS solutions can be efficiently characterised via a prior distribution. Moreover, by folding force-balance directly into one set of forward models and utilising simple Biot-Savart responses in another, the Bayesian inference of the plasma parameters itself produces an evidence (a normalisation constant of the inferred posterior distribution) which is sensitive to the relative consistency between both sets of models. This evidence can then be used to help determine the relative accuracy of the tested force-balance model across several discha...

  5. Spectral energy distribution modelling of Southern candidate massive protostars using the Bayesian inference method

    CERN Document Server

    Hill, T; Minier, V; Burton, M G; Cunningham, M R

    2008-01-01

    Concatenating data from the millimetre regime to the infrared, we have performed spectral energy distribution modelling for 227 of the 405 millimetre continuum sources of Hill et al. (2005) which are thought to contain young massive stars in the earliest stages of their formation. Three main parameters are extracted from the fits: temperature, mass and luminosity. The method employed was Bayesian inference, which allows a statistically probable range of suitable values for each parameter to be drawn for each individual protostellar candidate. This is the first application of this method to massive star formation. The cumulative distribution plots of the SED modelled parameters in this work indicate that collectively, the sources without methanol maser and/or radio continuum associations (MM-only cores) display similar characteristics to those of high mass star formation regions. Attributing significance to the marginal distinctions between the MM-only cores and the high-mass star formation sample we draw hypo...

  6. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574

  7. Bayesian inference on proportional elections.

    Science.gov (United States)

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  8. Likelihood-free inference of population structure and local adaptation in a Bayesian hierarchical model.

    Science.gov (United States)

    Bazin, Eric; Dawson, Kevin J; Beaumont, Mark A

    2010-06-01

    We address the problem of finding evidence of natural selection from genetic data, accounting for the confounding effects of demographic history. In the absence of natural selection, gene genealogies should all be sampled from the same underlying distribution, often approximated by a coalescent model. Selection at a particular locus will lead to a modified genealogy, and this motivates a number of recent approaches for detecting the effects of natural selection in the genome as "outliers" under some models. The demographic history of a population affects the sampling distribution of genealogies, and therefore the observed genotypes and the classification of outliers. Since we cannot see genealogies directly, we have to infer them from the observed data under some model of mutation and demography. Thus the accuracy of an outlier-based approach depends to a greater or a lesser extent on the uncertainty about the demographic and mutational model. A natural modeling framework for this type of problem is provided by Bayesian hierarchical models, in which parameters, such as mutation rates and selection coefficients, are allowed to vary across loci. It has proved quite difficult computationally to implement fully probabilistic genealogical models with complex demographies, and this has motivated the development of approximations such as approximate Bayesian computation (ABC). In ABC the data are compressed into summary statistics, and computation of the likelihood function is replaced by simulation of data under the model. In a hierarchical setting one may be interested both in hyperparameters and parameters, and there may be very many of the latter--for example, in a genetic model, these may be parameters describing each of many loci or populations. This poses a problem for ABC in that one then requires summary statistics for each locus, which, if used naively, leads to a consequent difficulty in conditional density estimation. We develop a general method for applying

  9. BAMBI: blind accelerated multimodal Bayesian inference

    CERN Document Server

    Graff, Philip; Hobson, Michael P; Lasenby, Anthony

    2011-01-01

    In this paper we present an algorithm for rapid Bayesian analysis that combines the benefits of nested sampling and artificial neural networks. The blind accelerated multimodal Bayesian inference (BAMBI) algorithm implements the MultiNest package for nested sampling as well as the training of an artificial neural network (NN) to learn the likelihood function. In the case of computationally expensive likelihoods, this allows the substitution of a much more rapid approximation in order to increase significantly the speed of the analysis. We begin by demonstrating, with a few toy examples, the ability of a NN to learn complicated likelihood surfaces. BAMBI's ability to decrease running time for Bayesian inference is then demonstrated in the context of estimating cosmological parameters from WMAP and other observations. We show that valuable speed increases are achieved in addition to obtaining NNs trained on the likelihood functions for the different model and data combinations. These NNs can then be used for an...

  10. Universal Darwinism as a process of Bayesian inference

    CERN Document Server

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment". Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description clo...

  11. Bayesian model comparison and parameter inference in systems biology using nested sampling.

    Science.gov (United States)

    Pullen, Nick; Morris, Richard J

    2014-01-01

    Inferring parameters for models of biological processes is a current challenge in systems biology, as is the related problem of comparing competing models that explain the data. In this work we apply Skilling's nested sampling to address both of these problems. Nested sampling is a Bayesian method for exploring parameter space that transforms a multi-dimensional integral to a 1D integration over likelihood space. This approach focuses on the computation of the marginal likelihood or evidence. The ratio of evidences of different models leads to the Bayes factor, which can be used for model comparison. We demonstrate how nested sampling can be used to reverse-engineer a system's behaviour whilst accounting for the uncertainty in the results. The effect of missing initial conditions of the variables as well as unknown parameters is investigated. We show how the evidence and the model ranking can change as a function of the available data. Furthermore, the addition of data from extra variables of the system can deliver more information for model comparison than increasing the data from one variable, thus providing a basis for experimental design. PMID:24523891

  12. Bayesian model comparison and parameter inference in systems biology using nested sampling.

    Directory of Open Access Journals (Sweden)

    Nick Pullen

    Full Text Available Inferring parameters for models of biological processes is a current challenge in systems biology, as is the related problem of comparing competing models that explain the data. In this work we apply Skilling's nested sampling to address both of these problems. Nested sampling is a Bayesian method for exploring parameter space that transforms a multi-dimensional integral to a 1D integration over likelihood space. This approach focuses on the computation of the marginal likelihood or evidence. The ratio of evidences of different models leads to the Bayes factor, which can be used for model comparison. We demonstrate how nested sampling can be used to reverse-engineer a system's behaviour whilst accounting for the uncertainty in the results. The effect of missing initial conditions of the variables as well as unknown parameters is investigated. We show how the evidence and the model ranking can change as a function of the available data. Furthermore, the addition of data from extra variables of the system can deliver more information for model comparison than increasing the data from one variable, thus providing a basis for experimental design.

  13. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  14. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  15. Multi-scale inference of interaction rules in animal groups using Bayesian model selection.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    Full Text Available Inference of interaction rules of animals moving in groups usually relies on an analysis of large scale system behaviour. Models are tuned through repeated simulation until they match the observed behaviour. More recent work has used the fine scale motions of animals to validate and fit the rules of interaction of animals in groups. Here, we use a Bayesian methodology to compare a variety of models to the collective motion of glass prawns (Paratya australiensis. We show that these exhibit a stereotypical 'phase transition', whereby an increase in density leads to the onset of collective motion in one direction. We fit models to this data, which range from: a mean-field model where all prawns interact globally; to a spatial Markovian model where prawns are self-propelled particles influenced only by the current positions and directions of their neighbours; up to non-Markovian models where prawns have 'memory' of previous interactions, integrating their experiences over time when deciding to change behaviour. We show that the mean-field model fits the large scale behaviour of the system, but does not capture the observed locality of interactions. Traditional self-propelled particle models fail to capture the fine scale dynamics of the system. The most sophisticated model, the non-Markovian model, provides a good match to the data at both the fine scale and in terms of reproducing global dynamics, while maintaining a biologically plausible perceptual range. We conclude that prawns' movements are influenced by not just the current direction of nearby conspecifics, but also those encountered in the recent past. Given the simplicity of prawns as a study system our research suggests that self-propelled particle models of collective motion should, if they are to be realistic at multiple biological scales, include memory of previous interactions and other non-Markovian effects.

  16. Multi-scale inference of interaction rules in animal groups using Bayesian model selection.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    2012-01-01

    Full Text Available Inference of interaction rules of animals moving in groups usually relies on an analysis of large scale system behaviour. Models are tuned through repeated simulation until they match the observed behaviour. More recent work has used the fine scale motions of animals to validate and fit the rules of interaction of animals in groups. Here, we use a Bayesian methodology to compare a variety of models to the collective motion of glass prawns (Paratya australiensis. We show that these exhibit a stereotypical 'phase transition', whereby an increase in density leads to the onset of collective motion in one direction. We fit models to this data, which range from: a mean-field model where all prawns interact globally; to a spatial Markovian model where prawns are self-propelled particles influenced only by the current positions and directions of their neighbours; up to non-Markovian models where prawns have 'memory' of previous interactions, integrating their experiences over time when deciding to change behaviour. We show that the mean-field model fits the large scale behaviour of the system, but does not capture fine scale rules of interaction, which are primarily mediated by physical contact. Conversely, the Markovian self-propelled particle model captures the fine scale rules of interaction but fails to reproduce global dynamics. The most sophisticated model, the non-Markovian model, provides a good match to the data at both the fine scale and in terms of reproducing global dynamics. We conclude that prawns' movements are influenced by not just the current direction of nearby conspecifics, but also those encountered in the recent past. Given the simplicity of prawns as a study system our research suggests that self-propelled particle models of collective motion should, if they are to be realistic at multiple biological scales, include memory of previous interactions and other non-Markovian effects.

  17. A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri

    2013-01-01

    representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...

  18. Bayesian inference in partially identified models: Is the shape of the posterior distribution useful?

    OpenAIRE

    Gustafson, Paul

    2014-01-01

    Partially identified models are characterized by the distribution of observables being compatible with a set of values for the target parameter, rather than a single value. This set is often referred to as an identification region. From a non-Bayesian point of view, the identification region is the object revealed to the investigator in the limit of increasing sample size. Conversely, a Bayesian analysis provides the identification region plus the limiting posterior distribution over this reg...

  19. From low-dimensional model selection to high-dimensional inference: tailoring Bayesian methods to biological dynamical systems

    OpenAIRE

    Hug, Sabine Carolin

    2015-01-01

    In this thesis we use differential equations for mathematically representing biological processes. For this we have to infer the associated parameters for fitting the differential equations to measurement data. If the structure of the ODE itself is uncertain, model selection methods have to be applied. We refine several existing Bayesian methods, ranging from an adaptive scheme for the computation of high-dimensional integrals to multi-chain Metropolis-Hastings algorithms for high-dimensional...

  20. Bayesianism and inference to the best explanation

    Directory of Open Access Journals (Sweden)

    Valeriano IRANZO

    2008-01-01

    Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.

  1. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating and ...

  2. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by eva...

  3. Semiparametric Bayesian inference on skew-normal joint modeling of multivariate longitudinal and survival data.

    Science.gov (United States)

    Tang, An-Min; Tang, Nian-Sheng

    2015-02-28

    We propose a semiparametric multivariate skew-normal joint model for multivariate longitudinal and multivariate survival data. One main feature of the posited model is that we relax the commonly used normality assumption for random effects and within-subject error by using a centered Dirichlet process prior to specify the random effects distribution and using a multivariate skew-normal distribution to specify the within-subject error distribution and model trajectory functions of longitudinal responses semiparametrically. A Bayesian approach is proposed to simultaneously obtain Bayesian estimates of unknown parameters, random effects and nonparametric functions by combining the Gibbs sampler and the Metropolis-Hastings algorithm. Particularly, a Bayesian local influence approach is developed to assess the effect of minor perturbations to within-subject measurement error and random effects. Several simulation studies and an example are presented to illustrate the proposed methodologies. PMID:25404574

  4. Bayesian Estimation and Inference Using Stochastic Electronics.

    Science.gov (United States)

    Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André

    2016-01-01

    In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream. PMID:27047326

  5. Bayesian inference for generalized linear mixed model based on the multivariate t distribution in population pharmacokinetic study.

    Directory of Open Access Journals (Sweden)

    Fang-Rong Yan

    Full Text Available This article provides a fully bayesian approach for modeling of single-dose and complete pharmacokinetic data in a population pharmacokinetic (PK model. To overcome the impact of outliers and the difficulty of computation, a generalized linear model is chosen with the hypothesis that the errors follow a multivariate Student t distribution which is a heavy-tailed distribution. The aim of this study is to investigate and implement the performance of the multivariate t distribution to analyze population pharmacokinetic data. Bayesian predictive inferences and the Metropolis-Hastings algorithm schemes are used to process the intractable posterior integration. The precision and accuracy of the proposed model are illustrated by the simulating data and a real example of theophylline data.

  6. Universal Darwinism As a Process of Bayesian Inference.

    Science.gov (United States)

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature. PMID:27375438

  7. Bayesian default probability models

    OpenAIRE

    Andrlíková, Petra

    2014-01-01

    This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...

  8. A sequential point process model and Bayesian inference for spatial point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    We introduce a flexible spatial point process model for spatial point patterns exhibiting linear structures, without incorporating a latent line process. The model is given by an underlying sequential point process model, i.e. each new point is generated given the previous points. Under this model...... previous points is such that the dependent cluster point is likely to occur closely to a previous cluster point. We demonstrate the flexibility of the model for producing point patterns with linear structures, and propose to use the model as the likelihood in a Bayesian setting when analyzing a spatial...

  9. From least squares to multilevel modeling: A graphical introduction to Bayesian inference

    Science.gov (United States)

    Loredo, Thomas J.

    2016-01-01

    This tutorial presentation will introduce some of the key ideas and techniques involved in applying Bayesian methods to problems in astrostatistics. The focus will be on the big picture: understanding the foundations (interpreting probability, Bayes's theorem, the law of total probability and marginalization), making connections to traditional methods (propagation of errors, least squares, chi-squared, maximum likelihood, Monte Carlo simulation), and highlighting problems where a Bayesian approach can be particularly powerful (Poisson processes, density estimation and curve fitting with measurement error). The "graphical" component of the title reflects an emphasis on pictorial representations of some of the math, but also on the use of graphical models (multilevel or hierarchical models) for analyzing complex data. Code for some examples from the talk will be available to participants, in Python and in the Stan probabilistic programming language.

  10. Propriety conditions for the Bayesian autologistic modelInference for histone modifications

    OpenAIRE

    Mitra, Riten; Müller, Peter; Ji, Yuan

    2013-01-01

    Motivated by inference for a set of histone modifications we consider an improper prior for an autologistic model. We state sufficient conditions for posterior propriety under a constant prior on the coefficients of an autologistic model. We use known results for a multinomial logistic regression to prove posterior propriety under the autologistic model. The conditions are easily verified.

  11. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    inference algorithms based on the proposed prior representation for sparse channel estimation in orthogonal frequency-division multiplexing receivers. The inference algorithms, which are mainly obtained from variational Bayesian methods, exploit the underlying sparse structure of wireless channel responses......This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development of...... Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation of...

  12. An adaptive sparse-grid high-order stochastic collocation method for Bayesian inference in groundwater reactive transport modeling

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Guannan [ORNL; Webster, Clayton G [ORNL; Gunzburger, Max D [ORNL

    2012-09-01

    Although Bayesian analysis has become vital to the quantification of prediction uncertainty in groundwater modeling, its application has been hindered due to the computational cost associated with numerous model executions needed for exploring the posterior probability density function (PPDF) of model parameters. This is particularly the case when the PPDF is estimated using Markov Chain Monte Carlo (MCMC) sampling. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid high-order stochastic collocation (aSG-hSC) method. Unlike previous works using first-order hierarchical basis, we utilize a compactly supported higher-order hierar- chical basis to construct the surrogate system, resulting in a significant reduction in the number of computational simulations required. In addition, we use hierarchical surplus as an error indi- cator to determine adaptive sparse grids. This allows local refinement in the uncertain domain and/or anisotropic detection with respect to the random model parameters, which further improves computational efficiency. Finally, we incorporate a global optimization technique and propose an iterative algorithm for building the surrogate system for the PPDF with multiple significant modes. Once the surrogate system is determined, the PPDF can be evaluated by sampling the surrogate system directly with very little computational cost. The developed method is evaluated first using a simple analytical density function with multiple modes and then using two synthetic groundwater reactive transport models. The groundwater models represent different levels of complexity; the first example involves coupled linear reactions and the second example simulates nonlinear ura- nium surface complexation. The results show that the aSG-hSC is an effective and efficient tool for Bayesian inference in groundwater modeling in comparison with conventional

  13. Bayesian inference tools for inverse problems

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2013-08-01

    In this paper, first the basics of Bayesian inference with a parametric model of the data is presented. Then, the needed extensions are given when dealing with inverse problems and in particular the linear models such as Deconvolution or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. A classification of these priors is presented, first in separable and Markovien models and then in simple or hierarchical with hidden variables. For practical applications, we need also to consider the estimation of the hyper parameters. Finally, we see that we have to infer simultaneously on the unknowns, the hidden variables and the hyper parameters. Very often, the expression of this joint posterior law is too complex to be handled directly. Indeed, rarely we can obtain analytical solutions to any point estimators such the Maximum A posteriori (MAP) or Posterior Mean (PM). Three main tools are then can be used: Laplace approximation (LAP), Markov Chain Monte Carlo (MCMC) and Bayesian Variational Approximations (BVA). To illustrate all these aspects, we will consider a deconvolution problem where we know that the input signal is sparse and propose to use a Student-t prior for that. Then, to handle the Bayesian computations with this model, we use the property of Student-t which is modelling it via an infinite mixture of Gaussians, introducing thus hidden variables which are the variances. Then, the expression of the joint posterior of the input signal samples, the hidden variables (which are here the inverse variances of those samples) and the hyper-parameters of the problem (for example the variance of the noise) is given. From this point, we will present the joint maximization by alternate optimization and the three possible approximation methods. Finally, the proposed methodology is applied in different applications such as mass spectrometry, spectrum estimation of quasi periodic biological signals and

  14. Human collective intelligence as distributed Bayesian inference

    CERN Document Server

    Krafft, Peter M; Pan, Wei; Della Penna, Nicolás; Altshuler, Yaniv; Shmueli, Erez; Tenenbaum, Joshua B; Pentland, Alex

    2016-01-01

    Collective intelligence is believed to underly the remarkable success of human society. The formation of accurate shared beliefs is one of the key components of human collective intelligence. How are accurate shared beliefs formed in groups of fallible individuals? Answering this question requires a multiscale analysis. We must understand both the individual decision mechanisms people use, and the properties and dynamics of those mechanisms in the aggregate. As of yet, mathematical tools for such an approach have been lacking. To address this gap, we introduce a new analytical framework: We propose that groups arrive at accurate shared beliefs via distributed Bayesian inference. Distributed inference occurs through information processing at the individual level, and yields rational belief formation at the group level. We instantiate this framework in a new model of human social decision-making, which we validate using a dataset we collected of over 50,000 users of an online social trading platform where inves...

  15. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  16. Efficient Bayesian inference of subsurface flow models using nested sampling and sparse polynomial chaos surrogates

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-02-01

    An efficient Bayesian calibration method based on the nested sampling (NS) algorithm and non-intrusive polynomial chaos method is presented. Nested sampling is a Bayesian sampling algorithm that builds a discrete representation of the posterior distributions by iteratively re-focusing a set of samples to high likelihood regions. NS allows representing the posterior probability density function (PDF) with a smaller number of samples and reduces the curse of dimensionality effects. The main difficulty of the NS algorithm is in the constrained sampling step which is commonly performed using a random walk Markov Chain Monte-Carlo (MCMC) algorithm. In this work, we perform a two-stage sampling using a polynomial chaos response surface to filter out rejected samples in the Markov Chain Monte-Carlo method. The combined use of nested sampling and the two-stage MCMC based on approximate response surfaces provides significant computational gains in terms of the number of simulation runs. The proposed algorithm is applied for calibration and model selection of subsurface flow models. © 2013.

  17. Efficient Bayesian inference for ARFIMA processes

    Science.gov (United States)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-03-01

    Many geophysical quantities, like atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long-range dependence (LRD). LRD means that these quantities experience non-trivial temporal memory, which potentially enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LRD. In this paper we present a modern and systematic approach to the inference of LRD. Rather than Mandelbrot's fractional Gaussian noise, we use the more flexible Autoregressive Fractional Integrated Moving Average (ARFIMA) model which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LRD, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g. short memory effects) can be integrated over in order to focus on long memory parameters, and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data, with favorable comparison to the standard estimators.

  18. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    Energy Technology Data Exchange (ETDEWEB)

    Miki, Kenji [NASA Glenn Research Center, OAI, 22800 Cedar Point Rd, Cleveland, OH 44142 (United States); Panesi, Marco, E-mail: mpanesi@illinois.edu [Department of Aerospace Engineering, University of Illinois at Urbana-Champaign, 306 Talbot Lab, 104 S. Wright St., Urbana, IL 61801 (United States); Prudhomme, Serge [Département de mathématiques et de génie industriel, Ecole Polytechnique de Montréal, C.P. 6079, succ. Centre-ville, Montréal, QC, H3C 3A7 (Canada)

    2015-10-01

    The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  19. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji

    2015-10-01

    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  20. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  1. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  2. Kernel Bayesian Inference with Posterior Regularization

    OpenAIRE

    Song, Yang; Jun ZHU; Ren, Yong

    2016-01-01

    We propose a vector-valued regression problem whose solution is equivalent to the reproducing kernel Hilbert space (RKHS) embedding of the Bayesian posterior distribution. This equivalence provides a new understanding of kernel Bayesian inference. Moreover, the optimization problem induces a new regularization for the posterior embedding estimator, which is faster and has comparable performance to the squared regularization in kernel Bayes' rule. This regularization coincides with a former th...

  3. Improving inferences from short-term ecological studies with Bayesian hierarchical modeling: white-headed woodpeckers in managed forests.

    Science.gov (United States)

    Linden, Daniel W; Roloff, Gary J

    2015-08-01

    Pilot studies are often used to design short-term research projects and long-term ecological monitoring programs, but data are sometimes discarded when they do not match the eventual survey design. Bayesian hierarchical modeling provides a convenient framework for integrating multiple data sources while explicitly separating sample variation into observation and ecological state processes. Such an approach can better estimate state uncertainty and improve inferences from short-term studies in dynamic systems. We used a dynamic multistate occupancy model to estimate the probabilities of occurrence and nesting for white-headed woodpeckers Picoides albolarvatus in recent harvest units within managed forests of northern California, USA. Our objectives were to examine how occupancy states and state transitions were related to forest management practices, and how the probabilities changed over time. Using Gibbs variable selection, we made inferences using multiple model structures and generated model-averaged estimates. Probabilities of white-headed woodpecker occurrence and nesting were high in 2009 and 2010, and the probability that nesting persisted at a site was positively related to the snag density in harvest units. Prior-year nesting resulted in higher probabilities of subsequent occurrence and nesting. We demonstrate the benefit of forest management practices that increase the density of retained snags in harvest units for providing white-headed woodpecker nesting habitat. While including an additional year of data from our pilot study did not drastically alter management recommendations, it changed the interpretation of the mechanism behind the observed dynamics. Bayesian hierarchical modeling has the potential to maximize the utility of studies based on small sample sizes while fully accounting for measurement error and both estimation and model uncertainty, thereby improving the ability of observational data to inform conservation and management strategies

  4. A Bayesian method for characterizing distributed micro-releases: II. inference under model uncertainty with short time-series data.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef; Fast P. (Lawrence Livermore National Laboratory, Livermore, CA); Kraus, M. (Peterson AFB, CO); Ray, J. P.

    2006-01-01

    Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that these data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.

  5. Bayesian inference of a lake water quality model by emulating its posterior density

    Science.gov (United States)

    Dietzel, A.; Reichert, P.

    2014-10-01

    We use a Gaussian stochastic process emulator to interpolate the posterior probability density of a computationally demanding application of the biogeochemical-ecological lake model BELAMO to accelerate statistical inference of deterministic model and error model parameters. The deterministic model consists of a mechanistic description of key processes influencing the mass balance of nutrients, dissolved oxygen, organic particles, and phytoplankton and zooplankton in the lake. This model is complemented by a Gaussian stochastic process to describe the remaining model bias and by Normal, independent observation errors. A small subsample of the Markov chain representing the posterior of the model parameters is propagated through the full model to get model predictions and uncertainty estimates. We expect this approximation to be more accurate at only slightly higher computational costs compared to using a Normal approximation to the posterior probability density and linear error propagation to the results as we did in an earlier paper. The performance of the two techniques is compared for a didactical example as well as for the lake model. As expected, for the didactical example, the use of the emulator led to posterior marginals of the model parameters that are closer to those calculated by Markov chain simulation using the full model than those based on the Normal approximation. For the lake model, the new technique proved applicable without an excessive increase in computational requirements, but we faced challenges in the choice of the design data set for emulator calibration. As the posterior is a scalar function of the parameters, the suggested technique is an alternative to the emulation of a potentially more complex, structured output of the simulation model that allows for the use of a less case-specific emulator. This is at the cost that still the full model has to be used for prediction (which can be done with a smaller, approximately independent subsample

  6. A simple algorithm to estimate genetic variance in an animal threshold model using Bayesian inference

    Directory of Open Access Journals (Sweden)

    Heringstad Bjørg

    2010-07-01

    Full Text Available Abstract Background In the genetic analysis of binary traits with one observation per animal, animal threshold models frequently give biased heritability estimates. In some cases, this problem can be circumvented by fitting sire- or sire-dam models. However, these models are not appropriate in cases where individual records exist on parents. Therefore, the aim of our study was to develop a new Gibbs sampling algorithm for a proper estimation of genetic (covariance components within an animal threshold model framework. Methods In the proposed algorithm, individuals are classified as either "informative" or "non-informative" with respect to genetic (covariance components. The "non-informative" individuals are characterized by their Mendelian sampling deviations (deviance from the mid-parent mean being completely confounded with a single residual on the underlying liability scale. For threshold models, residual variance on the underlying scale is not identifiable. Hence, variance of fully confounded Mendelian sampling deviations cannot be identified either, but can be inferred from the between-family variation. In the new algorithm, breeding values are sampled as in a standard animal model using the full relationship matrix, but genetic (covariance components are inferred from the sampled breeding values and relationships between "informative" individuals (usually parents only. The latter is analogous to a sire-dam model (in cases with no individual records on the parents. Results When applied to simulated data sets, the standard animal threshold model failed to produce useful results since samples of genetic variance always drifted towards infinity, while the new algorithm produced proper parameter estimates essentially identical to the results from a sire-dam model (given the fact that no individual records exist for the parents. Furthermore, the new algorithm showed much faster Markov chain mixing properties for genetic parameters (similar to

  7. Bayesian inference of baseline fertility and treatment effects via a crop yield-fertility model.

    Science.gov (United States)

    Chen, Hungyen; Yamagishi, Junko; Kishino, Hirohisa

    2014-01-01

    To effectively manage soil fertility, knowledge is needed of how a crop uses nutrients from fertilizer applied to the soil. Soil quality is a combination of biological, chemical and physical properties and is hard to assess directly because of collective and multiple functional effects. In this paper, we focus on the application of these concepts to agriculture. We define the baseline fertility of soil as the level of fertility that a crop can acquire for growth from the soil. With this strict definition, we propose a new crop yield-fertility model that enables quantification of the process of improving baseline fertility and the effects of treatments solely from the time series of crop yields. The model was modified from Michaelis-Menten kinetics and measured the additional effects of the treatments given the baseline fertility. Using more than 30 years of experimental data, we used the Bayesian framework to estimate the improvements in baseline fertility and the effects of fertilizer and farmyard manure (FYM) on maize (Zea mays), barley (Hordeum vulgare), and soybean (Glycine max) yields. Fertilizer contributed the most to the barley yield and FYM contributed the most to the soybean yield among the three crops. The baseline fertility of the subsurface soil was very low for maize and barley prior to fertilization. In contrast, the baseline fertility in this soil approximated half-saturated fertility for the soybean crop. The long-term soil fertility was increased by adding FYM, but the effect of FYM addition was reduced by the addition of fertilizer. Our results provide evidence that long-term soil fertility under continuous farming was maintained, or increased, by the application of natural nutrients compared with the application of synthetic fertilizer. PMID:25405353

  8. Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization

    Science.gov (United States)

    Gelman, Andrew; Lee, Daniel; Guo, Jiqiang

    2015-01-01

    Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…

  9. Bayesian parameter inference for empirical stochastic models of paleoclimatic records with dating uncertainty

    Science.gov (United States)

    Boers, Niklas; Goswami, Bedartha; Chekroun, Mickael; Svensson, Anders; Rousseau, Denis-Didier; Ghil, Michael

    2016-04-01

    In the recent past, empirical stochastic models have been successfully applied to model a wide range of climatic phenomena [1,2]. In addition to enhancing our understanding of the geophysical systems under consideration, multilayer stochastic models (MSMs) have been shown to be solidly grounded in the Mori-Zwanzig formalism of statistical physics [3]. They are also well-suited for predictive purposes, e.g., for the El Niño Southern Oscillation [4] and the Madden-Julian Oscillation [5]. In general, these models are trained on a given time series under consideration, and then assumed to reproduce certain dynamical properties of the underlying natural system. Most existing approaches are based on least-squares fitting to determine optimal model parameters, which does not allow for an uncertainty estimation of these parameters. This approach significantly limits the degree to which dynamical characteristics of the time series can be safely inferred from the model. Here, we are specifically interested in fitting low-dimensional stochastic models to time series obtained from paleoclimatic proxy records, such as the oxygen isotope ratio and dust concentration of the NGRIP record [6]. The time series derived from these records exhibit substantial dating uncertainties, in addition to the proxy measurement errors. In particular, for time series of this kind, it is crucial to obtain uncertainty estimates for the final model parameters. Following [7], we first propose a statistical procedure to shift dating uncertainties from the time axis to the proxy axis of layer-counted paleoclimatic records. Thereafter, we show how Maximum Likelihood Estimation in combination with Markov Chain Monte Carlo parameter sampling can be employed to translate all uncertainties present in the original proxy time series to uncertainties of the parameter estimates of the stochastic model. We compare time series simulated by the empirical model to the original time series in terms of standard

  10. A Bayesian Approach to Protein Inference Problem in Shotgun Proteomics

    OpenAIRE

    Li, Yong Fuga; Arnold, Randy J.; Li, Yixue; Radivojac, Predrag; Sheng, Quanhu; Tang, Haixu

    2009-01-01

    The protein inference problem represents a major challenge in shotgun proteomics. In this article, we describe a novel Bayesian approach to address this challenge by incorporating the predicted peptide detectabilities as the prior probabilities of peptide identification. We propose a rigorious probabilistic model for protein inference and provide practical algoritmic solutions to this problem. We used a complex synthetic protein mixture to test our method and obtained promising results.

  11. The NIFTY way of Bayesian signal inference

    International Nuclear Information System (INIS)

    We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy

  12. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    Science.gov (United States)

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  13. Analysis of KATRIN data using Bayesian inference

    CERN Document Server

    Riis, Anna Sejersen; Weinheimer, Christian

    2011-01-01

    The KATRIN (KArlsruhe TRItium Neutrino) experiment will be analyzing the tritium beta-spectrum to determine the mass of the neutrino with a sensitivity of 0.2 eV (90% C.L.). This approach to a measurement of the absolute value of the neutrino mass relies only on the principle of energy conservation and can in some sense be called model-independent as compared to cosmology and neutrino-less double beta decay. However by model independent we only mean in case of the minimal extension of the standard model. One should therefore also analyse the data for non-standard couplings to e.g. righthanded or sterile neutrinos. As an alternative to the frequentist minimization methods used in the analysis of the earlier experiments in Mainz and Troitsk we have been investigating Markov Chain Monte Carlo (MCMC) methods which are very well suited for probing multi-parameter spaces. We found that implementing the KATRIN chi squared function in the COSMOMC package - an MCMC code using Bayesian parameter inference - solved the ...

  14. Bayesian Inference of Kinematics and Memberships of Open Cluster

    Science.gov (United States)

    Shao, Z. Y.; Chen, L.; Zhong, J.; Hou, J. L.

    2014-07-01

    Based on the Bayesian Inference (BI) method, the Multiple-modelling approach is improved to combine coordinative positions, proper motions (PM) and radial velocities (RV), to separate the motion of the open cluster from field stars, as well as to describe the intrinsic kinematic status of the cluster.

  15. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  16. Bayesian Inference and Application of Robust Growth Curve Models Using Student's "t" Distribution

    Science.gov (United States)

    Zhang, Zhiyong; Lai, Keke; Lu, Zhenqiu; Tong, Xin

    2013-01-01

    Despite the widespread popularity of growth curve analysis, few studies have investigated robust growth curve models. In this article, the "t" distribution is applied to model heavy-tailed data and contaminated normal data with outliers for growth curve analysis. The derived robust growth curve models are estimated through Bayesian…

  17. A sequential point process model and Bayesian inference for spatial point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    We introduce a flexible spatial point process model for spatial point patterns exhibiting linear structures, without incorporating a latent line process. The model is given by an underlying sequential point process model, i.e. each new point is generated given the previous points. Under this mode...

  18. Bayesian Inference in the Modern Design of Experiments

    Science.gov (United States)

    DeLoach, Richard

    2008-01-01

    This paper provides an elementary tutorial overview of Bayesian inference and its potential for application in aerospace experimentation in general and wind tunnel testing in particular. Bayes Theorem is reviewed and examples are provided to illustrate how it can be applied to objectively revise prior knowledge by incorporating insights subsequently obtained from additional observations, resulting in new (posterior) knowledge that combines information from both sources. A logical merger of Bayesian methods and certain aspects of Response Surface Modeling is explored. Specific applications to wind tunnel testing, computational code validation, and instrumentation calibration are discussed.

  19. A sequential point process model and Bayesian inference for spatial point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    2012-01-01

    We introduce a flexible spatial point process model for spatial point patterns exhibiting linear structures, without incorporating a latent line process. The model is given by an underlying sequential point process model. Under this model, the points can be of one of three types: a ‘background...... point’ an ‘independent cluster point’ or a ‘dependent cluster point’. The background and independent cluster points are thought to exhibit ‘complete spatial randomness’, whereas the dependent cluster points are likely to occur close to previous cluster points. We demonstrate the flexibility of the model...

  20. Bayesian inference based modelling for gene transcriptional dynamics by integrating multiple source of knowledge

    Directory of Open Access Journals (Sweden)

    Wang Shu-Qiang

    2012-07-01

    Full Text Available Abstract Background A key challenge in the post genome era is to identify genome-wide transcriptional regulatory networks, which specify the interactions between transcription factors and their target genes. Numerous methods have been developed for reconstructing gene regulatory networks from expression data. However, most of them are based on coarse grained qualitative models, and cannot provide a quantitative view of regulatory systems. Results A binding affinity based regulatory model is proposed to quantify the transcriptional regulatory network. Multiple quantities, including binding affinity and the activity level of transcription factor (TF are incorporated into a general learning model. The sequence features of the promoter and the possible occupancy of nucleosomes are exploited to estimate the binding probability of regulators. Comparing with the previous models that only employ microarray data, the proposed model can bridge the gap between the relative background frequency of the observed nucleotide and the gene's transcription rate. Conclusions We testify the proposed approach on two real-world microarray datasets. Experimental results show that the proposed model can effectively identify the parameters and the activity level of TF. Moreover, the kinetic parameters introduced in the proposed model can reveal more biological sense than previous models can do.

  1. Bayesian Inference for Smoking Cessation with a Latent Cure State

    OpenAIRE

    Luo, Sheng; Crainiceanu, Ciprian M.; Thomas A Louis; Chatterjee, Nilanjan

    2009-01-01

    We present a Bayesian approach to modeling dynamic smoking addiction behavior processes when cure is not directly observed due to censoring. Subject-specific probabilities model the stochastic transitions among three behavioral states: smoking, transient quitting, and permanent quitting (absorbent state). A multivariate normal distribution for random effects is used to account for the potential correlation among the subject-specific transition probabilities. Inference is conducted using a Bay...

  2. An equilibrium validation technique based on Bayesian inference

    International Nuclear Information System (INIS)

    In recent years, Bayesian probability theory has been used in a number of experiments to fold uncertainties and interdependences in the diagnostic data and forward models, and together with prior knowledge of the state of the plasma, thus increase accuracy of inferred physics variables. Key developments include the application to current and flux surface tomography, effective charge, the electron energy distribution function, neutron spectrometry and density. Virtual observations have also been introduced to better constrain inferred quantities in current tomography. In this work we present Bayesian inference results of toroidal and poloidal current and flux surface tomography. Whilst the uncertainty in these profiles, as well as the uncertainty in inferred parameters such as the safety factor profile is small (<5%), the inference can change substantially depending on the physics model used. We also present Bayesian inference results for Thomson scattering and charge-exchange recombination spectroscopy. In separate work we have computed radial force balance components on the midplane in the Mega Ampere Spherical tokamak. Our aim is to establish a validation framework for different equilibrium physics models. We find that in the overlapping region of the core (normalized poloidal flux less than 0.4) and motional Stark effect (MSE) chords, the plasma is consistent with static Grad-Shafranov force balance to within two standard deviations. In the outboard edge region, where MSE data are also available, the pressure gradient exceeds the Lorentz force. Most likely, this is because the poloidal current is not constrained to zero at the plasma edge. To lowest order, the results suggest computing components of force balance are useful to assess data-consistency, independent of any equilibrium solution. To first order, we have integrated the residue to force balance to infer an energetic particle pressure.

  3. In-Home Activity Recognition: Bayesian Inference for Hidden Markov Models

    NARCIS (Netherlands)

    F. Javier Ordoñez; G. Englebienne; P. de Toledo; T. van Kasteren; A. Sanchez; B. Kröse

    2014-01-01

    Activity recognition in a home setting is being widely explored as a means to support elderly people living alone. Probabilistic models using classical, maximum-likelihood estimation methods are known to work well in this domain, but they are prone to overfitting and require labeled activity data fo

  4. Bayesian Models of Brain and Behaviour

    OpenAIRE

    Penny, William

    2012-01-01

    This paper presents a review of Bayesian models of brain and behaviour. We first review the basic principles of Bayesian inference. This is followed by descriptions of sampling and variational methods for approximate inference, and forward and backward recursions in time for inference in dynamical models. The review of behavioural models covers work in visual processing, sensory integration, sensorimotor integration, and collective decision making. The review of brain models covers a range of...

  5. Revealing ecological networks using Bayesian network inference algorithms.

    Science.gov (United States)

    Milns, Isobel; Beale, Colin M; Smith, V Anne

    2010-07-01

    Understanding functional relationships within ecological networks can help reveal keys to ecosystem stability or fragility. Revealing these relationships is complicated by the difficulties of isolating variables or performing experimental manipulations within a natural ecosystem, and thus inferences are often made by matching models to observational data. Such models, however, require assumptions-or detailed measurements-of parameters such as birth and death rate, encounter frequency, territorial exclusion, and predation success. Here, we evaluate the use of a Bayesian network inference algorithm, which can reveal ecological networks based upon species and habitat abundance alone. We test the algorithm's performance and applicability on observational data of avian communities and habitat in the Peak District National Park, United Kingdom. The resulting networks correctly reveal known relationships among habitat types and known interspecific relationships. In addition, the networks produced novel insights into ecosystem structure and identified key species with high connectivity. Thus, Bayesian networks show potential for becoming a valuable tool in ecosystem analysis. PMID:20715607

  6. Thin film subsurface environments; Advanced X-ray spectroscopies and a novel Bayesian inference modeling algorithm

    Science.gov (United States)

    Church, Jonathan R.

    New condensed matter metrologies are being used to probe ever smaller length scales. In support of the diverse field of materials research synchrotron based spectroscopies provide sub-micron spatial resolutions and a breadth of photon wavelengths for scientific studies. For electronic materials the thinnest layers in a complementary metal-oxide-semiconductor (CMOS) device have been reduced to just a few nanometers. This raises concerns for layer uniformity, complete surface coverage, and interfacial quality. Deposition processes like chemical vapor deposition (CVD) and atomic layer deposition (ALD) have been shown to deposit the needed high-quality films for the requisite thicknesses. However, new materials beget new chemistries and, unfortunately, unwanted side-reactions and by-products. CVD/ALD tools and chemical precursors provided by our collaborators at Air Liquide utilized these new chemistries and films were deposited for which novel spectroscopic characterization methods were used. The second portion of the thesis focuses on fading and decomposing paint pigments in iconic artworks. Efforts have been directed towards understanding the micro-environments causing degradation. Hard X-ray photoelectron spectroscopy (HAXPES) and variable kinetic energy X-ray photoelectron spectroscopy (VKE-XPS) are advanced XPS techniques capable of elucidating both chemical environments and electronic band structures in sub-surface regions of electronic materials. HAXPES has been used to study the electronic band structure in a typical CMOS structure; it will be shown that unexpected band alignments are associated with the presence of electronic charges near a buried interface. Additionally, a computational modeling algorithm, Bayes-Sim, was developed to reconstruct compositional depth profiles (CDP) using VKE-XPS data sets; a subset algorithm also reconstructs CDP from angle-resolved XPS data. Reconstructed CDP produced by Bayes-Sim were most strongly correlated to the real

  7. Approximate Bayesian inference for complex ecosystems

    OpenAIRE

    Michael P H Stumpf

    2014-01-01

    Mathematical models have been central to ecology for nearly a century. Simple models of population dynamics have allowed us to understand fundamental aspects underlying the dynamics and stability of ecological systems. What has remained a challenge, however, is to meaningfully interpret experimental or observational data in light of mathematical models. Here, we review recent developments, notably in the growing field of approximate Bayesian computation (ABC), that allow us to calibrate mathe...

  8. Efficient fuzzy Bayesian inference algorithms for incorporating expert knowledge in parameter estimation

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad

    2016-05-01

    Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert

  9. Bayesian inference of the lung alveolar spatial model for the identification of alveolar mechanics associated with acute respiratory distress syndrome

    Science.gov (United States)

    Christley, Scott; Emr, Bryanna; Ghosh, Auyon; Satalin, Josh; Gatto, Louis; Vodovotz, Yoram; Nieman, Gary F.; An, Gary

    2013-06-01

    Acute respiratory distress syndrome (ARDS) is acute lung failure secondary to severe systemic inflammation, resulting in a derangement of alveolar mechanics (i.e. the dynamic change in alveolar size and shape during tidal ventilation), leading to alveolar instability that can cause further damage to the pulmonary parenchyma. Mechanical ventilation is a mainstay in the treatment of ARDS, but may induce mechano-physical stresses on unstable alveoli, which can paradoxically propagate the cellular and molecular processes exacerbating ARDS pathology. This phenomenon is called ventilator induced lung injury (VILI), and plays a significant role in morbidity and mortality associated with ARDS. In order to identify optimal ventilation strategies to limit VILI and treat ARDS, it is necessary to understand the complex interplay between biological and physical mechanisms of VILI, first at the alveolar level, and then in aggregate at the whole-lung level. Since there is no current consensus about the underlying dynamics of alveolar mechanics, as an initial step we investigate the ventilatory dynamics of an alveolar sac (AS) with the lung alveolar spatial model (LASM), a 3D spatial biomechanical representation of the AS and its interaction with airflow pressure and the surface tension effects of pulmonary surfactant. We use the LASM to identify the mechanical ramifications of alveolar dynamics associated with ARDS. Using graphical processing unit parallel algorithms, we perform Bayesian inference on the model parameters using experimental data from rat lung under control and Tween-induced ARDS conditions. Our results provide two plausible models that recapitulate two fundamental hypotheses about volume change at the alveolar level: (1) increase in alveolar size through isotropic volume change, or (2) minimal change in AS radius with primary expansion of the mouth of the AS, with the implication that the majority of change in lung volume during the respiratory cycle occurs in the

  10. Bayesian inference of the lung alveolar spatial model for the identification of alveolar mechanics associated with acute respiratory distress syndrome

    International Nuclear Information System (INIS)

    Acute respiratory distress syndrome (ARDS) is acute lung failure secondary to severe systemic inflammation, resulting in a derangement of alveolar mechanics (i.e. the dynamic change in alveolar size and shape during tidal ventilation), leading to alveolar instability that can cause further damage to the pulmonary parenchyma. Mechanical ventilation is a mainstay in the treatment of ARDS, but may induce mechano-physical stresses on unstable alveoli, which can paradoxically propagate the cellular and molecular processes exacerbating ARDS pathology. This phenomenon is called ventilator induced lung injury (VILI), and plays a significant role in morbidity and mortality associated with ARDS. In order to identify optimal ventilation strategies to limit VILI and treat ARDS, it is necessary to understand the complex interplay between biological and physical mechanisms of VILI, first at the alveolar level, and then in aggregate at the whole-lung level. Since there is no current consensus about the underlying dynamics of alveolar mechanics, as an initial step we investigate the ventilatory dynamics of an alveolar sac (AS) with the lung alveolar spatial model (LASM), a 3D spatial biomechanical representation of the AS and its interaction with airflow pressure and the surface tension effects of pulmonary surfactant. We use the LASM to identify the mechanical ramifications of alveolar dynamics associated with ARDS. Using graphical processing unit parallel algorithms, we perform Bayesian inference on the model parameters using experimental data from rat lung under control and Tween-induced ARDS conditions. Our results provide two plausible models that recapitulate two fundamental hypotheses about volume change at the alveolar level: (1) increase in alveolar size through isotropic volume change, or (2) minimal change in AS radius with primary expansion of the mouth of the AS, with the implication that the majority of change in lung volume during the respiratory cycle occurs in the

  11. Bayesian Inference for Functional Dynamics Exploring in fMRI Data

    Directory of Open Access Journals (Sweden)

    Xuan Guo

    2016-01-01

    Full Text Available This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM, Bayesian Connectivity Change Point Model (BCCPM, and Dynamic Bayesian Variable Partition Model (DBVPM, and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.

  12. Optical tweezers calibration with Bayesian inference

    Science.gov (United States)

    Türkcan, Silvan; Richly, Maximilian U.; Le Gall, Antoine; Fiszman, Nicolas; Masson, Jean-Baptiste; Westbrook, Nathalie; Perronet, Karen; Alexandrou, Antigoni

    2014-09-01

    We present a new method for calibrating an optical-tweezer setup that is based on Bayesian inference1. This method employs an algorithm previously used to analyze the confined trajectories of receptors within lipid rafts2,3. The main advantages of this method are that it does not require input parameters and is insensitive to systematic errors like the drift of the setup. Additionally, it exploits a much larger amount of the information stored in the recorded bead trajectory than standard calibration approaches. The additional information can be used to detect deviations from the perfect harmonic potential or detect environmental influences on the bead. The algorithm infers the diffusion coefficient and the potential felt by a trapped bead, and only requires the bead trajectory as input. We demonstrate that this method outperforms the equipartition method and the power-spectrum method in input information required (bead radius and trajectory length) and in output accuracy. Furthermore, by inferring a higher order potential our method can reveal deviations from the assumed second-order potential. More generally, this method can also be used for magnetic-tweezer calibration.

  13. Parsing optical scanned 3D data by Bayesian inference

    Science.gov (United States)

    Xiong, Hanwei; Xu, Jun; Xu, Chenxi; Pan, Ming

    2015-10-01

    Optical devices are always used to digitize complex objects to get their shapes in form of point clouds. The results have no semantic meaning about the objects, and tedious process is indispensable to segment the scanned data to get meanings. The reason for a person to perceive an object correctly is the usage of knowledge, so Bayesian inference is used to the goal. A probabilistic And-Or-Graph is used as a unified framework of representation, learning, and recognition for a large number of object categories, and a probabilistic model defined on this And-Or-Graph is learned from a relatively small training set per category. Given a set of 3D scanned data, the Bayesian inference constructs a most probable interpretation of the object, and a semantic segment is obtained from the part decomposition. Some examples are given to explain the method.

  14. Fast Bayesian inference for slow-roll inflation

    CERN Document Server

    Ringeval, Christophe

    2013-01-01

    We present and discuss a new approach increasing by orders of magnitude the speed of performing Bayesian inference and parameter estimation within the framework of slow-roll inflation. The method relies on the determination of an effective likelihood for inflation which is a function of the primordial amplitude of the scalar perturbations complemented with the necessary number of the so-called Hubble flow functions to reach the desired accuracy. Starting from any cosmological data set, the effective likelihood is obtained by marginalisation over the standard cosmological parameters, here viewed as "nuisance" from the early Universe point of view. As being low-dimensional, basic machine-learning algorithms can be trained to accurately reproduce its multidimensional shape and then be used as a proxy to perform fast Bayesian inference on the inflationary models. The robustness and accuracy of the method are illustrated using the Planck Cosmic Microwave Background (CMB) data to perform primordial parameter estima...

  15. Approximate bayesian parameter inference for dynamical systems in systems biology

    International Nuclear Information System (INIS)

    This paper proposes to use approximate instead of exact stochastic simulation algorithms for approximate Bayesian parameter inference of dynamical systems in systems biology. It first presents the mathematical framework for the description of systems biology models, especially from the aspect of a stochastic formulation as opposed to deterministic model formulations based on the law of mass action. In contrast to maximum likelihood methods for parameter inference, approximate inference method- share presented which are based on sampling parameters from a known prior probability distribution, which gradually evolves toward a posterior distribution, through the comparison of simulated data from the model to a given data set of measurements. The paper then discusses the simulation process, where an over- view is given of the different exact and approximate methods for stochastic simulation and their improvements that we propose. The exact and approximate simulators are implemented and used within approximate Bayesian parameter inference methods. Our evaluation of these methods on two tasks of parameter estimation in two different models shows that equally good results are obtained much faster when using approximate simulation as compared to using exact simulation. (Author)

  16. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  17. Bayesian inference on dynamic linear models of day-to-day origin-destination flows in transportation networks

    OpenAIRE

    Pitombeira-Neto, Anselmo Ramalho; Loureiro, Carlos Felipe Grangeiro; Carvalho, Luis Eduardo

    2016-01-01

    Estimation of origin-destination (OD) demand plays a key role in successful transportation studies. In this paper, we consider the estimation of time-varying day-to-day OD flows given data on traffic volumes in a transportation network for a sequence of days. We propose a dynamic linear model (DLM) in order to represent the stochastic evolution of OD flows over time. DLM's are Bayesian state-space models which can capture non-stationarity. We take into account the hierarchical relationships b...

  18. Bayesian Inference of Reticulate Phylogenies under the Multispecies Network Coalescent.

    Science.gov (United States)

    Wen, Dingqiao; Yu, Yun; Nakhleh, Luay

    2016-05-01

    The multispecies coalescent (MSC) is a statistical framework that models how gene genealogies grow within the branches of a species tree. The field of computational phylogenetics has witnessed an explosion in the development of methods for species tree inference under MSC, owing mainly to the accumulating evidence of incomplete lineage sorting in phylogenomic analyses. However, the evolutionary history of a set of genomes, or species, could be reticulate due to the occurrence of evolutionary processes such as hybridization or horizontal gene transfer. We report on a novel method for Bayesian inference of genome and species phylogenies under the multispecies network coalescent (MSNC). This framework models gene evolution within the branches of a phylogenetic network, thus incorporating reticulate evolutionary processes, such as hybridization, in addition to incomplete lineage sorting. As phylogenetic networks with different numbers of reticulation events correspond to points of different dimensions in the space of models, we devise a reversible-jump Markov chain Monte Carlo (RJMCMC) technique for sampling the posterior distribution of phylogenetic networks under MSNC. We implemented the methods in the publicly available, open-source software package PhyloNet and studied their performance on simulated and biological data. The work extends the reach of Bayesian inference to phylogenetic networks and enables new evolutionary analyses that account for reticulation. PMID:27144273

  19. Bayesian inference for a nonlinear mixed-effects Tobit model with multivariate skew-t distributions: application to AIDS studies.

    Science.gov (United States)

    Dagne, Getachew; Huang, Yangxin

    2012-01-01

    Censored data are characteristics of many bioassays in HIV/AIDS studies where assays may not be sensitive enough to determine gradations in viral load determination among those below a detectable threshold. Not accounting for such left-censoring appropriately can lead to biased parameter estimates in most data analysis. To properly adjust for left-censoring, this paper presents an extension of the Tobit model for fitting nonlinear dynamic mixed-effects models with skew distributions. Such extensions allow one to specify the conditional distributions for viral load response to account for left-censoring, skewness and heaviness in the tails of the distributions of the response variable. A Bayesian modeling approach via Markov Chain Monte Carlo (MCMC) algorithm is used to estimate model parameters. The proposed methods are illustrated using real data from an HIV/AIDS study. PMID:22992288

  20. Bayesian inference for a nonlinear mixed-effects Tobit model with multivariate skew-t distributions: application to AIDS studies

    Science.gov (United States)

    Dagne, Getachew; Huang, Yangxin

    2016-01-01

    Censored data are characteristics of many bioassays in HIV/AIDS studies where assays may not be sensitive enough to determine gradations in viral load determination among those below a detectable threshold. Not accounting for such left-censoring appropriately can lead to biased parameter estimates in most data analysis. To properly adjust for left-censoring, this paper presents an extension of the Tobit model for fitting nonlinear dynamic mixed-effects models with skew distributions. Such extensions allow one to specify the conditional distributions for viral load response to account for left-censoring, skewness and heaviness in the tails of the distributions of the response variable. A Bayesian modeling approach via Markov Chain Monte Carlo (MCMC) algorithm is used to estimate model parameters. The proposed methods are illustrated using real data from an HIV/AIDS study. PMID:22992288

  1. Progress on Bayesian Inference of the Fast Ion Distribution Function

    DEFF Research Database (Denmark)

    Stagner, L.; Heidbrink, W.W,; Chen, X.;

    2013-01-01

    The fast-ion distribution function (DF) has a complicated dependence on several phase-space variables. The standard analysis procedure in energetic particle research is to compute the DF theoretically, use that DF in forward modeling to predict diagnostic signals, then compare with measured data....... However, when theory and experiment disagree (for one or more diagnostics), it is unclear how to proceed. Bayesian statistics provides a framework to infer the DF, quantify errors, and reconcile discrepant diagnostic measurements. Diagnostic errors and weight functions that describe the phase space...

  2. Towards Bayesian Inference of the Fast-Ion Distribution Function

    DEFF Research Database (Denmark)

    Stagner, L.; Heidbrink, W.W.; Salewski, Mirko

    2012-01-01

    The fast-ion distribution function (DF) has a complicated dependence on several phase-space variables. The standard analysis procedure in energetic particle research is to compute the DF theoretically, use that DF in forward modeling to predict diagnostic signals, then compare with measured data....... However, when theory and experiment disagree (for one or more diagnostics), it is unclear how to proceed. Bayesian statistics provides a framework to infer the DF, quantify errors, and reconcile discrepant diagnostic measurements. Diagnostic errors and ``weight functions" that describe the phase space...

  3. Bayesian Inference for Signal-Based Seismic Monitoring

    Science.gov (United States)

    Moore, D.

    2015-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http

  4. Bayesian inference on genetic merit under uncertain paternity

    Directory of Open Access Journals (Sweden)

    Tempelman Robert J

    2003-09-01

    Full Text Available Abstract A hierarchical animal model was developed for inference on genetic merit of livestock with uncertain paternity. Fully conditional posterior distributions for fixed and genetic effects, variance components, sire assignments and their probabilities are derived to facilitate a Bayesian inference strategy using MCMC methods. We compared this model to a model based on the Henderson average numerator relationship (ANRM in a simulation study with 10 replicated datasets generated for each of two traits. Trait 1 had a medium heritability (h2 for each of direct and maternal genetic effects whereas Trait 2 had a high h2 attributable only to direct effects. The average posterior probabilities inferred on the true sire were between 1 and 10% larger than the corresponding priors (the inverse of the number of candidate sires in a mating pasture for Trait 1 and between 4 and 13% larger than the corresponding priors for Trait 2. The predicted additive and maternal genetic effects were very similar using both models; however, model choice criteria (Pseudo Bayes Factor and Deviance Information Criterion decisively favored the proposed hierarchical model over the ANRM model.

  5. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Kevin McNally

    2012-01-01

    Full Text Available There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.

  6. Bayesian inference for Markov jump processes with informative observations.

    Science.gov (United States)

    Golightly, Andrew; Wilkinson, Darren J

    2015-04-01

    In this paper we consider the problem of parameter inference for Markov jump process (MJP) representations of stochastic kinetic models. Since transition probabilities are intractable for most processes of interest yet forward simulation is straightforward, Bayesian inference typically proceeds through computationally intensive methods such as (particle) MCMC. Such methods ostensibly require the ability to simulate trajectories from the conditioned jump process. When observations are highly informative, use of the forward simulator is likely to be inefficient and may even preclude an exact (simulation based) analysis. We therefore propose three methods for improving the efficiency of simulating conditioned jump processes. A conditioned hazard is derived based on an approximation to the jump process, and used to generate end-point conditioned trajectories for use inside an importance sampling algorithm. We also adapt a recently proposed sequential Monte Carlo scheme to our problem. Essentially, trajectories are reweighted at a set of intermediate time points, with more weight assigned to trajectories that are consistent with the next observation. We consider two implementations of this approach, based on two continuous approximations of the MJP. We compare these constructs for a simple tractable jump process before using them to perform inference for a Lotka-Volterra system. The best performing construct is used to infer the parameters governing a simple model of motility regulation in Bacillus subtilis. PMID:25720091

  7. Ajuste de modelos autorregressivos, na forma de modelos lineares dinâmicos, via inferência Bayesiana Autorregresive models fitting with a dynamic linear models approach via Bayesian inference

    Directory of Open Access Journals (Sweden)

    Marcelo Costa Souza

    2004-10-01

    , in which the parameters are regarded as fixed quantities, not assuming changes in time. This work aimed at fitting of autoregressive models with order 2, AR(2, specified in the form of dynamic linear models using Bayesian inference. Monte Carlo Markov Chain (MCMC was used to obtain the estimates, via Gibbs Sampler and Forward Filtering Backward Sampling (FFBS. To evaluate the fitting, two chains with 8000 iterations each, and three different series sizes, with 200, 500 and 800 observations were sampled. The Canadian lynx series (NICHOLLS and QUIN, 1982, was fitted with different discount factors (0.90, 0.95 and 0.99, and the resulting mean square error was used to compare to the fitting using classical inference. A better fit for the model with discount equal to 0.99 was observed. One-step ahead forecasts were done to check the estimates obtained for the updated and the backward sampled series. To the latter, the fitting was better and mean square error lower. In general, it was observed a good fit of the AR(2 dynamic models via Bayesian inference, and this gives a better understanding of the fitting in different situations, both simulated and real.

  8. Bayesian inference for identifying interaction rules in moving animal groups.

    Science.gov (United States)

    Mann, Richard P

    2011-01-01

    The emergence of similar collective patterns from different self-propelled particle models of animal groups points to a restricted set of "universal" classes for these patterns. While universality is interesting, it is often the fine details of animal interactions that are of biological importance. Universality thus presents a challenge to inferring such interactions from macroscopic group dynamics since these can be consistent with many underlying interaction models. We present a Bayesian framework for learning animal interaction rules from fine scale recordings of animal movements in swarms. We apply these techniques to the inverse problem of inferring interaction rules from simulation models, showing that parameters can often be inferred from a small number of observations. Our methodology allows us to quantify our confidence in parameter fitting. For example, we show that attraction and alignment terms can be reliably estimated when animals are milling in a torus shape, while interaction radius cannot be reliably measured in such a situation. We assess the importance of rate of data collection and show how to test different models, such as topological and metric neighbourhood models. Taken together our results both inform the design of experiments on animal interactions and suggest how these data should be best analysed. PMID:21829657

  9. Bayesian inference for identifying interaction rules in moving animal groups.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    Full Text Available The emergence of similar collective patterns from different self-propelled particle models of animal groups points to a restricted set of "universal" classes for these patterns. While universality is interesting, it is often the fine details of animal interactions that are of biological importance. Universality thus presents a challenge to inferring such interactions from macroscopic group dynamics since these can be consistent with many underlying interaction models. We present a Bayesian framework for learning animal interaction rules from fine scale recordings of animal movements in swarms. We apply these techniques to the inverse problem of inferring interaction rules from simulation models, showing that parameters can often be inferred from a small number of observations. Our methodology allows us to quantify our confidence in parameter fitting. For example, we show that attraction and alignment terms can be reliably estimated when animals are milling in a torus shape, while interaction radius cannot be reliably measured in such a situation. We assess the importance of rate of data collection and show how to test different models, such as topological and metric neighbourhood models. Taken together our results both inform the design of experiments on animal interactions and suggest how these data should be best analysed.

  10. Bayesian inference of population size history from multiple loci

    Directory of Open Access Journals (Sweden)

    Drummond Alexei J

    2008-10-01

    Full Text Available Abstract Background Effective population size (Ne is related to genetic variability and is a basic parameter in many models of population genetics. A number of methods for inferring current and past population sizes from genetic data have been developed since JFC Kingman introduced the n-coalescent in 1982. Here we present the Extended Bayesian Skyline Plot, a non-parametric Bayesian Markov chain Monte Carlo algorithm that extends a previous coalescent-based method in several ways, including the ability to analyze multiple loci. Results Through extensive simulations we show the accuracy and limitations of inferring population size as a function of the amount of data, including recovering information about evolutionary bottlenecks. We also analyzed two real data sets to demonstrate the behavior of the new method; a single gene Hepatitis C virus data set sampled from Egypt and a 10 locus Drosophila ananassae data set representing 16 different populations. Conclusion The results demonstrate the essential role of multiple loci in recovering population size dynamics. Multi-locus data from a small number of individuals can precisely recover past bottlenecks in population size which can not be characterized by analysis of a single locus. We also demonstrate that sequence data quality is important because even moderate levels of sequencing errors result in a considerable decrease in estimation accuracy for realistic levels of population genetic variability.

  11. Using a Simple Binomial Model to Assess Improvement in Predictive Capability: Sequential Bayesian Inference, Hypothesis Testing, and Power Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sigeti, David E. [Los Alamos National Laboratory; Pelak, Robert A. [Los Alamos National Laboratory

    2012-09-11

    We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis with an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a

  12. Unsupervised Transient Light Curve Analysis Via Hierarchical Bayesian Inference

    CERN Document Server

    Sanders, Nathan; Soderberg, Alicia

    2014-01-01

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometr...

  13. Modeling photosynthesis of Spartina alterniflora (smooth cordgrass) impacted by the Deepwater Horizon oil spill using Bayesian inference

    International Nuclear Information System (INIS)

    To study the impact of the Deepwater Horizon oil spill on photosynthesis of coastal salt marsh plants in Mississippi, we developed a hierarchical Bayesian (HB) model based on field measurements collected from July 2010 to November 2011. We sampled three locations in Davis Bayou, Mississippi (30.375°N, 88.790°W) representative of a range of oil spill impacts. Measured photosynthesis was negative (respiration only) at the heavily oiled location in July 2010 only, and rates started to increase by August 2010. Photosynthesis at the medium oiling location was lower than at the control location in July 2010 and it continued to decrease in September 2010. During winter 2010–2011, the contrast between the control and the two impacted locations was not as obvious as in the growing season of 2010. Photosynthesis increased through spring 2011 at the three locations and decreased starting with October at the control location and a month earlier (September) at the impacted locations. Using the field data, we developed an HB model. The model simulations agreed well with the measured photosynthesis, capturing most of the variability of the measured data. On the basis of the posteriors of the parameters, we found that air temperature and photosynthetic active radiation positively influenced photosynthesis whereas the leaf stress level negatively affected photosynthesis. The photosynthesis rates at the heavily impacted location had recovered to the status of the control location about 140 days after the initial impact, while the impact at the medium impact location was never severe enough to make photosynthesis significantly lower than that at the control location over the study period. The uncertainty in modeling photosynthesis rates mainly came from the individual and micro-site scales, and to a lesser extent from the leaf scale. (letter)

  14. Bayesian networks inference algorithm to implement Dempster Shafer theory in reliability analysis

    International Nuclear Information System (INIS)

    This paper deals with the use of Bayesian networks to compute system reliability. The reliability analysis problem is described and the usual methods for quantitative reliability analysis are presented within a case study. Some drawbacks that justify the use of Bayesian networks are identified. The basic concepts of the Bayesian networks application to reliability analysis are introduced and a model to compute the reliability for the case study is presented. Dempster Shafer theory to treat epistemic uncertainty in reliability analysis is then discussed and its basic concepts that can be applied thanks to the Bayesian network inference algorithm are introduced. Finally, it is shown, with a numerical example, how Bayesian networks' inference algorithms compute complex system reliability and what the Dempster Shafer theory can provide to reliability analysis

  15. Metainference: A Bayesian Inference Method for Heterogeneous Systems

    CERN Document Server

    Bonomi, Massimiliano; Cavalli, Andrea; Vendruscolo, Michele

    2015-01-01

    Modelling a complex system is almost invariably a challenging task. The incorporation of experimental observations can be used to improve the quality of a model, and thus to obtain better predictions about the behavior of the corresponding system. This approach, however, is affected by a variety of different errors, especially when a system populates simultaneously an ensemble of different states and experimental data are measured as averages over such states. To address this problem we present a method, called metainference, that combines Bayesian inference, which is a powerful strategy to deal with errors in experimental measurements, with the maximum entropy principle, which represents a rigorous approach to deal with experimental measurements averaged over multiple states. To illustrate the method we present its application to the determination of an ensemble of structures corresponding to the thermal fluctuations of a protein molecule. Metainference thus provides an approach to model complex systems with...

  16. Genetic evidence for long-term population decline in a savannah-dwelling primate: inferences from a hierarchical bayesian model.

    Science.gov (United States)

    Storz, Jay F; Beaumont, Mark A; Alberts, Susan C

    2002-11-01

    The purpose of this study was to test for evidence that savannah baboons (Papio cynocephalus) underwent a population expansion in concert with a hypothesized expansion of African human and chimpanzee populations during the late Pleistocene. The rationale is that any type of environmental event sufficient to cause simultaneous population expansions in African humans and chimpanzees would also be expected to affect other codistributed mammals. To test for genetic evidence of population expansion or contraction, we performed a coalescent analysis of multilocus microsatellite data using a hierarchical Bayesian model. Markov chain Monte Carlo (MCMC) simulations were used to estimate the posterior probability density of demographic and genealogical parameters. The model was designed to allow interlocus variation in mutational and demographic parameters, which made it possible to detect aberrant patterns of variation at individual loci that could result from heterogeneity in mutational dynamics or from the effects of selection at linked sites. Results of the MCMC simulations were consistent with zero variance in demographic parameters among loci, but there was evidence for a 10- to 20-fold difference in mutation rate between the most slowly and most rapidly evolving loci. Results of the model provided strong evidence that savannah baboons have undergone a long-term historical decline in population size. The mode of the highest posterior density for the joint distribution of current and ancestral population size indicated a roughly eightfold contraction over the past 1,000 to 250,000 years. These results indicate that savannah baboons apparently did not share a common demographic history with other codistributed primate species. PMID:12411607

  17. Kernel Approximate Bayesian Computation for Population Genetic Inferences

    OpenAIRE

    Nakagome, Shigeki; Fukumizu, Kenji; Mano, Shuhei

    2012-01-01

    Approximate Bayesian computation (ABC) is a likelihood-free approach for Bayesian inferences based on a rejection algorithm method that applies a tolerance of dissimilarity between summary statistics from observed and simulated data. Although several improvements to the algorithm have been proposed, none of these improvements avoid the following two sources of approximation: 1) lack of sufficient statistics: sampling is not from the true posterior density given data but from an approximate po...

  18. Bayesian large-scale structure inference and cosmic web analysis

    CERN Document Server

    Leclercq, Florent

    2015-01-01

    Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...

  19. Bayesian inference of mass segregation of open clusters

    Science.gov (United States)

    Shao, Zhengyi; Chen, Li; Lin, Chien-Cheng; Zhong, Jing; Hou, Jinliang

    2015-08-01

    Based on the Bayesian inference (BI) method, the mixture-modeling approach is improved to combine all kinematic data, including the coordinative position, proper motion (PM) and radial velocity (RV), to separate the motion of the cluster from field stars in its area, as well as to describe the intrinsic kinematic status. Meanwhile, the membership probabilities of individual stars are determined as by product results. This method has been testified by simulation of toy models and it was found that the joint usage of multiple kinematic data can significantly reduce the missing rate of membership determination, say from ~15% for single data type to 1% for using all position, proper motion and radial velocity data.By combining kinematic data from multiple sources of photometric and redshift surveys, such as WIYN and APOGEE, M67 and NGC188 are revisited. Mass segregation is identified clearly for both of these two old open clusters, either in position or in PM spaces, since the Bayesian evidence (BE) of the model, which includes the segregation parameters, is much larger than that without it. The ongoing work is applying this method to the LAMOST released data which contains a large amount of RVs cover ~200 nearby open clusters. If the coming GAIA data can be used, the accuracy of tangential velocity will be largely improved and the intrinsic kinematics of open clusters can be well investigated, though they are usually less than 1 km/s.

  20. Online query answering with differential privacy: a utility-driven approach using Bayesian inference

    CERN Document Server

    Xiao, Yonghui

    2012-01-01

    Data privacy issues frequently and increasingly arise for data sharing and data analysis tasks. In this paper, we study the problem of online query answering under the rigorous differential privacy model. The existing interactive mechanisms for differential privacy can only support a limited number of queries before the accumulated cost of privacy reaches a certain bound. This limitation has greatly hindered their applicability, especially in the scenario where multiple users legitimately need to pose a large number of queries. To minimize the privacy cost and extend the life span of a system, we propose a utility-driven mechanism for online query answering using Bayesian statistical inference. The key idea is to keep track of the query history and use Bayesian inference to answer a new query using previous query answers. The Bayesian inference algorithm provides both optimal point estimation and optimal interval estimation. We formally quantify the error of the inference result to determine if it satisfies t...

  1. Methods for Bayesian power spectrum inference with galaxy surveys

    CERN Document Server

    Jasche, Jens

    2013-01-01

    We derive and implement a full Bayesian large scale structure inference method aiming at precision recovery of the cosmological power spectrum from galaxy redshift surveys. Our approach improves over previous Bayesian methods by performing a joint inference of the three dimensional density field, the cosmological power spectrum, luminosity dependent galaxy biases and corresponding normalizations. We account for all joint and correlated uncertainties between all inferred quantities. Classes of galaxies with different biases are treated as separate sub samples. The method therefore also allows the combined analysis of more than one galaxy survey. In particular, it solves the problem of inferring the power spectrum from galaxy surveys with non-trivial survey geometries by exploring the joint posterior distribution with efficient implementations of multiple block Markov chain and Hybrid Monte Carlo methods. Our Markov sampler achieves high statistical efficiency in low signal to noise regimes by using a determini...

  2. Bayesian inference on the sphere beyond statistical isotropy

    CERN Document Server

    Das, Santanu; Souradeep, Tarun

    2015-01-01

    We present a general method for Bayesian inference of the underlying covariance structure of random fields on a sphere. We employ the Bipolar Spherical Harmonic (BipoSH) representation of general covariance structure on the sphere. We illustrate the efficacy of the method as a principled approach to assess violation of statistical isotropy (SI) in the sky maps of Cosmic Microwave Background (CMB) fluctuations. SI violation in observed CMB maps arise due to known physical effects such as Doppler boost and weak lensing; yet unknown theoretical possibilities like cosmic topology and subtle violations of the cosmological principle, as well as, expected observational artefacts of scanning the sky with a non-circular beam, masking, foreground residuals, anisotropic noise, etc. We explicitly demonstrate the recovery of the input SI violation signals with their full statistics in simulated CMB maps. Our formalism easily adapts to exploring parametric physical models with non-SI covariance, as we illustrate for the in...

  3. Bayesian Inference of Natural Rankings in Incomplete Competition Networks

    CERN Document Server

    Park, Juyong

    2013-01-01

    Competition between a complex system's constituents and a corresponding reward mechanism based on it have profound influence on the functioning, stability, and evolution of the system. But determining the dominance hierarchy or ranking among the constituent parts from the strongest to the weakest -- essential in determining reward or penalty -- is almost always an ambiguous task due to the incomplete nature of competition networks. Here we introduce ``Natural Ranking," a desirably unambiguous ranking method applicable to a complete (full) competition network, and formulate an analytical model based on the Bayesian formula inferring the expected mean and error of the natural ranking of nodes from an incomplete network. We investigate its potential and uses in solving issues in ranking by applying to a real-world competition network of economic and social importance.

  4. Pig Data and Bayesian Inference on Multinomial Probabilities

    Science.gov (United States)

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  5. Montblanc: GPU accelerated Radio Interferometer Measurement Equations in support of Bayesian Inference for Radio Observations

    CERN Document Server

    Perkins, Simon; Zwart, Jonathan; Natarajan, Iniyan; Smirnov, Oleg

    2015-01-01

    We present Montblanc, a GPU implementation of the Radio interferometer measurement equation (RIME) in support of the Bayesian inference for radio observations (BIRO) technique. BIRO uses Bayesian inference to select sky models that best match the visibilities observed by a radio interferometer. To accomplish this, BIRO evaluates the RIME multiple times, varying sky model parameters to produce multiple model visibilities. Chi-squared values computed from the model and observed visibilities are used as likelihood values to drive the Bayesian sampling process and select the best sky model. As most of the elements of the RIME and chi-squared calculation are independent of one another, they are highly amenable to parallel computation. Additionally, Montblanc caters for iterative RIME evaluation to produce multiple chi-squared values. Only modified model parameters are transferred to the GPU between each iteration. We implemented Montblanc as a Python package based upon NVIDIA's CUDA architecture. As such, it is ea...

  6. Metainference: A Bayesian inference method for heterogeneous systems.

    Science.gov (United States)

    Bonomi, Massimiliano; Camilloni, Carlo; Cavalli, Andrea; Vendruscolo, Michele

    2016-01-01

    Modeling a complex system is almost invariably a challenging task. The incorporation of experimental observations can be used to improve the quality of a model and thus to obtain better predictions about the behavior of the corresponding system. This approach, however, is affected by a variety of different errors, especially when a system simultaneously populates an ensemble of different states and experimental data are measured as averages over such states. To address this problem, we present a Bayesian inference method, called "metainference," that is able to deal with errors in experimental measurements and with experimental measurements averaged over multiple states. To achieve this goal, metainference models a finite sample of the distribution of models using a replica approach, in the spirit of the replica-averaging modeling based on the maximum entropy principle. To illustrate the method, we present its application to a heterogeneous model system and to the determination of an ensemble of structures corresponding to the thermal fluctuations of a protein molecule. Metainference thus provides an approach to modeling complex systems with heterogeneous components and interconverting between different states by taking into account all possible sources of errors. PMID:26844300

  7. Fast, fully Bayesian spatiotemporal inference for fMRI data.

    Science.gov (United States)

    Musgrove, Donald R; Hughes, John; Eberly, Lynn E

    2016-04-01

    We propose a spatial Bayesian variable selection method for detecting blood oxygenation level dependent activation in functional magnetic resonance imaging (fMRI) data. Typical fMRI experiments generate large datasets that exhibit complex spatial and temporal dependence. Fitting a full statistical model to such data can be so computationally burdensome that many practitioners resort to fitting oversimplified models, which can lead to lower quality inference. We develop a full statistical model that permits efficient computation. Our approach eases the computational burden in two ways. We partition the brain into 3D parcels, and fit our model to the parcels in parallel. Voxel-level activation within each parcel is modeled as regressions located on a lattice. Regressors represent the magnitude of change in blood oxygenation in response to a stimulus, while a latent indicator for each regressor represents whether the change is zero or non-zero. A sparse spatial generalized linear mixed model captures the spatial dependence among indicator variables within a parcel and for a given stimulus. The sparse SGLMM permits considerably more efficient computation than does the spatial model typically employed in fMRI. Through simulation we show that our parcellation scheme performs well in various realistic scenarios. Importantly, indicator variables on the boundary between parcels do not exhibit edge effects. We conclude by applying our methodology to data from a task-based fMRI experiment. PMID:26553916

  8. A tutorial introduction to Bayesian models of cognitive development

    OpenAIRE

    Perfors, Amy; Tenenbaum, Joshua B.; Griffiths, Thomas L.; Xu, Fei

    2010-01-01

    We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in...

  9. Bayesian Inference for Neighborhood Filters With Application in Denoising.

    Science.gov (United States)

    Huang, Chao-Tsung

    2015-11-01

    Range-weighted neighborhood filters are useful and popular for their edge-preserving property and simplicity, but they are originally proposed as intuitive tools. Previous works needed to connect them to other tools or models for indirect property reasoning or parameter estimation. In this paper, we introduce a unified empirical Bayesian framework to do both directly. A neighborhood noise model is proposed to reason and infer the Yaroslavsky, bilateral, and modified non-local means filters by joint maximum a posteriori and maximum likelihood estimation. Then, the essential parameter, range variance, can be estimated via model fitting to the empirical distribution of an observable chi scale mixture variable. An algorithm based on expectation-maximization and quasi-Newton optimization is devised to perform the model fitting efficiently. Finally, we apply this framework to the problem of color-image denoising. A recursive fitting and filtering scheme is proposed to improve the image quality. Extensive experiments are performed for a variety of configurations, including different kernel functions, filter types and support sizes, color channel numbers, and noise types. The results show that the proposed framework can fit noisy images well and the range variance can be estimated successfully and efficiently. PMID:26259244

  10. Type Ia Supernova Light Curve Inference: Hierarchical Bayesian Analysis in the Near Infrared

    CERN Document Server

    Mandel, Kaisey S; Friedman, Andrew S; Kirshner, Robert P

    2009-01-01

    We present a comprehensive statistical analysis of the properties of Type Ia SN light curves in the near infrared using recent data from PAIRITEL and the literature. We construct a hierarchical Bayesian framework, incorporating several uncertainties including photometric error, peculiar velocities, dust extinction and intrinsic variations, for coherent statistical inference. SN Ia light curve inferences are drawn from the global posterior probability of parameters describing both individual supernovae and the population conditioned on the entire SN Ia NIR dataset. The logical structure of the hierarchical Bayesian model is represented by a directed acyclic graph. Fully Bayesian analysis of the model and data is enabled by an efficient MCMC algorithm exploiting the conditional structure using Gibbs sampling. We apply this framework to the JHK_s SN Ia light curve data. A new light curve model captures the observed J-band light curve shape variations. The intrinsic variances in peak absolute magnitudes are: sigm...

  11. Practical Statistics for LHC Physicists: Bayesian Inference (3/3)

    CERN Document Server

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  12. Bayesian Fusion Algorithm for Inferring Trust in Wireless Sensor Networks

    OpenAIRE

    Mohammad Momani; Subhash Challa; Rami Alhmouz

    2010-01-01

    This paper introduces a new Bayesian fusion algorithm to combine more than one trust component (data trust and communication trust) to infer the overall trust between nodes. This research work proposes that one trust component is not enough when deciding on whether or not to trust a specific node in a wireless sensor network. This paper discusses and analyses the results from the communication trust component (binary) and the data trust component (continuous) and proves that either component ...

  13. Halo detection via large-scale Bayesian inference

    Science.gov (United States)

    Merson, Alexander I.; Jasche, Jens; Abdalla, Filipe B.; Lahav, Ofer; Wandelt, Benjamin; Jones, D. Heath; Colless, Matthew

    2016-08-01

    We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect haloes of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesian large-scale structure inference algorithm, HAmiltonian Density Estimation and Sampling algorithm (HADES), and a Bayesian chain rule (the Blackwell-Rao estimator), which we use to connect the inferred density field to the properties of dark matter haloes. To demonstrate the capability of our approach, we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information, we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful detection of haloes in the mock catalogue increases as a function of the signal to noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.

  14. Trans-Dimensional Bayesian Inference for Gravitational Lens Substructures

    CERN Document Server

    Brewer, Brendon J; Lewis, Geraint F

    2015-01-01

    We introduce a Bayesian solution to the problem of inferring the density profile of strong gravitational lenses when the lens galaxy may contain multiple dark or faint substructures. The source and lens models are based on a superposition of an unknown number of non-negative basis functions (or "blobs") whose form was chosen with speed as a primary criterion. The prior distribution for the blobs' properties is specified hierarchically, so the mass function of substructures is a natural output of the method. We use reversible jump Markov Chain Monte Carlo (MCMC) within Diffusive Nested Sampling (DNS) to sample the posterior distribution and evaluate the marginal likelihood of the model, including the summation over the unknown number of blobs in the source and the lens. We demonstrate the method on a simulated data set with a single substructure, which is recovered well with moderate uncertainties. We also apply the method to the g-band image of the "Cosmic Horseshoe" system, and find some hints of potential s...

  15. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, N. E.; Soderberg, A. M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Betancourt, M., E-mail: nsanders@cfa.harvard.edu [Department of Statistics, University of Warwick, Coventry CV4 7AL (United Kingdom)

    2015-02-10

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  16. Methods for Bayesian Power Spectrum Inference with Galaxy Surveys

    Science.gov (United States)

    Jasche, Jens; Wandelt, Benjamin D.

    2013-12-01

    We derive and implement a full Bayesian large scale structure inference method aiming at precision recovery of the cosmological power spectrum from galaxy redshift surveys. Our approach improves upon previous Bayesian methods by performing a joint inference of the three-dimensional density field, the cosmological power spectrum, luminosity dependent galaxy biases, and corresponding normalizations. We account for all joint and correlated uncertainties between all inferred quantities. Classes of galaxies with different biases are treated as separate subsamples. This method therefore also allows the combined analysis of more than one galaxy survey. In particular, it solves the problem of inferring the power spectrum from galaxy surveys with non-trivial survey geometries by exploring the joint posterior distribution with efficient implementations of multiple block Markov chain and Hybrid Monte Carlo methods. Our Markov sampler achieves high statistical efficiency in low signal-to-noise regimes by using a deterministic reversible jump algorithm. This approach reduces the correlation length of the sampler by several orders of magnitude, turning the otherwise numerically unfeasible problem of joint parameter exploration into a numerically manageable task. We test our method on an artificial mock galaxy survey, emulating characteristic features of the Sloan Digital Sky Survey data release 7, such as its survey geometry and luminosity-dependent biases. These tests demonstrate the numerical feasibility of our large scale Bayesian inference frame work when the parameter space has millions of dimensions. This method reveals and correctly treats the anti-correlation between bias amplitudes and power spectrum, which are not taken into account in current approaches to power spectrum estimation, a 20% effect across large ranges in k space. In addition, this method results in constrained realizations of density fields obtained without assuming the power spectrum or bias parameters

  17. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  18. Uncertainty Quantification of GEOS-5 L-band Radiative Transfer Model Parameters Using Bayesian Inference and SMOS Observations

    Science.gov (United States)

    DeLannoy, Gabrielle J. M.; Reichle, Rolf H.; Vrugt, Jasper A.

    2013-01-01

    Uncertainties in L-band (1.4 GHz) radiative transfer modeling (RTM) affect the simulation of brightness temperatures (Tb) over land and the inversion of satellite-observed Tb into soil moisture retrievals. In particular, accurate estimates of the microwave soil roughness, vegetation opacity and scattering albedo for large-scale applications are difficult to obtain from field studies and often lack an uncertainty estimate. Here, a Markov Chain Monte Carlo (MCMC) simulation method is used to determine satellite-scale estimates of RTM parameters and their posterior uncertainty by minimizing the misfit between long-term averages and standard deviations of simulated and observed Tb at a range of incidence angles, at horizontal and vertical polarization, and for morning and evening overpasses. Tb simulations are generated with the Goddard Earth Observing System (GEOS-5) and confronted with Tb observations from the Soil Moisture Ocean Salinity (SMOS) mission. The MCMC algorithm suggests that the relative uncertainty of the RTM parameter estimates is typically less than 25 of the maximum a posteriori density (MAP) parameter value. Furthermore, the actual root-mean-square-differences in long-term Tb averages and standard deviations are found consistent with the respective estimated total simulation and observation error standard deviations of m3.1K and s2.4K. It is also shown that the MAP parameter values estimated through MCMC simulation are in close agreement with those obtained with Particle Swarm Optimization (PSO).

  19. Inference algorithms and learning theory for Bayesian sparse factor analysis

    International Nuclear Information System (INIS)

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  20. Bayesian inference of the demographic history of chimpanzees.

    Science.gov (United States)

    Wegmann, Daniel; Excoffier, Laurent

    2010-06-01

    Due to an almost complete absence of fossil record, the evolutionary history of chimpanzees has only been studied recently on the basis of genetic data. Although the general topology of the chimpanzee phylogeny is well established, uncertainties remain concerning the size of current and past populations, the occurrence of bottlenecks or population expansions, or about divergence times and migrations rates between subspecies. Here, we present a novel attempt at globally inferring the detailed evolution of the Pan genus based on approximate Bayesian computation, an approach preferentially applied to complex models where the likelihood cannot be computed analytically. Based on two microsatellite and DNA sequence data sets and adjusting simulated data for local levels of inbreeding and patterns of missing data, we find support for several new features of chimpanzee evolution as compared with previous studies based on smaller data sets and simpler evolutionary models. We find that the central chimpanzees are certainly the oldest population of all P. troglodytes subspecies and that the other two P. t. subspecies diverged from the central chimpanzees by founder events. We also find an older divergence time (1.6 million years [My]) between common chimpanzee and Bonobos than previous studies (0.9-1.3 My), but this divergence appears to have been very progressive with the maintenance of relatively high levels of gene flow between the ancestral chimpanzee population and the Bonobos. Finally, we could also confirm the existence of strong unidirectional gene flow from the western into the central chimpanzee. These results show that interesting and innovative features of chimpanzee history emerge when considering their whole evolutionary history in a single analysis, rather than relying on simpler models involving several comparisons of pairs of populations. PMID:20118191

  1. Bayesian Spatial Modelling with R-INLA

    OpenAIRE

    Finn Lindgren; Håvard Rue

    2015-01-01

    The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA) approach proposed by Rue, Martino, and Chopin (2009) is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized) linear mixed to spatial and spatio-temporal models. Combined with the stochastic...

  2. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  3. Structural damage identification using piezoelectric impedance and Bayesian inference

    Science.gov (United States)

    Shuai, Q.; Zhou, K.; Tang, J.

    2015-04-01

    Structural damage identification is a challenging subject in the structural health monitoring research. The piezoelectric impedance-based damage identification, which usually utilizes the matrix inverse-based optimization, may in theory identify the damage location and damage severity. However, the sensitivity matrix is oftentimes ill-conditioned in practice, since the number of unknowns may far exceed the useful measurements/inputs. In this research, a new method based on intelligent inference framework for damage identification is presented. Bayesian inference is used to directly predict damage location and severity using impedance measurement through forward prediction and comparison. Gaussian process is employed to enrich the forward analysis result, thereby reducing computational cost. Case study is carried out to illustrate the identification performance.

  4. Efficient Nonparametric Bayesian Modelling with Sparse Gaussian Process Approximations

    OpenAIRE

    Seeger, Matthias; Lawrence, Neil; Herbrich, Ralf

    2006-01-01

    Sparse approximations to Bayesian inference for nonparametric Gaussian Process models scale linearly in the number of training points, allowing for the application of powerful kernel-based models to large datasets. We present a general framework based on the informative vector machine (IVM) (Lawrence et.al., 2002) and show how the complete Bayesian task of inference and learning of free hyperparameters can be performed in a practically efficient manner. Our framework allows for arbitrary like...

  5. Inference for Multiplicative Models

    OpenAIRE

    Wexler, Ydo; Meek, Christopher

    2012-01-01

    The paper introduces a generalization for known probabilistic models such as log-linear and graphical models, called here multiplicative models. These models, that express probabilities via product of parameters are shown to capture multiple forms of contextual independence between variables, including decision graphs and noisy-OR functions. An inference algorithm for multiplicative models is provided and its correctness is proved. The complexity analysis of the inference algorithm uses a mor...

  6. Uncertainty Analysis in Fatigue Life Prediction of Gas Turbine Blades Using Bayesian Inference

    Science.gov (United States)

    Li, Yan-Feng; Zhu, Shun-Peng; Li, Jing; Peng, Weiwen; Huang, Hong-Zhong

    2015-12-01

    This paper investigates Bayesian model selection for fatigue life estimation of gas turbine blades considering model uncertainty and parameter uncertainty. Fatigue life estimation of gas turbine blades is a critical issue for the operation and health management of modern aircraft engines. Since lots of life prediction models have been presented to predict the fatigue life of gas turbine blades, model uncertainty and model selection among these models have consequently become an important issue in the lifecycle management of turbine blades. In this paper, fatigue life estimation is carried out by considering model uncertainty and parameter uncertainty simultaneously. It is formulated as the joint posterior distribution of a fatigue life prediction model and its model parameters using Bayesian inference method. Bayes factor is incorporated to implement the model selection with the quantified model uncertainty. Markov Chain Monte Carlo method is used to facilitate the calculation. A pictorial framework and a step-by-step procedure of the Bayesian inference method for fatigue life estimation considering model uncertainty are presented. Fatigue life estimation of a gas turbine blade is implemented to demonstrate the proposed method.

  7. A generalized bayesian inference method for constraining the interiors of super Earths and sub-Neptunes

    CERN Document Server

    Dorn, C; Khan, A; Heng, K; Alibert, Y; Helled, R; Rivoldini, A; Benz, W

    2016-01-01

    We aim to present a generalized Bayesian inference method for constraining interiors of super Earths and sub-Neptunes. Our methodology succeeds in quantifying the degeneracy and correlation of structural parameters for high dimensional parameter spaces. Specifically, we identify what constraints can be placed on composition and thickness of core, mantle, ice, ocean, and atmospheric layers given observations of mass, radius, and bulk refractory abundance constraints (Fe, Mg, Si) from observations of the host star's photospheric composition. We employed a full probabilistic Bayesian inference analysis that formally accounts for observational and model uncertainties. Using a Markov chain Monte Carlo technique, we computed joint and marginal posterior probability distributions for all structural parameters of interest. We included state-of-the-art structural models based on self-consistent thermodynamics of core, mantle, high-pressure ice, and liquid water. Furthermore, we tested and compared two different atmosp...

  8. Stochastic Collapsed Variational Bayesian Inference for Latent Dirichlet Allocation

    NARCIS (Netherlands)

    J. Foulds; L. Boyles; C. DuBois; P. Smyth; M. Welling

    2013-01-01

    There has been an explosion in the amount of digital text information available in recent years, leading to challenges of scale for traditional inference algorithms for topic models. Recent advances in stochastic variational inference algorithms for latent Dirichlet allocation (LDA) have made it fea

  9. Bayesian inference and decision theory - A framework for decision making in natural resource management

    Science.gov (United States)

    Dorazio, R.M.; Johnson, F.A.

    2003-01-01

    Bayesian inference and decision theory may be used in the solution of relatively complex problems of natural resource management, owing to recent advances in statistical theory and computing. In particular, Markov chain Monte Carlo algorithms provide a computational framework for fitting models of adequate complexity and for evaluating the expected consequences of alternative management actions. We illustrate these features using an example based on management of waterfowl habitat.

  10. A Decision Making Framework in Production Processes Using Bayesian Inference and Stochastic Dynamic Programming

    OpenAIRE

    Seyed Taghi Akhavan Niaki; Mohammad Saber Fallah Nezhad

    2007-01-01

    In order to design a decision-making framework in production environments, in this study, we use both the stochastic dynamic programming and Bayesian inference concepts. Using the posterior probability of the production process to be in state λ (the hazard rate of defective products), first we formulate the problem into a stochastic dynamic programming model. Next, we derive some properties for the optimal value of the objective function. Then, we propose a solution algorithm. At the end, the...

  11. Inference of Gene Regulatory Network Based on Local Bayesian Networks

    Science.gov (United States)

    Liu, Fei; Zhang, Shao-Wu; Guo, Wei-Feng; Chen, Luonan

    2016-01-01

    The inference of gene regulatory networks (GRNs) from expression data can mine the direct regulations among genes and gain deep insights into biological processes at a network level. During past decades, numerous computational approaches have been introduced for inferring the GRNs. However, many of them still suffer from various problems, e.g., Bayesian network (BN) methods cannot handle large-scale networks due to their high computational complexity, while information theory-based methods cannot identify the directions of regulatory interactions and also suffer from false positive/negative problems. To overcome the limitations, in this work we present a novel algorithm, namely local Bayesian network (LBN), to infer GRNs from gene expression data by using the network decomposition strategy and false-positive edge elimination scheme. Specifically, LBN algorithm first uses conditional mutual information (CMI) to construct an initial network or GRN, which is decomposed into a number of local networks or GRNs. Then, BN method is employed to generate a series of local BNs by selecting the k-nearest neighbors of each gene as its candidate regulatory genes, which significantly reduces the exponential search space from all possible GRN structures. Integrating these local BNs forms a tentative network or GRN by performing CMI, which reduces redundant regulations in the GRN and thus alleviates the false positive problem. The final network or GRN can be obtained by iteratively performing CMI and local BN on the tentative network. In the iterative process, the false or redundant regulations are gradually removed. When tested on the benchmark GRN datasets from DREAM challenge as well as the SOS DNA repair network in E.coli, our results suggest that LBN outperforms other state-of-the-art methods (ARACNE, GENIE3 and NARROMI) significantly, with more accurate and robust performance. In particular, the decomposition strategy with local Bayesian networks not only effectively reduce

  12. Inference of Gene Regulatory Network Based on Local Bayesian Networks.

    Science.gov (United States)

    Liu, Fei; Zhang, Shao-Wu; Guo, Wei-Feng; Wei, Ze-Gang; Chen, Luonan

    2016-08-01

    The inference of gene regulatory networks (GRNs) from expression data can mine the direct regulations among genes and gain deep insights into biological processes at a network level. During past decades, numerous computational approaches have been introduced for inferring the GRNs. However, many of them still suffer from various problems, e.g., Bayesian network (BN) methods cannot handle large-scale networks due to their high computational complexity, while information theory-based methods cannot identify the directions of regulatory interactions and also suffer from false positive/negative problems. To overcome the limitations, in this work we present a novel algorithm, namely local Bayesian network (LBN), to infer GRNs from gene expression data by using the network decomposition strategy and false-positive edge elimination scheme. Specifically, LBN algorithm first uses conditional mutual information (CMI) to construct an initial network or GRN, which is decomposed into a number of local networks or GRNs. Then, BN method is employed to generate a series of local BNs by selecting the k-nearest neighbors of each gene as its candidate regulatory genes, which significantly reduces the exponential search space from all possible GRN structures. Integrating these local BNs forms a tentative network or GRN by performing CMI, which reduces redundant regulations in the GRN and thus alleviates the false positive problem. The final network or GRN can be obtained by iteratively performing CMI and local BN on the tentative network. In the iterative process, the false or redundant regulations are gradually removed. When tested on the benchmark GRN datasets from DREAM challenge as well as the SOS DNA repair network in E.coli, our results suggest that LBN outperforms other state-of-the-art methods (ARACNE, GENIE3 and NARROMI) significantly, with more accurate and robust performance. In particular, the decomposition strategy with local Bayesian networks not only effectively reduce

  13. Assessing the relationship between spectral solar irradiance and stratospheric ozone using Bayesian inference

    CERN Document Server

    Ball, William T; Egerton, Jack S; Haigh, Joanna D

    2014-01-01

    We investigate the relationship between spectral solar irradiance (SSI) and ozone in the tropical upper stratosphere. We find that solar cycle (SC) changes in ozone can be well approximated by considering the ozone response to SSI changes in a small number individual wavelength bands between 176 and 310 nm, operating independently of each other. Additionally, we find that the ozone varies approximately linearly with changes in the SSI. Using these facts, we present a Bayesian formalism for inferring SC SSI changes and uncertainties from measured SC ozone profiles. Bayesian inference is a powerful, mathematically self-consistent method of considering both the uncertainties of the data and additional external information to provide the best estimate of parameters being estimated. Using this method, we show that, given measurement uncertainties in both ozone and SSI datasets, it is not currently possible to distinguish between observed or modelled SSI datasets using available estimates of ozone change profiles, ...

  14. A simple algorithm to estimate genetic variance in an animal threshold model using Bayesian inference Genetics Selection Evolution 2010, 42:29

    DEFF Research Database (Denmark)

    Ødegård, Jørgen; Meuwissen, Theo HE; Heringstad, Bjørg;

    2010-01-01

    relationship matrix, but genetic (co)variance components are inferred from the sampled breeding values and relationships between "informative" individuals (usually parents) only. The latter is analogous to a sire-dam model (in cases with no individual records on the parents). Results When applied to simulated......, residual variance on the underlying scale is not identifiable. Hence, variance of fully confounded Mendelian sampling deviations cannot be identified either, but can be inferred from the between-family variation. In the new algorithm, breeding values are sampled as in a standard animal model using the full...

  15. Bayesian inference of inaccuracies in radiation transport physics from inertial confinement fusion experiments

    CERN Document Server

    Gaffney, Jim A; Sonnad, Vijay; Libby, Stephen B

    2013-01-01

    First principles microphysics models are essential to the design and analysis of high energy density physics experiments. Using experimental data to investigate the underlying physics is also essential, particularly when simulations and experiments are not consistent with each other. This is a difficult task, due to the large number of physical models that play a role, and due to the complex (and as a result, noisy) nature of the experiments. This results in a large number of parameters that make any inference a daunting task; it is also very important to consistently treat both experimental and prior understanding of the problem. In this paper we present a Bayesian method that includes both these effects, and allows the inference of a set of modifiers which have been constructed to give information about microphysics models from experimental data. We pay particular attention to radiation transport models. The inference takes into account a large set of experimental parameters and an estimate of the prior kno...

  16. Quantifying uncertainty in soot volume fraction estimates using Bayesian inference of auto-correlated laser-induced incandescence measurements

    Science.gov (United States)

    Hadwin, Paul J.; Sipkens, T. A.; Thomson, K. A.; Liu, F.; Daun, K. J.

    2016-01-01

    Auto-correlated laser-induced incandescence (AC-LII) infers the soot volume fraction (SVF) of soot particles by comparing the spectral incandescence from laser-energized particles to the pyrometrically inferred peak soot temperature. This calculation requires detailed knowledge of model parameters such as the absorption function of soot, which may vary with combustion chemistry, soot age, and the internal structure of the soot. This work presents a Bayesian methodology to quantify such uncertainties. This technique treats the additional "nuisance" model parameters, including the soot absorption function, as stochastic variables and incorporates the current state of knowledge of these parameters into the inference process through maximum entropy priors. While standard AC-LII analysis provides a point estimate of the SVF, Bayesian techniques infer the posterior probability density, which will allow scientists and engineers to better assess the reliability of AC-LII inferred SVFs in the context of environmental regulations and competing diagnostics.

  17. Bayesian Fusion Algorithm for Inferring Trust in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mohammad Momani

    2010-07-01

    Full Text Available This paper introduces a new Bayesian fusion algorithm to combine more than one trust component (data trust and communication trust to infer the overall trust between nodes. This research work proposes that one trust component is not enough when deciding on whether or not to trust a specific node in a wireless sensor network. This paper discusses and analyses the results from the communication trust component (binary and the data trust component (continuous and proves that either component by itself, can mislead the network and eventually cause a total breakdown of the network. As a result of this, new algorithms are needed to combine more than one trust component to infer the overall trust. The proposed algorithm is simple and generic as it allows trust components to be added and deleted easily. Simulation results demonstrate that a node is highly trustworthy provided that both trust components simultaneously confirm its trustworthiness and conversely, a node is highly untrustworthy if its untrustworthiness is asserted by both components.

  18. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    2000-01-01

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...

  19. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...

  20. A formal model of interpersonal inference

    Directory of Open Access Journals (Sweden)

    Michael Moutoussis

    2014-03-01

    Full Text Available Introduction: We propose that active Bayesian inference – a general framework for decision-making – can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: 1. Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to 'mentalising' in the psychological literature, is based upon the outcomes of interpersonal exchanges. 2. We show how some well-known social-psychological phenomena (e.g. self-serving biases can be explained in terms of active interpersonal inference. 3. Mentalising naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one’s own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modelling intersubject variability in mentalising during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalising is distorted.

  1. CGBayesNets: conditional Gaussian Bayesian network learning and inference with mixed discrete and continuous data.

    Directory of Open Access Journals (Sweden)

    Michael J McGeachie

    2014-06-01

    Full Text Available Bayesian Networks (BN have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com.

  2. Bayesian electron density inference from JET lithium beam emission spectra using Gaussian processes

    CERN Document Server

    Kwak, Sehyun; Brix, M; Ghim, Y -c

    2016-01-01

    A Bayesian model to infer edge electron density profiles is developed for the JET lithium beam emission spectroscopy system, measuring Li I line radiation using 26 channels with ~1 cm spatial resolution and 10~20 ms temporal resolution. The density profile is modelled using a Gaussian process prior, and the uncertainty of the density profile is calculated by a Markov Chain Monte Carlo (MCMC) scheme. From the spectra measured by the transmission grating spectrometer, the Li line intensities are extracted, and modelled as a function of the plasma density by a multi-state model which describes the relevant processes between neutral lithium beam atoms and plasma particles. The spectral model fully takes into account interference filter and instrument effects, that are separately estimated, again using Gaussian processes. The line intensities are inferred based on a spectral model consistent with the measured spectra within their uncertainties, which includes photon statistics and electronic noise. Our newly devel...

  3. Coordinate transformation and Polynomial Chaos for the Bayesian inference of a Gaussian process with parametrized prior covariance function

    KAUST Repository

    Sraj, Ihab

    2015-10-22

    This paper addresses model dimensionality reduction for Bayesian inference based on prior Gaussian fields with uncertainty in the covariance function hyper-parameters. The dimensionality reduction is traditionally achieved using the Karhunen-Loève expansion of a prior Gaussian process assuming covariance function with fixed hyper-parameters, despite the fact that these are uncertain in nature. The posterior distribution of the Karhunen-Loève coordinates is then inferred using available observations. The resulting inferred field is therefore dependent on the assumed hyper-parameters. Here, we seek to efficiently estimate both the field and covariance hyper-parameters using Bayesian inference. To this end, a generalized Karhunen-Loève expansion is derived using a coordinate transformation to account for the dependence with respect to the covariance hyper-parameters. Polynomial Chaos expansions are employed for the acceleration of the Bayesian inference using similar coordinate transformations, enabling us to avoid expanding explicitly the solution dependence on the uncertain hyper-parameters. We demonstrate the feasibility of the proposed method on a transient diffusion equation by inferring spatially-varying log-diffusivity fields from noisy data. The inferred profiles were found closer to the true profiles when including the hyper-parameters’ uncertainty in the inference formulation.

  4. Bayesian Nonparametrics in Topic Modeling: A Brief Tutorial

    OpenAIRE

    Spangher, Alexander

    2015-01-01

    Using nonparametric methods has been increasingly explored in Bayesian hierarchical modeling as a way to increase model flexibility. Although the field shows a lot of promise, inference in many models, including Hierachical Dirichlet Processes (HDP), remain prohibitively slow. One promising path forward is to exploit the submodularity inherent in Indian Buffet Process (IBP) to derive near-optimal solutions in polynomial time. In this work, I will present a brief tutorial on Bayesian nonparame...

  5. Bayesian inference of solar and stellar magnetic fields in the weak-field approximation

    CERN Document Server

    Ramos, A Asensio

    2011-01-01

    The weak-field approximation is one of the simplest models that allows us to relate the observed polarization induced by the Zeeman effect with the magnetic field vector present on the plasma of interest. It is usually applied for diagnosing magnetic fields in the solar and stellar atmospheres. A fully Bayesian approach to the inference of magnetic properties in unresolved structures is presented. The analytical expression for the marginal posterior distribution is obtained, from which we can obtain statistically relevant information about the model parameters. The role of a-priori information is discussed and a hierarchical procedure is presented that gives robust results that are almost insensitive to the precise election of the prior. The strength of the formalism is demonstrated through an application to IMaX data. Bayesian methods can optimally exploit data from filter-polarimeters given the scarcity of spectral information as compared with spectro-polarimeters. The effect of noise and how it degrades ou...

  6. Bayesian Inference on Predictors of Sex of the Baby

    OpenAIRE

    Scarpa, Bruno

    2016-01-01

    It is well known that the sex ratio at birth is a biological constant, being about 106 boys to 100 girls. However couples have always wanted to know and decide in advance the sex of a newborn. For example, a large number of papers appeared connecting biometrical variables, such as length of follicular phase in the woman menstrual cycle or timing of intercourse acts to the sex of new baby. In this paper, we propose a Bayesian model to validate some of these theories by using an independent dat...

  7. Planetary micro-rover operations on Mars using a Bayesian framework for inference and control

    Science.gov (United States)

    Post, Mark A.; Li, Junquan; Quine, Brendan M.

    2016-03-01

    With the recent progress toward the application of commercially-available hardware to small-scale space missions, it is now becoming feasible for groups of small, efficient robots based on low-power embedded hardware to perform simple tasks on other planets in the place of large-scale, heavy and expensive robots. In this paper, we describe design and programming of the Beaver micro-rover developed for Northern Light, a Canadian initiative to send a small lander and rover to Mars to study the Martian surface and subsurface. For a small, hardware-limited rover to handle an uncertain and mostly unknown environment without constant management by human operators, we use a Bayesian network of discrete random variables as an abstraction of expert knowledge about the rover and its environment, and inference operations for control. A framework for efficient construction and inference into a Bayesian network using only the C language and fixed-point mathematics on embedded hardware has been developed for the Beaver to make intelligent decisions with minimal sensor data. We study the performance of the Beaver as it probabilistically maps a simple outdoor environment with sensor models that include uncertainty. Results indicate that the Beaver and other small and simple robotic platforms can make use of a Bayesian network to make intelligent decisions in uncertain planetary environments.

  8. Bayesian inference and life testing plans for generalized exponential distribution

    Institute of Scientific and Technical Information of China (English)

    KUNDU; Debasis; PRADHAN; Biswabrata

    2009-01-01

    Recently generalized exponential distribution has received considerable attentions.In this paper,we deal with the Bayesian inference of the unknown parameters of the progressively censored generalized exponential distribution.It is assumed that the scale and the shape parameters have independent gamma priors.The Bayes estimates of the unknown parameters cannot be obtained in the closed form.Lindley’s approximation and importance sampling technique have been suggested to compute the approximate Bayes estimates.Markov Chain Monte Carlo method has been used to compute the approximate Bayes estimates and also to construct the highest posterior density credible intervals.We also provide different criteria to compare two different sampling schemes and hence to ?nd the optimal sampling schemes.It is observed that ?nding the optimum censoring procedure is a computationally expensive process.And we have recommended to use the sub-optimal censoring procedure,which can be obtained very easily.Monte Carlo simulations are performed to compare the performances of the different methods and one data analysis has been performed for illustrative purposes.

  9. Internal dosimetry of uranium isotopes using bayesian inference methods

    International Nuclear Information System (INIS)

    A group of personnel at Los Alamos National Laboratory is routinely monitored for the presence of uranium isotopes by urine bioassay. Samples are analysed by alpha spectroscopy, and the results are examined for evidence of an intake of uranium. Because the measurement uncertainties are often comparable to the quantities of material we wish to detect, statistical considerations are crucial for the proper interpretation of the data. The problem is further complicated by the significant, but highly non-uniform, presence of uranium in local drinking water and, in some cases, food supply. Software originally developed for internal dosimetry of plutonium has been adapted to the problem of uranium dosimetry. The software uses an unfolding algorithm to calculate an approximate Bayesian solution to the problem of characterising any intakes which may have occurred, given the history of urine bioassay results for each individual in the monitored population. The program uses biokinetic models from ICRP Publications 68 and later, and a prior probability distribution derived empirically from the body of uranium bioassay data collected at Los Alamos over the operating history of the Laboratory. For each individual, the software creates a posterior probability distribution of intake quantity and solubility type as a function of time. From this distribution, estimates are made of the cumulative committed dose (CEDE) to each individual. Results of the method are compared with those obtained using an earlier classical (non-Bayesian) algorithm for uranium dosimetry. We also discuss the problem of distinguishing occupational intakes from intake of environmental uranium, within a Bayesian framework. (author)

  10. Bayesian Fundamentalism or Enlightenment? On the explanatory status and theoretical contributions of Bayesian models of cognition.

    Science.gov (United States)

    Jones, Matt; Love, Bradley C

    2011-08-01

    The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls

  11. Mocapy++ - A toolkit for inference and learning in dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Hamelryck, Thomas Wim

    2010-01-01

    Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs). It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations...

  12. VIGoR: Variational Bayesian Inference for Genome-Wide Regression

    OpenAIRE

    Onogi, Akio; Iwata, Hiroyoshi

    2016-01-01

    Genome-wide regression using a number of genome-wide markers as predictors is now widely used for genome-wide association mapping and genomic prediction. We developed novel software for genome-wide regression which we named VIGoR (variational Bayesian inference for genome-wide regression). Variational Bayesian inference is computationally much faster than widely used Markov chain Monte Carlo algorithms. VIGoR implements seven regression methods, and is provided as a command line program packa...

  13. Hierarchical Bayesian inference of galaxy redshift distributions from photometric surveys

    CERN Document Server

    Leistedt, Boris; Peiris, Hiranya V

    2016-01-01

    Accurately characterizing the redshift distributions of galaxies is essential for analysing deep photometric surveys and testing cosmological models. We present a technique to simultaneously infer redshift distributions and individual redshifts from photometric galaxy catalogues. Our model constructs a piecewise constant representation (effectively a histogram) of the distribution of galaxy types and redshifts, the parameters of which are efficiently inferred from noisy photometric flux measurements. This approach can be seen as a generalization of template-fitting photometric redshift methods and relies on a library of spectral templates to relate the photometric fluxes of individual galaxies to their redshifts. We illustrate this technique on simulated galaxy survey data, and demonstrate that it delivers correct posterior distributions on the underlying type and redshift distributions, as well as on the individual types and redshifts of galaxies. We show that even with uninformative priors, large photometri...

  14. Montblanc1: GPU accelerated radio interferometer measurement equations in support of Bayesian inference for radio observations

    Science.gov (United States)

    Perkins, S. J.; Marais, P. C.; Zwart, J. T. L.; Natarajan, I.; Tasse, C.; Smirnov, O.

    2015-09-01

    We present Montblanc, a GPU implementation of the Radio interferometer measurement equation (RIME) in support of the Bayesian inference for radio observations (BIRO) technique. BIRO uses Bayesian inference to select sky models that best match the visibilities observed by a radio interferometer. To accomplish this, BIRO evaluates the RIME multiple times, varying sky model parameters to produce multiple model visibilities. χ2 values computed from the model and observed visibilities are used as likelihood values to drive the Bayesian sampling process and select the best sky model. As most of the elements of the RIME and χ2 calculation are independent of one another, they are highly amenable to parallel computation. Additionally, Montblanc caters for iterative RIME evaluation to produce multiple χ2 values. Modified model parameters are transferred to the GPU between each iteration. We implemented Montblanc as a Python package based upon NVIDIA's CUDA architecture. As such, it is easy to extend and implement different pipelines. At present, Montblanc supports point and Gaussian morphologies, but is designed for easy addition of new source profiles. Montblanc's RIME implementation is performant: On an NVIDIA K40, it is approximately 250 times faster than MEQTREES on a dual hexacore Intel E5-2620v2 CPU. Compared to the OSKAR simulator's GPU-implemented RIME components it is 7.7 and 12 times faster on the same K40 for single and double-precision floating point respectively. However, OSKAR's RIME implementation is more general than Montblanc's BIRO-tailored RIME. Theoretical analysis of Montblanc's dominant CUDA kernel suggests that it is memory bound. In practice, profiling shows that is balanced between compute and memory, as much of the data required by the problem is retained in L1 and L2 caches.

  15. Atmospheric Dispersion Unknown Source Parameters Determination Using AERMOD and Bayesian Inference Along Markov Chain Monte Carlo

    International Nuclear Information System (INIS)

    Occurrence of hazardous accident in nuclear power plants and industrial units usually lead to release of radioactive materials and pollutants in environment. These materials and pollutants can be transported to a far downstream by the wind flow. In this paper, we implemented an atmospheric dispersion code to solve the inverse problem. Having received and detected the pollutants in one region, we may estimate the rate and location of the unknown source. For the modeling, one needs a model with ability of atmospheric dispersion calculation. Furthermore, it is required to implement a mathematical approach to infer the source location and the related rates. In this paper the AERMOD software and Bayesian inference along the Markov Chain Monte Carlo have been applied. Implementing, Bayesian approach and Markov Chain Monte Carlo for the aforementioned subject is not a new approach, but the AERMOD model coupled with the said methods is a new and well known regulatory software, and enhances the reliability of outcomes. To evaluate the method, an example is considered by defining pollutants concentration in a specific region and then obtaining the source location and intensity by a direct calculation. The result of the calculation estimates the average source location at a distance of 7km with an accuracy of 5m which is good enough to support the ability of the proposed algorithm.

  16. Bayesian model reduction and empirical Bayes for group (DCM) studies.

    Science.gov (United States)

    Friston, Karl J; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E; van Wijk, Bernadette C M; Ziegler, Gabriel; Zeidman, Peter

    2016-03-01

    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level - e.g., dynamic causal models - and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. PMID:26569570

  17. Modeling Social Annotation: a Bayesian Approach

    CERN Document Server

    Plangprasopchok, Anon

    2008-01-01

    Collaborative tagging systems, such as del.icio.us, CiteULike, and others, allow users to annotate objects, e.g., Web pages or scientific papers, with descriptive labels called tags. The social annotations, contributed by thousands of users, can potentially be used to infer categorical knowledge, classify documents or recommend new relevant information. Traditional text inference methods do not make best use of socially-generated data, since they do not take into account variations in individual users' perspectives and vocabulary. In a previous work, we introduced a simple probabilistic model that takes interests of individual annotators into account in order to find hidden topics of annotated objects. Unfortunately, our proposed approach had a number of shortcomings, including overfitting, local maxima and the requirement to specify values for some parameters. In this paper we address these shortcomings in two ways. First, we extend the model to a fully Bayesian framework. Second, we describe an infinite ver...

  18. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  19. Hierarchical Bayesian inference of galaxy redshift distributions from photometric surveys

    Science.gov (United States)

    Leistedt, Boris; Mortlock, Daniel J.; Peiris, Hiranya V.

    2016-08-01

    Accurately characterizing the redshift distributions of galaxies is essential for analysing deep photometric surveys and testing cosmological models. We present a technique to simultaneously infer redshift distributions and individual redshifts from photometric galaxy catalogues. Our model constructs a piecewise constant representation (effectively a histogram) of the distribution of galaxy types and redshifts, the parameters of which are efficiently inferred from noisy photometric flux measurements. This approach can be seen as a generalization of template-fitting photometric redshift methods and relies on a library of spectral templates to relate the photometric fluxes of individual galaxies to their redshifts. We illustrate this technique on simulated galaxy survey data, and demonstrate that it delivers correct posterior distributions on the underlying type and redshift distributions, as well as on the individual types and redshifts of galaxies. We show that even with uninformative priors, large photometric errors and parameter degeneracies, the redshift and type distributions can be recovered robustly thanks to the hierarchical nature of the model, which is not possible with common photometric redshift estimation techniques. As a result, redshift uncertainties can be fully propagated in cosmological analyses for the first time, fulfilling an essential requirement for the current and future generations of surveys.

  20. TYPE Ia SUPERNOVA LIGHT-CURVE INFERENCE: HIERARCHICAL BAYESIAN ANALYSIS IN THE NEAR-INFRARED

    International Nuclear Information System (INIS)

    We present a comprehensive statistical analysis of the properties of Type Ia supernova (SN Ia) light curves in the near-infrared using recent data from Peters Automated InfraRed Imaging TELescope and the literature. We construct a hierarchical Bayesian framework, incorporating several uncertainties including photometric error, peculiar velocities, dust extinction, and intrinsic variations, for principled and coherent statistical inference. SN Ia light-curve inferences are drawn from the global posterior probability of parameters describing both individual supernovae and the population conditioned on the entire SN Ia NIR data set. The logical structure of the hierarchical model is represented by a directed acyclic graph. Fully Bayesian analysis of the model and data is enabled by an efficient Markov Chain Monte Carlo algorithm exploiting the conditional probabilistic structure using Gibbs sampling. We apply this framework to the JHKs SN Ia light-curve data. A new light-curve model captures the observed J-band light-curve shape variations. The marginal intrinsic variances in peak absolute magnitudes are σ(MJ) = 0.17 ± 0.03, σ(MH) = 0.11 ± 0.03, and σ(MKs) = 0.19 ± 0.04. We describe the first quantitative evidence for correlations between the NIR absolute magnitudes and J-band light-curve shapes, and demonstrate their utility for distance estimation. The average residual in the Hubble diagram for the training set SNe at cz > 2000kms-1 is 0.10 mag. The new application of bootstrap cross-validation to SN Ia light-curve inference tests the sensitivity of the statistical model fit to the finite sample and estimates the prediction error at 0.15 mag. These results demonstrate that SN Ia NIR light curves are as effective as corrected optical light curves, and, because they are less vulnerable to dust absorption, they have great potential as precise and accurate cosmological distance indicators.

  1. Evidence cross-validation and Bayesian inference of MAST plasma equilibria

    International Nuclear Information System (INIS)

    In this paper, current profiles for plasma discharges on the mega-ampere spherical tokamak are directly calculated from pickup coil, flux loop, and motional-Stark effect observations via methods based in the statistical theory of Bayesian analysis. By representing toroidal plasma current as a series of axisymmetric current beams with rectangular cross-section and inferring the current for each one of these beams, flux-surface geometry and q-profiles are subsequently calculated by elementary application of Biot-Savart's law. The use of this plasma model in the context of Bayesian analysis was pioneered by Svensson and Werner on the joint-European tokamak [Svensson and Werner,Plasma Phys. Controlled Fusion 50(8), 085002 (2008)]. In this framework, linear forward models are used to generate diagnostic predictions, and the probability distribution for the currents in the collection of plasma beams was subsequently calculated directly via application of Bayes' formula. In this work, we introduce a new diagnostic technique to identify and remove outlier observations associated with diagnostics falling out of calibration or suffering from an unidentified malfunction. These modifications enable a good agreement between Bayesian inference of the last-closed flux-surface with other corroborating data, such as that from force balance considerations using EFIT++[Appel et al., ''A unified approach to equilibrium reconstruction'' Proceedings of the 33rd EPS Conference on Plasma Physics (Rome, Italy, 2006)]. In addition, this analysis also yields errors on the plasma current profile and flux-surface geometry as well as directly predicting the Shafranov shift of the plasma core.

  2. A Non-Parametric Bayesian Method for Inferring Hidden Causes

    OpenAIRE

    Wood, Frank; Griffiths, Thomas; Ghahramani, Zoubin

    2012-01-01

    We present a non-parametric Bayesian approach to structure learning with hidden causes. Previous Bayesian treatments of this problem define a prior over the number of hidden causes and use algorithms such as reversible jump Markov chain Monte Carlo to move between solutions. In contrast, we assume that the number of hidden causes is unbounded, but only a finite number influence observable variables. This makes it possible to use a Gibbs sampler to approximate the distribution over causal stru...

  3. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  4. Low-Rank Separated Representation Surrogates of High-Dimensional Stochastic Functions: Application in Bayesian Inference

    CERN Document Server

    Validi, AbdoulAhad

    2013-01-01

    This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector valued sep...

  5. A Nonparametric Bayesian Model for Nested Clustering.

    Science.gov (United States)

    Lee, Juhee; Müller, Peter; Zhu, Yitan; Ji, Yuan

    2016-01-01

    We propose a nonparametric Bayesian model for clustering where clusters of experimental units are determined by a shared pattern of clustering another set of experimental units. The proposed model is motivated by the analysis of protein activation data, where we cluster proteins such that all proteins in one cluster give rise to the same clustering of patients. That is, we define clusters of proteins by the way that patients group with respect to the corresponding protein activations. This is in contrast to (almost) all currently available models that use shared parameters in the sampling model to define clusters. This includes in particular model based clustering, Dirichlet process mixtures, product partition models, and more. We show results for two typical biostatistical inference problems that give rise to clustering. PMID:26519174

  6. Bayesian Spatial Modelling with R-INLA

    Directory of Open Access Journals (Sweden)

    Finn Lindgren

    2015-02-01

    Full Text Available The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA approach proposed by Rue, Martino, and Chopin (2009 is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized linear mixed to spatial and spatio-temporal models. Combined with the stochastic partial differential equation approach (SPDE, Lindgren, Rue, and Lindstrm 2011, one can accommodate all kinds of geographically referenced data, including areal and geostatistical ones, as well as spatial point process data. The implementation interface covers stationary spatial mod- els, non-stationary spatial models, and also spatio-temporal models, and is applicable in epidemiology, ecology, environmental risk assessment, as well as general geostatistics.

  7. Bayesian kinematic earthquake source models

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.

    2009-12-01

    Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high

  8. A Bayesian Nonparametric IRT Model

    OpenAIRE

    Karabatsos, George

    2015-01-01

    This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...

  9. Bayesian Stable Isotope Mixing Models

    OpenAIRE

    Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard

    2012-01-01

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...

  10. Aplicação das inferências clássica e bayesiana na estimação dos parâmetros do modelo de densidade populacional de plantas daninhas Application of classic and bayesian inferences on the estimation of weed population density model parameters

    Directory of Open Access Journals (Sweden)

    L.S. Vismara

    2007-12-01

    Full Text Available A dinâmica da população de plantas daninhas pode ser representada por um sistema de equações que relaciona as densidades de sementes produzidas e de plântulas em áreas de cultivo. Os valores dos parâmetros dos modelos podem ser inferidos diretamente de experimentação e análise estatística ou extraídos da literatura. O presente trabalho teve por objetivo estimar os parâmetros do modelo de densidade populacional de plantas daninhas, a partir de um experimento conduzido na área experimental da Embrapa Milho e Sorgo, Sete Lagoas, MG, via os procedimentos de inferências clássica e Bayesiana.Dynamics of weed populations can be described as a system of equations relating the produced seed and seedling densities in crop areas. The model parameter values can be either directly inferred from experimentation and statistical analysis or obtained from the literature. The objective of this work was to estimate the weed population density model parameters based on experimental field data at Embrapa Milho e Sorgo, Sete Lagoas, MG, using classic and Bayesian inferences.

  11. Bayesian interpolation in a dynamic sinusoidal model with application to packet-loss concealment

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan;

    2010-01-01

    a Bayesian inference scheme for the missing observations, hidden states and model parameters of the dynamic model. The inference scheme is based on a Markov chain Monte Carlo method known as Gibbs sampler. We illustrate the performance of the inference scheme to the application of packet-loss concealment...

  12. Hilbertian sine as an absolute measure of Bayesian inference in ISR, homeland security, medicine, and defense

    Science.gov (United States)

    Jannson, Tomasz; Wang, Wenjian; Hodelin, Juan; Forrester, Thomas; Romanov, Volodymyr; Kostrzewski, Andrew

    2016-05-01

    In this paper, Bayesian Binary Sensing (BBS) is discussed as an effective tool for Bayesian Inference (BI) evaluation in interdisciplinary areas such as ISR (and, C3I), Homeland Security, QC, medicine, defense, and many others. In particular, Hilbertian Sine (HS) as an absolute measure of BI, is introduced, while avoiding relativity of decision threshold identification, as in the case of traditional measures of BI, related to false positives and false negatives.

  13. Bayesian inference of biochemical kinetic parameters using the linear noise approximation

    Directory of Open Access Journals (Sweden)

    Finkenstädt Bärbel

    2009-10-01

    Full Text Available Abstract Background Fluorescent and luminescent gene reporters allow us to dynamically quantify changes in molecular species concentration over time on the single cell level. The mathematical modeling of their interaction through multivariate dynamical models requires the deveopment of effective statistical methods to calibrate such models against available data. Given the prevalence of stochasticity and noise in biochemical systems inference for stochastic models is of special interest. In this paper we present a simple and computationally efficient algorithm for the estimation of biochemical kinetic parameters from gene reporter data. Results We use the linear noise approximation to model biochemical reactions through a stochastic dynamic model which essentially approximates a diffusion model by an ordinary differential equation model with an appropriately defined noise process. An explicit formula for the likelihood function can be derived allowing for computationally efficient parameter estimation. The proposed algorithm is embedded in a Bayesian framework and inference is performed using Markov chain Monte Carlo. Conclusion The major advantage of the method is that in contrast to the more established diffusion approximation based methods the computationally costly methods of data augmentation are not necessary. Our approach also allows for unobserved variables and measurement error. The application of the method to both simulated and experimental data shows that the proposed methodology provides a useful alternative to diffusion approximation based methods.

  14. Joining Forces of Bayesian and Frequentist Methodology: A Study for Inference in the Presence of Non-Identifiability

    CERN Document Server

    Raue, Andreas; Theis, Fabian Joachim; Timmer, Jens

    2012-01-01

    Increasingly complex applications involve large datasets in combination with non-linear and high dimensional mathematical models. In this context, statistical inference is a challenging issue that calls for pragmatic approaches that take advantage of both Bayesian and frequentist methods. The elegance of Bayesian methodology is founded in the propagation of information content provided by experimental data and prior assumptions to the posterior probability distribution of model predictions. However, for complex applications experimental data and prior assumptions potentially constrain the posterior probability distribution insufficiently. In these situations Bayesian Markov chain Monte Carlo sampling can be infeasible. From a frequentist point of view insufficient experimental data and prior assumptions can be interpreted as non-identifiability. The profile likelihood approach offers to detect and to resolve non-identifiability by experimental design iteratively. Therefore, it allows one to better constrain t...

  15. Cosmological parameters, shear maps and power spectra from CFHTLenS using Bayesian hierarchical inference

    CERN Document Server

    Alsing, Justin; Jaffe, Andrew H

    2016-01-01

    We apply two Bayesian hierarchical inference schemes to infer shear power spectra, shear maps and cosmological parameters from the CFHTLenS weak lensing survey - the first application of this method to data. In the first approach, we sample the joint posterior distribution of the shear maps and power spectra by Gibbs sampling, with minimal model assumptions. In the second approach, we sample the joint posterior of the shear maps and cosmological parameters, providing a new, accurate and principled approach to cosmological parameter inference from cosmic shear data. As a first demonstration on data we perform a 2-bin tomographic analysis to constrain cosmological parameters and investigate the possibility of photometric redshift bias in the CFHTLenS data. Under the baseline $\\Lambda$CDM model we constrain $S_8 = \\sigma_8(\\Omega_\\mathrm{m}/0.3)^{0.5} = 0.67 ^{\\scriptscriptstyle+ 0.03 }_{\\scriptscriptstyle- 0.03 }$ $(68\\%)$, consistent with previous CFHTLenS analysis but in tension with Planck. Adding neutrino m...

  16. Stochastic model updating utilizing Bayesian approach and Gaussian process model

    Science.gov (United States)

    Wan, Hua-Ping; Ren, Wei-Xin

    2016-03-01

    Stochastic model updating (SMU) has been increasingly applied in quantifying structural parameter uncertainty from responses variability. SMU for parameter uncertainty quantification refers to the problem of inverse uncertainty quantification (IUQ), which is a nontrivial task. Inverse problem solved with optimization usually brings about the issues of gradient computation, ill-conditionedness, and non-uniqueness. Moreover, the uncertainty present in response makes the inverse problem more complicated. In this study, Bayesian approach is adopted in SMU for parameter uncertainty quantification. The prominent strength of Bayesian approach for IUQ problem is that it solves IUQ problem in a straightforward manner, which enables it to avoid the previous issues. However, when applied to engineering structures that are modeled with a high-resolution finite element model (FEM), Bayesian approach is still computationally expensive since the commonly used Markov chain Monte Carlo (MCMC) method for Bayesian inference requires a large number of model runs to guarantee the convergence. Herein we reduce computational cost in two aspects. On the one hand, the fast-running Gaussian process model (GPM) is utilized to approximate the time-consuming high-resolution FEM. On the other hand, the advanced MCMC method using delayed rejection adaptive Metropolis (DRAM) algorithm that incorporates local adaptive strategy with global adaptive strategy is employed for Bayesian inference. In addition, we propose the use of the powerful variance-based global sensitivity analysis (GSA) in parameter selection to exclude non-influential parameters from calibration parameters, which yields a reduced-order model and thus further alleviates the computational burden. A simulated aluminum plate and a real-world complex cable-stayed pedestrian bridge are presented to illustrate the proposed framework and verify its feasibility.

  17. A Bayesian method for inferring transmission chains in a partially observed epidemic.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Ray, Jaideep

    2008-10-01

    We present a Bayesian approach for estimating transmission chains and rates in the Abakaliki smallpox epidemic of 1967. The epidemic affected 30 individuals in a community of 74; only the dates of appearance of symptoms were recorded. Our model assumes stochastic transmission of the infections over a social network. Distinct binomial random graphs model intra- and inter-compound social connections, while disease transmission over each link is treated as a Poisson process. Link probabilities and rate parameters are objects of inference. Dates of infection and recovery comprise the remaining unknowns. Distributions for smallpox incubation and recovery periods are obtained from historical data. Using Markov chain Monte Carlo, we explore the joint posterior distribution of the scalar parameters and provide an expected connectivity pattern for the social graph and infection pathway.

  18. Mocapy++ - A toolkit for inference and learning in dynamic Bayesian networks

    Directory of Open Access Journals (Sweden)

    Hamelryck Thomas

    2010-03-01

    Full Text Available Abstract Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs. It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations. Results The program package is freely available under the GNU General Public Licence (GPL from SourceForge http://sourceforge.net/projects/mocapy. The package contains the source for building the Mocapy++ library, several usage examples and the user manual. Conclusions Mocapy++ is especially suitable for constructing probabilistic models of biomolecular structure, due to its support for directional statistics. In particular, it supports the Kent distribution on the sphere and the bivariate von Mises distribution on the torus. These distributions have proven useful to formulate probabilistic models of protein and RNA structure in atomic detail.

  19. Reconstruction of elongated bubbles fusing the information from multiple optical probes through a Bayesian inference technique

    Science.gov (United States)

    Chakraborty, Shubhankar; Roy Chaudhuri, Partha; Das, Prasanta Kr.

    2016-07-01

    In this communication, a novel optical technique has been proposed for the reconstruction of the shape of a Taylor bubble using measurements from multiple arrays of optical sensors. The deviation of an optical beam passing through the bubble depends on the contour of bubble surface. A theoretical model of the deviation of a beam during the traverse of a Taylor bubble through it has been developed. Using this model and the time history of the deviation captured by the sensor array, the bubble shape has been reconstructed. The reconstruction has been performed using an inverse algorithm based on Bayesian inference technique and Markov chain Monte Carlo sampling algorithm. The reconstructed nose shape has been compared with the true shape, extracted through image processing of high speed images. Finally, an error analysis has been performed to pinpoint the sources of the errors.

  20. Bayesian Parameter Inference by Markov Chain Monte Carlo with Hybrid Fitness Measures: Theory and Test in Apoptosis Signal Transduction Network

    OpenAIRE

    Murakami, Yohei; Takada, Shoji

    2013-01-01

    When exact values of model parameters in systems biology are not available from experiments, they need to be inferred so that the resulting simulation reproduces the experimentally known phenomena. For the purpose, Bayesian statistics with Markov chain Monte Carlo (MCMC) is a useful method. Biological experiments are often performed with cell population, and the results are represented by histograms. On another front, experiments sometimes indicate the existence of a specific bifurcation patt...

  1. What is the `relevant population' in Bayesian forensic inference?

    OpenAIRE

    Brümmer, Niko; de Villiers, Edward

    2014-01-01

    In works discussing the Bayesian paradigm for presenting forensic evidence in court, the concept of a `relevant population' is often mentioned, without a clear definition of what is meant, and without recommendations of how to select such populations. This note is to try to better understand this concept. Our analysis is intended to be general enough to be applicable to different forensic technologies and we shall consider both DNA profiling and speaker recognition as examples.

  2. Improving PWR core simulations by Monte Carlo uncertainty analysis and Bayesian inference

    CERN Document Server

    Castro, Emilio; Buss, Oliver; Garcia-Herranz, Nuria; Hoefer, Axel; Porsch, Dieter

    2016-01-01

    A Monte Carlo-based Bayesian inference model is applied to the prediction of reactor operation parameters of a PWR nuclear power plant. In this non-perturbative framework, high-dimensional covariance information describing the uncertainty of microscopic nuclear data is combined with measured reactor operation data in order to provide statistically sound, well founded uncertainty estimates of integral parameters, such as the boron letdown curve and the burnup-dependent reactor power distribution. The performance of this methodology is assessed in a blind test approach, where we use measurements of a given reactor cycle to improve the prediction of the subsequent cycle. As it turns out, the resulting improvement of the prediction quality is impressive. In particular, the prediction uncertainty of the boron letdown curve, which is of utmost importance for the planning of the reactor cycle length, can be reduced by one order of magnitude by including the boron concentration measurement information of the previous...

  3. Study on inference model of pipelines corrosion leak fire based on Bayesian networks%管道腐蚀泄漏火灾的贝叶斯网络推理模型研究

    Institute of Scientific and Technical Information of China (English)

    左哲

    2015-01-01

    为了研究长输管道腐蚀泄漏及蒸气云爆炸事故的演化规律,通过对埋地管道内(外)壁腐蚀失效、燃气泄漏、气体云团扩散及蒸气云爆炸等4阶段事件进行分析,构建埋地管线腐蚀泄漏火灾的贝叶斯网络模型。研究网络结构中节点变量的取值范围及离散化方法,并基于对事故统计和专家分析判断,设定节点变量的先验概率,量化节点关联的条件概率分布。在对贝叶斯网络推理策略研究的基础上,考察节点变量对推理结果的敏感性,验证模型的合理性。结果表明,长输管道腐蚀泄漏及次生灾害事件过程具有较大的不确定性,主要体现在中间事件均具有多种状态,事故演化路径概率受模型输入条件影响较大。贝叶斯网络方法用于描述事故过程中间节点事件间的依赖关系有较大的优势,可以定量衡量事故风险的不确定性。%In order to research evolutionary laws of unconfined vapor cloud explosion ( UVCE) induced by combustible gas leak in long-distance oil and gas pipelines, Bayesian networks on buried pipelines corrosion leak fire were built by analyzing event nodes on inner and outer wall corrosion failure, combustible gas leak, the gas cloud diffusion and UVCE. The state ranges and discrete methods of node variables were studied. Priori probability and conditional probability distribution of the node variables were set by analyzing on accident statistics data and expert judgements. Bayesian network inference strategy was developed, the sensitivities of each network node variable on inference results were analyzed by researching on evolution mechanism of corrosion leak fire, and the rationality of the model was verified. The results show that there are greater uncer-tainty in the process of pipeline corrosion leaks and secondary disaster. The uncertainty presents in diverse intermediate event status value and probability of accident evolutionary

  4. Constraining East Antarctic mass trends using a Bayesian inference approach

    Science.gov (United States)

    Martin-Español, Alba; Bamber, Jonathan L.

    2016-04-01

    East Antarctica is an order of magnitude larger than its western neighbour and the Greenland ice sheet. It has the greatest potential to contribute to sea level rise of any source, including non-glacial contributors. It is, however, the most challenging ice mass to constrain because of a range of factors including the relative paucity of in-situ observations and the poor signal to noise ratio of Earth Observation data such as satellite altimetry and gravimetry. A recent study using satellite radar and laser altimetry (Zwally et al. 2015) concluded that the East Antarctic Ice Sheet (EAIS) had been accumulating mass at a rate of 136±28 Gt/yr for the period 2003-08. Here, we use a Bayesian hierarchical model, which has been tested on, and applied to, the whole of Antarctica, to investigate the impact of different assumptions regarding the origin of elevation changes of the EAIS. We combined GRACE, satellite laser and radar altimeter data and GPS measurements to solve simultaneously for surface processes (primarily surface mass balance, SMB), ice dynamics and glacio-isostatic adjustment over the period 2003-13. The hierarchical model partitions mass trends between SMB and ice dynamics based on physical principles and measures of statistical likelihood. Without imposing the division between these processes, the model apportions about a third of the mass trend to ice dynamics, +18 Gt/yr, and two thirds, +39 Gt/yr, to SMB. The total mass trend for that period for the EAIS was 57±20 Gt/yr. Over the period 2003-08, we obtain an ice dynamic trend of 12 Gt/yr and a SMB trend of 15 Gt/yr, with a total mass trend of 27 Gt/yr. We then imposed the condition that the surface mass balance is tightly constrained by the regional climate model RACMO2.3 and allowed height changes due to ice dynamics to occur in areas of low surface velocities (<10 m/yr) , such as those in the interior of East Antarctica (a similar condition as used in Zwally 2015). The model must find a solution that

  5. Inferring population history with DIYABC: a user-friendly approach to Approximate Bayesian Computation

    OpenAIRE

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud

    2008-01-01

    Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in...

  6. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  7. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    C. Dimitrakakis

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st

  8. Accelerating inference for diffusions observed with measurement error and large sample sizes using approximate Bayesian computation

    DEFF Research Database (Denmark)

    Picchini, Umberto; Forman, Julie Lyng

    2016-01-01

    applications. A simulation study is conducted to compare our strategy with exact Bayesian inference, the latter resulting two orders of magnitude slower than ABC-MCMC for the considered set-up. Finally, the ABC algorithm is applied to a large size protein data. The suggested methodology is fairly general and...

  9. Trans-dimensional Bayesian inference for large sequential data sets

    Science.gov (United States)

    Mandolesi, E.; Dettmer, J.; Dosso, S. E.; Holland, C. W.

    2015-12-01

    This work develops a sequential Monte Carlo method to infer seismic parameters of layered seabeds from large sequential reflection-coefficient data sets. The approach provides parameter estimates and uncertainties along survey tracks with the goal to aid in the detection of unexploded ordnance in shallow water. The sequential data are acquired by a moving platform with source and receiver array towed close to the seabed. This geometry requires consideration of spherical reflection coefficients, computed efficiently by massively parallel implementation of the Sommerfeld integral via Levin integration on a graphics processing unit. The seabed is parametrized with a trans-dimensional model to account for changes in the environment (i.e. changes in layering) along the track. The method combines advanced Markov chain Monte Carlo methods (annealing) with particle filtering (resampling). Since data from closely-spaced source transmissions (pings) often sample similar environments, the solution from one ping can be utilized to efficiently estimate the posterior for data from subsequent pings. Since reflection-coefficient data are highly informative, the likelihood function can be extremely peaked, resulting in little overlap between posteriors of adjacent pings. This is addressed by adding bridging distributions (via annealed importance sampling) between pings for more efficient transitions. The approach assumes the environment to be changing slowly enough to justify the local 1D parametrization. However, bridging allows rapid changes between pings to be addressed and we demonstrate the method to be stable in such situations. Results are in terms of trans-D parameter estimates and uncertainties along the track. The algorithm is examined for realistic simulated data along a track and applied to a dataset collected by an autonomous underwater vehicle on the Malta Plateau, Mediterranean Sea. [Work supported by the SERDP, DoD.

  10. Matrix-free Large Scale Bayesian inference in cosmology

    CERN Document Server

    Jasche, Jens

    2014-01-01

    In this work we propose a new matrix-free implementation of the Wiener sampler which is traditionally applied to high dimensional analysis when signal covariances are unknown. Specifically, the proposed method addresses the problem of jointly inferring a high dimensional signal and its corresponding covariance matrix from a set of observations. Our method implements a Gibbs sampling adaptation of the previously presented messenger approach, permitting to cast the complex multivariate inference problem into a sequence of uni-variate random processes. In this fashion, the traditional requirement of inverting high dimensional matrices is completely eliminated from the inference process, resulting in an efficient algorithm that is trivial to implement. Using cosmic large scale structure data as a showcase, we demonstrate the capabilities of our Gibbs sampling approach by performing a joint analysis of three dimensional density fields and corresponding power-spectra from Gaussian mock catalogues. These tests clear...

  11. Combinatorial Inference for Graphical Models

    OpenAIRE

    Neykov, Matey; Lu, Junwei; Liu, Han

    2016-01-01

    We propose a new family of combinatorial inference problems for graphical models. Unlike classical statistical inference where the main interest is point estimation or parameter testing, combinatorial inference aims at testing the global structure of the underlying graph. Examples include testing the graph connectivity, the presence of a cycle of certain size, or the maximum degree of the graph. To begin with, we develop a unified theory for the fundamental limits of a large family of combina...

  12. Solving #SAT and Bayesian Inference with Backtracking Search

    OpenAIRE

    Bacchus, Fahiem; Dalmao, Shannon; Pitassi, Toniann

    2014-01-01

    Inference in Bayes Nets (BAYES) is an important problem with numerous applications in probabilistic reasoning. Counting the number of satisfying assignments of a propositional formula (#SAT) is a closely related problem of fundamental theoretical importance. Both these problems, and others, are members of the class of sum-of-products (SUMPROD) problems. In this paper we show that standard backtracking search when augmented with a simple memoization scheme (caching) can solve any sum-of-produc...

  13. Using Bayesian inference for parameter estimation when the system response and experimental conditions are measured with error and some variables are considered as nuisance variables

    Science.gov (United States)

    Emery, A. F.; Valenti, E.; Bardot, D.

    2007-01-01

    Parameter estimation is generally based upon the maximum likelihood approach and often involves regularization. Typically it is desired that the results be unbiased and of minimum variance. However, it is often better to accept biased estimates that have minimum mean square error. Bayesian inference is an attractive approach that achieves this goal and incorporates regularization automatically. More importantly, it permits us to analyse experiments in which both the system response and the independent variables (time, sensor position, experimental conditions, etc) are corrupted by noise and in which the model includes nuisance variables. This paper describes the use of Bayesian inference for an apparently simple experiment which is, in fact, fundamentally difficult and is compounded by a nuisance variable. By presenting this analysis we hope that members of the inverse community will see the value of applying Bayesian inference.

  14. Performance and prediction: Bayesian modelling of fallible choice in chess

    OpenAIRE

    Haworth, Guy McCrossan; Regan, Ken; Di Fatta, Giuseppe

    2010-01-01

    Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration ...

  15. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  16. Merging Digital Surface Models Implementing Bayesian Approaches

    Science.gov (United States)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  17. A Bayesian Approach to Inferring Rates of Selfing and Locus-Specific Mutation.

    Science.gov (United States)

    Redelings, Benjamin D; Kumagai, Seiji; Tatarenkov, Andrey; Wang, Liuyang; Sakai, Ann K; Weller, Stephen G; Culley, Theresa M; Avise, John C; Uyenoyama, Marcy K

    2015-11-01

    We present a Bayesian method for characterizing the mating system of populations reproducing through a mixture of self-fertilization and random outcrossing. Our method uses patterns of genetic variation across the genome as a basis for inference about reproduction under pure hermaphroditism, gynodioecy, and a model developed to describe the self-fertilizing killifish Kryptolebias marmoratus. We extend the standard coalescence model to accommodate these mating systems, accounting explicitly for multilocus identity disequilibrium, inbreeding depression, and variation in fertility among mating types. We incorporate the Ewens sampling formula (ESF) under the infinite-alleles model of mutation to obtain a novel expression for the likelihood of mating system parameters. Our Markov chain Monte Carlo (MCMC) algorithm assigns locus-specific mutation rates, drawn from a common mutation rate distribution that is itself estimated from the data using a Dirichlet process prior model. Our sampler is designed to accommodate additional information, including observations pertaining to the sex ratio, the intensity of inbreeding depression, and other aspects of reproduction. It can provide joint posterior distributions for the population-wide proportion of uniparental individuals, locus-specific mutation rates, and the number of generations since the most recent outcrossing event for each sampled individual. Further, estimation of all basic parameters of a given model permits estimation of functions of those parameters, including the proportion of the gene pool contributed by each sex and relative effective numbers. PMID:26374460

  18. Low-rank separated representation surrogates of high-dimensional stochastic functions: Application in Bayesian inference

    International Nuclear Information System (INIS)

    This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector-valued separated representation-based model, in comparison to the available scalar-valued case, leads to a significant reduction in the cost of approximation by an order of magnitude equal to the vector size. The performance of the method is studied through its application to three numerical examples including a 41-dimensional elliptic PDE and a 21-dimensional cavity flow

  19. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  20. Differential gene co-expression networks via Bayesian biclustering models

    OpenAIRE

    Gao, Chuan; Zhao, Shiwen; McDowell, Ian C.; Brown, Christopher D.; Barbara E Engelhardt

    2014-01-01

    Identifying latent structure in large data matrices is essential for exploring biological processes. Here, we consider recovering gene co-expression networks from gene expression data, where each network encodes relationships between genes that are locally co-regulated by shared biological mechanisms. To do this, we develop a Bayesian statistical model for biclustering to infer subsets of co-regulated genes whose covariation may be observed in only a subset of the samples. Our biclustering me...

  1. Bayesian Models of Learning and Reasoning with Relations

    OpenAIRE

    Chen, Dawn

    2014-01-01

    How do humans acquire relational concepts such as larger, which are essential for analogical inference and other forms of high-level reasoning? Are they necessarily innate, or can they be learned from non-relational inputs? Using comparative relations as a model domain, we show that structured relations can be learned from unstructured inputs of realistic complexity, applying bottom-up Bayesian learning mechanisms that make minimal assumptions about innate representations. First, we introduce...

  2. Computational methods for Bayesian model choice

    OpenAIRE

    Robert, Christian P.; Wraith, Darren

    2009-01-01

    In this note, we shortly survey some recent approaches on the approximation of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model choice. In particular, we reassess importance sampling, harmonic mean sampling, and nested sampling from a unified perspective.

  3. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    Science.gov (United States)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  4. Application of Bayesian inference to the study of hierarchical organization in self-organized complex adaptive systems

    Science.gov (United States)

    Knuth, K. H.

    2001-05-01

    We consider the application of Bayesian inference to the study of self-organized structures in complex adaptive systems. In particular, we examine the distribution of elements, agents, or processes in systems dominated by hierarchical structure. We demonstrate that results obtained by Caianiello [1] on Hierarchical Modular Systems (HMS) can be found by applying Jaynes' Principle of Group Invariance [2] to a few key assumptions about our knowledge of hierarchical organization. Subsequent application of the Principle of Maximum Entropy allows inferences to be made about specific systems. The utility of the Bayesian method is considered by examining both successes and failures of the hierarchical model. We discuss how Caianiello's original statements suffer from the Mind Projection Fallacy [3] and we restate his assumptions thus widening the applicability of the HMS model. The relationship between inference and statistical physics, described by Jaynes [4], is reiterated with the expectation that this realization will aid the field of complex systems research by moving away from often inappropriate direct application of statistical mechanics to a more encompassing inferential methodology.

  5. On an adaptive preconditioned Crank-Nicolson MCMC algorithm for infinite dimensional Bayesian inferences

    OpenAIRE

    Hu, Zixi; Yao, Zhewei; Li, Jinglai

    2015-01-01

    Many scientific and engineering problems require to perform Bayesian inferences for unknowns of infinite dimension. In such problems, many standard Markov Chain Monte Carlo (MCMC) algorithms become arbitrary slow under the mesh refinement, which is referred to as being dimension dependent. To this end, a family of dimensional independent MCMC algorithms, known as the preconditioned Crank-Nicolson (pCN) methods, were proposed to sample the infinite dimensional parameters. In this work we devel...

  6. Bayesian inferences of the thermal properties of a wall using temperature and heat flux measurements

    OpenAIRE

    Iglesias, Marco; Sawlan, Zaid; Scavino, Marco; Tempone, Raul; Wood, Christopher

    2016-01-01

    We develop a hierarchical Bayesian inference method to estimate the thermal resistance and the volumetric heat capacity of a wall. These thermal properties are essential for accurate building energy simulations that are needed to make effective energy-saving policies. We apply our methodology to an experimental case study conducted in an environmental chamber, where measurements are recorded every minute from temperature probes and heat flux sensors placed on both sides of a solid brick wall ...

  7. Adaptability and phenotypic stability of common bean genotypes through Bayesian inference.

    Science.gov (United States)

    Corrêa, A M; Teodoro, P E; Gonçalves, M C; Barroso, L M A; Nascimento, M; Santos, A; Torres, F E

    2016-01-01

    This study used Bayesian inference to investigate the genotype x environment interaction in common bean grown in Mato Grosso do Sul State, and it also evaluated the efficiency of using informative and minimally informative a priori distributions. Six trials were conducted in randomized blocks, and the grain yield of 13 common bean genotypes was assessed. To represent the minimally informative a priori distributions, a probability distribution with high variance was used, and a meta-analysis concept was adopted to represent the informative a priori distributions. Bayes factors were used to conduct comparisons between the a priori distributions. The Bayesian inference was effective for the selection of upright common bean genotypes with high adaptability and phenotypic stability using the Eberhart and Russell method. Bayes factors indicated that the use of informative a priori distributions provided more accurate results than minimally informative a priori distributions. According to Bayesian inference, the EMGOPA-201, BAMBUÍ, CNF 4999, CNF 4129 A 54, and CNFv 8025 genotypes had specific adaptability to favorable environments, while the IAPAR 14 and IAC CARIOCA ETE genotypes had specific adaptability to unfavorable environments. PMID:27173270

  8. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seong Keun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2014-08-15

    Bayesian methodology has been used widely used in various research fields. It is method of inference using Bayes' rule to update the estimation of probability for the certain hypothesis when additional evidences are acquired. According to the current researches, malfunction of nuclear power plant can be detected by using this Bayesian inference which consistently piles up the newly incoming data and updates its estimation. However, those researches are based on the assumption that people are doing like computer perfectly, which can be criticized and may cause a problem in real world application. Studies in cognitive psychology indicates that when the amount of information becomes larger, people can't save the whole data because people have limited memory capacity which is well known as working memory, and also they have attention problem. The purpose of this paper is to consider the psychological factors and confirm how much this working memory and attention will affect the resulted estimation based on the Bayesian inference. To confirm this, experiment on human is needed, and the tool of experiment is Compact Nuclear Simulator (CNS)

  9. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    International Nuclear Information System (INIS)

    Bayesian methodology has been used widely used in various research fields. It is method of inference using Bayes' rule to update the estimation of probability for the certain hypothesis when additional evidences are acquired. According to the current researches, malfunction of nuclear power plant can be detected by using this Bayesian inference which consistently piles up the newly incoming data and updates its estimation. However, those researches are based on the assumption that people are doing like computer perfectly, which can be criticized and may cause a problem in real world application. Studies in cognitive psychology indicates that when the amount of information becomes larger, people can't save the whole data because people have limited memory capacity which is well known as working memory, and also they have attention problem. The purpose of this paper is to consider the psychological factors and confirm how much this working memory and attention will affect the resulted estimation based on the Bayesian inference. To confirm this, experiment on human is needed, and the tool of experiment is Compact Nuclear Simulator (CNS)

  10. A novel multimode process monitoring method integrating LDRSKM with Bayesian inference

    Institute of Scientific and Technical Information of China (English)

    Shi-jin REN; Yin LIANG; Xiang-jun ZHAO; Mao-yun YANG

    2015-01-01

    A local discriminant regularized soft k-means (LDRSKM) method with Bayesian inference is proposed for multimode process monitoring. LDRSKM extends the regularized soft k-means algorithm by exploiting the local and non-local geometric information of the data and generalized linear discriminant analysis to provide a better and more meaningful data partition. LDRSKM can perform clustering and subspace selection simultaneously, enhancing the separability of data residing in different clusters. With the data partition obtained, kernel support vector data description (KSVDD) is used to establish the monitoring statistics and control limits. Two Bayesian inference based global fault detection indicators are then developed using the local monitoring results associated with principal and residual subspaces. Based on clustering analysis, Bayesian inference and mani-fold learning methods, the within and cross-mode correlations, and local geometric information can be exploited to enhance monitoring performances for nonlinear and non-Gaussian processes. The effectiveness and efficiency of the proposed method are evaluated using the Tennessee Eastman benchmark process.

  11. Bayesian Correlated Component Analysis for inference of joint EEG activation

    DEFF Research Database (Denmark)

    Poulsen, Andreas Trier; Kamronn, Simon Due; Parra, Lucas; Hansen, Lars Kai

    2014-01-01

    We propose a probabilistic generative multi-view model to test the representational universality of human information processing. The model is tested in simulated data and in a well-established benchmark EEG dataset....

  12. Bayesian Correlated Component Analysis for inference of joint EEG activation

    DEFF Research Database (Denmark)

    Poulsen, Andreas Trier; Kamronn, Simon Due; Parra, Lucas;

    2014-01-01

    We propose a probabilistic generative multi-view model to test the representational universality of human information processing. The model is tested in simulated data and in a well-established benchmark EEG dataset.......We propose a probabilistic generative multi-view model to test the representational universality of human information processing. The model is tested in simulated data and in a well-established benchmark EEG dataset....

  13. Bayesian Inference of the Evolution of a Phenotype Distribution on a Phylogenetic Tree

    Science.gov (United States)

    Ansari, M. Azim; Didelot, Xavier

    2016-01-01

    The distribution of a phenotype on a phylogenetic tree is often a quantity of interest. Many phenotypes have imperfect heritability, so that a measurement of the phenotype for an individual can be thought of as a single realization from the phenotype distribution of that individual. If all individuals in a phylogeny had the same phenotype distribution, measured phenotypes would be randomly distributed on the tree leaves. This is, however, often not the case, implying that the phenotype distribution evolves over time. Here we propose a new model based on this principle of evolving phenotype distribution on the branches of a phylogeny, which is different from ancestral state reconstruction where the phenotype itself is assumed to evolve. We develop an efficient Bayesian inference method to estimate the parameters of our model and to test the evidence for changes in the phenotype distribution. We use multiple simulated data sets to show that our algorithm has good sensitivity and specificity properties. Since our method identifies branches on the tree on which the phenotype distribution has changed, it is able to break down a tree into components for which this distribution is unique and constant. We present two applications of our method, one investigating the association between HIV genetic variation and human leukocyte antigen and the other studying host range distribution in a lineage of Salmonella enterica, and we discuss many other potential applications. PMID:27412711

  14. Novel Gauss-Hermite integration based Bayesian inference on optimal wavelet parameters for bearing fault diagnosis

    Science.gov (United States)

    Wang, Dong; Tsui, Kwok-Leung; Zhou, Qiang

    2016-05-01

    Rolling element bearings are commonly used in machines to provide support for rotating shafts. Bearing failures may cause unexpected machine breakdowns and increase economic cost. To prevent machine breakdowns and reduce unnecessary economic loss, bearing faults should be detected as early as possible. Because wavelet transform can be used to highlight impulses caused by localized bearing faults, wavelet transform has been widely investigated and proven to be one of the most effective and efficient methods for bearing fault diagnosis. In this paper, a new Gauss-Hermite integration based Bayesian inference method is proposed to estimate the posterior distribution of wavelet parameters. The innovations of this paper are illustrated as follows. Firstly, a non-linear state space model of wavelet parameters is constructed to describe the relationship between wavelet parameters and hypothetical measurements. Secondly, the joint posterior probability density function of wavelet parameters and hypothetical measurements is assumed to follow a joint Gaussian distribution so as to generate Gaussian perturbations for the state space model. Thirdly, Gauss-Hermite integration is introduced to analytically predict and update moments of the joint Gaussian distribution, from which optimal wavelet parameters are derived. At last, an optimal wavelet filtering is conducted to extract bearing fault features and thus identify localized bearing faults. Two instances are investigated to illustrate how the proposed method works. Two comparisons with the fast kurtogram are used to demonstrate that the proposed method can achieve better visual inspection performances than the fast kurtogram.

  15. Bayesian Variable Selection in Spatial Autoregressive Models

    OpenAIRE

    Jesus Crespo Cuaresma; Philipp Piribauer

    2015-01-01

    This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. We present two alternative approaches which can be implemented using Gibbs sampling methods in a straightforward way and allow us to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. In a simulation study we show that the variable selection approaches tend to outperform existing Bayesian model averaging tech...

  16. Back to BaySICS: a user-friendly program for Bayesian Statistical Inference from Coalescent Simulations.

    Directory of Open Access Journals (Sweden)

    Edson Sandoval-Castellanos

    Full Text Available Inference of population demographic history has vastly improved in recent years due to a number of technological and theoretical advances including the use of ancient DNA. Approximate Bayesian computation (ABC stands among the most promising methods due to its simple theoretical fundament and exceptional flexibility. However, limited availability of user-friendly programs that perform ABC analysis renders it difficult to implement, and hence programming skills are frequently required. In addition, there is limited availability of programs able to deal with heterochronous data. Here we present the software BaySICS: Bayesian Statistical Inference of Coalescent Simulations. BaySICS provides an integrated and user-friendly platform that performs ABC analyses by means of coalescent simulations from DNA sequence data. It estimates historical demographic population parameters and performs hypothesis testing by means of Bayes factors obtained from model comparisons. Although providing specific features that improve inference from datasets with heterochronous data, BaySICS also has several capabilities making it a suitable tool for analysing contemporary genetic datasets. Those capabilities include joint analysis of independent tables, a graphical interface and the implementation of Markov-chain Monte Carlo without likelihoods.

  17. Metainference: A Bayesian Inference Method for Heterogeneous Systems

    OpenAIRE

    Bonomi, Massimiliano; Camilloni, Carlo; Cavalli, Andrea; Vendruscolo, Michele

    2015-01-01

    Modelling a complex system is almost invariably a challenging task. The incorporation of experimental observations can be used to improve the quality of a model, and thus to obtain better predictions about the behavior of the corresponding system. This approach, however, is affected by a variety of different errors, especially when a system populates simultaneously an ensemble of different states and experimental data are measured as averages over such states. To address this problem we prese...

  18. Technical Note: How to use Winbugs to infer animal models

    DEFF Research Database (Denmark)

    Damgaard, Lars Holm

    . Second, we show how this approach can be used to draw inferences from a wide range of animal models using the computer package Winbugs. Finally, we illustrate the approach in a simulation study, in which the data are generated and analyzed using Winbugs according to a linear model with i.i.d errors......This paper deals with Bayesian inferences of animal models using Gibbs sampling. First, we suggest a general and efficient method for updating additive genetic effects, in which the computational cost is independent of the pedigree depth and increases linearly only with the size of the pedigree...

  19. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    International Nuclear Information System (INIS)

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented

  20. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    Energy Technology Data Exchange (ETDEWEB)

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-02-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.

  1. Bayesian semiparametric inference for multivariate doubly-interval-censored data

    NARCIS (Netherlands)

    A. Jara; E.M.E.H. Lesaffre (Emmanuel); M. de Iorio (Maria); F. Quintana (Fernando)

    2010-01-01

    textabstractBased on a data set obtained in a dental longitudinal study, conducted in Flanders (Belgium), the joint time to caries distribution of permanent first molars was modeled as a function of covariates. This involves an analysis of multivariate continuous doubly-interval-censored data since:

  2. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  3. Machine health prognostics using the Bayesian-inference-based probabilistic indication and high-order particle filtering framework

    Science.gov (United States)

    Yu, Jianbo

    2015-12-01

    Prognostics is much efficient to achieve zero-downtime performance, maximum productivity and proactive maintenance of machines. Prognostics intends to assess and predict the time evolution of machine health degradation so that machine failures can be predicted and prevented. A novel prognostics system is developed based on the data-model-fusion scheme using the Bayesian inference-based self-organizing map (SOM) and an integration of logistic regression (LR) and high-order particle filtering (HOPF). In this prognostics system, a baseline SOM is constructed to model the data distribution space of healthy machine under an assumption that predictable fault patterns are not available. Bayesian inference-based probability (BIP) derived from the baseline SOM is developed as a quantification indication of machine health degradation. BIP is capable of offering failure probability for the monitored machine, which has intuitionist explanation related to health degradation state. Based on those historic BIPs, the constructed LR and its modeling noise constitute a high-order Markov process (HOMP) to describe machine health propagation. HOPF is used to solve the HOMP estimation to predict the evolution of the machine health in the form of a probability density function (PDF). An on-line model update scheme is developed to adapt the Markov process changes to machine health dynamics quickly. The experimental results on a bearing test-bed illustrate the potential applications of the proposed system as an effective and simple tool for machine health prognostics.

  4. Bayesian Joint Modelling for Object Localisation in Weakly Labelled Images.

    Science.gov (United States)

    Shi, Zhiyuan; Hospedales, Timothy M; Xiang, Tao

    2015-10-01

    We address the problem of localisation of objects as bounding boxes in images and videos with weak labels. This weakly supervised object localisation problem has been tackled in the past using discriminative models where each object class is localised independently from other classes. In this paper, a novel framework based on Bayesian joint topic modelling is proposed, which differs significantly from the existing ones in that: (1) All foreground object classes are modelled jointly in a single generative model that encodes multiple object co-existence so that "explaining away" inference can resolve ambiguity and lead to better learning and localisation. (2) Image backgrounds are shared across classes to better learn varying surroundings and "push out" objects of interest. (3) Our model can be learned with a mixture of weakly labelled and unlabelled data, allowing the large volume of unlabelled images on the Internet to be exploited for learning. Moreover, the Bayesian formulation enables the exploitation of various types of prior knowledge to compensate for the limited supervision offered by weakly labelled data, as well as Bayesian domain adaptation for transfer learning. Extensive experiments on the PASCAL VOC, ImageNet and YouTube-Object videos datasets demonstrate the effectiveness of our Bayesian joint model for weakly supervised object localisation. PMID:26340253

  5. Introduction to Hierarchical Bayesian Modeling for Ecological Data

    CERN Document Server

    Parent, Eric

    2012-01-01

    Making statistical modeling and inference more accessible to ecologists and related scientists, Introduction to Hierarchical Bayesian Modeling for Ecological Data gives readers a flexible and effective framework to learn about complex ecological processes from various sources of data. It also helps readers get started on building their own statistical models. The text begins with simple models that progressively become more complex and realistic through explanatory covariates and intermediate hidden states variables. When fitting the models to data, the authors gradually present the concepts a

  6. Modelling of JET diagnostics using Bayesian Graphical Models

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.

    2011-07-01

    The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This

  7. Bayesian inference of protein structure from chemical shift data

    DEFF Research Database (Denmark)

    Bratholm, Lars Andersen; Christensen, Anders Steen; Hamelryck, Thomas Wim; Jensen, Jan Halborg

    2015-01-01

    Protein chemical shifts are routinely used to augment molecular mechanics force fields in protein structure simulations, with weights of the chemical shift restraints determined empirically. These weights, however, might not be an optimal descriptor of a given protein structure and predictive model...... chain Monte Carlo simulations of three small proteins (ENHD, Protein G and the SMN Tudor Domain) using the PROFASI force field and the chemical shift predictor CamShift. Using a clustering-criterion for identifying the best structure, together with the addition of a solvent exposure scoring term, the......, result in overall better convergence to the native fold, suggesting that both types of distribution might be useful in different aspects of the protein structure prediction....

  8. Bayesian inference of protein structure from chemical shift data

    Directory of Open Access Journals (Sweden)

    Lars A. Bratholm

    2015-03-01

    Full Text Available Protein chemical shifts are routinely used to augment molecular mechanics force fields in protein structure simulations, with weights of the chemical shift restraints determined empirically. These weights, however, might not be an optimal descriptor of a given protein structure and predictive model, and a bias is introduced which might result in incorrect structures. In the inferential structure determination framework, both the unknown structure and the disagreement between experimental and back-calculated data are formulated as a joint probability distribution, thus utilizing the full information content of the data. Here, we present the formulation of such a probability distribution where the error in chemical shift prediction is described by either a Gaussian or Cauchy distribution. The methodology is demonstrated and compared to a set of empirically weighted potentials through Markov chain Monte Carlo simulations of three small proteins (ENHD, Protein G and the SMN Tudor Domain using the PROFASI force field and the chemical shift predictor CamShift. Using a clustering-criterion for identifying the best structure, together with the addition of a solvent exposure scoring term, the simulations suggests that sampling both the structure and the uncertainties in chemical shift prediction leads more accurate structures compared to conventional methods using empirical determined weights. The Cauchy distribution, using either sampled uncertainties or predetermined weights, did, however, result in overall better convergence to the native fold, suggesting that both types of distribution might be useful in different aspects of the protein structure prediction.

  9. Protein NMR Structure Refinement based on Bayesian Inference

    Science.gov (United States)

    Ikeya, Teppei; Ikeda, Shiro; Kigawa, Takanori; Ito, Yutaka; Güntert, Peter

    2016-03-01

    Nuclear Magnetic Resonance (NMR) spectroscopy is a tool to investigate threedimensional (3D) structures and dynamics of biomacromolecules at atomic resolution in solution or more natural environments such as living cells. Since NMR data are principally only spectra with peak signals, it is required to properly deduce structural information from the sparse experimental data with their imperfections and uncertainty, and to visualize 3D conformations by NMR structure calculation. In order to efficiently analyse the data, Rieping et al. proposed a new structure calculation method based on Bayes’ theorem. We implemented a similar approach into the program CYANA with some modifications. It allows us to handle automatic NOE cross peak assignments in unambiguous and ambiguous usages, and to create a prior distribution based on a physical force field with the generalized Born implicit water model. The sampling scheme for obtaining the posterior is performed by a hybrid Monte Carlo algorithm combined with Markov chain Monte Carlo (MCMC) by the Gibbs sampler, and molecular dynamics simulation (MD) for obtaining a canonical ensemble of conformations. Since it is not trivial to search the entire function space particularly for exploring the conformational prior due to the extraordinarily large conformation space of proteins, the replica exchange method is performed, in which several MCMC calculations with different temperatures run in parallel as replicas. It is shown with simulated data or randomly deleted experimental peaks that the new structure calculation method can provide accurate structures even with less peaks, especially compared with the conventional method. In particular, it dramatically improves in-cell structures of the proteins GB1 and TTHA1718 using exclusively information obtained in living Escherichia coli (E. coli) cells.

  10. Bayesian Analysis of Multivariate Probit Models

    OpenAIRE

    Siddhartha Chib; Edward Greenberg

    1996-01-01

    This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...

  11. Bayesian Network Models for Adaptive Testing

    Czech Academy of Sciences Publication Activity Database

    Plajner, Martin; Vomlel, Jiří

    Achen: Sun SITE Central Europe, 2016 - (Agosta, J.; Carvalho, R.), s. 24-33. (CEUR Workshop Proceedings. Vol 1565). ISSN 1613-0073. [The Twelfth UAI Bayesian Modeling Applications Workshop (BMAW 2015). Amsterdam (NL), 16.07.2015] R&D Projects: GA ČR GA13-20012S Institutional support: RVO:67985556 Keywords : Bayesian networks * Computerized adaptive testing Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2016/MTR/plajner-0458062.pdf

  12. Modeling error distributions of growth curve models through Bayesian methods.

    Science.gov (United States)

    Zhang, Zhiyong

    2016-06-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided. PMID:26019004

  13. Uncorrelated Track Classification, Characterization, & Prioritization using Admissible Regions and Bayesian Inference

    Science.gov (United States)

    Holzinger, M.

    2014-09-01

    This paper introduces and discusses a method to rigorously classify and prioritize UCTs using Bayesian inference and admissible regions. A detailed derivation and discussion of the methodology is given, followed by a generalized definition of prioritization parameters. Several example prioritization parameters including `time left to detect,' 'zero-effort miss,' and `effective albedo-area' are motivated and given. A number of illustrative applications with optical UCTs are examined to demonstrate information that can be extracted from each observation. Finally, the information extracted from each UCT is then compared and approaches to observation prioritization discussed.

  14. DIP -- Diagnostics for Insufficiencies of Posterior calculations in Bayesian signal inference

    CERN Document Server

    Dorn, Sebastian; lin, Torsten A Enß

    2013-01-01

    We present an error-diagnostic validation method for posterior distributions in Bayesian signal inference. It transfers deviations from the correct posterior into characteristic deviations from a uniform distribution of a quantity constructed for this purpose. We show that this method is able to reveal and discriminate several kinds of numerical and approximation errors. For this we present a number of analytical examples of posteriors with incorrect variance, skewness, position of the maximum, or normalization. We show further how this test can be applied to multidimensional signals.

  15. Bayesian hierarchical modelling of weak lensing - the golden goal

    CERN Document Server

    Heavens, Alan; Jaffe, Andrew; Hoffmann, Till; Kiessling, Alina; Wandelt, Benjamin

    2016-01-01

    To accomplish correct Bayesian inference from weak lensing shear data requires a complete statistical description of the data. The natural framework to do this is a Bayesian Hierarchical Model, which divides the chain of reasoning into component steps. Starting with a catalogue of shear estimates in tomographic bins, we build a model that allows us to sample simultaneously from the the underlying tomographic shear fields and the relevant power spectra (E-mode, B-mode, and E-B, for auto- and cross-power spectra). The procedure deals easily with masked data and intrinsic alignments. Using Gibbs sampling and messenger fields, we show with simulated data that the large (over 67000-)dimensional parameter space can be efficiently sampled and the full joint posterior probability density function for the parameters can feasibly be obtained. The method correctly recovers the underlying shear fields and all of the power spectra, including at levels well below the shot noise.

  16. On Bayesian Nonparametric Continuous Time Series Models

    OpenAIRE

    Karabatsos, George; Walker, Stephen G.

    2013-01-01

    This paper is a note on the use of Bayesian nonparametric mixture models for continuous time series. We identify a key requirement for such models, and then establish that there is a single type of model which meets this requirement. As it turns out, the model is well known in multiple change-point problems.

  17. The Bayesian Bootstrap

    OpenAIRE

    Rubin, Donald B.

    1981-01-01

    The Bayesian bootstrap is the Bayesian analogue of the bootstrap. Instead of simulating the sampling distribution of a statistic estimating a parameter, the Bayesian bootstrap simulates the posterior distribution of the parameter; operationally and inferentially the methods are quite similar. Because both methods of drawing inferences are based on somewhat peculiar model assumptions and the resulting inferences are generally sensitive to these assumptions, neither method should be applied wit...

  18. Analysis of trend changes in Northern African palaeo-climate by using Bayesian inference

    Science.gov (United States)

    Schütz, Nadine; Trauth, Martin H.; Holschneider, Matthias

    2010-05-01

    Climate variability of Northern Africa is of high interest due to climate-evolutionary linkages under study. The reconstruction of the palaeo-climate over long time scales, including the expected linkages (> 3 Ma), is mainly accessible by proxy data from deep sea drilling cores. By concentrating on published data sets, we try to decipher rhythms and trends to detect correlations between different proxy time series by advanced mathematical methods. Our preliminary data is dust concentration, as an indicator for climatic changes such as humidity, from the ODP sites 659, 721 and 967 situated around Northern Africa. Our interest is in challenging the available time series with advanced statistical methods to detect significant trend changes and to compare different model assumptions. For that purpose, we want to avoid the rescaling of the time axis to obtain equidistant time steps for filtering methods. Additionally we demand an plausible description of the errors for the estimated parameters, in terms of confidence intervals. Finally, depending on what model we restrict on, we also want an insight in the parameter structure of the assumed models. To gain this information, we focus on Bayesian inference by formulating the problem as a linear mixed model, so that the expectation and deviation are of linear structure. By using the Bayesian method we can formulate the posteriori density as a function of the model parameters and calculate this probability density in the parameter space. Depending which parameters are of interest, we analytically and numerically marginalize the posteriori with respect to the remaining parameters of less interest. We apply a simple linear mixed model to calculate the posteriori densities of the ODP sites 659 and 721 concerning the last 5 Ma at maximum. From preliminary calculations on these data sets, we can confirm results gained by the method of breakfit regression combined with block bootstrapping ([1]). We obtain a significant change

  19. Bayesian semiparametric dynamic Nelson-Siegel model

    NARCIS (Netherlands)

    C. Cakmakli

    2011-01-01

    This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Diric

  20. Bayesian calibration of car-following models

    NARCIS (Netherlands)

    Van Hinsbergen, C.P.IJ.; Van Lint, H.W.C.; Hoogendoorn, S.P.; Van Zuylen, H.J.

    2010-01-01

    Recent research has revealed that there exist large inter-driver differences in car-following behavior such that different car-following models may apply to different drivers. This study applies Bayesian techniques to the calibration of car-following models, where prior distributions on each model p

  1. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  2. Model Data Fusion: developing Bayesian inversion to constrain equilibrium and mode structure

    OpenAIRE

    Hole, M. J.; von Nessi, G.; Bertram, J; J. Svensson; Appel, L. C.; Blackwell, B. D.; Dewar, R L; Howard, J

    2010-01-01

    Recently, a new probabilistic "data fusion" framework based on Bayesian principles has been developed on JET and W7-AS. The Bayesian analysis framework folds in uncertainties and inter-dependencies in the diagnostic data and signal forward-models, together with prior knowledge of the state of the plasma, to yield predictions of internal magnetic structure. A feature of the framework, known as MINERVA (J. Svensson, A. Werner, Plasma Physics and Controlled Fusion 50, 085022, 2008), is the infer...

  3. Understanding the Scalability of Bayesian Network Inference Using Clique Tree Growth Curves

    Science.gov (United States)

    Mengshoel, Ole J.

    2010-01-01

    One of the main approaches to performing computation in Bayesian networks (BNs) is clique tree clustering and propagation. The clique tree approach consists of propagation in a clique tree compiled from a Bayesian network, and while it was introduced in the 1980s, there is still a lack of understanding of how clique tree computation time depends on variations in BN size and structure. In this article, we improve this understanding by developing an approach to characterizing clique tree growth as a function of parameters that can be computed in polynomial time from BNs, specifically: (i) the ratio of the number of a BN s non-root nodes to the number of root nodes, and (ii) the expected number of moral edges in their moral graphs. Analytically, we partition the set of cliques in a clique tree into different sets, and introduce a growth curve for the total size of each set. For the special case of bipartite BNs, there are two sets and two growth curves, a mixed clique growth curve and a root clique growth curve. In experiments, where random bipartite BNs generated using the BPART algorithm are studied, we systematically increase the out-degree of the root nodes in bipartite Bayesian networks, by increasing the number of leaf nodes. Surprisingly, root clique growth is well-approximated by Gompertz growth curves, an S-shaped family of curves that has previously been used to describe growth processes in biology, medicine, and neuroscience. We believe that this research improves the understanding of the scaling behavior of clique tree clustering for a certain class of Bayesian networks; presents an aid for trade-off studies of clique tree clustering using growth curves; and ultimately provides a foundation for benchmarking and developing improved BN inference and machine learning algorithms.

  4. Controlling the degree of caution in statistical inference with the Bayesian and frequentist approaches as opposite extremes

    CERN Document Server

    Bickel, David R

    2011-01-01

    In statistical practice, whether a Bayesian or frequentist approach is used in inference depends not only on the availability of prior information but also on the attitude taken toward partial prior information, with frequentists tending to be more cautious than Bayesians. The proposed framework defines that attitude in terms of a specified amount of caution, thereby enabling data analysis at the level of caution desired and on the basis of any prior information. The caution parameter represents the attitude toward partial prior information in much the same way as a loss function represents the attitude toward risk. When there is very little prior information and nonzero caution, the resulting inferences correspond to those of the candidate confidence intervals and p-values that are most similar to the credible intervals and hypothesis probabilities of the specified Bayesian posterior. On the other hand, in the presence of a known physical distribution of the parameter, inferences are based only on the corres...

  5. Macroscopic Models of Clique Tree Growth for Bayesian Networks

    Data.gov (United States)

    National Aeronautics and Space Administration — In clique tree clustering, inference consists of propagation in a clique tree compiled from a Bayesian network. In this paper, we develop an analytical approach to...

  6. Estimating Population Parameters using the Structured Serial Coalescent with Bayesian MCMC Inference when some Demes are Hidden

    Directory of Open Access Journals (Sweden)

    Allen Rodrigo

    2006-01-01

    Full Text Available Using the structured serial coalescent with Bayesian MCMC and serial samples, we estimate population size when some demes are not sampled or are hidden, ie ghost demes. It is found that even with the presence of a ghost deme, accurate inference was possible if the parameters are estimated with the true model. However with an incorrect model, estimates were biased and can be positively misleading. We extend these results to the case where there are sequences from the ghost at the last time sample. This case can arise in HIV patients, when some tissue samples and viral sequences only become available after death. When some sequences from the ghost deme are available at the last sampling time, estimation bias is reduced and accurate estimation of parameters associated with the ghost deme is possible despite sampling bias. Migration rates for this case are also shown to be good estimates when migration values are low.

  7. Bayesian Semiparametric Modeling of Realized Covariance Matrices

    OpenAIRE

    Jin, Xin; John M Maheu

    2014-01-01

    This paper introduces several new Bayesian nonparametric models suitable for capturing the unknown conditional distribution of realized covariance (RCOV) matrices. Existing dynamic Wishart models are extended to countably infinite mixture models of Wishart and inverse-Wishart distributions. In addition to mixture models with constant weights we propose models with time-varying weights to capture time dependence in the unknown distribution. Each of our models can be combined with returns...

  8. Complex Bayesian models: construction, and sampling strategies

    OpenAIRE

    Huston, Carolyn Marie

    2011-01-01

    Bayesian models are useful tools for realistically modeling processes occurring in the real world. In particular, we consider models for spatio-temporal data where the response vector is compositional, ie. has components that sum-to-one. A unique multivariate conditional hierarchical model (MVCAR) is proposed. Statistical methods for MVCAR models are well developed and we extend these tools for use with a discrete compositional response. We harness the advantages of an MVCAR model when the re...

  9. Bayesian Approach to Neuro-Rough Models for Modelling HIV

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.

  10. Survey of Bayesian Models for Modelling of Stochastic Temporal Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ng, B

    2006-10-12

    This survey gives an overview of popular generative models used in the modeling of stochastic temporal systems. In particular, this survey is organized into two parts. The first part discusses the discrete-time representations of dynamic Bayesian networks and dynamic relational probabilistic models, while the second part discusses the continuous-time representation of continuous-time Bayesian networks.

  11. Quantum-Like Bayesian Networks for Modeling Decision Making.

    Science.gov (United States)

    Moreira, Catarina; Wichert, Andreas

    2016-01-01

    In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios. PMID:26858669

  12. ESTIMATE OF THE HYPSOMETRIC RELATIONSHIP WITH NONLINEAR MODELS FITTED BY EMPIRICAL BAYESIAN METHODS

    Directory of Open Access Journals (Sweden)

    Monica Fabiana Bento Moreira

    2015-09-01

    Full Text Available In this paper we propose a Bayesian approach to solve the inference problem with restriction on parameters, regarding to nonlinear models used to represent the hypsometric relationship in clones of Eucalyptus sp. The Bayesian estimates are calculated using Monte Carlo Markov Chain (MCMC method. The proposed method was applied to different groups of actual data from which two were selected to show the results. These results were compared to the results achieved by the minimum square method, highlighting the superiority of the Bayesian approach, since this approach always generate the biologically consistent results for hipsometric relationship.

  13. On how to avoid input and structural uncertainties corrupt the inference of hydrological parameters using a Bayesian framework

    Science.gov (United States)

    Hernández, Mario R.; Francés, Félix

    2015-04-01

    One phase of the hydrological models implementation process, significantly contributing to the hydrological predictions uncertainty, is the calibration phase in which values of the unknown model parameters are tuned by optimizing an objective function. An unsuitable error model (e.g. Standard Least Squares or SLS) introduces noise into the estimation of the parameters. The main sources of this noise are the input errors and the hydrological model structural deficiencies. Thus, the biased calibrated parameters cause the divergence model phenomenon, where the errors variance of the (spatially and temporally) forecasted flows far exceeds the errors variance in the fitting period, and provoke the loss of part or all of the physical meaning of the modeled processes. In other words, yielding a calibrated hydrological model which works well, but not for the right reasons. Besides, an unsuitable error model yields a non-reliable predictive uncertainty assessment. Hence, with the aim of prevent all these undesirable effects, this research focuses on the Bayesian joint inference (BJI) of both the hydrological and error model parameters, considering a general additive (GA) error model that allows for correlation, non-stationarity (in variance and bias) and non-normality of model residuals. As hydrological model, it has been used a conceptual distributed model called TETIS, with a particular split structure of the effective model parameters. Bayesian inference has been performed with the aid of a Markov Chain Monte Carlo (MCMC) algorithm called Dream-ZS. MCMC algorithm quantifies the uncertainty of the hydrological and error model parameters by getting the joint posterior probability distribution, conditioned on the observed flows. The BJI methodology is a very powerful and reliable tool, but it must be used correctly this is, if non-stationarity in errors variance and bias is modeled, the Total Laws must be taken into account. The results of this research show that the

  14. Selectivity curves of the capture of mangrove crab (Ucides cordatus) on the northern coast of Brazil using bayesian inference.

    Science.gov (United States)

    Furtado-Junior, I; Abrunhosa, F A; Holanda, F C A F; Tavares, M C S

    2016-06-01

    Fishing selectivity of the mangrove crab Ucides cordatus in the north coast of Brazil can be defined as the fisherman's ability to capture and select individuals from a certain size or sex (or a combination of these factors) which suggests an empirical selectivity. Considering this hypothesis, we calculated the selectivity curves for males and females crabs using the logit function of the logistic model in the formulation. The Bayesian inference consisted of obtaining the posterior distribution by applying the Markov chain Monte Carlo (MCMC) method to software R using the OpenBUGS, BRugs, and R2WinBUGS libraries. The estimated results of width average carapace selection for males and females compared with previous studies reporting the average width of the carapace of sexual maturity allow us to confirm the hypothesis that most mature individuals do not suffer from fishing pressure; thus, ensuring their sustainability. PMID:26934154

  15. cosmoabc: Likelihood-free inference via Population Monte Carlo Approximate Bayesian Computation

    CERN Document Server

    Ishida, E E O; Penna-Lima, M; Cisewski, J; de Souza, R S; Trindade, A M M; Cameron, E

    2015-01-01

    Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogues. Here we present cosmoabc, a Python ABC sampler featuring a Population Monte Carlo (PMC) variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code is very flexible and can be easily coupled to an external simulator, while allowing to incorporate arbitrary distance and prior functions. As an example of practical application, we coupled cosmoabc with the numcosmo library and demonstrate how it can be used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function. cosmoabc is published under the GPLv3 license on PyPI and GitHub and documentation is availabl...

  16. Bayesian inference and assessment for rare-event bycatch in marine fisheries: a drift gillnet fishery case study.

    Science.gov (United States)

    Martin, Summer L; Stohs, Stephen M; Moore, Jeffrey E

    2015-03-01

    Fisheries bycatch is a global threat to marine megafauna. Environmental laws require bycatch assessment for protected species, but this is difficult when bycatch is rare. Low bycatch rates, combined with low observer coverage, may lead to biased, imprecise estimates when using standard ratio estimators. Bayesian model-based approaches incorporate uncertainty, produce less volatile estimates, and enable probabilistic evaluation of estimates relative to management thresholds. Here, we demonstrate a pragmatic decision-making process that uses Bayesian model-based inferences to estimate the probability of exceeding management thresholds for bycatch in fisheries with model rates of rare-event bycatch and mortality using Bayesian Markov chain Monte Carlo estimation methods and 20 years of observer data; (2) predict unobserved counts of bycatch and mortality; (3) infer expected annual mortality; (4) determine probabilities of mortality exceeding regulatory thresholds; and (5) classify the fishery as having low, medium, or high bycatch impact using those probabilities. We focused on leatherback sea turtles (Dermochelys coriacea) and humpback whales (Megaptera novaeangliae). Candidate models included Poisson or zero-inflated Poisson likelihood, fishing effort, and a bycatch rate that varied with area, time, or regulatory regime. Regulatory regime had the strongest effect on leatherback bycatch, with the highest levels occurring prior to a regulatory change. Area had the strongest effect on humpback bycatch. Cumulative bycatch estimates for the 20-year period were 104-242 leatherbacks (52-153 deaths) and 6-50 humpbacks (0-21 deaths). The probability of exceeding a regulatory threshold under the U.S. Marine Mammal Protection Act (Potential Biological Removal, PBR) of 0.113 humpback deaths was 0.58, warranting a "medium bycatch impact" classification of the fishery. No PBR thresholds exist for leatherbacks, but the probability of exceeding an anticipated level of two deaths

  17. Bayesian modelling of the emission spectrum of the JET Li-BES system

    CERN Document Server

    Kwak, Sehyun; Brix, M; Ghim, Y -c; Contributors, JET

    2015-01-01

    A Bayesian model of the emission spectrum of the JET lithium beam has been developed to infer the intensity of the Li I (2p-2s) line radiation and associated uncertainties. The detected spectrum for each channel of the lithium beam emission spectroscopy (Li-BES) system is here modelled by a single Li line modified by an instrumental function, Bremsstrahlung background, instrumental offset, and interference filter curve. Both the instrumental function and the interference filter curve are modelled with non-parametric Gaussian processes. All free parameters of the model, the intensities of the Li line, Bremsstrahlung background, and instrumental offset, are inferred using Bayesian probability theory with a Gaussian likelihood for photon statistics and electronic background noise. The prior distributions of the free parameters are chosen as Gaussians. Given these assumptions, the intensity of the Li line and corresponding uncertainties are analytically available using a Bayesian linear inversion technique. The p...

  18. Bayesian modeling and classification of neural signals

    OpenAIRE

    Lewicki, Michael S.

    1994-01-01

    Signal processing and classification algorithms often have limited applicability resulting from an inaccurate model of the signal's underlying structure. We present here an efficient, Bayesian algorithm for modeling a signal composed of the superposition of brief, Poisson-distributed functions. This methodology is applied to the specific problem of modeling and classifying extracellular neural waveforms which are composed of a superposition of an unknown number of action potentials CAPs). ...

  19. Development of a Weibull posterior distribution by combining a Weibull prior with an actual failure distribution using Bayesian inference

    Science.gov (United States)

    Giuntini, Michael E.; Giuntini, Ronald E.

    1991-01-01

    A Bayesian inference process for system logistical planning is presented which provides a method for incorporating actual failures with prediction data for an ongoing and improving reliability estimates. The process uses the Weibull distribution, and provides a means for examining and updating logistical and maintenance support needs.

  20. Phenotypic Bayesian phylodynamics: hierarchical graph models, antigenic clustering and latent liabilities

    OpenAIRE

    Cybis, Gabriela Bettella

    2014-01-01

    Combining models for phenotypic and molecular evolution can lead to powerful inference tools.Under the flexible framework of Bayesian phylogenetics, I develop statistical methods to address phylodynamic problems in this intersection.First, I present a hierarchical phylogeographic method that combines information across multiple datasets to draw inference on a common geographical spread process. Each dataset represents a parallel realization of this geographic process on a different group of ...

  1. Distributed Bayesian Networks for User Modeling

    DEFF Research Database (Denmark)

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang;

    2006-01-01

    The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used by such...... adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... mechanism efficiently combines distributed learner models without the need to exchange internal structure of local Bayesian networks, nor local evidence between the involved platforms....

  2. IZI: Inferring the Gas Phase Metallicity (Z) and Ionization Parameter (q) of Ionized Nebulae using Bayesian Statistics

    CERN Document Server

    Blanc, Guillermo A; Vogt, Frédéric P A; Dopita, Michael A

    2014-01-01

    We present a new method for inferring the metallicity (Z) and ionization parameter (q) of HII regions and star-forming galaxies using strong nebular emission lines (SEL). We use Bayesian inference to derive the joint and marginalized posterior probability density functions for Z and q given a set of observed line fluxes and an input photo-ionization model. Our approach allows the use of arbitrary sets of SELs and the inclusion of flux upper limits. The method provides a self-consistent way of determining the physical conditions of ionized nebulae that is not tied to the arbitrary choice of a particular SEL diagnostic and uses all the available information. Unlike theoretically calibrated SEL diagnostics the method is flexible and not tied to a particular photo-ionization model. We describe our algorithm, validate it against other methods, and present a tool that implements it called IZI. Using a sample of nearby extra-galactic HII regions we assess the performance of commonly used SEL abundance diagnostics. W...

  3. Predicting the Future as Bayesian Inference: People Combine Prior Knowledge with Observations when Estimating Duration and Extent

    Science.gov (United States)

    Griffiths, Thomas L.; Tenenbaum, Joshua B.

    2011-01-01

    Predicting the future is a basic problem that people have to solve every day and a component of planning, decision making, memory, and causal reasoning. In this article, we present 5 experiments testing a Bayesian model of predicting the duration or extent of phenomena from their current state. This Bayesian model indicates how people should…

  4. A population-based Bayesian approach to the minimal model of glucose and insulin homeostasis

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Højbjerre, Malene

    2005-01-01

    for a whole population. Traditionally it has been analysed in a deterministic set-up with only error terms on the measurements. In this work we adopt a Bayesian graphical model to describe the coupled minimal model that accounts for both measurement and process variability, and the model is extended...... to a population-based model. The estimation of the parameters are efficiently implemented in a Bayesian approach where posterior inference is made through the use of Markov chain Monte Carlo techniques. Hereby we obtain a powerful and flexible modelling framework for regularizing the ill-posed estimation problem...

  5. Comparison of Bayesian clustering and edge detection methods for inferring boundaries in landscape genetics

    Science.gov (United States)

    Safner, T.; Miller, M.P.; McRae, B.H.; Fortin, M.-J.; Manel, S.

    2011-01-01

    Recently, techniques available for identifying clusters of individuals or boundaries between clusters using genetic data from natural populations have expanded rapidly. Consequently, there is a need to evaluate these different techniques. We used spatially-explicit simulation models to compare three spatial Bayesian clustering programs and two edge detection methods. Spatially-structured populations were simulated where a continuous population was subdivided by barriers. We evaluated the ability of each method to correctly identify boundary locations while varying: (i) time after divergence, (ii) strength of isolation by distance, (iii) level of genetic diversity, and (iv) amount of gene flow across barriers. To further evaluate the methods' effectiveness to detect genetic clusters in natural populations, we used previously published data on North American pumas and a European shrub. Our results show that with simulated and empirical data, the Bayesian spatial clustering algorithms outperformed direct edge detection methods. All methods incorrectly detected boundaries in the presence of strong patterns of isolation by distance. Based on this finding, we support the application of Bayesian spatial clustering algorithms for boundary detection in empirical datasets, with necessary tests for the influence of isolation by distance. ?? 2011 by the authors; licensee MDPI, Basel, Switzerland.

  6. Bayesian Network Based XP Process Modelling

    Directory of Open Access Journals (Sweden)

    Mohamed Abouelela

    2010-07-01

    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  7. A Bayesian Modelling of Wildfires in Portugal

    OpenAIRE

    Silva, Giovani L.; Soares, Paulo; Marques, Susete; Dias, Inês M.; Oliveira, Manuela M.; Borges, Guilherme J.

    2015-01-01

    In the last decade wildfires became a serious problem in Portugal due to different issues such as climatic characteristics and nature of Portuguese forest. In order to analyse wildfire data, we employ beta regression for modelling the proportion of burned forest area, under a Bayesian perspective. Our main goal is to find out fire risk factors that influence the proportion of area burned and what may make a forest type susceptible or resistant to fire. Then, we analyse wildfire...

  8. Market Segmentation Using Bayesian Model Based Clustering

    OpenAIRE

    Van Hattum, P.

    2009-01-01

    This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...

  9. Centralized Bayesian reliability modelling with sensor networks

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Sečkárová, Vladimíra

    2013-01-01

    Roč. 19, č. 5 (2013), s. 471-482. ISSN 1387-3954 R&D Projects: GA MŠk 7D12004 Grant ostatní: GA MŠk(CZ) SVV-265315 Keywords : Bayesian modelling * Sensor network * Reliability Subject RIV: BD - Theory of Information Impact factor: 0.984, year: 2013 http://library.utia.cas.cz/separaty/2013/AS/dedecius-0392551.pdf

  10. Bayesian mixture models for Poisson astronomical images

    OpenAIRE

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2012-01-01

    Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...

  11. Advances in Bayesian Model Based Clustering Using Particle Learning

    Energy Technology Data Exchange (ETDEWEB)

    Merl, D M

    2009-11-19

    Recent work by Carvalho, Johannes, Lopes and Polson and Carvalho, Lopes, Polson and Taddy introduced a sequential Monte Carlo (SMC) alternative to traditional iterative Monte Carlo strategies (e.g. MCMC and EM) for Bayesian inference for a large class of dynamic models. The basis of SMC techniques involves representing the underlying inference problem as one of state space estimation, thus giving way to inference via particle filtering. The key insight of Carvalho et al was to construct the sequence of filtering distributions so as to make use of the posterior predictive distribution of the observable, a distribution usually only accessible in certain Bayesian settings. Access to this distribution allows a reversal of the usual propagate and resample steps characteristic of many SMC methods, thereby alleviating to a large extent many problems associated with particle degeneration. Furthermore, Carvalho et al point out that for many conjugate models the posterior distribution of the static variables can be parametrized in terms of [recursively defined] sufficient statistics of the previously observed data. For models where such sufficient statistics exist, particle learning as it is being called, is especially well suited for the analysis of streaming data do to the relative invariance of its algorithmic complexity with the number of data observations. Through a particle learning approach, a statistical model can be fit to data as the data is arriving, allowing at any instant during the observation process direct quantification of uncertainty surrounding underlying model parameters. Here we describe the use of a particle learning approach for fitting a standard Bayesian semiparametric mixture model as described in Carvalho, Lopes, Polson and Taddy. In Section 2 we briefly review the previously presented particle learning algorithm for the case of a Dirichlet process mixture of multivariate normals. In Section 3 we describe several novel extensions to the original

  12. On Russian Roulette Estimates for Bayesian Inference with Doubly-Intractable Likelihoods

    OpenAIRE

    Lyne, Anne-Marie; Girolami, Mark; Atchadé, Yves; Strathmann, Heiko; Simpson, Daniel

    2013-01-01

    A large number of statistical models are “doubly-intractable”: the likelihood normalising term, which is a function of the model parameters, is intractable, as well as the marginal likelihood (model evidence). This means that standard inference techniques to sample from the posterior, such as Markov chain Monte Carlo (MCMC), cannot be used. Examples include, but are not confined to, massive Gaussian Markov random fields, autologistic models and Exponential random graph models. A number of app...

  13. Bayesian Inference in Hidden Markov Random Fields for Binary Data Defined on Large Lattices

    NARCIS (Netherlands)

    Friel, N.; Pettitt, A.N.; Reeves, R.; Wit, E.

    2009-01-01

    Hidden Markov random fields represent a complex hierarchical model, where the hidden latent process is an undirected graphical structure. Performing inference for such models is difficult primarily because the likelihood of the hidden states is often unavailable. The main contribution of this articl

  14. Bayesian Top-Down Protein Sequence Alignment with Inferred Position-Specific Gap Penalties.

    Science.gov (United States)

    Neuwald, Andrew F; Altschul, Stephen F

    2016-05-01

    We describe a Bayesian Markov chain Monte Carlo (MCMC) sampler for protein multiple sequence alignment (MSA) that, as implemented in the program GISMO and applied to large numbers of diverse sequences, is more accurate than the popular MSA programs MUSCLE, MAFFT, Clustal-Ω and Kalign. Features of GISMO central to its performance are: (i) It employs a "top-down" strategy with a favorable asymptotic time complexity that first identifies regions generally shared by all the input sequences, and then realigns closely related subgroups in tandem. (ii) It infers position-specific gap penalties that favor insertions or deletions (indels) within each sequence at alignment positions in which indels are invoked in other sequences. This favors the placement of insertions between conserved blocks, which can be understood as making up the proteins' structural core. (iii) It uses a Bayesian statistical measure of alignment quality based on the minimum description length principle and on Dirichlet mixture priors. Consequently, GISMO aligns sequence regions only when statistically justified. This is unlike methods based on the ad hoc, but widely used, sum-of-the-pairs scoring system, which will align random sequences. (iv) It defines a system for exploring alignment space that provides natural avenues for further experimentation through the development of new sampling strategies for more efficiently escaping from suboptimal traps. GISMO's superior performance is illustrated using 408 protein sets containing, on average, 235 sequences. These sets correspond to NCBI Conserved Domain Database alignments, which have been manually curated in the light of available crystal structures, and thus provide a means to assess alignment accuracy. GISMO fills a different niche than other MSA programs, namely identifying and aligning a conserved domain present within a large, diverse set of full length sequences. The GISMO program is available at http://gismo.igs.umaryland.edu/. PMID:27192614

  15. New PDE-based methods for image enhancement using SOM and Bayesian inference in various discretization schemes

    International Nuclear Information System (INIS)

    A novel approach is presented in this paper for improving anisotropic diffusion PDE models, based on the Perona–Malik equation. A solution is proposed from an engineering perspective to adaptively estimate the parameters of the regularizing function in this equation. The goal of such a new adaptive diffusion scheme is to better preserve edges when the anisotropic diffusion PDE models are applied to image enhancement tasks. The proposed adaptive parameter estimation in the anisotropic diffusion PDE model involves self-organizing maps and Bayesian inference to define edge probabilities accurately. The proposed modifications attempt to capture not only simple edges but also difficult textural edges and incorporate their probability in the anisotropic diffusion model. In the context of the application of PDE models to image processing such adaptive schemes are closely related to the discrete image representation problem and the investigation of more suitable discretization algorithms using constraints derived from image processing theory. The proposed adaptive anisotropic diffusion model illustrates these concepts when it is numerically approximated by various discretization schemes in a database of magnetic resonance images (MRI), where it is shown to be efficient in image filtering and restoration applications

  16. Bayesian inference of nonlinear unsteady aerodynamics from aeroelastic limit cycle oscillations

    Science.gov (United States)

    Sandhu, Rimple; Poirel, Dominique; Pettit, Chris; Khalil, Mohammad; Sarkar, Abhijit

    2016-07-01

    A Bayesian model selection and parameter estimation algorithm is applied to investigate the influence of nonlinear and unsteady aerodynamic loads on the limit cycle oscillation (LCO) of a pitching airfoil in the transitional Reynolds number regime. At small angles of attack, laminar boundary layer trailing edge separation causes negative aerodynamic damping leading to the LCO. The fluid-structure interaction of the rigid, but elastically mounted, airfoil and nonlinear unsteady aerodynamics is represented by two coupled nonlinear stochastic ordinary differential equations containing uncertain parameters and model approximation errors. Several plausible aerodynamic models with increasing complexity are proposed to describe the aeroelastic system leading to LCO. The likelihood in the posterior parameter probability density function (pdf) is available semi-analytically using the extended Kalman filter for the state estimation of the coupled nonlinear structural and unsteady aerodynamic model. The posterior parameter pdf is sampled using a parallel and adaptive Markov Chain Monte Carlo (MCMC) algorithm. The posterior probability of each model is estimated using the Chib-Jeliazkov method that directly uses the posterior MCMC samples for evidence (marginal likelihood) computation. The Bayesian algorithm is validated through a numerical study and then applied to model the nonlinear unsteady aerodynamic loads using wind-tunnel test data at various Reynolds numbers.

  17. A Bayesian approach to the inference of parametric configuration of the signal-to-noise ratio in an adaptive refinement of the measurements

    CERN Document Server

    Marquez, Maria Jose

    2012-01-01

    Calibration is nowadays one of the most important processes involved in the extraction of valuable data from measurements. The current availability of an optimum data cube measured from a heterogeneous set of instruments and surveys relies on a systematic and robust approach in the corresponding measurement analysis. In that sense, the inference of configurable instrument parameters can considerably increase the quality of the data obtained. This paper proposes a solution based on Bayesian inference for the estimation of the configurable parameters relevant to the signal to noise ratio. The information obtained by the resolution of this problem can be handled in a very useful way if it is considered as part of an adaptive loop for the overall measurement strategy, in such a way that the outcome of this parametric inference leads to an increase in the knowledge of a model comparison problem in the context of the measurement interpretation. The context of this problem is the multi-wavelength measurements coming...

  18. Plackett-Luce regression: A new Bayesian model for polychotomous data

    OpenAIRE

    Archambeau, Cedric; Caron, Francois

    2012-01-01

    Multinomial logistic regression is one of the most popular models for modelling the effect of explanatory variables on a subject choice between a set of specified options. This model has found numerous applications in machine learning, psychology or economy. Bayesian inference in this model is non trivial and requires, either to resort to a MetropolisHastings algorithm, or rejection sampling within a Gibbs sampler. In this paper, we propose an alternative model to multinomial logistic regress...

  19. Bayesian auxiliary variable models for binary and multinomial regression

    OpenAIRE

    Holmes, C C; HELD, L.

    2006-01-01

    In this paper we discuss auxiliary variable approaches to Bayesian binary and multinomial regression. These approaches are ideally suited to automated Markov chain Monte Carlo simulation. In the first part we describe a simple technique using joint updating that improves the performance of the conventional probit regression algorithm. In the second part we discuss auxiliary variable methods for inference in Bayesian logistic regression, including covariate set uncertainty. Fina...

  20. Inference of nonlinear gene regulatory networks through optimized ensemble of support vector regression and dynamic Bayesian networks.

    Science.gov (United States)

    Akutekwe, Arinze; Seker, Huseyin

    2015-08-01

    Comprehensive understanding of gene regulatory networks (GRNs) is a major challenge in systems biology. Most methods for modeling and inferring the dynamics of GRNs, such as those based on state space models, vector autoregressive models and G1DBN algorithm, assume linear dependencies among genes. However, this strong assumption does not make for true representation of time-course relationships across the genes, which are inherently nonlinear. Nonlinear modeling methods such as the S-systems and causal structure identification (CSI) have been proposed, but are known to be statistically inefficient and analytically intractable in high dimensions. To overcome these limitations, we propose an optimized ensemble approach based on support vector regression (SVR) and dynamic Bayesian networks (DBNs). The method called SVR-DBN, uses nonlinear kernels of the SVR to infer the temporal relationships among genes within the DBN framework. The two-stage ensemble is further improved by SVR parameter optimization using Particle Swarm Optimization. Results on eight insilico-generated datasets, and two real world datasets of Drosophila Melanogaster and Escherichia Coli, show that our method outperformed the G1DBN algorithm by a total average accuracy of 12%. We further applied our method to model the time-course relationships of ovarian carcinoma. From our results, four hub genes were discovered. Stratified analysis further showed that the expression levels Prostrate differentiation factor and BTG family member 2 genes, were significantly increased by the cisplatin and oxaliplatin platinum drugs; while expression levels of Polo-like kinase and Cyclin B1 genes, were both decreased by the platinum drugs. These hub genes might be potential biomarkers for ovarian carcinoma. PMID:26738192

  1. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  2. Bayesian Degree-Corrected Stochastic Block Models for Community Detection

    CERN Document Server

    Peng, Lijun

    2013-01-01

    Community detection in networks has drawn much attention in diverse fields, especially social sciences. Given its significance, there has been a large body of literature among which many are not statistically based. In this paper, we propose a novel stochastic blockmodel based on a logistic regression setup with node correction terms to better address this problem. We follow a Bayesian approach that explicitly captures the community behavior via prior specification. We then adopt a data augmentation strategy with latent Polya-Gamma variables to obtain posterior samples. We conduct inference based on a canonically mapped centroid estimator that formally addresses label non-identifiability. We demonstrate the novel proposed model and estimation on real-world as well as simulated benchmark networks and show that the proposed model and estimator are more flexible, representative, and yield smaller error rates when compared to the MAP estimator from classical degree-corrected stochastic blockmodels.

  3. Bayesian inference for data assimilation using Least-Squares Finite Element methods

    International Nuclear Information System (INIS)

    It has recently been observed that Least-Squares Finite Element methods (LS-FEMs) can be used to assimilate experimental data into approximations of PDEs in a natural way, as shown by Heyes et al. in the case of incompressible Navier-Stokes flow. The approach was shown to be effective without regularization terms, and can handle substantial noise in the experimental data without filtering. Of great practical importance is that - unlike other data assimilation techniques - it is not significantly more expensive than a single physical simulation. However the method as presented so far in the literature is not set in the context of an inverse problem framework, so that for example the meaning of the final result is unclear. In this paper it is shown that the method can be interpreted as finding a maximum a posteriori (MAP) estimator in a Bayesian approach to data assimilation, with normally distributed observational noise, and a Bayesian prior based on an appropriate norm of the governing equations. In this setting the method may be seen to have several desirable properties: most importantly discretization and modelling error in the simulation code does not affect the solution in limit of complete experimental information, so these errors do not have to be modelled statistically. Also the Bayesian interpretation better justifies the choice of the method, and some useful generalizations become apparent. The technique is applied to incompressible Navier-Stokes flow in a pipe with added velocity data, where its effectiveness, robustness to noise, and application to inverse problems is demonstrated.

  4. Generalized Fiducial Inference for Binary Logistic Item Response Models.

    Science.gov (United States)

    Liu, Yang; Hannig, Jan

    2016-06-01

    Generalized fiducial inference (GFI) has been proposed as an alternative to likelihood-based and Bayesian inference in mainstream statistics. Confidence intervals (CIs) can be constructed from a fiducial distribution on the parameter space in a fashion similar to those used with a Bayesian posterior distribution. However, no prior distribution needs to be specified, which renders GFI more suitable when no a priori information about model parameters is available. In the current paper, we apply GFI to a family of binary logistic item response theory models, which includes the two-parameter logistic (2PL), bifactor and exploratory item factor models as special cases. Asymptotic properties of the resulting fiducial distribution are discussed. Random draws from the fiducial distribution can be obtained by the proposed Markov chain Monte Carlo sampling algorithm. We investigate the finite-sample performance of our fiducial percentile CI and two commonly used Wald-type CIs associated with maximum likelihood (ML) estimation via Monte Carlo simulation. The use of GFI in high-dimensional exploratory item factor analysis was illustrated by the analysis of a set of the Eysenck Personality Questionnaire data. PMID:26769340

  5. BayesWave: Bayesian Inference for Gravitational Wave Bursts and Instrument Glitches

    CERN Document Server

    Cornish, Neil J

    2014-01-01

    A central challenge in Gravitational Wave Astronomy is identifying weak signals in the presence of non-stationary and non-Gaussian noise. The separation of gravitational wave signals from noise requires good models for both. When accurate signal models are available, such as for binary Neutron star systems, it is possible to make robust detection statements even when the noise is poorly understood. In contrast, searches for "un-modeled" transient signals are strongly impacted by the methods used to characterize the noise. Here we take a Bayesian approach and introduce a multi-component, variable dimension, parameterized noise model that explicitly accounts for non-stationarity and non-Gaussianity in data from interferometric gravitational wave detectors. Instrumental transients (glitches) and burst sources of gravitational waves are modeled using a Morlet-Gabor continuous wavelet basis. The number and placement of the wavelets is determined by a trans-dimensional Reversible Jump Markov Chain Monte Carlo algor...

  6. EXONEST: Bayesian model selection applied to the detection and characterization of exoplanets via photometric variations

    Energy Technology Data Exchange (ETDEWEB)

    Placek, Ben; Knuth, Kevin H. [Physics Department, University at Albany (SUNY), Albany, NY 12222 (United States); Angerhausen, Daniel, E-mail: bplacek@albany.edu, E-mail: kknuth@albany.edu, E-mail: daniel.angerhausen@gmail.com [Department of Physics, Applied Physics, and Astronomy, Rensselear Polytechnic Institute, Troy, NY 12180 (United States)

    2014-11-10

    EXONEST is an algorithm dedicated to detecting and characterizing the photometric signatures of exoplanets, which include reflection and thermal emission, Doppler boosting, and ellipsoidal variations. Using Bayesian inference, we can test between competing models that describe the data as well as estimate model parameters. We demonstrate this approach by testing circular versus eccentric planetary orbital models, as well as testing for the presence or absence of four photometric effects. In addition to using Bayesian model selection, a unique aspect of EXONEST is the potential capability to distinguish between reflective and thermal contributions to the light curve. A case study is presented using Kepler data recorded from the transiting planet KOI-13b. By considering only the nontransiting portions of the light curve, we demonstrate that it is possible to estimate the photometrically relevant model parameters of KOI-13b. Furthermore, Bayesian model testing confirms that the orbit of KOI-13b has a detectable eccentricity.

  7. Bayesian Kinematic Finite Fault Source Models (Invited)

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.

    2010-12-01

    Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.

  8. Bayesian Estimation of a Mixture Model

    OpenAIRE

    Ilhem Merah; Assia Chadli

    2015-01-01

    We present the properties of a bathtub curve reliability model having both a sufficient adaptability and a minimal number of parameters introduced by Idée and Pierrat (2010). This one is a mixture of a Gamma distribution G(2, (1/θ)) and a new distribution L(θ). We are interesting by Bayesian estimation of the parameters and survival function of this model with a squared-error loss function and non-informative prior using the approximations of Lindley (1980) and Tierney and Kadane (1986). Usin...

  9. Bayesian mixture models for partially verified data

    DEFF Research Database (Denmark)

    Kostoulas, Polychronis; Browne, William J.; Nielsen, Søren Saxmose;

    2013-01-01

    , where a perfect reference test does not exist. However, their discriminatory ability diminishes with increasing overlap of the distributions and with increasing number of latent infection stages to be discriminated. We provide a method that uses partially verified data, with known infection status for......Bayesian mixture models can be used to discriminate between the distributions of continuous test responses for different infection stages. These models are particularly useful in case of chronic infections with a long latent period, like Mycobacterium avium subsp. paratuberculosis (MAP) infection...

  10. Dynamical Bayesian Inference of Time-evolving Interactions: From a Pair of Coupled Oscillators to Networks of Oscillators

    CERN Document Server

    Duggento, Andrea; McClintock, Peter V E; Stefanovska, Aneta

    2012-01-01

    Living systems have time-evolving interactions that, until recently, could not be identified accurately from recorded time series in the presence of noise. Stankovski et al. (Phys. Rev. Lett. 109 024101, 2012) introduced a method based on dynamical Bayesian inference that facilitates the simultaneous detection of time-varying synchronization, directionality of influence, and coupling functions. It can distinguish unsynchronized dynamics from noise-induced phase slips. The method is based on phase dynamics, with Bayesian inference of the time- evolving parameters being achieved by shaping the prior densities to incorporate knowledge of previous samples. We now present the method in detail using numerically-generated data, data from an analog electronic circuit, and cardio-respiratory data. We also generalize the method to encompass networks of interacting oscillators and thus demonstrate its applicability to small-scale networks.

  11. Use of Bayesian Inference to Correlate In Vitro Embryo Production and In Vivo Fertility in Zebu Bulls

    Science.gov (United States)

    Sudano, Mateus José; Crespilho, André Maciel; Fernandes, Claudia Barbosa; Junior, Alicio Martins; Papa, Frederico Ozanam; Rodrigues, Josemar; Machado, Rui; Landim-Alvarenga, Fernanda Da Cruz

    2011-01-01

    The objective of this experiment was to test in vitro embryo production (IVP) as a tool to estimate fertility performance in zebu bulls using Bayesian inference statistics. Oocytes were matured and fertilized in vitro using sperm cells from three different Zebu bulls (V, T, and G). The three bulls presented similar results with regard to pronuclear formation and blastocyst formation rates. However, the cleavage rates were different between bulls. The estimated conception rates based on combined data of cleavage and blastocyst formation were very similar to the true conception rates observed for the same bulls after a fixed-time artificial insemination program. Moreover, even when we used cleavage rate data only or blastocyst formation data only, the estimated conception rates were still close to the true conception rates. We conclude that Bayesian inference is an effective statistical procedure to estimate in vivo bull fertility using data from IVP. PMID:21547211

  12. Predictive Distribution of the Dirichlet Mixture Model by the Local Variational Inference Method

    DEFF Research Database (Denmark)

    Ma, Zhanyu; Leijon, Arne; Tan, Zheng-Hua;

    2014-01-01

    predictive likelihood of the new upcoming data, especially when the amount of training data is small. The Bayesian estimation of a Dirichlet mixture model (DMM) is, in general, not analytically tractable. In our previous work, we have proposed a global variational inference-based method for approximately...

  13. Inferring cetacean population densities from the absolute dynamic topography of the ocean in a hierarchical Bayesian framework.

    Science.gov (United States)

    Pardo, Mario A; Gerrodette, Tim; Beier, Emilio; Gendron, Diane; Forney, Karin A; Chivers, Susan J; Barlow, Jay; Palacios, Daniel M

    2015-01-01

    We inferred the population densities of blue whales (Balaenoptera musculus) and short-beaked common dolphins (Delphinus delphis) in the Northeast Pacific Ocean as functions of the water-column's physical structure by implementing hierarchical models in a Bayesian framework. This approach allowed us to propagate the uncertainty of the field observations into the inference of species-habitat relationships and to generate spatially explicit population density predictions with reduced effects of sampling heterogeneity. Our hypothesis was that the large-scale spatial distributions of these two cetacean species respond primarily to ecological processes resulting from shoaling and outcropping of the pycnocline in regions of wind-forced upwelling and eddy-like circulation. Physically, these processes affect the thermodynamic balance of the water column, decreasing its volume and thus the height of the absolute dynamic topography (ADT). Biologically, they lead to elevated primary productivity and persistent aggregation of low-trophic-level prey. Unlike other remotely sensed variables, ADT provides information about the structure of the entire water column and it is also routinely measured at high spatial-temporal resolution by satellite altimeters with uniform global coverage. Our models provide spatially explicit population density predictions for both species, even in areas where the pycnocline shoals but does not outcrop (e.g. the Costa Rica Dome and the North Equatorial Countercurrent thermocline ridge). Interannual variations in distribution during El Niño anomalies suggest that the population density of both species decreases dramatically in the Equatorial Cold Tongue and the Costa Rica Dome, and that their distributions retract to particular areas that remain productive, such as the more oceanic waters in the central California Current System, the northern Gulf of California, the North Equatorial Countercurrent thermocline ridge, and the more southern portion of the

  14. Inferring cetacean population densities from the absolute dynamic topography of the ocean in a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    Mario A Pardo

    Full Text Available We inferred the population densities of blue whales (Balaenoptera musculus and short-beaked common dolphins (Delphinus delphis in the Northeast Pacific Ocean as functions of the water-column's physical structure by implementing hierarchical models in a Bayesian framework. This approach allowed us to propagate the uncertainty of the field observations into the inference of species-habitat relationships and to generate spatially explicit population density predictions with reduced effects of sampling heterogeneity. Our hypothesis was that the large-scale spatial distributions of these two cetacean species respond primarily to ecological processes resulting from shoaling and outcropping of the pycnocline in regions of wind-forced upwelling and eddy-like circulation. Physically, these processes affect the thermodynamic balance of the water column, decreasing its volume and thus the height of the absolute dynamic topography (ADT. Biologically, they lead to elevated primary productivity and persistent aggregation of low-trophic-level prey. Unlike other remotely sensed variables, ADT provides information about the structure of the entire water column and it is also routinely measured at high spatial-temporal resolution by satellite altimeters with uniform global coverage. Our models provide spatially explicit population density predictions for both species, even in areas where the pycnocline shoals but does not outcrop (e.g. the Costa Rica Dome and the North Equatorial Countercurrent thermocline ridge. Interannual variations in distribution during El Niño anomalies suggest that the population density of both species decreases dramatically in the Equatorial Cold Tongue and the Costa Rica Dome, and that their distributions retract to particular areas that remain productive, such as the more oceanic waters in the central California Current System, the northern Gulf of California, the North Equatorial Countercurrent thermocline ridge, and the more

  15. Bayesian Discovery of Linear Acyclic Causal Models

    CERN Document Server

    Hoyer, Patrik O

    2012-01-01

    Methods for automated discovery of causal relationships from non-interventional data have received much attention recently. A widely used and well understood model family is given by linear acyclic causal models (recursive structural equation models). For Gaussian data both constraint-based methods (Spirtes et al., 1993; Pearl, 2000) (which output a single equivalence class) and Bayesian score-based methods (Geiger and Heckerman, 1994) (which assign relative scores to the equivalence classes) are available. On the contrary, all current methods able to utilize non-Gaussianity in the data (Shimizu et al., 2006; Hoyer et al., 2008) always return only a single graph or a single equivalence class, and so are fundamentally unable to express the degree of certainty attached to that output. In this paper we develop a Bayesian score-based approach able to take advantage of non-Gaussianity when estimating linear acyclic causal models, and we empirically demonstrate that, at least on very modest size networks, its accur...

  16. Use of SAMC for Bayesian analysis of statistical models with intractable normalizing constants

    KAUST Repository

    Jin, Ick Hoon

    2014-03-01

    Statistical inference for the models with intractable normalizing constants has attracted much attention. During the past two decades, various approximation- or simulation-based methods have been proposed for the problem, such as the Monte Carlo maximum likelihood method and the auxiliary variable Markov chain Monte Carlo methods. The Bayesian stochastic approximation Monte Carlo algorithm specifically addresses this problem: It works by sampling from a sequence of approximate distributions with their average converging to the target posterior distribution, where the approximate distributions can be achieved using the stochastic approximation Monte Carlo algorithm. A strong law of large numbers is established for the Bayesian stochastic approximation Monte Carlo estimator under mild conditions. Compared to the Monte Carlo maximum likelihood method, the Bayesian stochastic approximation Monte Carlo algorithm is more robust to the initial guess of model parameters. Compared to the auxiliary variable MCMC methods, the Bayesian stochastic approximation Monte Carlo algorithm avoids the requirement for perfect samples, and thus can be applied to many models for which perfect sampling is not available or very expensive. The Bayesian stochastic approximation Monte Carlo algorithm also provides a general framework for approximate Bayesian analysis. © 2012 Elsevier B.V. All rights reserved.

  17. Adversarial life testing: A Bayesian negotiation model

    International Nuclear Information System (INIS)

    Life testing is a procedure intended for facilitating the process of making decisions in the context of industrial reliability. On the other hand, negotiation is a process of making joint decisions that has one of its main foundations in decision theory. A Bayesian sequential model of negotiation in the context of adversarial life testing is proposed. This model considers a general setting for which a manufacturer offers a product batch to a consumer. It is assumed that the reliability of the product is measured in terms of its lifetime. Furthermore, both the manufacturer and the consumer have to use their own information with respect to the quality of the product. Under these assumptions, two situations can be analyzed. For both of them, the main aim is to accept or reject the product batch based on the product reliability. This topic is related to a reliability demonstration problem. The procedure is applied to a class of distributions that belong to the exponential family. Thus, a unified framework addressing the main topics in the considered Bayesian model is presented. An illustrative example shows that the proposed technique can be easily applied in practice

  18. Modeling Land-Use Decision Behavior with Bayesian Belief Networks

    Directory of Open Access Journals (Sweden)

    Inge Aalders

    2008-06-01

    Full Text Available The ability to incorporate and manage the different drivers of land-use change in a modeling process is one of the key challenges because they are complex and are both quantitative and qualitative in nature. This paper uses Bayesian belief networks (BBN to incorporate characteristics of land managers in the modeling process and to enhance our understanding of land-use change based on the limited and disparate sources of information. One of the two models based on spatial data represented land managers in the form of a quantitative variable, the area of individual holdings, whereas the other model included qualitative data from a survey of land managers. Random samples from the spatial data provided evidence of the relationship between the different variables, which I used to develop the BBN structure. The model was tested for four different posterior probability distributions, and results showed that the trained and learned models are better at predicting land use than the uniform and random models. The inference from the model demonstrated the constraints that biophysical characteristics impose on land managers; for older land managers without heirs, there is a higher probability of the land use being arable agriculture. The results show the benefits of incorporating a more complex notion of land managers in land-use models, and of using different empirical data sources in the modeling process. Future research should focus on incorporating more complex social processes into the modeling structure, as well as incorporating spatio-temporal dynamics in a BBN.

  19. Bayesian inference along Markov Chain Monte Carlo approach for PWR core loading pattern optimization

    International Nuclear Information System (INIS)

    Highlights: ► The BIMCMC method performs very well and is comparable to GA and PSO techniques. ► The potential of the technique is very well for optimization. ► It is observed that the performance of the method is quite adequate. ► The BIMCMC is very easy to implement. -- Abstract: Despite remarkable progress in optimization procedures, inherent complexities in nuclear reactor structure and strong interdependence among the fundamental indices namely, economic, neutronic, thermo-hydraulic and environmental effects make it necessary to evaluate the most efficient arrangement of a reactor core. In this paper a reactor core reloading technique based on Bayesian inference along Markov Chain Monte Carlo, BIMCMC, is addressed in the context of obtaining an optimal configuration of fuel assemblies in reactor cores. The Markov Chain Monte Carlo with Metropolis–Hastings algorithm has been applied for sampling variable and its acceptance. The proposed algorithm can be used for in-core fuel management optimization problems in pressurized water reactors. Considerable work has been expended for loading pattern optimization, but no preferred approach has yet emerged. To evaluate the proposed technique, increasing the effective multiplication factor Keff of a WWER-1000 core along flattening power with keeping power peaking factor below a specific limit as a first test case and flattening of power as a second test case are considered as objective functions; although other variables such as burn up and cycle length can also be taken into account. The results, convergence rate and reliability of the new method are compared to published data resulting from particle swarm optimization and genetic algorithm; the outcome is quite promising and demonstrating the potential of the technique very well for optimization applications in the nuclear engineering field.

  20. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seongkeun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    The purpose of this paper is to confirm if Bayesian inference can properly reflect the situation awareness of real human operators, and find the difference between the situation of ideal and practical operators, and investigate the factors which contributes to those difference. As a results, human can not think like computer. If human can memorize all the information, and their thinking process is same to the CPU of computer, the results of these two experiments come out more than 99%. However the probability of finding right malfunction by humans are only 64.52% in simple experiment, and 51.61% in complex experiment. Cognition is the mental processing that includes the attention of working memory, comprehending and producing language, calculating, reasoning, problem solving, and decision making. There are many reasons why human thinking process is different with computer, but in this experiment, we suggest that the working memory is the most important factor. Humans have limited working memory which has only seven chunks capacity. These seven chunks are called magic number. If there are more than seven sequential information, people start to forget the previous information because their working memory capacity is running over. We can check how much working memory affects to the result through the simple experiment. Then what if we neglect the effect of working memory? The total number of subjects who have incorrect memory is 7 (subject 3, 5, 6, 7, 8, 15, 25). They could find the right malfunction if the memory hadn't changed because of lack of working memory. Then the probability of find correct malfunction will be increased to 87.10% from 64.52%. Complex experiment has similar result. In this case, eight subjects(1, 5, 8, 9, 15, 17, 18, 30) had changed the memory, and it affects to find the right malfunction. Considering it, then the probability would be (16+8)/31 = 77.42%.

  1. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    International Nuclear Information System (INIS)

    The purpose of this paper is to confirm if Bayesian inference can properly reflect the situation awareness of real human operators, and find the difference between the situation of ideal and practical operators, and investigate the factors which contributes to those difference. As a results, human can not think like computer. If human can memorize all the information, and their thinking process is same to the CPU of computer, the results of these two experiments come out more than 99%. However the probability of finding right malfunction by humans are only 64.52% in simple experiment, and 51.61% in complex experiment. Cognition is the mental processing that includes the attention of working memory, comprehending and producing language, calculating, reasoning, problem solving, and decision making. There are many reasons why human thinking process is different with computer, but in this experiment, we suggest that the working memory is the most important factor. Humans have limited working memory which has only seven chunks capacity. These seven chunks are called magic number. If there are more than seven sequential information, people start to forget the previous information because their working memory capacity is running over. We can check how much working memory affects to the result through the simple experiment. Then what if we neglect the effect of working memory? The total number of subjects who have incorrect memory is 7 (subject 3, 5, 6, 7, 8, 15, 25). They could find the right malfunction if the memory hadn't changed because of lack of working memory. Then the probability of find correct malfunction will be increased to 87.10% from 64.52%. Complex experiment has similar result. In this case, eight subjects(1, 5, 8, 9, 15, 17, 18, 30) had changed the memory, and it affects to find the right malfunction. Considering it, then the probability would be (16+8)/31 = 77.42%

  2. HASSET: a probability event tree tool to evaluate future volcanic scenarios using Bayesian inference

    Science.gov (United States)

    Sobradelo, Rosa; Bartolini, Stefania; Martí, Joan

    2014-01-01

    Event tree structures constitute one of the most useful and necessary tools in modern volcanology for assessment of hazards from future volcanic scenarios (those that culminate in an eruptive event as well as those that do not). They are particularly relevant for evaluation of long- and short-term probabilities of occurrence of possible volcanic scenarios and their potential impacts on urbanized areas. In this paper, we introduce Hazard Assessment Event Tree (HASSET), a probability tool, built on an event tree structure that uses Bayesian inference to estimate the probability of occurrence of a future volcanic scenario and to evaluate the most relevant sources of uncertainty from the corresponding volcanic system. HASSET includes hazard assessment of noneruptive and nonmagmatic volcanic scenarios, that is, episodes of unrest that do not evolve into volcanic eruption but have an associated volcanic hazard (e.g., sector collapse and phreatic explosion), as well as unrest episodes triggered by external triggers rather than the magmatic system alone. Additionally, HASSET introduces the Delta method to assess precision of the probability estimates, by reporting a 1 standard deviation variability interval around the expected value for each scenario. HASSET is presented as a free software package in the form of a plug-in for the open source geographic information system Quantum Gis (QGIS), providing a graphically supported computation of the event tree structure in an interactive and user-friendly way. We also include further in-depth explanations for each node together with an application of HASSET to Teide-Pico Viejo volcanic complex (Spain).

  3. A defense of Columbo (and of the use of Bayesian inference in forensics): A multilevel introduction to probabilistic reasoning

    CERN Document Server

    D'Agostini, G

    2010-01-01

    Triggered by a recent interesting New Scientist article on the too frequent incorrect use of probabilistic evidence in courts, I introduce the basic concepts of probabilistic inference with a toy model, and discuss several important issues that need to be understood in order to extend the basic reasoning to real life cases. In particular, I emphasize the often neglected point that degrees of beliefs are updated not by `bare facts' alone, but by all available information pertaining to them, including how they have been acquired. In this light I show that, contrary to what claimed in that article, there was no "probabilistic pitfall" in the Columbo's episode pointed as example of "bad mathematics" yielding "rough justice". Instead, such a criticism could have a `negative reaction' to the article itself and to the use of Bayesian reasoning in courts, as well as in all other places in which probabilities need to be assessed and decisions need to be made. Anyway, besides introductory/recreational aspects, the pape...

  4. A Bayesian Based Functional Mixed-Effects Model for Analysis of LC-MS Data

    OpenAIRE

    Befekadu, Getachew K.; Tadesse, Mahlet G.; Ressom, Habtom W

    2009-01-01

    A Bayesian multilevel functional mixed-effects model with group specific random-effects is presented for analysis of liquid chromatography-mass spectrometry (LC-MS) data. The proposed framework allows alignment of LC-MS spectra with respect to both retention time (RT) and mass-to-charge ratio (m/z). Affine transformations are incorporated within the model to account for any variability along the RT and m/z dimensions. Simultaneous posterior inference of all unknown parameters is accomplished ...

  5. A Note on Bayesian Estimation for the Negative-Binomial Model

    OpenAIRE

    L. Lio, Y.

    2009-01-01

    2000 Mathematics Subject Classification: 62F15. The Negative Binomial model, which is generated by a simple mixture model, has been widely applied in the social, health and economic market prediction. The most commonly used methods were the maximum likelihood estimate (MLE) and the moment method estimate (MME). Bradlow et al. (2002) proposed a Bayesian inference with beta-prime and Pearson Type VI as priors for the negative binomial distribution. It is due to the complicated posterior dens...

  6. Bayesian Estimation of a Mixture Model

    Directory of Open Access Journals (Sweden)

    Ilhem Merah

    2015-05-01

    Full Text Available We present the properties of a bathtub curve reliability model having both a sufficient adaptability and a minimal number of parameters introduced by Idée and Pierrat (2010. This one is a mixture of a Gamma distribution G(2, (1/θ and a new distribution L(θ. We are interesting by Bayesian estimation of the parameters and survival function of this model with a squared-error loss function and non-informative prior using the approximations of Lindley (1980 and Tierney and Kadane (1986. Using a statistical sample of 60 failure data relative to a technical device, we illustrate the results derived. Based on a simulation study, comparisons are made between these two methods and the maximum likelihood method of this two parameters model.

  7. On the criticality of inferred models

    Science.gov (United States)

    Mastromatteo, Iacopo; Marsili, Matteo

    2011-10-01

    Advanced inference techniques allow one to reconstruct a pattern of interaction from high dimensional data sets, from probing simultaneously thousands of units of extended systems—such as cells, neural tissues and financial markets. We focus here on the statistical properties of inferred models and argue that inference procedures are likely to yield models which are close to singular values of parameters, akin to critical points in physics where phase transitions occur. These are points where the response of physical systems to external perturbations, as measured by the susceptibility, is very large and diverges in the limit of infinite size. We show that the reparameterization invariant metrics in the space of probability distributions of these models (the Fisher information) are directly related to the susceptibility of the inferred model. As a result, distinguishable models tend to accumulate close to critical points, where the susceptibility diverges in infinite systems. This region is the one where the estimate of inferred parameters is most stable. In order to illustrate these points, we discuss inference of interacting point processes with application to financial data and show that sensible choices of observation time scales naturally yield models which are close to criticality.

  8. The Bayesian Modelling Of Inflation Rate In Romania

    OpenAIRE

    Mihaela Simionescu

    2014-01-01

    Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estim...

  9. Bayesian Models of Graphs, Arrays and Other Exchangeable Random Structures.

    Science.gov (United States)

    Orbanz, Peter; Roy, Daniel M

    2015-02-01

    The natural habitat of most Bayesian methods is data represented by exchangeable sequences of observations, for which de Finetti's theorem provides the theoretical foundation. Dirichlet process clustering, Gaussian process regression, and many other parametric and nonparametric Bayesian models fall within the remit of this framework; many problems arising in modern data analysis do not. This article provides an introduction to Bayesian models of graphs, matrices, and other data that can be modeled by random structures. We describe results in probability theory that generalize de Finetti's theorem to such data and discuss their relevance to nonparametric Bayesian modeling. With the basic ideas in place, we survey example models available in the literature; applications of such models include collaborative filtering, link prediction, and graph and network analysis. We also highlight connections to recent developments in graph theory and probability, and sketch the more general mathematical foundation of Bayesian methods for other types of data beyond sequences and arrays. PMID:26353253

  10. Bayesian inference of T Tauri star properties using multi-wavelength survey photometry

    Science.gov (United States)

    Barentsen, Geert; Vink, J. S.; Drew, J. E.; Sale, S. E.

    2013-03-01

    There are many pertinent open issues in the area of star and planet formation. Large statistical samples of young stars across star-forming regions are needed to trigger a breakthrough in our understanding, but most optical studies are based on a wide variety of spectrographs and analysis methods, which introduces large biases. Here we show how graphical Bayesian networks can be employed to construct a hierarchical probabilistic model which allows pre-main-sequence ages, masses, accretion rates and extinctions to be estimated using two widely available photometric survey data bases (Isaac Newton Telescope Photometric Hα Survey r'/Hα/i' and Two Micron All Sky Survey J-band magnitudes). Because our approach does not rely on spectroscopy, it can easily be applied to ho-mogeneously study the large number of clusters for which Gaia will yield membership lists. We explain how the analysis is carried out using the Markov chain Monte Carlo method and provide PYTHON source code. We then demonstrate its use on 587 known low-mass members of the star-forming region NGC 2264 (Cone Nebula), arriving at a median age of 3.0 Myr, an accretion fraction of 20 ± 2 per cent and a median accretion rate of 10-8.4 M⊙ yr-1. The Bayesian analysis formulated in this work delivers results which are in agreement with spectroscopic studies already in the literature, but achieves this with great efficiency by depending only on photometry. It is a significant step forward from previous photometric studies because the probabilistic approach ensures that nuisance parameters, such as extinction and distance, are fully included in the analysis with a clear picture on any degeneracies.

  11. Improving randomness characterization through Bayesian model selection

    CERN Document Server

    R., Rafael Díaz-H; Martínez, Alí M Angulo; U'Ren, Alfred B; Hirsch, Jorge G; Marsili, Matteo; Castillo, Isaac Pérez

    2016-01-01

    Nowadays random number generation plays an essential role in technology with important applications in areas ranging from cryptography, which lies at the core of current communication protocols, to Monte Carlo methods, and other probabilistic algorithms. In this context, a crucial scientific endeavour is to develop effective methods that allow the characterization of random number generators. However, commonly employed methods either lack formality (e.g. the NIST test suite), or are inapplicable in principle (e.g. the characterization derived from the Algorithmic Theory of Information (ATI)). In this letter we present a novel method based on Bayesian model selection, which is both rigorous and effective, for characterizing randomness in a bit sequence. We derive analytic expressions for a model's likelihood which is then used to compute its posterior probability distribution. Our method proves to be more rigorous than NIST's suite and the Borel-Normality criterion and its implementation is straightforward. We...

  12. Inversion of hierarchical Bayesian models using Gaussian processes.

    Science.gov (United States)

    Lomakina, Ekaterina I; Paliwal, Saee; Diaconescu, Andreea O; Brodersen, Kay H; Aponte, Eduardo A; Buhmann, Joachim M; Stephan, Klaas E

    2015-09-01

    Over the past decade, computational approaches to neuroimaging have increasingly made use of hierarchical Bayesian models (HBMs), either for inferring on physiological mechanisms underlying fMRI data (e.g., dynamic causal modelling, DCM) or for deriving computational trajectories (from behavioural data) which serve as regressors in general linear models. However, an unresolved problem is that standard methods for inverting the hierarchical Bayesian model are either very slow, e.g. Markov Chain Monte Carlo Methods (MCMC), or are vulnerable to local minima in non-convex optimisation problems, such as variational Bayes (VB). This article considers Gaussian process optimisation (GPO) as an alternative approach for global optimisation of sufficiently smooth and efficiently evaluable objective functions. GPO avoids being trapped in local extrema and can be computationally much more efficient than MCMC. Here, we examine the benefits of GPO for inverting HBMs commonly used in neuroimaging, including DCM for fMRI and the Hierarchical Gaussian Filter (HGF). Importantly, to achieve computational efficiency despite high-dimensional optimisation problems, we introduce a novel combination of GPO and local gradient-based search methods. The utility of this GPO implementation for DCM and HGF is evaluated against MCMC and VB, using both synthetic data from simulations and empirical data. Our results demonstrate that GPO provides parameter estimates with equivalent or better accuracy than the other techniques, but at a fraction of the computational cost required for MCMC. We anticipate that GPO will prove useful for robust and efficient inversion of high-dimensional and nonlinear models of neuroimaging data. PMID:26048619

  13. On the shape of the mass-function of dense clumps in the Hi-GAL fields. II. Using Bayesian inference to study the clump mass function

    CERN Document Server

    Olmi, L; Elia, D; Molinari, S; Pestalozzi, M; Pezzuto, S; Schisano, E; Testi, L; Thompson, M

    2013-01-01

    Context. Stars form in dense, dusty clumps of molecular clouds, but little is known about their origin, their evolution and their detailed physical properties. In particular, the relationship between the mass distribution of these clumps (also known as the "clump mass function", or CMF) and the stellar initial mass function (IMF), is still poorly understood. Aims. In order to better understand how the CMF evolve toward the IMF, and to discern the "true" shape of the CMF, large samples of bona-fide pre- and proto-stellar clumps are required. Two such datasets obtained from the Herschel infrared GALactic Plane Survey (Hi-GAL) have been described in paper I. Robust statistical methods are needed in order to infer the parameters describing the models used to fit the CMF, and to compare the competing models themselves. Methods. In this paper we apply Bayesian inference to the analysis of the CMF of the two regions discussed in Paper I. First, we determine the Bayesian posterior probability distribution for each of...

  14. Lifted Inference for Relational Continuous Models

    CERN Document Server

    Choi, Jaesik; Hill, David J

    2012-01-01

    Relational Continuous Models (RCMs) represent joint probability densities over attributes of objects, when the attributes have continuous domains. With relational representations, they can model joint probability distributions over large numbers of variables compactly in a natural way. This paper presents a new exact lifted inference algorithm for RCMs, thus it scales up to large models of real world applications. The algorithm applies to Relational Pairwise Models which are (relational) products of potentials of arity 2. Our algorithm is unique in two ways. First, it substantially improves the efficiency of lifted inference with variables of continuous domains. When a relational model has Gaussian potentials, it takes only linear-time compared to cubic time of previous methods. Second, it is the first exact inference algorithm which handles RCMs in a lifted way. The algorithm is illustrated over an example from econometrics. Experimental results show that our algorithm outperforms both a groundlevel inferenc...

  15. Bayesian mixture models for Poisson astronomical images

    CERN Document Server

    Guglielmetti, Fabrizia; Dose, Volker

    2012-01-01

    Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as well as the sources with their respective uncertainties. Background estimation and source detection is achieved in a single algorithm. A large variety of source morphologies is revealed. The technique is applied in the X-ray part of the electromagnetic spectrum on ROSAT and Chandra data sets and it is under a feasibility study for the forthcoming eROSITA mission.

  16. Dead or gone? Bayesian inference on mortality for the dispersing sex.

    Science.gov (United States)

    Barthold, Julia A; Packer, Craig; Loveridge, Andrew J; Macdonald, David W; Colchero, Fernando

    2016-07-01

    Estimates of age-specific mortality are regularly used in ecology, evolution, and conservation research. However, estimating mortality of the dispersing sex, in species where one sex undergoes natal dispersal, is difficult. This is because it is often unclear whether members of the dispersing sex that disappear from monitored areas have died or dispersed. Here, we develop an extension of a multievent model that imputes dispersal state (i.e., died or dispersed) for uncertain records of the dispersing sex as a latent state and estimates age-specific mortality and dispersal parameters in a Bayesian hierarchical framework. To check the performance of our model, we first conduct a simulation study. We then apply our model to a long-term data set of African lions. Using these data, we further study how well our model estimates mortality of the dispersing sex by incrementally reducing the level of uncertainty in the records of male lions. We achieve this by taking advantage of an expert's indication on the likely fate of each missing male (i.e., likely died or dispersed). We find that our model produces accurate mortality estimates for simulated data of varying sample sizes and proportions of uncertain male records. From the empirical study, we learned that our model provides similar mortality estimates for different levels of uncertainty in records. However, a sensitivity of the mortality estimates to varying uncertainty is, as can be expected, detectable. We conclude that our model provides a solution to the challenge of estimating mortality of the dispersing sex in species with data deficiency due to natal dispersal. Given the utility of sex-specific mortality estimates in biological and conservation research, and the virtual ubiquity of sex-biased dispersal, our model will be useful to a wide variety of applications. PMID:27547322

  17. Bayesian Analysis of Hazard Regression Models under Order Restrictions on Covariate Effects and Ageing

    OpenAIRE

    Bhattacharjee, Arnab; Bhattacharjee, Madhuchhanda

    2007-01-01

    We propose Bayesian inference in hazard regression models where the baseline hazard is unknown, covariate effects are possibly age-varying (non-proportional), and there is multiplicative frailty with arbitrary distribution. Our framework incorporates a wide variety of order restrictions on covariate dependence and duration dependence (ageing). We propose estimation and evaluation of age-varying covariate effects when covariate dependence is monotone rather than proportional. In particular, we...

  18. Low bitrate object coding of musical audio using bayesian harmonic models

    OpenAIRE

    Vincent, Emmanuel; PLUMBLEY, Mark

    2007-01-01

    This article deals with the decomposition of music signals into pitched sound objects made of harmonic sinusoidal partials for very low bitrate coding purposes. After a brief review of existing methods, we recast this problem in the Bayesian framework. We propose a family of probabilistic signal models combining learnt object priors and various perceptually motivated distortion measures. We design efficient algorithms to infer object parameters and build a coder based on the interpolation of ...

  19. Data-driven and Model-based Verification:a Bayesian Identification Approach

    OpenAIRE

    Haesaert, S Sofie; Hof, van den, S.; Abate, A.

    2015-01-01

    This work develops a measurement-driven and model-based formal verification approach, applicable to systems with partly unknown dynamics. We provide a principled method, grounded on reachability analysis and on Bayesian inference, to compute the confidence that a physical system driven by external inputs and accessed under noisy measurements, verifies a temporal logic property. A case study is discussed, where we investigate the bounded- and unbounded-time safety of a partly unknown linear ti...

  20. A SEMIPARAMETRIC BAYESIAN MODEL FOR CIRCULAR-LINEAR REGRESSION

    Science.gov (United States)

    We present a Bayesian approach to regress a circular variable on a linear predictor. The regression coefficients are assumed to have a nonparametric distribution with a Dirichlet process prior. The semiparametric Bayesian approach gives added flexibility to the model and is usefu...

  1. Bayesian modeling of animal- and herd-level prevalences.

    Science.gov (United States)

    Branscum, A J; Gardner, I A; Johnson, W O

    2004-12-15

    We reviewed Bayesian approaches for animal-level and herd-level prevalence estimation based on cross-sectional sampling designs and demonstrated fitting of these models using the WinBUGS software. We considered estimation of infection prevalence based on use of a single diagnostic test applied to a single herd with binomial and hypergeometric sampling. We then considered multiple herds under binomial sampling with the primary goal of estimating the prevalence distribution and the proportion of infected herds. A new model is presented that can be used to estimate the herd-level prevalence in a region, including the posterior probability that all herds are non-infected. Using this model, inferences for the distribution of prevalences, mean prevalence in the region, and predicted prevalence of herds in the region (including the predicted probability of zero prevalence) are also available. In the models presented, both animal- and herd-level prevalences are modeled as mixture distributions to allow for zero infection prevalences. (If mixture models for the prevalences were not used, prevalence estimates might be artificially inflated, especially in herds and regions with low or zero prevalence.) Finally, we considered estimation of animal-level prevalence based on pooled samples. PMID:15579338

  2. Bayesian State-Space Modelling on High-Performance Hardware Using LibBi

    Directory of Open Access Journals (Sweden)

    Lawrence M. Murray

    2015-10-01

    Full Text Available LibBi is a software package for state space modelling and Bayesian inference on modern computer hardware, including multi-core central processing units, many-core graphics processing units, and distributed-memory clusters of such devices. The software parses a domain-specific language for model specification, then optimizes, generates, compiles and runs code for the given model, inference method and hardware platform. In presenting the software, this work serves as an introduction to state space models and the specialized methods developed for Bayesian inference with them. The focus is on sequential Monte Carlo (SMC methods such as the particle filter for state estimation, and the particle Markov chain Monte Carlo and SMC2 methods for parameter estimation. All are well-suited to current computer hardware. Two examples are given and developed throughout, one a linear three-element windkessel model of the human arterial system, the other a nonlinear Lorenz '96 model. These are specified in the prescribed modelling language, and LibBi demonstrated by performing inference with them. Empirical results are presented, including a performance comparison of the software with different hardware configurations.

  3. Stochastic Annealing for Variational Inference

    OpenAIRE

    Gultekin, San; Zhang, Aonan; Paisley, John

    2015-01-01

    We empirically evaluate a stochastic annealing strategy for Bayesian posterior optimization with variational inference. Variational inference is a deterministic approach to approximate posterior inference in Bayesian models in which a typically non-convex objective function is locally optimized over the parameters of the approximating distribution. We investigate an annealing method for optimizing this objective with the aim of finding a better local optimal solution and compare with determin...

  4. A new approach for Bayesian model averaging

    Institute of Scientific and Technical Information of China (English)

    TIAN XiangJun; XIE ZhengHui; WANG AiHui; YANG XiaoChun

    2012-01-01

    Bayesian model averaging (BMA) is a recently proposed statistical method for calibrating forecast ensembles from numerical weather models.However,successful implementation of BMA requires accurate estimates of the weights and variances of the individual competing models in the ensemble.Two methods,namely the Expectation-Maximization (EM) and the Markov Chain Monte Carlo (MCMC) algorithms,are widely used for BMA model training.Both methods have their own respective strengths and weaknesses.In this paper,we first modify the BMA log-likelihood function with the aim of removing the additional limitation that requires that the BMA weights add to one,and then use a limited memory quasi-Newtonian algorithm for solving the nonlinear optimization problem,thereby formulating a new approach for BMA (referred to as BMA-BFGS).Several groups of multi-model soil moisture simulation experiments from three land surface models show that the performance of BMA-BFGS is similar to the MCMC method in terms of simulation accuracy,and that both are superior to the EM algorithm.On the other hand,the computational cost of the BMA-BFGS algorithm is substantially less than for MCMC and is almost equivalent to that for EM.

  5. Coping with Trial-to-Trial Variability of Event Related Signals: A Bayesian Inference Approach

    Science.gov (United States)

    Ding, Mingzhou; Chen, Youghong; Knuth, Kevin H.; Bressler, Steven L.; Schroeder, Charles E.

    2005-01-01

    In electro-neurophysiology, single-trial brain responses to a sensory stimulus or a motor act are commonly assumed to result from the linear superposition of a stereotypic event-related signal (e.g. the event-related potential or ERP) that is invariant across trials and some ongoing brain activity often referred to as noise. To extract the signal, one performs an ensemble average of the brain responses over many identical trials to attenuate the noise. To date, h s simple signal-plus-noise (SPN) model has been the dominant approach in cognitive neuroscience. Mounting empirical evidence has shown that the assumptions underlying this model may be overly simplistic. More realistic models have been proposed that account for the trial-to-trial variability of the event-related signal as well as the possibility of multiple differentially varying components within a given ERP waveform. The variable-signal-plus-noise (VSPN) model, which has been demonstrated to provide the foundation for separation and characterization of multiple differentially varying components, has the potential to provide a rich source of information for questions related to neural functions that complement the SPN model. Thus, being able to estimate the amplitude and latency of each ERP component on a trial-by-trial basis provides a critical link between the perceived benefits of the VSPN model and its many concrete applications. In this paper we describe a Bayesian approach to deal with this issue and the resulting strategy is referred to as the differentially Variable Component Analysis (dVCA). We compare the performance of dVCA on simulated data with Independent Component Analysis (ICA) and analyze neurobiological recordings from monkeys performing cognitive tasks.

  6. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  7. Determination of the stopping power of 4He using Bayesian inference with the Markov chain Monte Carlo algorithm

    International Nuclear Information System (INIS)

    A new method for the experimental determination of stopping powers based on Bayesian Inference with the Markov chain Monte Carlo (MCMC) algorithm has been devised. This method avoids the difficulties related to thin target preparation. By measuring the RBS spectra for a known material, and using the known underlying physics, the stopping powers are determined by best matching the simulated spectra with the experimental spectra. Using silicon, SiO2 and Al2O3 as test cases, good agreement is obtained between calculated and experimental data. (author)

  8. Final Report, DOE Early Career Award: Predictive modeling of complex physical systems: new tools for statistical inference, uncertainty quantification, and experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2016-08-31

    Predictive simulation of complex physical systems increasingly rests on the interplay of experimental observations with computational models. Key inputs, parameters, or structural aspects of models may be incomplete or unknown, and must be developed from indirect and limited observations. At the same time, quantified uncertainties are needed to qualify computational predictions in the support of design and decision-making. In this context, Bayesian statistics provides a foundation for inference from noisy and limited data, but at prohibitive computional expense. This project intends to make rigorous predictive modeling *feasible* in complex physical systems, via accelerated and scalable tools for uncertainty quantification, Bayesian inference, and experimental design. Specific objectives are as follows: 1. Develop adaptive posterior approximations and dimensionality reduction approaches for Bayesian inference in high-dimensional nonlinear systems. 2. Extend accelerated Bayesian methodologies to large-scale {\\em sequential} data assimilation, fully treating nonlinear models and non-Gaussian state and parameter distributions. 3. Devise efficient surrogate-based methods for Bayesian model selection and the learning of model structure. 4. Develop scalable simulation/optimization approaches to nonlinear Bayesian experimental design, for both parameter inference and model selection. 5. Demonstrate these inferential tools on chemical kinetic models in reacting flow, constructing and refining thermochemical and electrochemical models from limited data. Demonstrate Bayesian filtering on canonical stochastic PDEs and in the dynamic estimation of inhomogeneous subsurface properties and flow fields.

  9. Bayesian Model Selection for LISA Pathfinder

    CERN Document Server

    Karnesis, Nikolaos; Sopuerta, Carlos F; Gibert, Ferran; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Ferraioli, Luigi; Hewitson, Martin; Hueller, Mauro; Korsakova, Natalia; Plagnol, Eric; Vitale, and Stefano

    2013-01-01

    The main goal of the LISA Pathfinder (LPF) mission is to fully characterize the acceleration noise models and to test key technologies for future space-based gravitational-wave observatories similar to the LISA/eLISA concept. The Data Analysis (DA) team has developed complex three-dimensional models of the LISA Technology Package (LTP) experiment on-board LPF. These models are used for simulations, but more importantly, they will be used for parameter estimation purposes during flight operations. One of the tasks of the DA team is to identify the physical effects that contribute significantly to the properties of the instrument noise. A way of approaching to this problem is to recover the essential parameters of the LTP which describe the data. Thus, we want to define the simplest model that efficiently explains the observations. To do so, adopting a Bayesian framework, one has to estimate the so-called Bayes Factor between two competing models. In our analysis, we use three main different methods to estimate...

  10. Bayesian Model Averaging in the Instrumental Variable Regression Model

    OpenAIRE

    Gary Koop; Robert Leon Gonzalez; Rodney Strachan

    2011-01-01

    This paper considers the instrumental variable regression model when there is uncertainly about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainly can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very fl...

  11. EVENT MODEL: A ROBUST BAYESIAN TOOL FOR CHRONOLOGICAL MODELING

    OpenAIRE

    Lanos, Philippe; Philippe, Anne

    2015-01-01

    We propose a new modeling approach for combining dates through the Event model by using hierarchical Bayesian statistics. The Event model aims to estimate the date of a context (unit of stratification) from individual dates assumed to be contemporaneous and which are affected by errors of different types: laboratory and calibration curve errors and also irreducible errors related to contaminations, taphonomic disturbances, etc, hence the possible presence of outliers. The Event model has a hi...

  12. A Bayesian approach to the semi-analytic model of galaxy formation: methodology

    CERN Document Server

    Lu, Yu; Weinberg, Martin D; Katz, Neal S

    2010-01-01

    We believe that a wide range of physical processes conspire to shape the observed galaxy population but we remain unsure of their detailed interactions. The semi-analytic model (SAM) of galaxy formation uses multi-dimensional parameterizations of the physical processes of galaxy formation and provides a tool to constrain these underlying physical interactions. Because of the high dimensionality, the parametric problem of galaxy formation may be profitably tackled with a Bayesian-inference based approach, which allows one to constrain theory with data in a statistically rigorous way. In this paper, we develop a generalized SAM using the framework of Bayesian inference. We show that, with a parallel implementation of an advanced Markov-Chain Monte-Carlo algorithm, it is now possible to rigorously sample the posterior distribution of the high-dimensional parameter space of typical SAMs. As an example, we characterize galaxy formation in the current $\\Lambda$CDM cosmology using stellar mass function of galaxies a...

  13. Inferring cultural models from corpus data

    DEFF Research Database (Denmark)

    Jensen, Kim Ebensgaard

    2015-01-01

    into the possibility of inferring cultural models from naturally occurring verbal behavior as documented in language corpora. Even rarer are such corpus-based studies of the interaction between cultural models and constructions. Exploring the usability of corpus data and methodology in the observation...

  14. Bayesian estimation of parameters in a regional hydrological model

    OpenAIRE

    Engeland, K.; Gottschalk, L.

    2002-01-01

    This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC) analysis. The Bayesian method requires formulation of a likelihood funct...

  15. Bayesian estimation of parameters in a regional hydrological model

    OpenAIRE

    Engeland, K.; Gottschalk, L.

    2002-01-01

    This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC) analysis. The Bayesian method requires formulation of ...

  16. Bayesian Analysis of Dynamic Multivariate Models with Multiple Structural Breaks

    OpenAIRE

    Sugita, Katsuhiro

    2006-01-01

    This paper considers a vector autoregressive model or a vector error correction model with multiple structural breaks in any subset of parameters, using a Bayesian approach with Markov chain Monte Carlo simulation technique. The number of structural breaks is determined as a sort of model selection by the posterior odds. For a cointegrated model, cointegrating rank is also allowed to change with breaks. Bayesian approach by Strachan (Journal of Business and Economic Statistics 21 (2003) 185) ...

  17. Dissecting magnetar variability with Bayesian hierarchical models

    CERN Document Server

    Huppenkothen, D; Hogg, D W; Murray, I; Frean, M; Elenbaas, C; Watts, A L; Levin, Y; van der Horst, A J; Kouveliotou, C

    2015-01-01

    Neutron stars are a prime laboratory for testing physical processes under conditions of strong gravity, high density, and extreme magnetic fields. Among the zoo of neutron star phenomena, magnetars stand out for their bursting behaviour, ranging from extremely bright, rare giant flares to numerous, less energetic recurrent bursts. The exact trigger and emission mechanisms for these bursts are not known; favoured models involve either a crust fracture and subsequent energy release into the magnetosphere, or explosive reconnection of magnetic field lines. In the absence of a predictive model, understanding the physical processes responsible for magnetar burst variability is difficult. Here, we develop an empirical model that decomposes magnetar bursts into a superposition of small spike-like features with a simple functional form, where the number of model components is itself part of the inference problem. The cascades of spikes that we model might be formed by avalanches of reconnection, or crust rupture afte...

  18. An imprecise Dirichlet model for Bayesian analysis of failure data including right-censored observations

    International Nuclear Information System (INIS)

    This paper is intended to make researchers in reliability theory aware of a recently introduced Bayesian model with imprecise prior distributions for statistical inference on failure data, that can also be considered as a robust Bayesian model. The model consists of a multinomial distribution with Dirichlet priors, making the approach basically nonparametric. New results for the model are presented, related to right-censored observations, where estimation based on this model is closely related to the product-limit estimator, which is an important statistical method to deal with reliability or survival data including right-censored observations. As for the product-limit estimator, the model considered in this paper aims at not using any information other than that provided by observed data, but our model fits into the robust Bayesian context which has the advantage that all inferences can be based on probabilities or expectations, or bounds for probabilities or expectations. The model uses a finite partition of the time-axis, and as such it is also related to life-tables

  19. Bayesian Inference on the Effect of Density Dependence and Weather on a Guanaco Population from Chile

    Science.gov (United States)

    Zubillaga, María; Skewes, Oscar; Soto, Nicolás; Rabinovich, Jorge E.; Colchero, Fernando

    2014-01-01

    Understanding the mechanisms that drive population dynamics is fundamental for management of wild populations. The guanaco (Lama guanicoe) is one of two wild camelid species in South America. We evaluated the effects of density dependence and weather variables on population regulation based on a time series of 36 years of population sampling of guanacos in Tierra del Fuego, Chile. The population density varied between 2.7 and 30.7 guanaco/km2, with an apparent monotonic growth during the first 25 years; however, in the last 10 years the population has shown large fluctuations, suggesting that it might have reached its carrying capacity. We used a Bayesian state-space framework and model selection to determine the effect of density and environmental variables on guanaco population dynamics. Our results show that the population is under density dependent regulation and that it is currently fluctuating around an average carrying capacity of 45,000 guanacos. We also found a significant positive effect of previous winter temperature while sheep density has a strong negative effect on the guanaco population growth. We conclude that there are significant density dependent processes and that climate as well as competition with domestic species have important effects determining the population size of guanacos, with important implications for management and conservation. PMID:25514510

  20. Bayesian inference of T Tauri star properties using multi-wavelength survey photometry

    CERN Document Server

    Barentsen, Geert; Drew, Janet E; Sale, Stuart E

    2012-01-01

    There are many pertinent open issues in the area of star and planet formation. Large statistical samples of young stars across star-forming regions are needed to trigger a breakthrough in our understanding, but most optical studies are based on a wide variety of spectrographs and analysis methods, which introduces large biases. Here we show how graphical Bayesian networks can be employed to construct a hierarchical probabilistic model which allows pre-main sequence ages, masses, accretion rates, and extinctions to be estimated using two widely available photometric survey databases (IPHAS r/i/Halpha and 2MASS J-band magnitudes.) Because our approach does not rely on spectroscopy, it can easily be applied to homogeneously study the large number of clusters for which Gaia will yield membership lists. We explain how the analysis is carried out using the Markov Chain Monte Carlo (MCMC) method and provide Python source code. We then demonstrate its use on 587 known low-mass members of the star-forming region NGC 2...

  1. Bayesian nonparametric inference on quantile residual life function: Application to breast cancer data.

    Science.gov (United States)

    Park, Taeyoung; Jeong, Jong-Hyeon; Lee, Jae Won

    2012-08-15

    There is often an interest in estimating a residual life function as a summary measure of survival data. For ease in presentation of the potential therapeutic effect of a new drug, investigators may summarize survival data in terms of the remaining life years of patients. Under heavy right censoring, however, some reasonably high quantiles (e.g., median) of a residual lifetime distribution cannot be always estimated via a popular nonparametric approach on the basis of the Kaplan-Meier estimator. To overcome the difficulties in dealing with heavily censored survival data, this paper develops a Bayesian nonparametric approach that takes advantage of a fully model-based but highly flexible probabilistic framework. We use a Dirichlet process mixture of Weibull distributions to avoid strong parametric assumptions on the unknown failure time distribution, making it possible to estimate any quantile residual life function under heavy censoring. Posterior computation through Markov chain Monte Carlo is straightforward and efficient because of conjugacy properties and partial collapse. We illustrate the proposed methods by using both simulated data and heavily censored survival data from a recent breast cancer clinical trial conducted by the National Surgical Adjuvant Breast and Bowel Project. PMID:22437758

  2. Bayesian inference on the effect of density dependence and weather on a guanaco population from Chile.

    Directory of Open Access Journals (Sweden)

    María Zubillaga

    Full Text Available Understanding the mechanisms that drive population dynamics is fundamental for management of wild populations. The guanaco (Lama guanicoe is one of two wild camelid species in South America. We evaluated the effects of density dependence and weather variables on population regulation based on a time series of 36 years of population sampling of guanacos in Tierra del Fuego, Chile. The population density varied between 2.7 and 30.7 guanaco/km2, with an apparent monotonic growth during the first 25 years; however, in the last 10 years the population has shown large fluctuations, suggesting that it might have reached its carrying capacity. We used a Bayesian state-space framework and model selection to determine the effect of density and environmental variables on guanaco population dynamics. Our results show that the population is under density dependent regulation and that it is currently fluctuating around an average carrying capacity of 45,000 guanacos. We also found a significant positive effect of previous winter temperature while sheep density has a strong negative effect on the guanaco population growth. We conclude that there are significant density dependent processes and that climate as well as competition with domestic species have important effects determining the population size of guanacos, with important implications for management and conservation.

  3. Bayesian Test of Significance for Conditional Independence: The Multinomial Model

    Science.gov (United States)

    de Morais Andrade, Pablo; Stern, Julio; de Bragança Pereira, Carlos

    2014-03-01

    Conditional independence tests (CI tests) have received special attention lately in Machine Learning and Computational Intelligence related literature as an important indicator of the relationship among the variables used by their models. In the field of Probabilistic Graphical Models (PGM)--which includes Bayesian Networks (BN) models--CI tests are especially important for the task of learning the PGM structure from data. In this paper, we propose the Full Bayesian Significance Test (FBST) for tests of conditional independence for discrete datasets. FBST is a powerful Bayesian test for precise hypothesis, as an alternative to frequentist's significance tests (characterized by the calculation of the \\emph{p-value}).

  4. Probabilistic Inference in BN2T Models by Weighted Model Counting

    Czech Academy of Sciences Publication Activity Database

    Vomlel, Jiří; Tichavský, Petr

    Amsterdam: IOS Press, 2013 - (Jaeger, M.; Nielsen, T.; Viappiani, P.), s. 275-284. (Frontiers in Artificial Intelligence and Applications). ISBN 978-1-61499-329-2. [The Scandinavian Conference on Artificial Intelligence (SCAI 2013) /12./. Aalborg (DK), 20.11.2013-22.11.2013] R&D Projects: GA ČR GA13-20012S; GA ČR GA102/09/1278 Institutional support: RVO:67985556 Keywords : Bayesian networks * Models of Independence of causal influence * Noisy threshold models * Probabilistic inference * Weighted model counting * Arithmetic circuits Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2013/MTR/vomlel-0399130.pdf

  5. Inferring late-Holocene climate in the Ecuadorian Andes using a chironomid-based temperature inference model

    Science.gov (United States)

    Matthews-Bird, Frazer; Brooks, Stephen J.; Holden, Philip B.; Montoya, Encarni; Gosling, William D.

    2016-06-01

    Presented here is the first chironomid calibration data set for tropical South America. Surface sediments were collected from 59 lakes across Bolivia (15 lakes), Peru (32 lakes), and Ecuador (12 lakes) between 2004 and 2013 over an altitudinal gradient from 150 m above sea level (a.s.l) to 4655 m a.s.l, between 0-17° S and 64-78° W. The study sites cover a mean annual temperature (MAT) gradient of 25 °C. In total, 55 chironomid taxa were identified in the 59 calibration data set lakes. When used as a single explanatory variable, MAT explains 12.9 % of the variance (λ1/λ2 = 1.431). Two inference models were developed using weighted averaging (WA) and Bayesian methods. The best-performing model using conventional statistical methods was a WA (inverse) model (R2jack = 0.890; RMSEPjack = 2.404 °C, RMSEP - root mean squared error of prediction; mean biasjack = -0.017 °C; max biasjack = 4.665 °C). The Bayesian method produced a model with R2jack = 0.909, RMSEPjack = 2.373 °C, mean biasjack = 0.598 °C, and max biasjack = 3.158 °C. Both models were used to infer past temperatures from a ca. 3000-year record from the tropical Andes of Ecuador, Laguna Pindo. Inferred temperatures fluctuated around modern-day conditions but showed significant departures at certain intervals (ca. 1600 cal yr BP; ca. 3000-2500 cal yr BP). Both methods (WA and Bayesian) showed similar patterns of temperature variability; however, the magnitude of fluctuations differed. In general the WA method was more variable and often underestimated Holocene temperatures (by ca. -7 ± 2.5 °C relative to the modern period). The Bayesian method provided temperature anomaly estimates for cool periods that lay within the expected range of the Holocene (ca. -3 ± 3.4 °C). The error associated with both reconstructions is consistent with a constant temperature of 20 °C for the past 3000 years. We would caution, however, against an over-interpretation at this stage. The reconstruction can only

  6. Glutamatergic Model Psychoses: Prediction Error, Learning, and Inference

    Science.gov (United States)

    Corlett, Philip R; Honey, Garry D; Krystal, John H; Fletcher, Paul C

    2011-01-01

    Modulating glutamatergic neurotransmission induces alterations in conscious experience that mimic the symptoms of early psychotic illness. We review studies that use intravenous administration of ketamine, focusing on interindividual variability in the profundity of the ketamine experience. We will consider this individual variability within a hypothetical model of brain and cognitive function centered upon learning and inference. Within this model, the brains, neural systems, and even single neurons specify expectations about their inputs and responding to violations of those expectations with new learning that renders future inputs more predictable. We argue that ketamine temporarily deranges this ability by perturbing both the ways in which prior expectations are specified and the ways in which expectancy violations are signaled. We suggest that the former effect is predominantly mediated by NMDA blockade and the latter by augmented and inappropriate feedforward glutamatergic signaling. We suggest that the observed interindividual variability emerges from individual differences in neural circuits that normally underpin the learning and inference processes described. The exact source for that variability is uncertain, although it is likely to arise not only from genetic variation but also from subjects' previous experiences and prior learning. Furthermore, we argue that chronic, unlike acute, NMDA blockade alters the specification of expectancies more profoundly and permanently. Scrutinizing individual differences in the effects of acute and chronic ketamine administration in the context of the Bayesian brain model may generate new insights about the symptoms of psychosis; their underlying cognitive processes and neurocircuitry. PMID:20861831

  7. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  8. Bayesian inference of nanoparticle-broadened x-ray line profiles

    CERN Document Server

    Armstrong, N; Cline, J P; Bonevich, J

    2003-01-01

    A single and self-contained method for determining the crystallite-size distribution and shape from experimental x-ray line profile data is presented. We have shown that the crystallite-size distribution can be determined without assuming a functional form for the size distribution, determining instead the size distribution with the least assumptions by applying the Bayesian/MaxEnt method. The Bayesian/MaxEnt method is tested using both simulated and experimental CeO$_{2}$ data. The results demonstrate that the proposed method can determine size distributions, while making the least number of assumptions. The comparison of the Bayesian/MaxEnt results from experimental CeO$_2$ with TEM results is favorable

  9. Two-Stage Bayesian Model Averaging in Endogenous Variable Models.

    Science.gov (United States)

    Lenkoski, Alex; Eicher, Theo S; Raftery, Adrian E

    2014-01-01

    Economic modeling in the presence of endogeneity is subject to model uncertainty at both the instrument and covariate level. We propose a Two-Stage Bayesian Model Averaging (2SBMA) methodology that extends the Two-Stage Least Squares (2SLS) estimator. By constructing a Two-Stage Unit Information Prior in the endogenous variable model, we are able to efficiently combine established methods for addressing model uncertainty in regression models with the classic technique of 2SLS. To assess the validity of instruments in the 2SBMA context, we develop Bayesian tests of the identification restriction that are based on model averaged posterior predictive p-values. A simulation study showed that 2SBMA has the ability to recover structure in both the instrument and covariate set, and substantially improves the sharpness of resulting coefficient estimates in comparison to 2SLS using the full specification in an automatic fashion. Due to the increased parsimony of the 2SBMA estimate, the Bayesian Sargan test had a power of 50 percent in detecting a violation of the exogeneity assumption, while the method based on 2SLS using the full specification had negligible power. We apply our approach to the problem of development accounting, and find support not only for institutions, but also for geography and integration as development determinants, once both model uncertainty and endogeneity have been jointly addressed. PMID:24223471

  10. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  11. Bayesian hierarchical models suggest oldest known plant-visiting bat was omnivorous.

    Science.gov (United States)

    Yohe, Laurel R; Velazco, Paúl M; Rojas, Danny; Gerstner, Beth E; Simmons, Nancy B; Dávalos, Liliana M

    2015-11-01

    The earliest record of plant visiting in bats dates to the Middle Miocene of La Venta, the world's most diverse tropical palaeocommunity. Palynephyllum antimaster is known from molars that indicate nectarivory. Skull length, an important indicator of key traits such as body size, bite force and trophic specialization, remains unknown. We developed Bayesian models to infer skull length based on dental measurements. These models account for variation within and between species, variation between clades, and phylogenetic error structure. Models relating skull length to trophic level for nectarivorous bats were then used to infer the diet of the fossil. The skull length estimate for Palynephyllum places it among the larger lonchophylline bats. The inferred diet suggests Palynephyllum fed on nectar and insects, similar to its living relatives. Omnivory has persisted since the mid-Miocene. This is the first study to corroborate with fossil data that highly specialized nectarivory in bats requires an omnivorous transition. PMID:26559512

  12. Statistical inference an integrated Bayesianlikelihood approach

    CERN Document Server

    Aitkin, Murray

    2010-01-01

    Filling a gap in current Bayesian theory, Statistical Inference: An Integrated Bayesian/Likelihood Approach presents a unified Bayesian treatment of parameter inference and model comparisons that can be used with simple diffuse prior specifications. This novel approach provides new solutions to difficult model comparison problems and offers direct Bayesian counterparts of frequentist t-tests and other standard statistical methods for hypothesis testing.After an overview of the competing theories of statistical inference, the book introduces the Bayes/likelihood approach used throughout. It pre

  13. Dissecting Magnetar Variability with Bayesian Hierarchical Models

    Science.gov (United States)

    Huppenkothen, Daniela; Brewer, Brendon J.; Hogg, David W.; Murray, Iain; Frean, Marcus; Elenbaas, Chris; Watts, Anna L.; Levin, Yuri; van der Horst, Alexander J.; Kouveliotou, Chryssa

    2015-09-01

    Neutron stars are a prime laboratory for testing physical processes under conditions of strong gravity, high density, and extreme magnetic fields. Among the zoo of neutron star phenomena, magnetars stand out for their bursting behavior, ranging from extremely bright, rare giant flares to numerous, less energetic recurrent bursts. The exact trigger and emission mechanisms for these bursts are not known; favored models involve either a crust fracture and subsequent energy release into the magnetosphere, or explosive reconnection of magnetic field lines. In the absence of a predictive model, understanding the physical processes responsible for magnetar burst variability is difficult. Here, we develop an empirical model that decomposes magnetar bursts into a superposition of small spike-like features with a simple functional form, where the number of model components is itself part of the inference problem. The cascades of spikes that we model might be formed by avalanches of reconnection, or crust rupture aftershocks. Using Markov Chain Monte Carlo sampling augmented with reversible jumps between models with different numbers of parameters, we characterize the posterior distributions of the model parameters and the number of components per burst. We relate these model parameters to physical quantities in the system, and show for the first time that the variability within a burst does not conform to predictions from ideas of self-organized criticality. We also examine how well the properties of the spikes fit the predictions of simplified cascade models for the different trigger mechanisms.

  14. Sampling Techniques in Bayesian Finite Element Model Updating

    CERN Document Server

    Boulkaibet, I; Mthembu, L; Friswell, M I; Adhikari, S

    2011-01-01

    Recent papers in the field of Finite Element Model (FEM) updating have highlighted the benefits of Bayesian techniques. The Bayesian approaches are designed to deal with the uncertainties associated with complex systems, which is the main problem in the development and updating of FEMs. This paper highlights the complexities and challenges of implementing any Bayesian method when the analysis involves a complicated structural dynamic model. In such systems an analytical Bayesian formulation might not be available in an analytic form; therefore this leads to the use of numerical methods, i.e. sampling methods. The main challenge then is to determine an efficient sampling of the model parameter space. In this paper, three sampling techniques, the Metropolis-Hastings (MH) algorithm, Slice Sampling and the Hybrid Monte Carlo (HMC) technique, are tested by updating a structural beam model. The efficiency and limitations of each technique is investigated when the FEM updating problem is implemented using the Bayesi...

  15. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  16. Sequential Inference for Latent Force Models

    CERN Document Server

    Hartikainen, Jouni

    2012-01-01

    Latent force models (LFMs) are hybrid models combining mechanistic principles with non-parametric components. In this article, we shall show how LFMs can be equivalently formulated and solved using the state variable approach. We shall also show how the Gaussian process prior used in LFMs can be equivalently formulated as a linear statespace model driven by a white noise process and how inference on the resulting model can be efficiently implemented using Kalman filter and smoother. Then we shall show how the recently proposed switching LFM can be reformulated using the state variable approach, and how we can construct a probabilistic model for the switches by formulating a similar switching LFM as a switching linear dynamic system (SLDS). We illustrate the performance of the proposed methodology in simulated scenarios and apply it to inferring the switching points in GPS data collected from car movement data in urban environment.

  17. Inference of random walk models to describe leukocyte migration

    Science.gov (United States)

    Jones, Phoebe J. M.; Sim, Aaron; Taylor, Harriet B.; Bugeon, Laurence; Dallman, Magaret J.; Pereira, Bernard; Stumpf, Michael P. H.; Liepe, Juliane

    2015-12-01

    While the majority of cells in an organism are static and remain relatively immobile in their tissue, migrating cells occur commonly during developmental processes and are crucial for a functioning immune response. The mode of migration has been described in terms of various types of random walks. To understand the details of the migratory behaviour we rely on mathematical models and their calibration to experimental data. Here we propose an approximate Bayesian inference scheme to calibrate a class of random walk models characterized by a specific, parametric particle re-orientation mechanism to observed trajectory data. We elaborate the concept of transition matrices (TMs) to detect random walk patterns and determine a statistic to quantify these TM to make them applicable for inference schemes. We apply the developed pipeline to in vivo trajectory data of macrophages and neutrophils, extracted from zebrafish that had undergone tail transection. We find that macrophage and neutrophils exhibit very distinct biased persistent random walk patterns, where the strengths of the persistence and bias are spatio-temporally regulated. Furthermore, the movement of macrophages is far less persistent than that of neutrophils in response to wounding.

  18. Modelling biogeochemical cycles in forest ecosystems: a Bayesian approach

    OpenAIRE

    Bagnara, Maurizio

    2015-01-01

    Forest models are tools for explaining and predicting the dynamics of forest ecosystems. They simulate forest behavior by integrating information on the underlying processes in trees, soil and atmosphere. Bayesian calibration is the application of probability theory to parameter estimation. It is a method, applicable to all models, that quantifies output uncertainty and identifies key parameters and variables. This study aims at testing the Bayesian procedure for calibration to different t...

  19. Bayesian modelling of the emission spectrum of the Joint European Torus Lithium Beam Emission Spectroscopy system.

    Science.gov (United States)

    Kwak, Sehyun; Svensson, J; Brix, M; Ghim, Y-C

    2016-02-01

    A Bayesian model of the emission spectrum of the JET lithium beam has been developed to infer the intensity of the Li I (2p-2s) line radiation and associated uncertainties. The detected spectrum for each channel of the lithium beam emission spectroscopy system is here modelled by a single Li line modified by an instrumental function, Bremsstrahlung background, instrumental offset, and interference filter curve. Both the instrumental function and the interference filter curve are modelled with non-parametric Gaussian processes. All free parameters of the model, the intensities of the Li line, Bremsstrahlung background, and instrumental offset, are inferred using Bayesian probability theory with a Gaussian likelihood for photon statistics and electronic background noise. The prior distributions of the free parameters are chosen as Gaussians. Given these assumptions, the intensity of the Li line and corresponding uncertainties are analytically available using a Bayesian linear inversion technique. The proposed approach makes it possible to extract the intensity of Li line without doing a separate background subtraction through modulation of the Li beam. PMID:26931843

  20. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  1. A Bayesian Belief Network to Infer Incentive Mechanisms to Reduce Antibiotic Use in Livestock Production

    OpenAIRE

    Ge, L.; Asseldonk, van, N.; Valeeva, N.I.; Hennen, W.H.G.J.; Bergevoet, R.H.M.

    2011-01-01

    Efficient policy intervention to reduce antibiotic use in livestock production requires knowledge about the rationale underlying antibiotic usage. Animal health status and management quality are considered the two most important factors that influence farmersâ¿¿ decision-making concerning antibiotic use. Information on these two factors is therefore crucial in designing incentive mechanisms. In this paper, a Bayesian belief network (BBN) is built to represent the knowledge on how these factor...

  2. A Bayesian observer model constrained by efficient coding can explain 'anti-Bayesian' percepts.

    Science.gov (United States)

    Wei, Xue-Xin; Stocker, Alan A

    2015-10-01

    Bayesian observer models provide a principled account of the fact that our perception of the world rarely matches physical reality. The standard explanation is that our percepts are biased toward our prior beliefs. However, reported psychophysical data suggest that this view may be simplistic. We propose a new model formulation based on efficient coding that is fully specified for any given natural stimulus distribution. The model makes two new and seemingly anti-Bayesian predictions. First, it predicts that perception is often biased away from an observer's prior beliefs. Second, it predicts that stimulus uncertainty differentially affects perceptual bias depending on whether the uncertainty is induced by internal or external noise. We found that both model predictions match reported perceptual biases in perceived visual orientation and spatial frequency, and were able to explain data that have not been explained before. The model is general and should prove applicable to other perceptual variables and tasks. PMID:26343249

  3. Equifinality of formal (DREAM) and informal (GLUE) bayesian approaches in hydrologic modeling?

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Ter Braak, Cajo J F [NON LANL; Gupta, Hoshin V [NON LANL

    2008-01-01

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.

  4. Improved testing inference in mixed linear models

    CERN Document Server

    Melo, Tatiane F N; Cribari-Neto, Francisco; 10.1016/j.csda.2008.12.007

    2011-01-01

    Mixed linear models are commonly used in repeated measures studies. They account for the dependence amongst observations obtained from the same experimental unit. Oftentimes, the number of observations is small, and it is thus important to use inference strategies that incorporate small sample corrections. In this paper, we develop modified versions of the likelihood ratio test for fixed effects inference in mixed linear models. In particular, we derive a Bartlett correction to such a test and also to a test obtained from a modified profile likelihood function. Our results generalize those in Zucker et al. (Journal of the Royal Statistical Society B, 2000, 62, 827-838) by allowing the parameter of interest to be vector-valued. Additionally, our Bartlett corrections allow for random effects nonlinear covariance matrix structure. We report numerical evidence which shows that the proposed tests display superior finite sample behavior relative to the standard likelihood ratio test. An application is also presente...

  5. Bayesian model discrimination for glucose-insulin homeostasis

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Brooks, Stephen P.; Højbjerre, Malene

    the reformulation of existing deterministic models as stochastic state space models which properly accounts for both measurement and process variability. The analysis is further enhanced by Bayesian model discrimination techniques and model averaged parameter estimation which fully accounts for model as well......In this paper we analyse a set of experimental data on a number of healthy and diabetic patients and discuss a variety of models for describing the physiological processes involved in glucose absorption and insulin secretion within the human body. We adopt a Bayesian approach which facilitates...

  6. Prediction of trajectory based on modified Bayesian inference%基于改进贝叶斯方法的轨迹预测算法研究

    Institute of Scientific and Technical Information of China (English)

    李万高; 赵雪梅; 孙德厂

    2013-01-01

    The existing algorithms for trajectory prediction have very low prediction accuracy when there are a limited number of available trajectories.To address this problem,the Modified Bayesian Inference (MBI) approach was proposed,which constructed the Markov model to quantify the correlation between adjacent locations.MBI decomposed historical trajectories into sub-trajectories to get more precise Markov model and the probability formula of Bayesian inference was obtained.The experimental results based on real datasets show that MBI approach is two to three times faster than the existing algorithm,and it has higher prediction accuracy and stability.MBI makes full use of the available trajectories and improves the efficiency and accuracy for the prediction of trajectory.%针对传统轨迹预测方法在历史轨迹数目有限时,预测准确度较低的问题,提出一种改进的贝叶斯推理(MBI)方法,MBI构建了马尔可夫模型来量化相邻位置的相关性,并通过对历史轨迹进行分解来获得更准确的马尔可夫模型,最后得到改进的贝叶斯推理公式.实验结果表明,MBI方法比现有方法的预测速度快2到3倍,并且有较高的准确度和稳定性.MBI方法充分利用现有轨迹信息,不仅提高了查询效率,还保证了较高的预测精度.

  7. Using consensus bayesian network to model the reactive oxygen species regulatory pathway.

    Directory of Open Access Journals (Sweden)

    Liangdong Hu

    Full Text Available Bayesian network is one of the most successful graph models for representing the reactive oxygen species regulatory pathway. With the increasing number of microarray measurements, it is possible to construct the bayesian network from microarray data directly. Although large numbers of bayesian network learning algorithms have been developed, when applying them to learn bayesian networks from microarray data, the accuracies are low due to that the databases they used to learn bayesian networks contain too few microarray data. In this paper, we propose a consensus bayesian network which is constructed by combining bayesian networks from relevant literatures and bayesian networks learned from microarray data. It would have a higher accuracy than the bayesian networks learned from one database. In the experiment, we validated the bayesian network combination algorithm on several classic machine learning databases and used the consensus bayesian network to model the Escherichia coli's ROS pathway.

  8. Modelling the presence of disease under spatial misalignment using Bayesian latent Gaussian models.

    Science.gov (United States)

    Barber, Xavier; Conesa, David; Lladosa, Silvia; López-Quílez, Antonio

    2016-01-01

    Modelling patterns of the spatial incidence of diseases using local environmental factors has been a growing problem in the last few years. Geostatistical models have become popular lately because they allow estimating and predicting the underlying disease risk and relating it with possible risk factors. Our approach to these models is based on the fact that the presence/absence of a disease can be expressed with a hierarchical Bayesian spatial model that incorporates the information provided by the geographical and environmental characteristics of the region of interest. Nevertheless, our main interest here is to tackle the misalignment problem arising when information about possible covariates are partially (or totally) different than those of the observed locations and those in which we want to predict. As a result, we present two different models depending on the fact that there is uncertainty on the covariates or not. In both cases, Bayesian inference on the parameters and prediction of presence/absence in new locations are made by considering the model as a latent Gaussian model, which allows the use of the integrated nested Laplace approximation. In particular, the spatial effect is implemented with the stochastic partial differential equation approach. The methodology is evaluated on the presence of the Fasciola hepatica in Galicia, a North-West region of Spain. PMID:27087038

  9. Modelling Transcriptional Regulation with a Mixture of Factor Analyzers and Variational Bayesian Expectation Maximization

    Directory of Open Access Journals (Sweden)

    Kuang Lin

    2009-01-01

    Full Text Available Understanding the mechanisms of gene transcriptional regulation through analysis of high-throughput postgenomic data is one of the central problems of computational systems biology. Various approaches have been proposed, but most of them fail to address at least one of the following objectives: (1 allow for the fact that transcription factors are potentially subject to posttranscriptional regulation; (2 allow for the fact that transcription factors cooperate as a functional complex in regulating gene expression, and (3 provide a model and a learning algorithm with manageable computational complexity. The objective of the present study is to propose and test a method that addresses these three issues. The model we employ is a mixture of factor analyzers, in which the latent variables correspond to different transcription factors, grouped into complexes or modules. We pursue inference in a Bayesian framework, using the Variational Bayesian Expectation Maximization (VBEM algorithm for approximate inference of the posterior distributions of the model parameters, and estimation of a lower bound on the marginal likelihood for model selection. We have evaluated the performance of the proposed method on three criteria: activity profile reconstruction, gene clustering, and network inference.

  10. Bayesian Inference of a Finite Mixture of Inverse Weibull Distributions with an Application to Doubly Censoring Data

    Directory of Open Access Journals (Sweden)

    Navid Feroze

    2016-03-01

    Full Text Available The families of mixture distributions have a wider range of applications in different fields such as fisheries, agriculture, botany, economics, medicine, psychology, electrophoresis, finance, communication theory, geology and zoology. They provide the necessary flexibility to model failure distributions of components with multiple failure modes. Mostly, the Bayesian procedure for the estimation of parameters of mixture model is described under the scheme of Type-I censoring. In particular, the Bayesian analysis for the mixture models under doubly censored samples has not been considered in the literature yet. The main objective of this paper is to develop the Bayes estimation of the inverse Weibull mixture distributions under doubly censoring. The posterior estimation has been conducted under the assumption of gamma and inverse levy using precautionary loss function and weighted squared error loss function. The comparisons among the different estimators have been made based on analysis of simulated and real life data sets.

  11. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    Science.gov (United States)

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes. PMID:27343475

  12. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  13. Bayesian approach to color-difference models based on threshold and constant-stimuli methods.

    Science.gov (United States)

    Brusola, Fernando; Tortajada, Ignacio; Lengua, Ismael; Jordá, Begoña; Peris, Guillermo

    2015-06-15

    An alternative approach based on statistical Bayesian inference is presented to deal with the development of color-difference models and the precision of parameter estimation. The approach was applied to simulated data and real data, the latter published by selected authors involved with the development of color-difference formulae using traditional methods. Our results show very good agreement between the Bayesian and classical approaches. Among other benefits, our proposed methodology allows one to determine the marginal posterior distribution of each random individual parameter of the color-difference model. In this manner, it is possible to analyze the effect of individual parameters on the statistical significance calculation of a color-difference equation. PMID:26193510

  14. A BAYESIAN ABDUCTION MODEL FOR EXTRACTING THE MOST PROBABLE EVIDENCE TO SUPPORT SENSEMAKING

    Directory of Open Access Journals (Sweden)

    Paul Munya

    2015-01-01

    Full Text Available In this paper, we discuss the development of a Bayesian Abduction Model of Sensemaking Support (BAMSS as a tool for information fusion to support prospective sensemaking. Currently, BAMSS can identify the Most Probable Explanation from a Bayesian Belief Network (BBN and extract the prevalent conditional probability values to help the sensemaking analysts to understand the cause-effect of the adversary information. Actual vignettes from databases of modern insurgencies and asymmetry warfare are used to validate the performance of BAMSS. BAMSS computes the posterior probability of the network edges and performs information fusion using a clustering algorithm. In the model, the friendly force commander uses the adversary information to prospectively make sense of the enemy’s intent. Sensitivity analyses were used to confirm the robustness of BAMSS in generating the Most Probable Explanations from a BBN through abductive inference. The simulation results demonstrate the utility of BAMSS as a computational tool to support sense making.

  15. Parameter inference and model selection in signaling pathway models

    OpenAIRE

    Toni, Tina; Stumpf, Michael P. H.

    2009-01-01

    To support and guide an extensive experimental research into systems biology of signaling pathways, increasingly more mechanistic models are being developed with hopes of gaining further insight into biological processes. In order to analyse these models, computational and statistical techniques are needed to estimate the unknown kinetic parameters. This chapter reviews methods from frequentist and Bayesian statistics for estimation of parameters and for choosing which model is best for model...

  16. Inferring brain-computational mechanisms with models of activity measurements.

    Science.gov (United States)

    Kriegeskorte, Nikolaus; Diedrichsen, Jörn

    2016-10-01

    High-resolution functional imaging is providing increasingly rich measurements of brain activity in animals and humans. A major challenge is to leverage such data to gain insight into the brain's computational mechanisms. The first step is to define candidate brain-computational models (BCMs) that can perform the behavioural task in question. We would then like to infer which of the candidate BCMs best accounts for measured brain-activity data. Here we describe a method that complements each BCM by a measurement model (MM), which simulates the way the brain-activity measurements reflect neuronal activity (e.g. local averaging in functional magnetic resonance imaging (fMRI) voxels or sparse sampling in array recordings). The resulting generative model (BCM-MM) produces simulated measurements. To avoid having to fit the MM to predict each individual measurement channel of the brain-activity data, we compare the measured and predicted data at the level of summary statistics. We describe a novel particular implementation of this approach, called probabilistic representational similarity analysis (pRSA) with MMs, which uses representational dissimilarity matrices (RDMs) as the summary statistics. We validate this method by simulations of fMRI measurements (locally averaging voxels) based on a deep convolutional neural network for visual object recognition. Results indicate that the way the measurements sample the activity patterns strongly affects the apparent representational dissimilarities. However, modelling of the measurement process can account for these effects, and different BCMs remain distinguishable even under substantial noise. The pRSA method enables us to perform Bayesian inference on the set of BCMs and to recognize the data-generating model in each case.This article is part of the themed issue 'Interpreting BOLD: a dialogue between cognitive and cellular neuroscience'. PMID:27574316

  17. Extraction of Belief Knowledge from a Relational Database for Quantitative Bayesian Network Inference

    Directory of Open Access Journals (Sweden)

    LiMin Wang

    2013-01-01

    Full Text Available The problem of extracting knowledge from a relational database for probabilistic reasoning is still unsolved. On the basis of a three-phase learning framework, we propose the integration of a Bayesian network (BN with the functional dependency (FD discovery technique. Association rule analysis is employed to discover FDs and expert knowledge encoded within a BN; that is, key relationships between attributes are emphasized. Moreover, the BN can be updated by using an expert-driven annotation process wherein redundant nodes and edges are removed. Experimental results show the effectiveness and efficiency of the proposed approach.

  18. Lack of Confidence in Approximate Bayesian Computation Model Choice

    OpenAIRE

    Robert, Christian P.; Cornuet, Jean-Marie; Marin, Jean-Michel; Pillai, Natesh S.

    2011-01-01

    Approximate Bayesian computation (ABC) have become an essential tool for the analysis of complex stochastic models. Grelaud et al. [(2009) Bayesian Anal 3:427–442] advocated the use of ABC for model choice in the specific case of Gibbs random fields, relying on an intermodel sufficiency property to show that the approximation was legitimate. We implemented ABC model choice in a wide range of phylogenetic models in the Do It Yourself-ABC (DIY-ABC) software [Cornuet et al. (2008) Bioinformatics...

  19. A selective view of stochastic inference and modeling problems in nanoscale biophysics

    Institute of Scientific and Technical Information of China (English)

    KOU S. C.

    2009-01-01

    Advances in nanotechnology enable scientists for the first time to study biological processes on a nanoscale molecule-by-molecule basis. They also raise challenges and opportunities for statisticians and applied probabilists. To exemplify the stochastic inference and modeling problems in the field, this paper discusses a few selected cases, ranging from likelihood inference, Bayesian data augmentation, and semi- and non-parametric inference of nanometric biochemical systems to the utilization of stochastic integro-differential equations and stochastic networks to model single-molecule biophysical processes. We discuss the statistical and probabilistic issues as well as the biophysical motivation and physical meaning behind the problems, emphasizing the analysis and modeling of real experimental data.

  20. A selective view of stochastic inference and mod-eling problems in nanoscale biophysics

    Institute of Scientific and Technical Information of China (English)

    KOU; S.C.

    2009-01-01

    Advances in nanotechnology enable scientists for the first time to study biological pro-cesses on a nanoscale molecule-by-molecule basis.They also raise challenges and opportunities for statisticians and applied probabilists.To exemplify the stochastic inference and modeling problems in the field,this paper discusses a few selected cases,ranging from likelihood inference,Bayesian data augmentation,and semi-and non-parametric inference of nanometric biochemical systems to the uti-lization of stochastic integro-differential equations and stochastic networks to model single-molecule biophysical processes.We discuss the statistical and probabilistic issues as well as the biophysical motivation and physical meaning behind the problems,emphasizing the analysis and modeling of real experimental data.

  1. On the Bayesian Nonparametric Generalization of IRT-Type Models

    Science.gov (United States)

    San Martin, Ernesto; Jara, Alejandro; Rolin, Jean-Marie; Mouchart, Michel

    2011-01-01

    We study the identification and consistency of Bayesian semiparametric IRT-type models, where the uncertainty on the abilities' distribution is modeled using a prior distribution on the space of probability measures. We show that for the semiparametric Rasch Poisson counts model, simple restrictions ensure the identification of a general…

  2. Modelling LGD for unsecured retail loans using Bayesian methods

    OpenAIRE

    Katarzyna Bijak; Thomas, Lyn C

    2015-01-01

    Loss Given Default (LGD) is the loss borne by the bank when a customer defaults on a loan. LGD for unsecured retail loans is often found difficult to model. In the frequentist (non-Bayesian) two-step approach, two separate regression models are estimated independently, which can be considered potentially problematic when trying to combine them to make predictions about LGD. The result is a point estimate of LGD for each loan. Alternatively, LGD can be modelled using Bayesian methods. In the B...

  3. A Bayesian Matrix Factorization Model for Relational Data

    CERN Document Server

    Singh, Ajit P

    2012-01-01

    Relational learning can be used to augment one data source with other correlated sources of information, to improve predictive accuracy. We frame a large class of relational learning problems as matrix factorization problems, and propose a hierarchical Bayesian model. Training our Bayesian model using random-walk Metropolis-Hastings is impractically slow, and so we develop a block Metropolis- Hastings sampler which uses the gradient and Hessian of the likelihood to dynamically tune the proposal. We demonstrate that a predictive model of brain response to stimuli can be improved by augmenting it with side information about the stimuli.

  4. Inferring Biologically Relevant Models: Nested Canalyzing Functions

    CERN Document Server

    Hinkelmann, Franziska

    2010-01-01

    Inferring dynamic biochemical networks is one of the main challenges in systems biology. Given experimental data, the objective is to identify the rules of interaction among the different entities of the network. However, the number of possible models fitting the available data is huge and identifying a biologically relevant model is of great interest. Nested canalyzing functions, where variables in a given order dominate the function, have recently been proposed as a framework for modeling gene regulatory networks. Previously we described this class of functions as an algebraic toric variety. In this paper, we present an algorithm that identifies all nested canalyzing models that fit the given data. We demonstrate our methods using a well-known Boolean model of the cell cycle in budding yeast.

  5. The Bayesian Modelling Of Inflation Rate In Romania

    Directory of Open Access Journals (Sweden)

    Mihaela Simionescu (Bratu

    2014-06-01

    Full Text Available Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estimation was presented, realizing two empirical studies for data taken from the Romanian economy. Thus, an autoregressive model of order 2 and a multiple regression model were built for the index of consumer prices. The Gibbs sampling algorithm was used for estimation in R software, computing the posterior means and the standard deviations. The parameters’ stability proved to be greater than in the case of estimations based on the methods of classical Econometrics.

  6. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  7. Bayesian Inference for LISA Pathfinder using Markov Chain Monte Carlo Methods

    CERN Document Server

    Ferraioli, Luigi; Plagnol, Eric

    2012-01-01

    We present a parameter estimation procedure based on a Bayesian framework by applying a Markov Chain Monte Carlo algorithm to the calibration of the dynamical parameters of a space based gravitational wave detector. The method is based on the Metropolis-Hastings algorithm and a two-stage annealing treatment in order to ensure an effective exploration of the parameter space at the beginning of the chain. We compare two versions of the algorithm with an application to a LISA Pathfinder data analysis problem. The two algorithms share the same heating strategy but with one moving in coordinate directions using proposals from a multivariate Gaussian distribution, while the other uses the natural logarithm of some parameters and proposes jumps in the eigen-space of the Fisher Information matrix. The algorithm proposing jumps in the eigen-space of the Fisher Information matrix demonstrates a higher acceptance rate and a slightly better convergence towards the equilibrium parameter distributions in the application to...

  8. Multi-Pitch Estimation and Tracking Using Bayesian Inference in Block Sparsity

    DEFF Research Database (Denmark)

    Karimian-Azari, Sam; Jakobsson, Andreas; Jensen, Jesper Rindom;

    2015-01-01

    In this paper, we consider the problem of multi-pitch estimation and tracking of an unknown number of harmonic audio sources. The regularized least-squares is a solution for simultaneous sparse source selection and parameter estimation. Exploiting block sparsity, the method allows for reliable...... tracking of the found sources, without posing detailed a priori assumptions of the number of harmonics for each source. The method incorporates a Bayesian prior and assigns data-dependent regularization coefficients to efficiently incorporate both earlier and future data blocks in the tracking of estimates....... In comparison with fix regularization coefficients, the simulation results, using both real and synthetic audio signals, confirm the performance of the proposed method....

  9. Bayesian Subset Modeling for High-Dimensional Generalized Linear Models

    KAUST Repository

    Liang, Faming

    2013-06-01

    This article presents a new prior setting for high-dimensional generalized linear models, which leads to a Bayesian subset regression (BSR) with the maximum a posteriori model approximately equivalent to the minimum extended Bayesian information criterion model. The consistency of the resulting posterior is established under mild conditions. Further, a variable screening procedure is proposed based on the marginal inclusion probability, which shares the same properties of sure screening and consistency with the existing sure independence screening (SIS) and iterative sure independence screening (ISIS) procedures. However, since the proposed procedure makes use of joint information from all predictors, it generally outperforms SIS and ISIS in real applications. This article also makes extensive comparisons of BSR with the popular penalized likelihood methods, including Lasso, elastic net, SIS, and ISIS. The numerical results indicate that BSR can generally outperform the penalized likelihood methods. The models selected by BSR tend to be sparser and, more importantly, of higher prediction ability. In addition, the performance of the penalized likelihood methods tends to deteriorate as the number of predictors increases, while this is not significant for BSR. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  10. Automated adaptive inference of phenomenological dynamical models

    Science.gov (United States)

    Daniels, Bryan C.; Nemenman, Ilya

    2015-08-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.

  11. Robust Spectroscopic Inference with Imperfect Models

    CERN Document Server

    Czekala, Ian; Mandel, Kaisey S; Hogg, David W; Green, Gregory M

    2014-01-01

    We present a modular, extensible framework for the spectroscopic inference of physical parameters based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. In the limit of high signal-to-noise data with large spectral range that is common for stellar parameter estimation, that covariant structure can bias the parameter determinations. We have designed a likelihood function formalism to account for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. We specifically address the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic or molecular data, or radiative transfer treatment) by developing a novel local covariance kernel framework that identifies and self-consistently downweights pathological spectral line "outliers." By fitting multiple spec...

  12. Involving Stakeholders in Building Integrated Fisheries Models Using Bayesian Methods

    Science.gov (United States)

    Haapasaari, Päivi; Mäntyniemi, Samu; Kuikka, Sakari

    2013-06-01

    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame the management problem of the herring fishery and elucidate what kind of causalities the different views involve. The paper combines these two tasks to assess the suitability of the methodological choices to participatory modeling in terms of both a modeling tool and participation mode. The paper also assesses the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology provides a flexible tool that can be adapted to different kinds of needs and challenges of participatory modeling. The ability of the approach to deal with small data sets makes it cost-effective in participatory contexts. However, the BMA methodology used in modeling the biological uncertainties is so complex that it needs further development before it can be introduced to wider use in participatory contexts.

  13. Bayesian modeling and prediction of solar particles flux

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Kalová, J.

    Praha: FJFI ČVUT v Praze, 2009 - (Štěpán, V.), s. 77-77 ISBN 978-80-01-04430-8. [XXXI. Dny radiační ochrany. Kouty nad Desnou, Hrubý Jeseník (CZ), 02.11.2009-06.11.2009] R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian model * solar particle * solar wind Subject RIV: IN - Informatics, Computer Science http://library.utia.cas.cz/separaty/2009/AS/dedecius-bayesian modeling and prediction of solar particle s flux.pdf

  14. Research & development and growth: A Bayesian model averaging analysis

    Czech Academy of Sciences Publication Activity Database

    Horváth, Roman

    2011-01-01

    Roč. 28, č. 6 (2011), s. 2669-2673. ISSN 0264-9993. [Society for Non-linear Dynamics and Econometrics Annual Conferencen. Washington DC, 16.03.2011-18.03.2011] R&D Projects: GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Keywords : Research and development * Growth * Bayesian model averaging Subject RIV: AH - Economics Impact factor: 0.701, year: 2011 http://library.utia.cas.cz/separaty/2011/E/horvath-research & development and growth a bayesian model averaging analysis.pdf

  15. Approximate Bayesian Recursive Estimation of Linear Model with Uniform Noise

    Czech Academy of Sciences Publication Activity Database

    Pavelková, Lenka; Kárný, Miroslav

    Brussels: IFAC, 2012, s. 1803-1807. ISBN 978-3-902823-06-9. [16th IFAC Symposium on System Identification The International Federation of Automatic Control. Brussels (BE), 11.07.2012-13.07.2012] R&D Projects: GA TA ČR TA01030123 Institutional support: RVO:67985556 Keywords : recursive parameter estimation * bounded noise * Bayesian learning * autoregressive models Subject RIV: BC - Control System s Theory http://library.utia.cas.cz/separaty/2012/AS/pavelkova-approximate bayesian recursive estimation of linear model with uniform noise.pdf

  16. Comparing Bayesian models for multisensory cue combination without mandatory integration

    OpenAIRE

    Beierholm, Ulrik R.; Shams, Ladan; Kording, Konrad P; Ma, Wei Ji

    2009-01-01

    Bayesian models of multisensory perception traditionally address the problem of estimating an underlying variable that is assumed to be the cause of the two sensory signals. The brain, however, has to solve a more general problem: it also has to establish which signals come from the same source and should be integrated, and which ones do not and should be segregated. In the last couple of years, a few models have been proposed to solve this problem in a Bayesian fashion. One of these ha...

  17. Bayesian model mixing for cold rolling mills: Test results

    Czech Academy of Sciences Publication Activity Database

    Ettler, P.; Puchr, I.; Dedecius, Kamil

    Slovensko: Slovak University of Technology, 2013, s. 359-364. ISBN 978-1-4799-0926-1. [19th International Conference on Process Control . Štrbské Pleso (SK), 18.06.2013-21.06.2013] R&D Projects: GA MŠk(CZ) 7D09008; GA MŠk 7D12004 Keywords : Bayesian statistics * model mixing * process control Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2013/AS/dedecius-bayesian model mixing for cold rolling mills test results.pdf

  18. Bayesian model selection framework for identifying growth patterns in filamentous fungi.

    Science.gov (United States)

    Lin, Xiao; Terejanu, Gabriel; Shrestha, Sajan; Banerjee, Sourav; Chanda, Anindya

    2016-06-01

    This paper describes a rigorous methodology for quantification of model errors in fungal growth models. This is essential to choose the model that best describes the data and guide modeling efforts. Mathematical modeling of growth of filamentous fungi is necessary in fungal biology for gaining systems level understanding on hyphal and colony behaviors in different environments. A critical challenge in the development of these mathematical models arises from the indeterminate nature of their colony architecture, which is a result of processing diverse intracellular signals induced in response to a heterogeneous set of physical and nutritional factors. There exists a practical gap in connecting fungal growth models with measurement data. Here, we address this gap by introducing the first unified computational framework based on Bayesian inference that can quantify individual model errors and rank the statistical models based on their descriptive power against data. We show that this Bayesian model comparison is just a natural formalization of Occam׳s razor. The application of this framework is discussed in comparing three models in the context of synthetic data generated from a known true fungal growth model. This framework of model comparison achieves a trade-off between data fitness and model complexity and the quantified model error not only helps in calibrating and comparing the models, but also in making better predictions and guiding model refinements. PMID:27000772

  19. Bayesian Model Comparison With the g-Prior

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan;

    2014-01-01

    Model comparison and selection is an important problem in many model-based signal processing applications. Often, very simple information criteria such as the Akaike information criterion or the Bayesian information criterion are used despite their shortcomings. Compared to these methods, Djuric’...

  20. Bayesian Estimation of the DINA Model with Gibbs Sampling

    Science.gov (United States)

    Culpepper, Steven Andrew

    2015-01-01

    A Bayesian model formulation of the deterministic inputs, noisy "and" gate (DINA) model is presented. Gibbs sampling is employed to simulate from the joint posterior distribution of item guessing and slipping parameters, subject attribute parameters, and latent class probabilities. The procedure extends concepts in Béguin and Glas,…

  1. Fast Markov chain Monte Carlo sampling for sparse Bayesian inference in high-dimensional inverse problems using L1-type priors

    International Nuclear Information System (INIS)

    Sparsity has become a key concept for solving of high-dimensional inverse problems using variational regularization techniques. Recently, using similar sparsity-constraints in the Bayesian framework for inverse problems by encoding them in the prior distribution has attracted attention. Important questions about the relation between regularization theory and Bayesian inference still need to be addressed when using sparsity promoting inversion. A practical obstacle for these examinations is the lack of fast posterior sampling algorithms for sparse, high-dimensional Bayesian inversion. Accessing the full range of Bayesian inference methods requires being able to draw samples from the posterior probability distribution in a fast and efficient way. This is usually done using Markov chain Monte Carlo (MCMC) sampling algorithms. In this paper, we develop and examine a new implementation of a single component Gibbs MCMC sampler for sparse priors relying on L1-norms. We demonstrate that the efficiency of our Gibbs sampler increases when the level of sparsity or the dimension of the unknowns is increased. This property is contrary to the properties of the most commonly applied Metropolis–Hastings (MH) sampling schemes. We demonstrate that the efficiency of MH schemes for L1-type priors dramatically decreases when the level of sparsity or the dimension of the unknowns is increased. Practically, Bayesian inversion for L1-type priors using MH samplers is not feasible at all. As this is commonly believed to be an intrinsic feature of MCMC sampling, the performance of our Gibbs sampler also challenges common beliefs about the applicability of sample based Bayesian inference. (paper)

  2. Full Bayesian hierarchical light curve modeling of core-collapse supernova populations

    Science.gov (United States)

    Sanders, Nathan; Betancourt, Michael; Soderberg, Alicia Margarita

    2016-06-01

    While wide field surveys have yielded remarkable quantities of photometry of transient objects, including supernovae, light curves reconstructed from this data suffer from several characteristic problems. Because most transients are discovered near the detection limit, signal to noise is generally poor; because coverage is limited to the observing season, light curves are often incomplete; and because temporal sampling can be uneven across filters, these problems can be exacerbated at any one wavelength. While the prevailing approach of modeling individual light curves independently is successful at recovering inferences for the objects with the highest quality observations, it typically neglects a substantial portion of the data and can introduce systematic biases. Joint modeling of the light curves of transient populations enables direct inference on population-level characteristics as well as superior measurements for individual objects. We present a new hierarchical Bayesian model for supernova light curves, where information inferred from observations of every individual light curve in a sample is partially pooled across objects to constrain population-level hyperparameters. Using an efficient Hamiltonian Monte Carlo sampling technique, the model posterior can be explored to enable marginalization over weakly-identified hyperparameters through full Bayesian inference. We demonstrate our technique on the Pan-STARRS1 (PS1) Type IIP supernova light curve sample published by Sanders et al. (2015), consisting of nearly 20,000 individual photometric observations of more than 70 supernovae in five photometric filters. We discuss the Stan probabilistic programming language used to implement the model, computational challenges, and prospects for future work including generalization to multiple supernova types. We also discuss scientific results from the PS1 dataset including a new relation between the peak magnitude and decline rate of SNe IIP, a new perspective on the

  3. Bayesian large-scale structure inference: initial conditions and the cosmic web

    CERN Document Server

    Leclercq, Florent

    2014-01-01

    We describe an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the large-scale structure of the inhomogeneous Universe. Our algorithm explores the joint posterior distribution of the many millions of parameters involved via efficient Hamiltonian Markov Chain Monte Carlo sampling. We describe its application to the Sloan Digital Sky Survey data release 7 and an additional non-linear filtering step. We illustrate the use of our findings for cosmic web analysis: identification of structures via tidal shear analysis and inference of dark matter voids.

  4. Bayesian coalescent inference reveals high evolutionary rates and diversification of Zika virus populations.

    Science.gov (United States)

    Fajardo, Alvaro; Soñora, Martín; Moreno, Pilar; Moratorio, Gonzalo; Cristina, Juan

    2016-10-01

    Zika virus (ZIKV) is a member of the family Flaviviridae. In 2015, ZIKV triggered an epidemic in Brazil and spread across Latin America. By May of 2016, the World Health Organization warns over spread of ZIKV beyond this region. Detailed studies on the mode of evolution of ZIKV strains are extremely important for our understanding of the emergence and spread of ZIKV populations. In order to gain insight into these matters, a Bayesian coalescent Markov Chain Monte Carlo analysis of complete genome sequences of recently isolated ZIKV strains was performed. The results of these studies revealed a mean rate of evolution of 1.20 × 10(-3) nucleotide substitutions per site per year (s/s/y) for ZIKV strains enrolled in this study. Several variants isolated in China are grouped together with all strains isolated in Latin America. Another genetic group composed exclusively by Chinese strains were also observed, suggesting the co-circulation of different genetic lineages in China. These findings indicate a high level of diversification of ZIKV populations. Strains isolated from microcephaly cases do not share amino acid substitutions, suggesting that other factors besides viral genetic differences may play a role for the proposed pathogenesis caused by ZIKV infection. J. Med. Virol. 88:1672-1676, 2016. © 2016 Wiley Periodicals, Inc. PMID:27278855

  5. HDDM: Hierarchical Bayesian estimation of the Drift-Diffusion Model in Python

    Directory of Open Access Journals (Sweden)

    Thomas V Wiecki

    2013-08-01

    Full Text Available The diffusion model is a commonly used tool to infer latent psychological processes underlying decision making, and to link them to neural mechanisms based on reaction times. Although efficient open source software has been made available to quantitatively fit the model to data, current estimation methods require an abundance of reaction time measurements to recover meaningful parameters, and only provide point estimates of each parameter. In contrast, hierarchical Bayesian parameter estimation methods are useful for enhancing statistical power, allowing for simultaneous estimation of individual subject parameters and the group distribution that they are drawn from, while also providing measures of uncertainty in these parameters in the posterior distribution. Here, we present a novel Python-based toolbox called HDDM (hierarchical drift diffusion model, which allows fast and flexible estimation of the the drift-diffusion model and the related linear ballistic accumulator model. HDDM requires fewer data per subject / condition than non-hierarchical method, allows for full Bayesian data analysis, and can handle outliers in the data. Finally, HDDM supports the estimation of how trial-by-trial measurements (e.g. fMRI influence decision making parameters. This paper will first describe the theoretical background of drift-diffusion model and Bayesian inference. We then illustrate usage of the toolbox on a real-world data set from our lab. Finally, parameter recovery studies show that HDDM beats alternative fitting methods like the chi-quantile method as well as maximum likelihood estimation. The software and documentation can be downloaded at: http://ski.clps.brown.edu/hddm_docs

  6. Spatial and spatio-temporal bayesian models with R - INLA

    CERN Document Server

    Blangiardo, Marta

    2015-01-01

    Dedication iiiPreface ix1 Introduction 11.1 Why spatial and spatio-temporal statistics? 11.2 Why do we use Bayesian methods for modelling spatial and spatio-temporal structures? 21.3 Why INLA? 31.4 Datasets 32 Introduction to 212.1 The language 212.2 objects 222.3 Data and session management 342.4 Packages 352.5 Programming in 362.6 Basic statistical analysis with 393 Introduction to Bayesian Methods 533.1 Bayesian Philosophy 533.2 Basic Probability Elements 573.3 Bayes Theorem 623.4 Prior and Posterior Distributions 643.5 Working with the Posterior Distribution 663.6 Choosing the Prior Distr

  7. Asymptotically minimax Bayesian predictive densities for multinomial models

    CERN Document Server

    Komaki, Fumiyasu

    2011-01-01

    One-step ahead prediction for the multinomial model is considered. The performance of a predictive density is evaluated by the average Kullback-Leibler divergence from the true density to the predictive density. Asymptotic approximations of risk functions of Bayesian predictive densities based on Dirichlet priors are obtained. It is shown that a Bayesian predictive density based on a specific Dirichlet prior is asymptotically minimax. The asymptotically minimax prior is different from known objective priors such as the Jeffreys prior or the uniform prior.

  8. Uncertainty Modeling Based on Bayesian Network in Ontology Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Yuhua; LIU Tao; SUN Xiaolin

    2006-01-01

    How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.

  9. A surrogate model enables a Bayesian approach to the inverse problem of scatterometry

    International Nuclear Information System (INIS)

    Scatterometry is an indirect optical method for the determination of photomask geometry parameters from scattered light intensities by solving an inverse problem. The Bayesian approach is a powerful method to solve the inverse problem. In the Bayesian framework estimates of parameters and associated uncertainties are obtained from posterior distributions. The determination the probability distribution is typically based on Markov chain Monte Carlo (MCMC) methods. However, in scatterometry the evaluation of MCMC steps require solutions of partial differential equations that are computationally expensive and application of MCMC methods is thus impractical. In this article we introduce a surrogate model for scatterometry based on polynomial chaos that can be treated by Bayesian inference. We compare the results of the surrogate model with rigorous finite element simulations and demonstrate its convergence. The accuracy reaches a value of lower than one percent for a sufficient fine mesh and the speed up amounts more than two order of magnitudes. Furthermore, we apply the surrogate model to MCMC calculations and we reconstruct geometry parameters of a photomask

  10. Top-Down Feedback in an HMAX-Like Cortical Model of Object Perception Based on Hierarchical Bayesian Networks and Belief Propagation

    OpenAIRE

    Salvador Dura-Bernal; Thomas Wennekers; DENHAM, SUSAN L.

    2012-01-01

    Hierarchical generative models, such as Bayesian networks, and belief propagation have been shown to provide a theoretical framework that can account for perceptual processes, including feedforward recognition and feedback modulation. The framework explains both psychophysical and physiological experimental data and maps well onto the hierarchical distributed cortical anatomy. However, the complexity required to model cortical processes makes inference, even using approximate methods, very co...

  11. Hierarchical Bayesian spatial models for multispecies conservation planning and monitoring.

    Science.gov (United States)

    Carroll, Carlos; Johnson, Devin S; Dunk, Jeffrey R; Zielinski, William J

    2010-12-01

    Biologists who develop and apply habitat models are often familiar with the statistical challenges posed by their data's spatial structure but are unsure of whether the use of complex spatial models will increase the utility of model results in planning. We compared the relative performance of nonspatial and hierarchical Bayesian spatial models for three vertebrate and invertebrate taxa of conservation concern (Church's sideband snails [Monadenia churchi], red tree voles [Arborimus longicaudus], and Pacific fishers [Martes pennanti pacifica]) that provide examples of a range of distributional extents and dispersal abilities. We used presence-absence data derived from regional monitoring programs to develop models with both landscape and site-level environmental covariates. We used Markov chain Monte Carlo algorithms and a conditional autoregressive or intrinsic conditional autoregressive model framework to fit spatial models. The fit of Bayesian spatial models was between 35 and 55% better than the fit of nonspatial analogue models. Bayesian spatial models outperformed analogous models developed with maximum entropy (Maxent) methods. Although the best spatial and nonspatial models included similar environmental variables, spatial models provided estimates of residual spatial effects that suggested how ecological processes might structure distribution patterns. Spatial models built from presence-absence data improved fit most for localized endemic species with ranges constrained by poorly known biogeographic factors and for widely distributed species suspected to be strongly affected by unmeasured environmental variables or population processes. By treating spatial effects as a variable of interest rather than a nuisance, hierarchical Bayesian spatial models, especially when they are based on a common broad-scale spatial lattice (here the national Forest Inventory and Analysis grid of 24 km(2) hexagons), can increase the relevance of habitat models to multispecies

  12. Bayesian nonparametric estimation of hazard rate in monotone Aalen model

    Czech Academy of Sciences Publication Activity Database

    Timková, Jana

    2014-01-01

    Roč. 50, č. 6 (2014), s. 849-868. ISSN 0023-5954 Institutional support: RVO:67985556 Keywords : Aalen model * Bayesian estimation * MCMC Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.541, year: 2014 http://library.utia.cas.cz/separaty/2014/SI/timkova-0438210.pdf

  13. An Inhomogeneous Bayesian Texture Model for Spatially Varying Parameter Estimation

    OpenAIRE

    Dharmagunawardhana, Chathurika; Mahmoodi, Sasan; Bennett, Michael; Niranjan, Mahesan

    2014-01-01

    In statistical model based texture feature extraction, features based on spatially varying parameters achieve higher discriminative performances compared to spatially constant parameters. In this paper we formulate a novel Bayesian framework which achieves texture characterization by spatially varying parameters based on Gaussian Markov random fields. The parameter estimation is carried out by Metropolis-Hastings algorithm. The distributions of estimated spatially varying paramete...

  14. A Bayesian computational model for online character recognition and disability assessment during cursive eye writing

    Directory of Open Access Journals (Sweden)

    JulienDiard

    2013-11-01

    Full Text Available This research involves a novel apparatus, in which the user is presented with an illusion inducing visual stimulus. The user perceives illusory movement that can be followed by the eye, so that smooth pursuit eye movements can be sustained in arbitrary directions. Thus, free-flow trajectories of any shape can be traced. In other words, coupled with an eye-tracking device, this apparatus enables "eye writing", which appears to be an original object of study. We adapt a previous model of reading and writing to this context. We describe a probabilistic model called the Bayesian Action-Perception for Eye On-Line model (BAP-EOL. It encodes probabilistic knowledge about isolated letter trajectories, their size, high-frequency components of the produced trajectory, and pupil diameter. We show how Bayesian inference, in this single model, can be used to solve several tasks, like letter recognition and novelty detection (i.e., recognizing when a presented character is not part of the learned database. We are interested in the potential use of the eye writing apparatus by motor impaired patients: the final task we solve by Bayesian inference is disability assessment (i.e., measuring and tracking the evolution of motor characteristics of produced trajectories. Preliminary experimental results are presented, which illustrate the method, showing the feasibility of character recognition in the context of eye writing. We then show experimentally how a model of the unknown character can be used to detect trajectories that are likely to be new symbols, and how disability assessment can be performed by opportunistically observing characteristics of fine motor control, as letter are being traced. Experimental analyses also help identify specificities of eye writing, as compared to handwriting, and the resulting technical challenges.

  15. Research on Bayesian Network Based User's Interest Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Weifeng; XU Baowen; CUI Zifeng; XU Lei

    2007-01-01

    It has very realistic significance for improving the quality of users' accessing information to filter and selectively retrieve the large number of information on the Internet. On the basis of analyzing the existing users' interest models and some basic questions of users' interest (representation, derivation and identification of users' interest), a Bayesian network based users' interest model is given. In this model, the users' interest reduction algorithm based on Markov Blanket model is used to reduce the interest noise, and then users' interested and not interested documents are used to train the Bayesian network. Compared to the simple model, this model has the following advantages like small space requirements, simple reasoning method and high recognition rate. The experiment result shows this model can more appropriately reflect the user's interest, and has higher performance and good usability.

  16. Bayesian model selection validates a biokinetic model for zirconium processing in humans

    Directory of Open Access Journals (Sweden)

    Schmidl Daniel

    2012-08-01

    Full Text Available Abstract Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology.

  17. Bayesian estimation of parameters in a regional hydrological model

    Directory of Open Access Journals (Sweden)

    K. Engeland

    2002-01-01

    Full Text Available This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC analysis. The Bayesian method requires formulation of a likelihood function for the parameters and three alternative formulations are used. The first is a subjectively chosen objective function that describes the goodness of fit between the simulated and observed streamflow, as defined in the GLUE framework. The second and third formulations are more statistically correct likelihood models that describe the simulation errors. The full statistical likelihood model describes the simulation errors as an AR(1 process, whereas the simple model excludes the auto-regressive part. The statistical parameters depend on the catchments and the hydrological processes and the statistical and the hydrological parameters are estimated simultaneously. The results show that the simple likelihood model gives the most robust parameter estimates. The simulation error may be explained to a large extent by the catchment characteristics and climatic conditions, so it is possible to transfer knowledge about them to ungauged catchments. The statistical models for the simulation errors indicate that structural errors in the model are more important than parameter uncertainties. Keywords: regional hydrological model, model uncertainty, Bayesian analysis, Markov Chain Monte Carlo analysis

  18. A Bayesian Markov geostatistical model for estimation of hydrogeological properties

    International Nuclear Information System (INIS)

    A geostatistical methodology based on Markov-chain analysis and Bayesian statistics was developed for probability estimations of hydrogeological and geological properties in the siting process of a nuclear waste repository. The probability estimates have practical use in decision-making on issues such as siting, investigation programs, and construction design. The methodology is nonparametric which makes it possible to handle information that does not exhibit standard statistical distributions, as is often the case for classified information. Data do not need to meet the requirements on additivity and normality as with the geostatistical methods based on regionalized variable theory, e.g., kriging. The methodology also has a formal way for incorporating professional judgments through the use of Bayesian statistics, which allows for updating of prior estimates to posterior probabilities each time new information becomes available. A Bayesian Markov Geostatistical Model (BayMar) software was developed for implementation of the methodology in two and three dimensions. This paper gives (1) a theoretical description of the Bayesian Markov Geostatistical Model; (2) a short description of the BayMar software; and (3) an example of application of the model for estimating the suitability for repository establishment with respect to the three parameters of lithology, hydraulic conductivity, and rock quality designation index (RQD) at 400--500 meters below ground surface in an area around the Aespoe Hard Rock Laboratory in southeastern Sweden

  19. Short-term volcanic hazard assessment through Bayesian inference: retrospective application to the Pinatubo 1991 volcanic crisis

    Science.gov (United States)

    Sobradelo, Rosa; Martí, Joan

    2015-01-01

    One of the most challenging aspects of managing a volcanic crisis is the interpretation of the monitoring data, so as to anticipate to the evolution of the unrest and implement timely mitigation actions. An unrest episode may include different stages or time intervals of increasing activity that may or may not precede a volcanic eruption, depending on the causes of the unrest (magmatic, geothermal or tectonic). Therefore, one of the main goals in monitoring volcanic unrest is to forecast whether or not such increase of activity will end up with an eruption, and if this is the case, how, when, and where this eruption will take place. As an alternative method to expert elicitation for assessing and merging monitoring data and relevant past information, we present a probabilistic method to transform precursory activity into the probability of experiencing a significant variation by the next time interval (i.e. the next step in the unrest), given its preceding evolution, and by further estimating the probability of the occurrence of a particular eruptive scenario combining monitoring and past data. With the 1991 Pinatubo volcanic crisis as a reference, we have developed such a method to assess short-term volcanic hazard using Bayesian inference.

  20. Bayesian inference of cosmic density fields from non-linear, scale-dependent, and stochastic biased tracers

    Science.gov (United States)

    Ata, Metin; Kitaura, Francisco-Shu; Müller, Volker

    2015-02-01

    We present a Bayesian reconstruction algorithm to generate unbiased samples of the underlying dark matter field from halo catalogues. Our new contribution consists of implementing a non-Poisson likelihood including a deterministic non-linear and scale-dependent bias. In particular we present the Hamiltonian equations of motions for the negative binomial (NB) probability distribution function. This permits us to efficiently sample the posterior distribution function of density fields given a sample of galaxies using the Hamiltonian Monte Carlo technique implemented in the ARGO code. We have tested our algorithm with the Bolshoi N-body simulation at redshift z = 0, inferring the underlying dark matter density field from subsamples of the halo catalogue with biases smaller and larger than one. Our method shows that we can draw closely unbiased samples (compatible within 1-σ) from the posterior distribution up to scales of about k ˜ 1 h Mpc-1 in terms of power-spectra and cell-to-cell correlations. We find that a Poisson likelihood including a scale-dependent non-linear deterministic bias can yield reconstructions with power spectra deviating more than 10 per cent at k = 0.2 h Mpc-1. Our reconstruction algorithm is especially suited for emission line galaxy data for which a complex non-linear stochastic biasing treatment beyond Poissonity becomes indispensable.

  1. Bayesian inference of cosmic density fields from non-linear, scale-dependent, and stochastic biased tracers

    CERN Document Server

    Ata, Metin; Müller, Volker

    2014-01-01

    We present a Bayesian reconstruction algorithm to generate unbiased samples of the underlying dark matter field from galaxy redshift data. Our new contribution consists of implementing a non-Poisson likelihood including a deterministic non-linear and scale-dependent bias. In particular we present the Hamiltonian equations of motions for the negative binomial (NB) probability distribution function. This permits us to efficiently sample the posterior distribution function of density fields given a sample of galaxies using the Hamiltonian Monte Carlo technique implemented in the Argo code. We have tested our algorithm with the Bolshoi N-body simulation, inferring the underlying dark matter density field from a subsample of the halo catalogue. Our method shows that we can draw closely unbiased samples (compatible within 1-$\\sigma$) from the posterior distribution up to scales of about k~1 h/Mpc in terms of power-spectra and cell-to-cell correlations. We find that a Poisson likelihood yields reconstructions with p...

  2. HASSET: a probability event tree tool to evaluate future volcanic scenarios using Bayesian inference. Presented as a plugin for QGIS.

    Science.gov (United States)

    Sobradelo, Rosa; Bartolini, Stefania; Martí, Joan

    2014-05-01

    Event tree structures constitute one of the most useful and necessary tools of modern volcanology to assess the volcanic hazard of future volcanic scenarios. They are particularly relevant to evaluate long- and short-term probabilities of occurrence of possible volcanic scenarios and their potential impacts on urbanized areas. Here we introduce HASSET, a Hazard Assessment Event Tree probability tool, built on an event tree structure that uses Bayesian inference to estimate the probability of occurrence of a future volcanic scenario, and to evaluate the most relevant sources of uncertainty from the corresponding volcanic system. HASSET includes hazard assessment of non-eruptive and non-magmatic volcanic scenarios, that is, episodes of unrest that do not evolve into volcanic eruption but have an associated volcanic hazard (eg. sector collapse and phreatic explosion), as well as those with external triggers as primary sources of unrest (as opposed to magmatic unrest alone). Additionally, HASSET introduces the Delta method to assess how precise the probability estimates are, by reporting a one standard deviation variability interval around the expected value for each scenario. HASSET is presented as a free software package in the form of a plugin for the open source geographic information system Quantum Gis (QGIS), providing a graphically supported computation of the event tree structure in an interactive and user-friendly way. We also include an example of HASSET applied to Teide-Pico Viejo volcanic complex (Spain).

  3. Bayesian and maximin optimal designs for heteroscedastic regression models

    OpenAIRE

    Dette, Holger; Haines, Linda M.; Imhof, Lorens A.

    2003-01-01

    The problem of constructing standardized maximin D-optimal designs for weighted polynomial regression models is addressed. In particular it is shown that, by following the broad approach to the construction of maximin designs introduced recently by Dette, Haines and Imhof (2003), such designs can be obtained as weak limits of the corresponding Bayesian Φq-optimal designs. The approach is illustrated for two specific weighted polynomial models and also for a particular growth model.

  4. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  5. Bayesian modeling growth curves for quail assuming skewness in errors

    Directory of Open Access Journals (Sweden)

    Robson Marcelo Rossi

    2014-06-01

    Full Text Available Bayesian modeling growth curves for quail assuming skewness in errors - To assume normal distributions in the data analysis is common in different areas of the knowledge. However we can make use of the other distributions that are capable to model the skewness parameter in the situations that is needed to model data with tails heavier than the normal. This article intend to present alternatives to the assumption of the normality in the errors, adding asymmetric distributions. A Bayesian approach is proposed to fit nonlinear models when the errors are not normal, thus, the distributions t, skew-normal and skew-t are adopted. The methodology is intended to apply to different growth curves to the quail body weights. It was found that the Gompertz model assuming skew-normal errors and skew-t errors, respectively for male and female, were the best fitted to the data.

  6. APPLICATION OF BAYESIAN MONTE CARLO ANALYSIS TO A LAGRANGIAN PHOTOCHEMICAL AIR QUALITY MODEL. (R824792)

    Science.gov (United States)

    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  7. A Bayesian approach to the inference of parametric configuration of the signal-to-noise ratio in an adaptive refinement of the measurements

    OpenAIRE

    Marquez, Maria Jose

    2012-01-01

    Calibration is nowadays one of the most important processes involved in the extraction of valuable data from measurements. The current availability of an optimum data cube measured from a heterogeneous set of instruments and surveys relies on a systematic and robust approach in the corresponding measurement analysis. In that sense, the inference of configurable instrument parameters can considerably increase the quality of the data obtained. This paper proposes a solution based on Bayesian in...

  8. Gas turbine engine prognostics using Bayesian hierarchical models: A variational approach

    Science.gov (United States)

    Zaidan, Martha A.; Mills, Andrew R.; Harrison, Robert F.; Fleming, Peter J.

    2016-03-01

    Prognostics is an emerging requirement of modern health monitoring that aims to increase the fidelity of failure-time predictions by the appropriate use of sensory and reliability information. In the aerospace industry it is a key technology to reduce life-cycle costs, improve reliability and asset availability for a diverse fleet of gas turbine engines. In this work, a Bayesian hierarchical model is selected to utilise fleet data from multiple assets to perform probabilistic estimation of remaining useful life (RUL) for civil aerospace gas turbine engines. The hierarchical formulation allows Bayesian updates of an individual predictive model to be made, based upon data received asynchronously from a fleet of assets with different in-service lives and for the entry of new assets into the fleet. In this paper, variational inference is applied to the hierarchical formulation to overcome the computational and convergence concerns that are raised by the numerical sampling techniques needed for inference in the original formulation. The algorithm is tested on synthetic data, where the quality of approximation is shown to be satisfactory with respect to prediction performance, computational speed, and ease of use. A case study of in-service gas turbine engine data demonstrates the value of integrating fleet data for accurately predicting degradation trajectories of assets.

  9. A fully Bayesian method for jointly fitting instrumental calibration and X-ray spectral models

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Jin; Yu, Yaming [Department of Statistics, University of California, Irvine, Irvine, CA 92697-1250 (United States); Van Dyk, David A. [Statistics Section, Imperial College London, Huxley Building, South Kensington Campus, London SW7 2AZ (United Kingdom); Kashyap, Vinay L.; Siemiginowska, Aneta; Drake, Jeremy; Ratzlaff, Pete [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Connors, Alanna; Meng, Xiao-Li, E-mail: jinx@uci.edu, E-mail: yamingy@ics.uci.edu, E-mail: dvandyk@imperial.ac.uk, E-mail: vkashyap@cfa.harvard.edu, E-mail: asiemiginowska@cfa.harvard.edu, E-mail: jdrake@cfa.harvard.edu, E-mail: pratzlaff@cfa.harvard.edu, E-mail: meng@stat.harvard.edu [Department of Statistics, Harvard University, 1 Oxford Street, Cambridge, MA 02138 (United States)

    2014-10-20

    Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is 'pragmatic' in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use a principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.

  10. A fully Bayesian method for jointly fitting instrumental calibration and X-ray spectral models

    International Nuclear Information System (INIS)

    Owing to a lack of robust principled methods, systematic instrumental uncertainties have generally been ignored in astrophysical data analysis despite wide recognition of the importance of including them. Ignoring calibration uncertainty can cause bias in the estimation of source model parameters and can lead to underestimation of the variance of these estimates. We previously introduced a pragmatic Bayesian method to address this problem. The method is 'pragmatic' in that it introduced an ad hoc technique that simplified computation by neglecting the potential information in the data for narrowing the uncertainty for the calibration product. Following that work, we use a principal component analysis to efficiently represent the uncertainty of the effective area of an X-ray (or γ-ray) telescope. Here, however, we leverage this representation to enable a principled, fully Bayesian method that coherently accounts for the calibration uncertainty in high-energy spectral analysis. In this setting, the method is compared with standard analysis techniques and the pragmatic Bayesian method. The advantage of the fully Bayesian method is that it allows the data to provide information not only for estimation of the source parameters but also for the calibration product—here the effective area, conditional on the adopted spectral model. In this way, it can yield more accurate and efficient estimates of the source parameters along with valid estimates of their uncertainty. Provided that the source spectrum can be accurately described by a parameterized model, this method allows rigorous inference about the effective area by quantifying which possible curves are most consistent with the data.

  11. A Bayesian nonlinear mixed-effects disease progression model

    Science.gov (United States)

    Kim, Seongho; Jang, Hyejeong; Wu, Dongfeng; Abrams, Judith

    2016-01-01

    A nonlinear mixed-effects approach is developed for disease progression models that incorporate variation in age in a Bayesian framework. We further generalize the probability model for sensitivity to depend on age at diagnosis, time spent in the preclinical state and sojourn time. The developed models are then applied to the Johns Hopkins Lung Project data and the Health Insurance Plan for Greater New York data using Bayesian Markov chain Monte Carlo and are compared with the estimation method that does not consider random-effects from age. Using the developed models, we obtain not only age-specific individual-level distributions, but also population-level distributions of sensitivity, sojourn time and transition probability. PMID:26798562

  12. Inferring gene regression networks with model trees

    Directory of Open Access Journals (Sweden)

    Aguilar-Ruiz Jesus S

    2010-10-01

    Full Text Available Abstract Background Novel strategies are required in order to handle the huge amount of data produced by microarray technologies. To infer gene regulatory networks, the first step is to find direct regulatory relationships between genes building the so-called gene co-expression networks. They are typically generated using correlation statistics as pairwise similarity measures. Correlation-based methods are very useful in order to determine whether two genes have a strong global similarity but do not detect local similarities. Results We propose model trees as a method to identify gene interaction networks. While correlation-based methods analyze each pair of genes, in our approach we generate a single regression tree for each gene from the remaining genes. Finally, a graph from all the relationships among output and input genes is built taking into account whether the pair of genes is statistically significant. For this reason we apply a statistical procedure to control the false discovery rate. The performance of our approach, named REGNET, is experimentally tested on two well-known data sets: Saccharomyces Cerevisiae and E.coli data set. First, the biological coherence of the results are tested. Second the E.coli transcriptional network (in the Regulon database is used as control to compare the results to that of a correlation-based method. This experiment shows that REGNET performs more accurately at detecting true gene associations than the Pearson and Spearman zeroth and first-order correlation-based methods. Conclusions REGNET generates gene association networks from gene expression data, and differs from correlation-based methods in that the relationship between one gene and others is calculated simultaneously. Model trees are very useful techniques to estimate the numerical values for the target genes by linear regression functions. They are very often more precise than linear regression models because they can add just different linear

  13. Modelling the dynamics of an experimental host-pathogen microcosm within a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    David Lunn

    Full Text Available The advantages of Bayesian statistical approaches, such as flexibility and the ability to acknowledge uncertainty in all parameters, have made them the prevailing method for analysing the spread of infectious diseases in human or animal populations. We introduce a Bayesian approach to experimental host-pathogen systems that shares these attractive features. Since uncertainty in all parameters is acknowledged, existing information can be accounted for through prior distributions, rather than through fixing some parameter values. The non-linear dynamics, multi-factorial design, multiple measurements of responses over time and sampling error that are typical features of experimental host-pathogen systems can also be naturally incorporated. We analyse the dynamics of the free-living protozoan Paramecium caudatum and its specialist bacterial parasite Holospora undulata. Our analysis provides strong evidence for a saturable infection function, and we were able to reproduce the two waves of infection apparent in the data by separating the initial inoculum from the parasites released after the first cycle of infection. In addition, the parameter estimates from the hierarchical model can be combined to infer variations in the parasite's basic reproductive ratio across experimental groups, enabling us to make predictions about the effect of resources and host genotype on the ability of the parasite to spread. Even though the high level of variability between replicates limited the resolution of the results, this Bayesian framework has strong potential to be used more widely in experimental ecology.

  14. Bayesian networks with applications in reliability analysis

    OpenAIRE

    Langseth, Helge

    2002-01-01

    A common goal of the papers in this thesis is to propose, formalize and exemplify the use of Bayesian networks as a modelling tool in reliability analysis. The papers span work in which Bayesian networks are merely used as a modelling tool (Paper I), work where models are specially designed to utilize the inference algorithms of Bayesian networks (Paper II and Paper III), and work where the focus has been on extending the applicability of Bayesian networks to very large domains (Paper IV and ...

  15. Non-stationarity in GARCH models: A Bayesian analysis

    OpenAIRE

    Kleibergen, Frank; Dijk, Herman

    1993-01-01

    textabstractFirst, the non-stationarity properties of the conditional variances in the GARCH(1,1) model are analysed using the concept of infinite persistence of shocks. Given a time sequence of probabilities for increasing/decreasing conditional variances, a theoretical formula for quasi-strict non-stationarity is defined. The resulting conditions for the GARCH(1,1) model are shown to differ from the weak stationarity conditions mainly used in the literature. Bayesian statistical analysis us...

  16. A New Bayesian Unit Root Test in Stochastic Volatility Models

    OpenAIRE

    Yong Li; Jun Yu

    2010-01-01

    A new posterior odds analysis is proposed to test for a unit root in volatility dynamics in the context of stochastic volatility models. This analysis extends the Bayesian unit root test of So and Li (1999, Journal of Business Economic Statistics) in two important ways. First, a numerically more stable algorithm is introduced to compute the Bayes factor, taking into account the special structure of the competing models. Owing to its numerical stability, the algorithm overcomes the problem of ...

  17. Bayesian Modelling in Machine Learning: A Tutorial Review

    OpenAIRE

    Seeger, Matthias

    2006-01-01

    Many facets of Bayesian Modelling are firmly established in Machine Learning and give rise to state-of-the-art solutions to application problems. The sheer number of techniques, ideas and models which have been proposed, and the terminology, can be bewildering. With this tutorial review, we aim to give a wide high-level overview over this important field, concentrating on central ideas and methods, and on their interconnections. The reader will gain a basic understanding of the topics and the...

  18. Bayesian modeling and prediction of solar particles flux

    International Nuclear Information System (INIS)

    An autoregression model was developed based on the Bayesian approach. Considering the solar wind non-homogeneity, the idea was applied of combining the pure autoregressive properties of the model with expert knowledge based on a similar behaviour of the various phenomena related to the flux properties. Examples of such situations include the hardening of the X-ray spectrum, which is often followed by coronal mass ejection and a significant increase in the particles flux intensity

  19. Bayesian modeling and prediction of solar particles flux

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Kalová, J.

    18/56/, 7/8 (2010), s. 228-230. ISSN 1210-7085 R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : mathematical models * solar activity * solar flares * solar flux * solar particles Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2010/AS/dedecius-bayesian modeling and prediction of solar particles flux.pdf

  20. Hierarchical Bayesian Modeling of Hitting Performance in Baseball

    OpenAIRE

    Jensen, Shane T.; McShane, Blake; Wyner, Abraham J.

    2009-01-01

    We have developed a sophisticated statistical model for predicting the hitting performance of Major League baseball players. The Bayesian paradigm provides a principled method for balancing past performance with crucial covariates, such as player age and position. We share information across time and across players by using mixture distributions to control shrinkage for improved accuracy. We compare the performance of our model to current sabermetric methods on a held-out seaso...

  1. Bayesian estimation of a DSGE model with inventories

    OpenAIRE

    Foerster, Marcel

    2011-01-01

    This paper introduces inventories in an otherwise standard Dynamic Stochastic General Equilibrium Model (DSGE) of the business cycle. Firms accumulate inventories to facilitate sales, but face a cost of doing so in terms of costly storage of intermediate goods. The paper's main contribution is to present a DSGE model with inventories that is estimated using Bayesian methods. Based on US data we show that accounting for inventory dynamics has a significant impact on parameter estimates and imp...

  2. Computational modeling of neural activities for statistical inference

    CERN Document Server

    Kolossa, Antonio

    2016-01-01

    This authored monograph supplies empirical evidence for the Bayesian brain hypothesis by modeling event-related potentials (ERP) of the human electroencephalogram (EEG) during successive trials in cognitive tasks. The employed observer models are useful to compute probability distributions over observable events and hidden states, depending on which are present in the respective tasks. Bayesian model selection is then used to choose the model which best explains the ERP amplitude fluctuations. Thus, this book constitutes a decisive step towards a better understanding of the neural coding and computing of probabilities following Bayesian rules. The target audience primarily comprises research experts in the field of computational neurosciences, but the book may also be beneficial for graduate students who want to specialize in this field. .

  3. Bayesian Adaptive Exploration

    Science.gov (United States)

    Loredo, Thomas J.

    2004-04-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.

  4. Bayesian point event modeling in spatial and environmental epidemiology.

    Science.gov (United States)

    Lawson, Andrew B

    2012-10-01

    This paper reviews the current state of point event modeling in spatial epidemiology from a Bayesian perspective. Point event (or case event) data arise when geo-coded addresses of disease events are available. Often, this level of spatial resolution would not be accessible due to medical confidentiality constraints. However, for the examination of small spatial scales, it is important to be capable of examining point process data directly. Models for such data are usually formulated based on point process theory. In addition, special conditioning arguments can lead to simpler Bernoulli likelihoods and logistic spatial models. Goodness-of-fit diagnostics and Bayesian residuals are also considered. Applications within putative health hazard risk assessment, cluster detection, and linkage to environmental risk fields (misalignment) are considered. PMID:23035034

  5. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Directory of Open Access Journals (Sweden)

    Volker J. Schmid

    2007-10-01

    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  6. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  7. Bayesian inference for an emerging arboreal epidemic in the presence of control.

    Science.gov (United States)

    Parry, Matthew; Gibson, Gavin J; Parnell, Stephen; Gottwald, Tim R; Irey, Michael S; Gast, Timothy C; Gilligan, Christopher A

    2014-04-29

    The spread of Huanglongbing through citrus groves is used as a case study for modeling an emerging epidemic in the presence of a control. Specifically, the spread of the disease is modeled as a susceptible-exposed-infectious-detected-removed epidemic, where the exposure and infectious times are not observed, detection times are censored, removal times are known, and the disease is spreading through a heterogeneous host population with trees of different age and susceptibility. We show that it is possible to characterize the disease transmission process under these conditions. Two innovations in our work are (i) accounting for control measures via time dependence of the infectious process and (ii) including seasonal and host age effects in the model of the latent period. By estimating parameters in different subregions of a large commercially cultivated orchard, we establish a temporal pattern of invasion, host age dependence of the dispersal parameters, and a close to linear relationship between primary and secondary infectious rates. The model can be used to simulate Huanglongbing epidemics to assess economic costs and potential benefits of putative control scenarios. PMID:24711393

  8. Data-Driven Inference on Sign Restrictions in Bayesian Structural Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    Sign-identified structural vector autoregressive (SVAR) models have recently become popular. However, the conventional approach to sign restrictions only yields set identification, and implicitly assumes an informative prior distribution of the impulse responses whose influence does not vanish as...... methods by two empirical applications to U.S. macroeconomic data....

  9. Bayesian analysis of recursive SVAR models with overidentifying restrictions

    OpenAIRE

    Kociecki, Andrzej; Rubaszek, Michał; Ca' Zorzi, Michele

    2012-01-01

    The paper provides a novel Bayesian methodological framework to estimate structural VAR (SVAR) models with recursive identification schemes that allows for the inclusion of over-identifying restrictions. The proposed framework enables the researcher to (i) elicit the prior on the non-zero contemporaneous relations between economic variables and to (ii) derive an analytical expression for the posterior distribution and marginal data density. We illustrate our methodological framework by estima...

  10. Bayesian parsimonious covariance estimation for hierarchical linear mixed models

    OpenAIRE

    Frühwirth-Schnatter, Sylvia; Tüchler, Regina

    2004-01-01

    We considered a non-centered parameterization of the standard random-effects model, which is based on the Cholesky decomposition of the variance-covariance matrix. The regression type structure of the non-centered parameterization allows to choose a simple, conditionally conjugate normal prior on the Cholesky factor. Based on the non-centered parameterization, we search for a parsimonious variance-covariance matrix by identifying the non-zero elements of the Cholesky factors using Bayesian va...

  11. Diffusion Estimation Of State-Space Models: Bayesian Formulation

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil

    Reims: IEEE, 2014. ISBN 978-1-4799-3693-9. [The 24th IEEE International Workshop on Machine Learning for Signal Processing (MLSP2014). Reims (FR), 21.09.2014-24.09.2014] R&D Projects: GA ČR(CZ) GP14-06678P Keywords : distributed estimation * state-space models * Bayesian estimation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2014/AS/dedecius-0431804.pdf

  12. Bayesian Methods for Neural Networks and Related Models

    OpenAIRE

    Titterington, D.M.

    2004-01-01

    Models such as feed-forward neural networks and certain other structures investigated in the computer science literature are not amenable to closed-form Bayesian analysis. The paper reviews the various approaches taken to overcome this difficulty, involving the use of Gaussian approximations, Markov chain Monte Carlo simulation routines and a class of non-Gaussian but “deterministic” approximations called variational approximations.

  13. Bayesian network models in brain functional connectivity analysis

    OpenAIRE

    Ide, Jaime S.; Zhang, Sheng; Chiang-shan R. Li

    2013-01-01

    Much effort has been made to better understand the complex integration of distinct parts of the human brain using functional magnetic resonance imaging (fMRI). Altered functional connectivity between brain regions is associated with many neurological and mental illnesses, such as Alzheimer and Parkinson diseases, addiction, and depression. In computational science, Bayesian networks (BN) have been used in a broad range of studies to model complex data set in the presence of uncertainty and wh...

  14. Dead or gone? Bayesian inference on mortality for the dispersing sex

    DEFF Research Database (Denmark)

    Barthold, Julia; Packer, Craig; Loveridge, Andrew;

    2016-01-01

    provides a solution to the challenge of estimating mortality of the dispersing sex in species with data deficiency due to natal dispersal. Given the utility of sex-specific mortality estimates in biological and conservation research, and the virtual ubiquity of sex-biased dispersal, our model will be......Estimates of age-specific mortality are regularly used in ecology, evolution, and conservation research. However, estimating mortality of the dispersing sex, in species where one sex undergoes natal dispersal, is difficult. This is because it is often unclear whether members of the dispersing sex...... that disappear from monitored areas have died or dispersed. Here, we develop an extension of a multievent model that imputes dispersal state (i.e., died or dispersed) for uncertain records of the dispersing sex as a latent state and estimates age-specific mortality and dispersal parameters in a...

  15. Technical Note: Approximate Bayesian parameterization of a complex tropical forest model

    Directory of Open Access Journals (Sweden)

    F. Hartig

    2013-08-01

    Full Text Available Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics, and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC, another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter

  16. Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model

    Science.gov (United States)

    Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.

    2014-02-01

    Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation

  17. Technical Note: Approximate Bayesian parameterization of a complex tropical forest model

    Science.gov (United States)

    Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.

    2013-08-01

    Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can

  18. Bayesian inference for an emerging arboreal epidemic in the presence of control

    OpenAIRE

    Parry, Matthew; Gibson, Gavin J.; Parnell, Stephen; Tim R. Gottwald; Irey, Michael S.; Gast, Timothy C.; Gilligan, Christopher A.

    2014-01-01

    Fast-moving and destructive emerging epidemics are seldom left to run their course because of the imperative to control further spread. Contemporaneous control measures, however, greatly complicate the characterization of the disease transmission process and the extraction of the epidemiological parameters of interest. The spread of Huanglongbing on orchard scales is used as a case study for modeling an emerging epidemic in the presence of control. We show that even with missing and censored ...

  19. Global sensitivity analysis and Bayesian parameter inference for solute transport in porous media colonized by biofilms

    Science.gov (United States)

    Younes, A.; Delay, F.; Fajraoui, N.; Fahs, M.; Mara, T. A.

    2016-08-01

    The concept of dual flowing continuum is a promising approach for modeling solute transport in porous media that includes biofilm phases. The highly dispersed transit time distributions often generated by these media are taken into consideration by simply stipulating that advection-dispersion transport occurs through both the porous and the biofilm phases. Both phases are coupled but assigned with contrasting hydrodynamic properties. However, the dual flowing continuum suffers from intrinsic equifinality in the sense that the outlet solute concentration can be the result of several parameter sets of the two flowing phases. To assess the applicability of the dual flowing continuum, we investigate how the model behaves with respect to its parameters. For the purpose of this study, a Global Sensitivity Analysis (GSA) and a Statistical Calibration (SC) of model parameters are performed for two transport scenarios that differ by the strength of interaction between the flowing phases. The GSA is shown to be a valuable tool to understand how the complex system behaves. The results indicate that the rate of mass transfer between the two phases is a key parameter of the model behavior and influences the identifiability of the other parameters. For weak mass exchanges, the output concentration is mainly controlled by the velocity in the porous medium and by the porosity of both flowing phases. In the case of large mass exchanges, the kinetics of this exchange also controls the output concentration. The SC results show that transport with large mass exchange between the flowing phases is more likely affected by equifinality than transport with weak exchange. The SC also indicates that weakly sensitive parameters, such as the dispersion in each phase, can be accurately identified. Removing them from calibration procedures is not recommended because it might result in biased estimations of the highly sensitive parameters.

  20. Hybrid optimization and Bayesian inference techniques for a non-smooth radiation detection problem

    OpenAIRE

    Stefanescu, Razvan; Schmidt, Kathleen; Hite, Jason; Smith, Ralph; Mattingly, John

    2016-01-01

    In this investigation, we propose several algorithms to recover the location and intensity of a radiation source located in a simulated 250 m x 180 m block in an urban center based on synthetic measurements. Radioactive decay and detection are Poisson random processes, so we employ likelihood functions based on this distribution. Due to the domain geometry and the proposed response model, the negative logarithm of the likelihood is only piecewise continuous differentiable, and it has multiple...