WorldWideScience

Sample records for bayesian image analysis

  1. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  2. BaTMAn: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  3. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real Integral-Field Spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of two. Our analysis reveals that the algorithm prioritizes conservation of all the statistically-significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BATMAN is not to be used as a `black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially-resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  4. Nuclear stockpile stewardship and Bayesian image analysis (DARHT and the BIE)

    Energy Technology Data Exchange (ETDEWEB)

    Carroll, James L [Los Alamos National Laboratory

    2011-01-11

    Since the end of nuclear testing, the reliability of our nation's nuclear weapon stockpile has been performed using sub-critical hydrodynamic testing. These tests involve some pretty 'extreme' radiography. We will be discussing the challenges and solutions to these problems provided by DARHT (the world's premiere hydrodynamic testing facility) and the BIE or Bayesian Inference Engine (a powerful radiography analysis software tool). We will discuss the application of Bayesian image analysis techniques to this important and difficult problem.

  5. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  6. Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images

    Science.gov (United States)

    2010-04-01

    OF EACH CELL ARE RESULTS OF KSVD AND BPFA, RESPECTIVELY. σ C.man House Peppers Lena Barbara Boats F.print Couple Hill 5 37.87 39.37 37.78 38.60 38.08...INTERPOLATION PSNR RESULTS, USING PATCH SIZE 8× 8. BOTTOM: BPFA RGB IMAGE INTERPOLATION PSNR RESULTS, USING PATCH SIZE 7× 7. data ratio C.man House Peppers Lena...of subspaces. IEEE Trans. Inform. Theory, 2009. [16] T. Ferguson . A Bayesian analysis of some nonparametric problems. Annals of Statistics, 1:209–230

  7. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  8. Bayesian Image Reconstruction Based on Voronoi Diagrams

    CERN Document Server

    Cabrera, G F; Hitschfeld, N

    2007-01-01

    We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.

  9. Bayesian uncertainty analysis for advanced seismic imaging - Application to the Mentelle Basin, Australia

    Science.gov (United States)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2016-04-01

    multivariate posterior distribution. The novelty of our approach and the major difference compared to the traditional semblance spectrum velocity analysis procedure is the calculation of uncertainty of the output model. As the model is able to estimate the credibility intervals of the corresponding interval velocities, we can produce the most probable PSDM images in an iterative manner. The depths extracted using our statistical algorithm are in very good agreement with the key horizons retrieved from the drilled core DSDP-258, showing that the Bayesian model is able to control the depth migration of the seismic data and estimate the uncertainty to the drilling targets.

  10. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2017-04-12

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  11. From retrodiction to Bayesian quantum imaging

    Science.gov (United States)

    Speirits, Fiona C.; Sonnleitner, Matthias; Barnett, Stephen M.

    2017-04-01

    We employ quantum retrodiction to develop a robust Bayesian algorithm for reconstructing the intensity values of an image from sparse photocount data, while also accounting for detector noise in the form of dark counts. This method yields not only a reconstructed image but also provides the full probability distribution function for the intensity at each pixel. We use simulated as well as real data to illustrate both the applications of the algorithm and the analysis options that are only available when the full probability distribution functions are known. These include calculating Bayesian credible regions for each pixel intensity, allowing an objective assessment of the reliability of the reconstructed image intensity values.

  12. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  13. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...

  14. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  15. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  16. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  17. BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS

    Directory of Open Access Journals (Sweden)

    Thordis Linda Thorarinsdottir

    2011-05-01

    Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.

  18. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization....

  19. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  20. Bayesian Methods for Statistical Analysis

    OpenAIRE

    Puza, Borek

    2015-01-01

    Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...

  1. Bayesian Fusion of Multi-Band Images

    CERN Document Server

    Wei, Qi; Tourneret, Jean-Yves

    2013-01-01

    In this paper, a Bayesian fusion technique for remotely sensed multi-band images is presented. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution exploiting geometrical consideration is introduced. To compute the Bayesian estimator of the scene of interest from its posterior distribution, a Markov chain Monte Carlo algorithm is designed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimension distribution, a Hamiltonian Monte Carlo step is introduced in the Gibbs sampling strategy. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques. In particular, low spatial resolution hyperspectral and mult...

  2. Bayesian Analysis of Experimental Data

    Directory of Open Access Journals (Sweden)

    Lalmohan Bhar

    2013-10-01

    Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.

  3. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....... are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...

  4. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  5. Sparse Bayesian learning in ISAR tomography imaging

    Institute of Scientific and Technical Information of China (English)

    SU Wu-ge; WANG Hong-qiang; DENG Bin; WANG Rui-jun; QIN Yu-liang

    2015-01-01

    Inverse synthetic aperture radar (ISAR) imaging can be regarded as a narrow-band version of the computer aided tomography (CT). The traditional CT imaging algorithms for ISAR, including the polar format algorithm (PFA) and the convolution back projection algorithm (CBP), usually suffer from the problem of the high sidelobe and the low resolution. The ISAR tomography image reconstruction within a sparse Bayesian framework is concerned. Firstly, the sparse ISAR tomography imaging model is established in light of the CT imaging theory. Then, by using the compressed sensing (CS) principle, a high resolution ISAR image can be achieved with limited number of pulses. Since the performance of existing CS-based ISAR imaging algorithms is sensitive to the user parameter, this makes the existing algorithms inconvenient to be used in practice. It is well known that the Bayesian formalism of recover algorithm named sparse Bayesian learning (SBL) acts as an effective tool in regression and classification, which uses an efficient expectation maximization procedure to estimate the necessary parameters, and retains a preferable property of thel0-norm diversity measure. Motivated by that, a fully automated ISAR tomography imaging algorithm based on SBL is proposed. Experimental results based on simulated and electromagnetic (EM) data illustrate the effectiveness and the superiority of the proposed algorithm over the existing algorithms.

  6. Bayesian learning of sparse multiscale image representations.

    Science.gov (United States)

    Hughes, James Michael; Rockmore, Daniel N; Wang, Yang

    2013-12-01

    Multiscale representations of images have become a standard tool in image analysis. Such representations offer a number of advantages over fixed-scale methods, including the potential for improved performance in denoising, compression, and the ability to represent distinct but complementary information that exists at various scales. A variety of multiresolution transforms exist, including both orthogonal decompositions such as wavelets as well as nonorthogonal, overcomplete representations. Recently, techniques for finding adaptive, sparse representations have yielded state-of-the-art results when applied to traditional image processing problems. Attempts at developing multiscale versions of these so-called dictionary learning models have yielded modest but encouraging results. However, none of these techniques has sought to combine a rigorous statistical formulation of the multiscale dictionary learning problem and the ability to share atoms across scales. We present a model for multiscale dictionary learning that overcomes some of the drawbacks of previous approaches by first decomposing an input into a pyramid of distinct frequency bands using a recursive filtering scheme, after which we perform dictionary learning and sparse coding on the individual levels of the resulting pyramid. The associated image model allows us to use a single set of adapted dictionary atoms that is shared--and learned--across all scales in the model. The underlying statistical model of our proposed method is fully Bayesian and allows for efficient inference of parameters, including the level of additive noise for denoising applications. We apply the proposed model to several common image processing problems including non-Gaussian and nonstationary denoising of real-world color images.

  7. ANALYSIS OF BAYESIAN CLASSIFIER ACCURACY

    Directory of Open Access Journals (Sweden)

    Felipe Schneider Costa

    2013-01-01

    Full Text Available The naïve Bayes classifier is considered one of the most effective classification algorithms today, competing with more modern and sophisticated classifiers. Despite being based on unrealistic (naïve assumption that all variables are independent, given the output class, the classifier provides proper results. However, depending on the scenario utilized (network structure, number of samples or training cases, number of variables, the network may not provide appropriate results. This study uses a process variable selection, using the chi-squared test to verify the existence of dependence between variables in the data model in order to identify the reasons which prevent a Bayesian network to provide good performance. A detailed analysis of the data is also proposed, unlike other existing work, as well as adjustments in case of limit values between two adjacent classes. Furthermore, variable weights are used in the calculation of a posteriori probabilities, calculated with mutual information function. Tests were applied in both a naïve Bayesian network and a hierarchical Bayesian network. After testing, a significant reduction in error rate has been observed. The naïve Bayesian network presented a drop in error rates from twenty five percent to five percent, considering the initial results of the classification process. In the hierarchical network, there was not only a drop in fifteen percent error rate, but also the final result came to zero.

  8. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  9. Bayesian image reconstruction: Application to emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, J.; Llacer, J.

    1989-02-01

    In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.

  10. Bayesian depth estimation from monocular natural images.

    Science.gov (United States)

    Su, Che-Chun; Cormack, Lawrence K; Bovik, Alan C

    2017-05-01

    Estimating an accurate and naturalistic dense depth map from a single monocular photographic image is a difficult problem. Nevertheless, human observers have little difficulty understanding the depth structure implied by photographs. Two-dimensional (2D) images of the real-world environment contain significant statistical information regarding the three-dimensional (3D) structure of the world that the vision system likely exploits to compute perceived depth, monocularly as well as binocularly. Toward understanding how this might be accomplished, we propose a Bayesian model of monocular depth computation that recovers detailed 3D scene structures by extracting reliable, robust, depth-sensitive statistical features from single natural images. These features are derived using well-accepted univariate natural scene statistics (NSS) models and recent bivariate/correlation NSS models that describe the relationships between 2D photographic images and their associated depth maps. This is accomplished by building a dictionary of canonical local depth patterns from which NSS features are extracted as prior information. The dictionary is used to create a multivariate Gaussian mixture (MGM) likelihood model that associates local image features with depth patterns. A simple Bayesian predictor is then used to form spatial depth estimates. The depth results produced by the model, despite its simplicity, correlate well with ground-truth depths measured by a current-generation terrestrial light detection and ranging (LIDAR) scanner. Such a strong form of statistical depth information could be used by the visual system when creating overall estimated depth maps incorporating stereopsis, accommodation, and other conditions. Indeed, even in isolation, the Bayesian predictor delivers depth estimates that are competitive with state-of-the-art "computer vision" methods that utilize highly engineered image features and sophisticated machine learning algorithms.

  11. STATISTICAL BAYESIAN ANALYSIS OF EXPERIMENTAL DATA.

    Directory of Open Access Journals (Sweden)

    AHLAM LABDAOUI

    2012-12-01

    Full Text Available The Bayesian researcher should know the basic ideas underlying Bayesian methodology and the computational tools used in modern Bayesian econometrics.  Some of the most important methods of posterior simulation are Monte Carlo integration, importance sampling, Gibbs sampling and the Metropolis- Hastings algorithm. The Bayesian should also be able to put the theory and computational tools together in the context of substantive empirical problems. We focus primarily on recent developments in Bayesian computation. Then we focus on particular models. Inevitably, we combine theory and computation in the context of particular models. Although we have tried to be reasonably complete in terms of covering the basic ideas of Bayesian theory and the computational tools most commonly used by the Bayesian, there is no way we can cover all the classes of models used in econometrics. We propose to the user of analysis of variance and linear regression model.

  12. Bayesian modeling in conjoint analysis

    Directory of Open Access Journals (Sweden)

    Janković-Milić Vesna

    2010-01-01

    Full Text Available Statistical analysis in marketing is largely influenced by the availability of various types of data. There is sudden increase in the number and types of information available to market researchers in the last decade. In such conditions, traditional statistical methods have limited ability to solve problems related to the expression of market uncertainty. The aim of this paper is to highlight the advantages of bayesian inference, as an alternative approach to classical inference. Multivariate statistic methods offer extremely powerful tools to achieve many goals of marketing research. One of these methods is the conjoint analysis, which provides a quantitative measure of the relative importance of product or service attributes in relation to the other attribute. The application of this method involves interviewing consumers, where they express their preferences, and statistical analysis provides numerical indicators of each attribute utility. One of the main objections to the method of discrete choice in the conjoint analysis is to use this method to estimate the utility only at the aggregate level and by expressing the average utility for all respondents in the survey. Application of hierarchical Bayesian models enables capturing of individual utility ratings for each attribute level.

  13. Remote sensing image fusion based on Bayesian linear estimation

    Institute of Scientific and Technical Information of China (English)

    GE ZhiRong; WANG Bin; ZHANG LiMing

    2007-01-01

    A new remote sensing image fusion method based on statistical parameter estimation is proposed in this paper. More specially, Bayesian linear estimation (BLE) is applied to observation models between remote sensing images with different spatial and spectral resolutions. The proposed method only estimates the mean vector and covariance matrix of the high-resolution multispectral (MS) images, instead of assuming the joint distribution between the panchromatic (PAN) image and low-resolution multispectral image. Furthermore, the proposed method can enhance the spatial resolution of several principal components of MS images, while the traditional Principal Component Analysis (PCA) method is limited to enhance only the first principal component. Experimental results with real MS images and PAN image of Landsat ETM+ demonstrate that the proposed method performs better than traditional methods based on statistical parameter estimation,PCA-based method and wavelet-based method.

  14. Computational Advances for and from Bayesian Analysis

    OpenAIRE

    Andrieu, C.; Doucet, A.; Robert, C. P.

    2004-01-01

    The emergence in the past years of Bayesian analysis in many methodological and applied fields as the solution to the modeling of complex problems cannot be dissociated from major changes in its computational implementation. We show in this review how the advances in Bayesian analysis and statistical computation are intermingled.

  15. Bayesian technique for image classifying registration.

    Science.gov (United States)

    Hachama, Mohamed; Desolneux, Agnès; Richard, Frédéric J P

    2012-09-01

    In this paper, we address a complex image registration issue arising while the dependencies between intensities of images to be registered are not spatially homogeneous. Such a situation is frequently encountered in medical imaging when a pathology present in one of the images modifies locally intensity dependencies observed on normal tissues. Usual image registration models, which are based on a single global intensity similarity criterion, fail to register such images, as they are blind to local deviations of intensity dependencies. Such a limitation is also encountered in contrast-enhanced images where there exist multiple pixel classes having different properties of contrast agent absorption. In this paper, we propose a new model in which the similarity criterion is adapted locally to images by classification of image intensity dependencies. Defined in a Bayesian framework, the similarity criterion is a mixture of probability distributions describing dependencies on two classes. The model also includes a class map which locates pixels of the two classes and weighs the two mixture components. The registration problem is formulated both as an energy minimization problem and as a maximum a posteriori estimation problem. It is solved using a gradient descent algorithm. In the problem formulation and resolution, the image deformation and the class map are estimated simultaneously, leading to an original combination of registration and classification that we call image classifying registration. Whenever sufficient information about class location is available in applications, the registration can also be performed on its own by fixing a given class map. Finally, we illustrate the interest of our model on two real applications from medical imaging: template-based segmentation of contrast-enhanced images and lesion detection in mammograms. We also conduct an evaluation of our model on simulated medical data and show its ability to take into account spatial variations

  16. Bayesian analysis of exoplanet and binary orbits

    OpenAIRE

    Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas

    2012-01-01

    We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.

  17. Bayesian Statistics for Biological Data: Pedigree Analysis

    Science.gov (United States)

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  18. Image Classifying Registration for Gaussian & Bayesian Techniques: A Review

    Directory of Open Access Journals (Sweden)

    Rahul Godghate,

    2014-04-01

    Full Text Available A Bayesian Technique for Image Classifying Registration to perform simultaneously image registration and pixel classification. Medical image registration is critical for the fusion of complementary information about patient anatomy and physiology, for the longitudinal study of a human organ over time and the monitoring of disease development or treatment effect, for the statistical analysis of a population variation in comparison to a so-called digital atlas, for image-guided therapy, etc. A Bayesian Technique for Image Classifying Registration is well-suited to deal with image pairs that contain two classes of pixels with different inter-image intensity relationships. We will show through different experiments that the model can be applied in many different ways. For instance if the class map is known, then it can be used for template-based segmentation. If the full model is used, then it can be applied to lesion detection by image comparison. Experiments have been conducted on both real and simulated data. It show that in the presence of an extra-class, the classifying registration improves both the registration and the detection, especially when the deformations are small. The proposed model is defined using only two classes but it is straightforward to extend it to an arbitrary number of classes.

  19. Bayesian analysis of CCDM Models

    OpenAIRE

    Jesus, J. F.; Valentim, R.; Andrade-Oliveira, F.

    2016-01-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, leads to negative creation pressure, which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical tools, at light of SN Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These approaches allow to compare models considering goodness of fit and numbe...

  20. Bayesian learning for cardiac SPECT image interpretation.

    Science.gov (United States)

    Sacha, Jarosław P; Goodenday, Lucy S; Cios, Krzysztof J

    2002-01-01

    In this paper, we describe a system for automating the diagnosis of myocardial perfusion from single-photon emission computerized tomography (SPECT) images of male and female hearts. Initially we had several thousand of SPECT images, other clinical data and physician-interpreter's descriptions of the images. The images were divided into segments based on the Yale system. Each segment was described by the physician as showing one of the following conditions: normal perfusion, reversible perfusion defect, partially reversible perfusion defect, fixed perfusion defect, defect showing reverse redistribution, equivocal defect or artifact. The physician's diagnosis of overall left ventricular (LV) perfusion, based on the above descriptions, categorizes a study as showing one or more of eight possible conditions: normal, ischemia, infarct and ischemia, infarct, reverse redistribution, equivocal, artifact or LV dysfunction. Because of the complexity of the task, we decided to use the knowledge discovery approach, consisting of these steps: problem understanding, data understanding, data preparation, data mining, evaluating the discovered knowledge and its implementation. After going through the data preparation step, in which we constructed normal gender-specific models of the LV and image registration, we ended up with 728 patients for whom we had both SPECT images and corresponding diagnoses. Another major contribution of the paper is the data mining step, in which we used several new Bayesian learning classification methods. The approach we have taken, namely the six-step knowledge discovery process has proven to be very successful in this complex data mining task and as such the process can be extended to other medical data mining projects.

  1. Bayesian Image Classification At High Latitudes

    Science.gov (United States)

    Bulgin, Claire E.; Eastwood, Steinar; Merchant, Chris J.

    2013-12-01

    The European Space Agency created the Climate Change Initiative (CCI) to maximize the usefulness of Earth Observations to climate science. Sea Surface Temperature (SST) is an essential climate variable to which satellite observations make a crucial contribution, and is one of the projects within the CCI program. SST retrieval is dependent on successful cloud clearing and identification of clear-sky pixels over ocean. At high latitudes image classification is more difficult due to the presence of sea-ice. Newly formed ice has a temperature close to the freezing point of water and a dark surface making it difficult to distinguish from open ocean using data at visible and infrared wavelengths. Similarly, melt ponds on the sea-ice surface make image classification more difficult. We present here a three- way Bayesian classifier for the AATSR instrument classifying pixels as ‘clear-sky over ocean', ‘clear-sky over ice' or ‘cloud' using the 0.6, 1.6, 11 and 12 micron channels. We demonstrate the ability of the classifier to successfully identify sea-ice and consider the potential for generating an ice surface temperature record from AATSR which could be extended using data from SLSTR.

  2. PROFIT: Bayesian profile fitting of galaxy images

    Science.gov (United States)

    Robotham, A. S. G.; Taranu, D. S.; Tobar, R.; Moffett, A.; Driver, S. P.

    2017-04-01

    We present PROFIT, a new code for Bayesian two-dimensional photometric galaxy profile modelling. PROFIT consists of a low-level C++ library (libprofit), accessible via a command-line interface and documented API, along with high-level R (PROFIT) and PYTHON (PyProFit) interfaces (available at github.com/ICRAR/libprofit, github.com/ICRAR/ProFit, and github.com/ICRAR/pyprofit, respectively). R PROFIT is also available pre-built from CRAN; however, this version will be slightly behind the latest GitHub version. libprofit offers fast and accurate two-dimensional integration for a useful number of profiles, including Sérsic, Core-Sérsic, broken-exponential, Ferrer, Moffat, empirical King, point-source, and sky, with a simple mechanism for adding new profiles. We show detailed comparisons between libprofit and GALFIT. libprofit is both faster and more accurate than GALFIT at integrating the ubiquitous Sérsic profile for the most common values of the Sérsic index n (0.5 < n < 8). The high-level fitting code PROFIT is tested on a sample of galaxies with both SDSS and deeper KiDS imaging. We find good agreement in the fit parameters, with larger scatter in best-fitting parameters from fitting images from different sources (SDSS versus KiDS) than from using different codes (PROFIT versus GALFIT). A large suite of Monte Carlo-simulated images are used to assess prospects for automated bulge-disc decomposition with PROFIT on SDSS, KiDS, and future LSST imaging. We find that the biggest increases in fit quality come from moving from SDSS- to KiDS-quality data, with less significant gains moving from KiDS to LSST.

  3. Bayesian analysis of binary sequences

    Science.gov (United States)

    Torney, David C.

    2005-03-01

    This manuscript details Bayesian methodology for "learning by example", with binary n-sequences encoding the objects under consideration. Priors prove influential; conformable priors are described. Laplace approximation of Bayes integrals yields posterior likelihoods for all n-sequences. This involves the optimization of a definite function over a convex domain--efficiently effectuated by the sequential application of the quadratic program.

  4. Bayesian analysis for the social sciences

    CERN Document Server

    Jackman, Simon

    2009-01-01

    Bayesian methods are increasingly being used in the social sciences, as the problems encountered lend themselves so naturally to the subjective qualities of Bayesian methodology. This book provides an accessible introduction to Bayesian methods, tailored specifically for social science students. It contains lots of real examples from political science, psychology, sociology, and economics, exercises in all chapters, and detailed descriptions of all the key concepts, without assuming any background in statistics beyond a first course. It features examples of how to implement the methods using WinBUGS - the most-widely used Bayesian analysis software in the world - and R - an open-source statistical software. The book is supported by a Website featuring WinBUGS and R code, and data sets.

  5. Domino effect analysis using Bayesian networks.

    Science.gov (United States)

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio

    2013-02-01

    A new methodology is introduced based on Bayesian network both to model domino effect propagation patterns and to estimate the domino effect probability at different levels. The flexible structure and the unique modeling techniques offered by Bayesian network make it possible to analyze domino effects through a probabilistic framework, considering synergistic effects, noisy probabilities, and common cause failures. Further, the uncertainties and the complex interactions among the domino effect components are captured using Bayesian network. The probabilities of events are updated in the light of new information, and the most probable path of the domino effect is determined on the basis of the new data gathered. This study shows how probability updating helps to update the domino effect model either qualitatively or quantitatively. The methodology is applied to a hypothetical example and also to an earlier-studied case study. These examples accentuate the effectiveness of Bayesian network in modeling domino effects in processing facility. © 2012 Society for Risk Analysis.

  6. Bayesian analysis for kaon photoproduction

    Energy Technology Data Exchange (ETDEWEB)

    Marsainy, T., E-mail: tmart@fisika.ui.ac.id; Mart, T., E-mail: tmart@fisika.ui.ac.id [Department Fisika, FMIPA, Universitas Indonesia, Depok 16424 (Indonesia)

    2014-09-25

    We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.

  7. Bayesian Parallel Imaging With Edge-Preserving Priors

    Science.gov (United States)

    Raj, Ashish; Singh, Gurmeet; Zabih, Ramin; Kressler, Bryan; Wang, Yi; Schuff, Norbert; Weiner, Michael

    2007-01-01

    Existing parallel MRI methods are limited by a fundamental trade-off in that suppressing noise introduces aliasing artifacts. Bayesian methods with an appropriately chosen image prior offer a promising alternative; however, previous methods with spatial priors assume that intensities vary smoothly over the entire image, resulting in blurred edges. Here we introduce an edge-preserving prior (EPP) that instead assumes that intensities are piecewise smooth, and propose a new approach to efficiently compute its Bayesian estimate. The estimation task is formulated as an optimization problem that requires a non-convex objective function to be minimized in a space with thousands of dimensions. As a result, traditional continuous minimization methods cannot be applied. This optimization task is closely related to some problems in the field of computer vision for which discrete optimization methods have been developed in the last few years. We adapt these algorithms, which are based on graph cuts, to address our optimization problem. The results of several parallel imaging experiments on brain and torso regions performed under challenging conditions with high acceleration factors are shown and compared with the results of conventional sensitivity encoding (SENSE) methods. An empirical analysis indicates that the proposed method visually improves overall quality compared to conventional methods. PMID:17195165

  8. Bayesian Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  9. Bayesian Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  10. An Overview of Bayesian Methods for Neural Spike Train Analysis

    Directory of Open Access Journals (Sweden)

    Zhe Chen

    2013-01-01

    Full Text Available Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed.

  11. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M.

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  12. Hierarchical Bayesian sparse image reconstruction with application to MRFM

    CERN Document Server

    Dobigeon, Nicolas; Tourneret, Jean-Yves

    2008-01-01

    This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seamlessly accounts for properties such as sparsity and positivity of the image via appropriate Bayes priors. We propose a prior that is based on a weighted mixture of a positive exponential distribution and a mass at zero. The prior has hyperparameters that are tuned automatically by marginalization over the hierarchical Bayesian model. To overcome the complexity of the posterior distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be used to estimate the image to be recovered, e.g. by maximizing the estimated posterior distribution. In our fully Bayesian approach the posteriors of all the parameters are available. Thus our algorithm provides more information than other previously proposed sparse reconstr...

  13. Bayesian regularization of diffusion tensor images

    DEFF Research Database (Denmark)

    Frandsen, Jesper; Hobolth, Asger; Østergaard, Leif;

    2007-01-01

    several directions. The measured diffusion coefficients and thereby the diffusion tensors are subject to noise, leading to possibly flawed representations of the three dimensional fibre bundles. In this paper we develop a Bayesian procedure for regularizing the diffusion tensor field, fully utilizing...

  14. Bayesian analysis of cosmic structures

    CERN Document Server

    Kitaura, Francisco-Shu

    2011-01-01

    We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...

  15. A Bayesian Approach for Image Segmentation with Shape Priors

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hang; Yang, Qing; Parvin, Bahram

    2008-06-20

    Color and texture have been widely used in image segmentation; however, their performance is often hindered by scene ambiguities, overlapping objects, or missingparts. In this paper, we propose an interactive image segmentation approach with shape prior models within a Bayesian framework. Interactive features, through mouse strokes, reduce ambiguities, and the incorporation of shape priors enhances quality of the segmentation where color and/or texture are not solely adequate. The novelties of our approach are in (i) formulating the segmentation problem in a well-de?ned Bayesian framework with multiple shape priors, (ii) ef?ciently estimating parameters of the Bayesian model, and (iii) multi-object segmentation through user-speci?ed priors. We demonstrate the effectiveness of our method on a set of natural and synthetic images.

  16. Bayesian data analysis tools for atomic physics

    CERN Document Server

    Trassinelli, Martino

    2016-01-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested fit to calculate the different probability distrib...

  17. A Bayesian nonparametric meta-analysis model.

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  18. [A medical image semantic modeling based on hierarchical Bayesian networks].

    Science.gov (United States)

    Lin, Chunyi; Ma, Lihong; Yin, Junxun; Chen, Jianyu

    2009-04-01

    A semantic modeling approach for medical image semantic retrieval based on hierarchical Bayesian networks was proposed, in allusion to characters of medical images. It used GMM (Gaussian mixture models) to map low-level image features into object semantics with probabilities, then it captured high-level semantics through fusing these object semantics using a Bayesian network, so that it built a multi-layer medical image semantic model, aiming to enable automatic image annotation and semantic retrieval by using various keywords at different semantic levels. As for the validity of this method, we have built a multi-level semantic model from a small set of astrocytoma MRI (magnetic resonance imaging) samples, in order to extract semantics of astrocytoma in malignant degree. Experiment results show that this is a superior approach.

  19. Compressive dynamic range imaging via Bayesian shrinkage dictionary learning

    Science.gov (United States)

    Yuan, Xin

    2016-12-01

    We apply the Bayesian shrinkage dictionary learning into compressive dynamic-range imaging. By attenuating the luminous intensity impinging upon the detector at the pixel level, we demonstrate a conceptual design of an 8-bit camera to sample high-dynamic-range scenes with a single snapshot. Coding strategies for both monochrome and color cameras are proposed. A Bayesian reconstruction algorithm is developed to learn a dictionary in situ on the sampled image, for joint reconstruction and demosaicking. We use global-local shrinkage priors to learn the dictionary and dictionary coefficients representing the data. Simulation results demonstrate the feasibility of the proposed camera and the superior performance of the Bayesian shrinkage dictionary learning algorithm.

  20. A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research

    NARCIS (Netherlands)

    Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, t

  1. A SAS Interface for Bayesian Analysis with WinBUGS

    Science.gov (United States)

    Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki

    2008-01-01

    Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…

  2. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  3. Integrative bayesian network analysis of genomic data.

    Science.gov (United States)

    Ni, Yang; Stingo, Francesco C; Baladandayuthapani, Veerabhadran

    2014-01-01

    Rapid development of genome-wide profiling technologies has made it possible to conduct integrative analysis on genomic data from multiple platforms. In this study, we develop a novel integrative Bayesian network approach to investigate the relationships between genetic and epigenetic alterations as well as how these mutations affect a patient's clinical outcome. We take a Bayesian network approach that admits a convenient decomposition of the joint distribution into local distributions. Exploiting the prior biological knowledge about regulatory mechanisms, we model each local distribution as linear regressions. This allows us to analyze multi-platform genome-wide data in a computationally efficient manner. We illustrate the performance of our approach through simulation studies. Our methods are motivated by and applied to a multi-platform glioblastoma dataset, from which we reveal several biologically relevant relationships that have been validated in the literature as well as new genes that could potentially be novel biomarkers for cancer progression.

  4. Bayesian analysis of multiple direct detection experiments

    CERN Document Server

    Arina, Chiara

    2013-01-01

    Bayesian methods offer a coherent and efficient framework for implementing uncertainties into induction problems. In this article, we review how this approach applies to the analysis of dark matter direct detection experiments. In particular we discuss the exclusion limit of XENON100 and the debated hints of detection under the hypothesis of a WIMP signal. Within parameter inference, marginalizing consistently over uncertainties to extract robust posterior probability distributions, we find that the claimed tension between XENON100 and the other experiments can be partially alleviated in isospin violating scenario, while elastic scattering model appears to be compatible with the classical approach. We then move to model comparison, for which Bayesian methods are particularly well suited. Firstly, we investigate the annual modulation seen in CoGeNT data, finding that there is weak evidence for a modulation. Modulation models due to other physics compare unfavorably with the WIMP models, paying the price for th...

  5. An improved Bayesian matting method based on image statistic characteristics

    Science.gov (United States)

    Sun, Wei; Luo, Siwei; Wu, Lina

    2015-03-01

    Image matting is an important task in image and video editing and has been studied for more than 30 years. In this paper we propose an improved interactive matting method. Starting from a coarse user-guided trimap, we first perform a color estimation based on texture and color information and use the result to refine the original trimap. Then with the new trimap, we apply soft matting process which is improved Bayesian matting with smoothness constraints. Experimental results on natural image show that this method is useful, especially for the images have similar texture feature in the background or the images which is hard to give a precise trimap.

  6. Multifrequency Bayesian compressive sensing methods for microwave imaging.

    Science.gov (United States)

    Poli, Lorenzo; Oliveri, Giacomo; Ding, Ping Ping; Moriyama, Toshifumi; Massa, Andrea

    2014-11-01

    The Bayesian retrieval of sparse scatterers under multifrequency transverse magnetic illuminations is addressed. Two innovative imaging strategies are formulated to process the spectral content of microwave scattering data according to either a frequency-hopping multistep scheme or a multifrequency one-shot scheme. To solve the associated inverse problems, customized implementations of single-task and multitask Bayesian compressive sensing are introduced. A set of representative numerical results is discussed to assess the effectiveness and the robustness against the noise of the proposed techniques also in comparison with some state-of-the-art deterministic strategies.

  7. The Relevance Voxel Machine (RVoxM): A Bayesian Method for Image-Based Prediction

    DEFF Research Database (Denmark)

    Sabuncu, Mert R.; Van Leemput, Koen

    2011-01-01

    This paper presents the Relevance VoxelMachine (RVoxM), a Bayesian multivariate pattern analysis (MVPA) algorithm that is specifically designed for making predictions based on image data. In contrast to generic MVPA algorithms that have often been used for this purpose, the method is designed to ...

  8. Bayesian Analysis of Individual Level Personality Dynamics

    Directory of Open Access Journals (Sweden)

    Edward Cripps

    2016-07-01

    Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.

  9. Bayesian analysis of factorial designs.

    Science.gov (United States)

    Rouder, Jeffrey N; Morey, Richard D; Verhagen, Josine; Swagman, April R; Wagenmakers, Eric-Jan

    2017-06-01

    This article provides a Bayes factor approach to multiway analysis of variance (ANOVA) that allows researchers to state graded evidence for effects or invariances as determined by the data. ANOVA is conceptualized as a hierarchical model where levels are clustered within factors. The development is comprehensive in that it includes Bayes factors for fixed and random effects and for within-subjects, between-subjects, and mixed designs. Different model construction and comparison strategies are discussed, and an example is provided. We show how Bayes factors may be computed with BayesFactor package in R and with the JASP statistical package. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Bayesian Inference in Statistical Analysis

    CERN Document Server

    Box, George E P

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Rob

  11. A Sparse Bayesian Learning Algorithm for Longitudinal Image Data.

    Science.gov (United States)

    Sabuncu, Mert R

    2015-10-01

    Longitudinal imaging studies, where serial (multiple) scans are collected on each individual, are becoming increasingly widespread. The field of machine learning has in general neglected the longitudinal design, since many algorithms are built on the assumption that each datapoint is an independent sample. Thus, the application of general purpose machine learning tools to longitudinal image data can be sub-optimal. Here, we present a novel machine learning algorithm designed to handle longitudinal image datasets. Our approach builds on a sparse Bayesian image-based prediction algorithm. Our empirical results demonstrate that the proposed method can offer a significant boost in prediction performance with longitudinal clinical data.

  12. Analysis of COSIMA spectra: Bayesian approach

    Directory of Open Access Journals (Sweden)

    H. J. Lehto

    2014-11-01

    Full Text Available We describe the use of Bayesian analysis methods applied to TOF-SIMS spectra. The method finds the probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes, positions in mass intervals over the whole spectrum. We discuss the results we can expect from this analysis. We discuss the effects the instrument dead time causes in the COSIMA TOF SIMS. We address this issue in a new way. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method in two ways, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique for COSIMA. Finally, we point out that the Bayesian method can be thought as a means to solve inverse problems but with forward calculations only.

  13. Bayesian global analysis of neutrino oscillation data

    CERN Document Server

    Bergstrom, Johannes; Maltoni, Michele; Schwetz, Thomas

    2015-01-01

    We perform a Bayesian analysis of current neutrino oscillation data. When estimating the oscillation parameters we find that the results generally agree with those of the $\\chi^2$ method, with some differences involving $s_{23}^2$ and CP-violating effects. We discuss the additional subtleties caused by the circular nature of the CP-violating phase, and how it is possible to obtain correlation coefficients with $s_{23}^2$. When performing model comparison, we find that there is no significant evidence for any mass ordering, any octant of $s_{23}^2$ or a deviation from maximal mixing, nor the presence of CP-violation.

  14. Bayesian Analysis of Type Ia Supernova Data

    Institute of Scientific and Technical Information of China (English)

    王晓峰; 周旭; 李宗伟; 陈黎

    2003-01-01

    Recently, the distances to type Ia supernova (SN Ia) at z ~ 0.5 have been measured with the motivation of estimating cosmological parameters. However, different sleuthing techniques tend to give inconsistent measurements for SN Ia distances (~0.3 mag), which significantly affects the determination of cosmological parameters.A Bayesian "hyper-parameter" procedure is used to analyse jointly the current SN Ia data, which considers the relative weights of different datasets. For a flat Universe, the combining analysis yields ΩM = 0.20 ± 0.07.

  15. Doing bayesian data analysis a tutorial with R and BUGS

    CERN Document Server

    Kruschke, John K

    2011-01-01

    There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all

  16. Assessment of CT image quality using a Bayesian approach

    Science.gov (United States)

    Reginatto, M.; Anton, M.; Elster, C.

    2017-08-01

    One of the most promising approaches for evaluating CT image quality is task-specific quality assessment. This involves a simplified version of a clinical task, e.g. deciding whether an image belongs to the class of images that contain the signature of a lesion or not. Task-specific quality assessment can be done by model observers, which are mathematical procedures that carry out the classification task. The most widely used figure of merit for CT image quality is the area under the ROC curve (AUC), a quantity which characterizes the performance of a given model observer. In order to estimate AUC from a finite sample of images, different approaches from classical statistics have been suggested. The goal of this paper is to introduce task-specific quality assessment of CT images to metrology and to propose a novel Bayesian estimation of AUC for the channelized Hotelling observer (CHO) applied to the task of detecting a lesion at a known image location. It is assumed that signal-present and signal-absent images follow multivariate normal distributions with the same covariance matrix. The Bayesian approach results in a posterior distribution for the AUC of the CHO which provides in addition a complete characterization of the uncertainty of this figure of merit. The approach is illustrated by its application to both simulated and experimental data.

  17. A Variational Bayesian Approach to Multiframe Image Restoration.

    Science.gov (United States)

    Sonogashira, Motoharu; Funatomi, Takuya; Iiyama, Masaaki; Minoh, Michihiko

    2017-03-06

    Image restoration is a fundamental problem in the field of image processing. The key objective of image restoration is to recover clean images from images degraded by noise and blur. Recently, a family of new statistical techniques called variational Bayes (VB) has been introduced to image restoration, which enables us to automatically tune parameters that control restoration. While information from one image is often insufficient for high-quality restoration, however, current state-of-theart methods of image restoration via VB approaches use only a single degraded image to recover a clean image. In this paper, we propose a novel method of multiframe image restoration via a VB approach, which can achieve higher image quality while tuning parameters automatically. Given multiple degraded images, this method jointly estimates a clean image and other parameters, including an image warping parameter introduced for the use of multiple images, through Bayesian inference that we enable by making full use of VB techniques. Through various experiments, we demonstrate the effectiveness of our multiframe method by comparing it with single-frame one, and also show the advantages of our VB approach over non-VB approaches.

  18. Bayesian estimation of the multifractality parameter for image texture using a Whittle approximation

    CERN Document Server

    Combrexelle, Sébastien; Dobigeon, Nicolas; Tourneret, Jean-Yves; McLaughlin, Steve; Abry, Patrice

    2014-01-01

    Texture characterization is a central element in many image processing applications. Multifractal analysis is a useful signal and image processing tool, yet, the accurate estimation of multifractal parameters for image texture remains a challenge. This is due in the main to the fact that current estimation procedures consist of performing linear regressions across frequency scales of the two-dimensional (2D) dyadic wavelet transform, for which only a few such scales are computable for images. The strongly non-Gaussian nature of multifractal processes, combined with their complicated dependence structure, makes it difficult to develop suitable models for parameter estimation. Here, we propose a Bayesian procedure that addresses the difficulties in the estimation of the multifractality parameter. The originality of the procedure is threefold: The construction of a generic semi-parametric statistical model for the logarithm of wavelet leaders; the formulation of Bayesian estimators that are associated with this ...

  19. Bayesian analysis of multiple direct detection experiments

    Science.gov (United States)

    Arina, Chiara

    2014-12-01

    Bayesian methods offer a coherent and efficient framework for implementing uncertainties into induction problems. In this article, we review how this approach applies to the analysis of dark matter direct detection experiments. In particular we discuss the exclusion limit of XENON100 and the debated hints of detection under the hypothesis of a WIMP signal. Within parameter inference, marginalizing consistently over uncertainties to extract robust posterior probability distributions, we find that the claimed tension between XENON100 and the other experiments can be partially alleviated in isospin violating scenario, while elastic scattering model appears to be compatible with the frequentist statistical approach. We then move to model comparison, for which Bayesian methods are particularly well suited. Firstly, we investigate the annual modulation seen in CoGeNT data, finding that there is weak evidence for a modulation. Modulation models due to other physics compare unfavorably with the WIMP models, paying the price for their excessive complexity. Secondly, we confront several coherent scattering models to determine the current best physical scenario compatible with the experimental hints. We find that exothermic and inelastic dark matter are moderatly disfavored against the elastic scenario, while the isospin violating model has a similar evidence. Lastly the Bayes' factor gives inconclusive evidence for an incompatibility between the data sets of XENON100 and the hints of detection. The same question assessed with goodness of fit would indicate a 2 σ discrepancy. This suggests that more data are therefore needed to settle this question.

  20. Bayesian Analysis of High Dimensional Classification

    Science.gov (United States)

    Mukhopadhyay, Subhadeep; Liang, Faming

    2009-12-01

    Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.

  1. A Bayesian Approach for Segmentation in Stereo Image Sequences

    Directory of Open Access Journals (Sweden)

    Tzovaras Dimitrios

    2002-01-01

    Full Text Available Stereoscopic image sequence processing has been the focus of considerable attention in recent literature for videoconference applications. A novel Bayesian scheme is proposed in this paper, for the segmentation of a noisy stereoscopic image sequence. More specifically, occlusions and visible foreground and background regions are detected between the left and the right frame while the uncovered-background areas are identified between two successive frames of the sequence. Combined hypotheses are used for the formulation of the Bayes decision rule which employs a single intensity-difference measurement at each pixel. Experimental results illustrating the performance of the proposed technique are presented and evaluated in videoconference applications.

  2. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging.

    Science.gov (United States)

    Zhang, Shuanghui; Liu, Yongxiang; Li, Xiang; Bi, Guoan

    2016-04-28

    This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR) algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP) estimation and the maximum likelihood estimation (MLE) are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT) and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  3. The bugs book a practical introduction to Bayesian analysis

    CERN Document Server

    Lunn, David; Best, Nicky; Thomas, Andrew; Spiegelhalter, David

    2012-01-01

    Introduction: Probability and ParametersProbabilityProbability distributionsCalculating properties of probability distributionsMonte Carlo integrationMonte Carlo Simulations Using BUGSIntroduction to BUGSDoodleBUGSUsing BUGS to simulate from distributionsTransformations of random variablesComplex calculations using Monte CarloMultivariate Monte Carlo analysisPredictions with unknown parametersIntroduction to Bayesian InferenceBayesian learningPosterior predictive distributionsConjugate Bayesian inferenceInference about a discrete parameterCombinations of conjugate analysesBayesian and classica

  4. Photoacoustic image reconstruction based on Bayesian compressive sensing algorithm

    Institute of Scientific and Technical Information of China (English)

    Mingjian Sun; Naizhang Feng; Yi Shen; Jiangang Li; Liyong Ma; Zhenghua Wu

    2011-01-01

    The photoacoustic tomography (PAT) method, based on compressive sensing (CS) theory, requires that,for the CS reconstruction, the desired image should have a sparse representation in a known transform domain. However, the sparsity of photoacoustic signals is destroyed because noises always exist. Therefore,the original sparse signal cannot be effectively recovered using the general reconstruction algorithm. In this study, Bayesian compressive sensing (BCS) is employed to obtain highly sparse representations of photoacoustic images based on a set of noisy CS measurements. Results of simulation demonstrate that the BCS-reconstructed image can achieve superior performance than other state-of-the-art CS-reconstruction algorithms.%@@ The photoacoustic tomography (PAT) method, based on compressive sensing (CS) theory, requires that,for the CS reconstruction, the desired image should have a sparse representation in a known transform domain.However, the sparsity of photoacoustic signals is destroyed because noises always exist.Therefore,the original sparse signal cannot be effectively recovered using the general reconstruction algorithm.In this study, Bayesian compressive sensing (BCS) is employed to obtain highly sparse representations of photoacoustic inages based on a set of noisy CS measurements.Results of simulation demonstrate that the BCS-reconstructed image can achieve superior performance than other state-of-the-art CS-reconstruction algorithms.

  5. Confirmation via Analogue Simulation: A Bayesian Analysis

    CERN Document Server

    Dardashti, Radin; Thebault, Karim P Y; Winsberg, Eric

    2016-01-01

    Analogue simulation is a novel mode of scientific inference found increasingly within modern physics, and yet all but neglected in the philosophical literature. Experiments conducted upon a table-top 'source system' are taken to provide insight into features of an inaccessible 'target system', based upon a syntactic isomorphism between the relevant modelling frameworks. An important example is the use of acoustic 'dumb hole' systems to simulate gravitational black holes. In a recent paper it was argued that there exists circumstances in which confirmation via analogue simulation can obtain; in particular when the robustness of the isomorphism is established via universality arguments. The current paper supports these claims via an analysis in terms of Bayesian confirmation theory.

  6. BioEM: GPU-accelerated computing of Bayesian inference of electron microscopy images

    CERN Document Server

    Cossio, Pilar; Baruffa, Fabio; Rampp, Markus; Lindenstruth, Volker; Hummer, Gerhard

    2016-01-01

    In cryo-electron microscopy (EM), molecular structures are determined from large numbers of projection images of individual particles. To harness the full power of this single-molecule information, we use the Bayesian inference of EM (BioEM) formalism. By ranking structural models using posterior probabilities calculated for individual images, BioEM in principle addresses the challenge of working with highly dynamic or heterogeneous systems not easily handled in traditional EM reconstruction. However, the calculation of these posteriors for large numbers of particles and models is computationally demanding. Here we present highly parallelized, GPU-accelerated computer software that performs this task efficiently. Our flexible formulation employs CUDA, OpenMP, and MPI parallelization combined with both CPU and GPU computing. The resulting BioEM software scales nearly ideally both on pure CPU and on CPU+GPU architectures, thus enabling Bayesian analysis of tens of thousands of images in a reasonable time. The g...

  7. Bayesian Framework for Automatic Image Annotation Using Visual Keywords

    Science.gov (United States)

    Agrawal, Rajeev; Wu, Changhua; Grosky, William; Fotouhi, Farshad

    In this paper, we propose a Bayesian probability based framework, which uses visual keywords and already available text keywords to automatically annotate the images. Taking the cue from document classification, an image can be considered as a document and objects present in it as words. Using this concept, we can create visual keywords by dividing an image into tiles based on a certain template size. Visual keywords are simple vector quantization of small-sized image tiles. We estimate the conditional probability of a text keyword in the presence of visual keywords, described by a multivariate Gaussian distribution. We demonstrate the effectiveness of our approach by comparing predicted text annotations with manual annotations and analyze the effect of text annotation length on the performance.

  8. BEAST: Bayesian evolutionary analysis by sampling trees

    Directory of Open Access Journals (Sweden)

    Drummond Alexei J

    2007-11-01

    Full Text Available Abstract Background The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models suitable for both within- and between-species sequence data are implemented. Results BEAST version 1.4.6 consists of 81000 lines of Java source code, 779 classes and 81 packages. It provides models for DNA and protein sequence evolution, highly parametric coalescent analysis, relaxed clock phylogenetics, non-contemporaneous sequence data, statistical alignment and a wide range of options for prior distributions. BEAST source code is object-oriented, modular in design and freely available at http://beast-mcmc.googlecode.com/ under the GNU LGPL license. Conclusion BEAST is a powerful and flexible evolutionary analysis package for molecular sequence variation. It also provides a resource for the further development of new models and statistical methods of evolutionary analysis.

  9. Analysis of COSIMA spectra: Bayesian approach

    Directory of Open Access Journals (Sweden)

    H. J. Lehto

    2015-06-01

    secondary ion mass spectrometer (TOF-SIMS spectra. The method is applied to the COmetary Secondary Ion Mass Analyzer (COSIMA TOF-SIMS mass spectra where the analysis can be broken into subgroups of lines close to integer mass values. The effects of the instrumental dead time are discussed in a new way. The method finds the joint probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes and positions. In the case of two or more lines, these distributions can take complex forms. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique to COSIMA. Finally, we point out that the Bayesian method can be thought of as a means to solve inverse problems but with forward calculations, only with no iterative corrections or other manipulation of the observed data.

  10. Bayesian data analysis in population ecology: motivations, methods, and benefits

    Science.gov (United States)

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  11. Developing and Testing a Bayesian Analysis of Fluorescence Lifetime Measurements

    Science.gov (United States)

    Needleman, Daniel J.

    2017-01-01

    FRET measurements can provide dynamic spatial information on length scales smaller than the diffraction limit of light. Several methods exist to measure FRET between fluorophores, including Fluorescence Lifetime Imaging Microscopy (FLIM), which relies on the reduction of fluorescence lifetime when a fluorophore is undergoing FRET. FLIM measurements take the form of histograms of photon arrival times, containing contributions from a mixed population of fluorophores both undergoing and not undergoing FRET, with the measured distribution being a mixture of exponentials of different lifetimes. Here, we present an analysis method based on Bayesian inference that rigorously takes into account several experimental complications. We test the precision and accuracy of our analysis on controlled experimental data and verify that we can faithfully extract model parameters, both in the low-photon and low-fraction regimes. PMID:28060890

  12. Stochastic back analysis of permeability coefficient using generalized Bayesian method

    Institute of Scientific and Technical Information of China (English)

    Zheng Guilan; Wang Yuan; Wang Fei; Yang Jian

    2008-01-01

    Owing to the fact that the conventional deterministic back analysis of the permeability coefficient cannot reflect the uncertainties of parameters, including the hydraulic head at the boundary, the permeability coefficient and measured hydraulic head, a stochastic back analysis taking consideration of uncertainties of parameters was performed using the generalized Bayesian method. Based on the stochastic finite element method (SFEM) for a seepage field, the variable metric algorithm and the generalized Bayesian method, formulas for stochastic back analysis of the permeability coefficient were derived. A case study of seepage analysis of a sluice foundation was performed to illustrate the proposed method. The results indicate that, with the generalized Bayesian method that considers the uncertainties of measured hydraulic head, the permeability coefficient and the hydraulic head at the boundary, both the mean and standard deviation of the permeability coefficient can be obtained and the standard deviation is less than that obtained by the conventional Bayesian method. Therefore, the present method is valid and applicable.

  13. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  14. Bayesian approach for near-duplicate image detection

    CERN Document Server

    Bueno, Lucas Moutinho; Torres, Ricardo da Silva

    2011-01-01

    In this paper we propose a bayesian approach for near-duplicate image detection, and investigate how different probabilistic models affect the performance obtained. The task of identifying an image whose metadata are missing is often demanded for a myriad of applications: metadata retrieval in cultural institutions, detection of copyright violations, investigation of latent cross-links in archives and libraries, duplicate elimination in storage management, etc. The majority of current solutions are based either on voting algorithms, which are very precise, but expensive; either on the use of visual dictionaries, which are efficient, but less precise. Our approach, uses local descriptors in a novel way, which by a careful application of decision theory, allows a very fine control of the compromise between precision and efficiency. In addition, the method attains a great compromise between those two axes, with more than 99% accuracy with less than 10 database operations.

  15. Bayesian Analysis of the Cosmic Microwave Background

    Science.gov (United States)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  16. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  17. A Bayesian approach to image expansion for improved definition.

    Science.gov (United States)

    Schultz, R R; Stevenson, R L

    1994-01-01

    Accurate image expansion is important in many areas of image analysis. Common methods of expansion, such as linear and spline techniques, tend to smooth the image data at edge regions. This paper introduces a method for nonlinear image expansion which preserves the discontinuities of the original image, producing an expanded image with improved definition. The maximum a posteriori (MAP) estimation techniques that are proposed for noise-free and noisy images result in the optimization of convex functionals. The expanded images produced from these methods will be shown to be aesthetically and quantitatively superior to images expanded by the standard methods of replication, linear interpolation, and cubic B-spline expansion.

  18. Bayesian analysis of a disability model for lung cancer survival.

    Science.gov (United States)

    Armero, C; Cabras, S; Castellanos, M E; Perra, S; Quirós, A; Oruezábal, M J; Sánchez-Rubio, J

    2016-02-01

    Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncologists and patients make efficient and effective decisions.

  19. Logarithmic Laplacian Prior Based Bayesian Inverse Synthetic Aperture Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shuanghui Zhang

    2016-04-01

    Full Text Available This paper presents a novel Inverse Synthetic Aperture Radar Imaging (ISAR algorithm based on a new sparse prior, known as the logarithmic Laplacian prior. The newly proposed logarithmic Laplacian prior has a narrower main lobe with higher tail values than the Laplacian prior, which helps to achieve performance improvement on sparse representation. The logarithmic Laplacian prior is used for ISAR imaging within the Bayesian framework to achieve better focused radar image. In the proposed method of ISAR imaging, the phase errors are jointly estimated based on the minimum entropy criterion to accomplish autofocusing. The maximum a posterior (MAP estimation and the maximum likelihood estimation (MLE are utilized to estimate the model parameters to avoid manually tuning process. Additionally, the fast Fourier Transform (FFT and Hadamard product are used to minimize the required computational efficiency. Experimental results based on both simulated and measured data validate that the proposed algorithm outperforms the traditional sparse ISAR imaging algorithms in terms of resolution improvement and noise suppression.

  20. Bayesian analysis of MEG visual evoked responses

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  1. Bayesian Analysis of Perceived Eye Level

    Science.gov (United States)

    Orendorff, Elaine E.; Kalesinskas, Laurynas; Palumbo, Robert T.; Albert, Mark V.

    2016-01-01

    To accurately perceive the world, people must efficiently combine internal beliefs and external sensory cues. We introduce a Bayesian framework that explains the role of internal balance cues and visual stimuli on perceived eye level (PEL)—a self-reported measure of elevation angle. This framework provides a single, coherent model explaining a set of experimentally observed PEL over a range of experimental conditions. Further, it provides a parsimonious explanation for the additive effect of low fidelity cues as well as the averaging effect of high fidelity cues, as also found in other Bayesian cue combination psychophysical studies. Our model accurately estimates the PEL and explains the form of previous equations used in describing PEL behavior. Most importantly, the proposed Bayesian framework for PEL is more powerful than previous behavioral modeling; it permits behavioral estimation in a wider range of cue combination and perceptual studies than models previously reported. PMID:28018204

  2. A Bayesian Analysis of Spectral ARMA Model

    Directory of Open Access Journals (Sweden)

    Manoel I. Silvestre Bezerra

    2012-01-01

    Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.

  3. THz-SAR Vibrating Target Imaging via the Bayesian Method

    Directory of Open Access Journals (Sweden)

    Bin Deng

    2017-01-01

    Full Text Available Target vibration bears important information for target recognition, and terahertz, due to significant micro-Doppler effects, has strong advantages for remotely sensing vibrations. In this paper, the imaging characteristics of vibrating targets with THz-SAR are at first analyzed. An improved algorithm based on an excellent Bayesian approach, that is, the expansion-compression variance-component (ExCoV method, has been proposed for reconstructing scattering coefficients of vibrating targets, which provides more robust and efficient initialization and overcomes the deficiencies of sidelobes as well as artifacts arising from the traditional correlation method. A real vibration measurement experiment of idle cars was performed to validate the range model. Simulated SAR data of vibrating targets and a tank model in a real background in 220 GHz show good performance at low SNR. Rapidly evolving high-power terahertz devices will offer viable THz-SAR application at a distance of several kilometers.

  4. Bayesian methods for the design and analysis of noninferiority trials.

    Science.gov (United States)

    Gamalo-Siebers, Margaret; Gao, Aijun; Lakshminarayanan, Mani; Liu, Guanghan; Natanegara, Fanni; Railkar, Radha; Schmidli, Heinz; Song, Guochen

    2016-01-01

    The gold standard for evaluating treatment efficacy of a medical product is a placebo-controlled trial. However, when the use of placebo is considered to be unethical or impractical, a viable alternative for evaluating treatment efficacy is through a noninferiority (NI) study where a test treatment is compared to an active control treatment. The minimal objective of such a study is to determine whether the test treatment is superior to placebo. An assumption is made that if the active control treatment remains efficacious, as was observed when it was compared against placebo, then a test treatment that has comparable efficacy with the active control, within a certain range, must also be superior to placebo. Because of this assumption, the design, implementation, and analysis of NI trials present challenges for sponsors and regulators. In designing and analyzing NI trials, substantial historical data are often required on the active control treatment and placebo. Bayesian approaches provide a natural framework for synthesizing the historical data in the form of prior distributions that can effectively be used in design and analysis of a NI clinical trial. Despite a flurry of recent research activities in the area of Bayesian approaches in medical product development, there are still substantial gaps in recognition and acceptance of Bayesian approaches in NI trial design and analysis. The Bayesian Scientific Working Group of the Drug Information Association provides a coordinated effort to target the education and implementation issues on Bayesian approaches for NI trials. In this article, we provide a review of both frequentist and Bayesian approaches in NI trials, and elaborate on the implementation for two common Bayesian methods including hierarchical prior method and meta-analytic-predictive approach. Simulations are conducted to investigate the properties of the Bayesian methods, and some real clinical trial examples are presented for illustration.

  5. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...

  6. Adaptive bayesian analysis for binomial proportions

    CSIR Research Space (South Africa)

    Das, Sonali

    2008-10-01

    Full Text Available The authors consider the problem of statistical inference of binomial proportions for non-matched, correlated samples, under the Bayesian framework. Such inference can arise when the same group is observed at a different number of times with the aim...

  7. Lossless predictive coding for images with Bayesian treatment.

    Science.gov (United States)

    Liu, Jing; Zhai, Guangtao; Yang, Xiaokang; Chen, Li

    2014-12-01

    Adaptive predictor has long been used for lossless predictive coding of images. Most of existing lossless predictive coding techniques mainly focus on suitability of prediction model for training set with the underlying assumption of local consistency, which may not hold well on object boundaries and cause large predictive error. In this paper, we propose a novel approach based on the assumption that local consistency and patch redundancy exist simultaneously in natural images. We derive a family of linear models and design a new algorithm to automatically select one suitable model for prediction. From the Bayesian perspective, the model with maximum posterior probability is considered as the best. Two types of model evidence are included in our algorithm. One is traditional training evidence, which represents the models’ suitability for current pixel under the assumption of local consistency. The other is target evidence, which is proposed to express the preference for different models from the perspective of patch redundancy. It is shown that the fusion of training evidence and target evidence jointly exploits the benefits of local consistency and patch redundancy. As a result, our proposed predictor is more suitable for natural images with textures and object boundaries. Comprehensive experiments demonstrate that the proposed predictor achieves higher efficiency compared with the state-of-the-art lossless predictors.

  8. A bayesian approach to deformed pattern matching of iris images.

    Science.gov (United States)

    Thornton, Jason; Savvides, Marios; Vijaya Kumar, B V K

    2007-04-01

    We describe a general probabilistic framework for matching patterns that experience in-plane nonlinear deformations, such as iris patterns. Given a pair of images, we derive a maximum a posteriori probability (MAP) estimate of the parameters of the relative deformation between them. Our estimation process accomplishes two things simultaneously: It normalizes for pattern warping and it returns a distortion-tolerant similarity metric which can be used for matching two nonlinearly deformed image patterns. The prior probability of the deformation parameters is specific to the pattern-type and, therefore, should result in more accurate matching than an arbitrary general distribution. We show that the proposed method is very well suited for handling iris biometrics, applying it to two databases of iris images which contain real instances of warped patterns. We demonstrate a significant improvement in matching accuracy using the proposed deformed Bayesian matching methodology. We also show that the additional computation required to estimate the deformation is relatively inexpensive, making it suitable for real-time applications.

  9. PAC-Bayesian Analysis of Martingales and Multiarmed Bandits

    CERN Document Server

    Seldin, Yevgeny; Shawe-Taylor, John; Peters, Jan; Auer, Peter

    2011-01-01

    We present two alternative ways to apply PAC-Bayesian analysis to sequences of dependent random variables. The first is based on a new lemma that enables to bound expectations of convex functions of certain dependent random variables by expectations of the same functions of independent Bernoulli random variables. This lemma provides an alternative tool to Hoeffding-Azuma inequality to bound concentration of martingale values. Our second approach is based on integration of Hoeffding-Azuma inequality with PAC-Bayesian analysis. We also introduce a way to apply PAC-Bayesian analysis in situation of limited feedback. We combine the new tools to derive PAC-Bayesian generalization and regret bounds for the multiarmed bandit problem. Although our regret bound is not yet as tight as state-of-the-art regret bounds based on other well-established techniques, our results significantly expand the range of potential applications of PAC-Bayesian analysis and introduce a new analysis tool to reinforcement learning and many ...

  10. Analysis of KATRIN data using Bayesian inference

    DEFF Research Database (Denmark)

    Riis, Anna Sejersen; Hannestad, Steen; Weinheimer, Christian

    2011-01-01

    The KATRIN (KArlsruhe TRItium Neutrino) experiment will be analyzing the tritium beta-spectrum to determine the mass of the neutrino with a sensitivity of 0.2 eV (90% C.L.). This approach to a measurement of the absolute value of the neutrino mass relies only on the principle of energy conservati...... the KATRIN chi squared function in the COSMOMC package - an MCMC code using Bayesian parameter inference - solved the task at hand very nicely....

  11. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  12. The Relevance Voxel Machine (RVoxM): A Bayesian Method for Image-Based Prediction

    DEFF Research Database (Denmark)

    Sabuncu, Mert R.; Van Leemput, Koen

    2011-01-01

    to utilize a small number of spatially clustered sets of voxels that are particularly suited for clinical interpretation. RVoxM automatically tunes all its free parameters during the training phase, and offers the additional advantage of producing probabilistic prediction outcomes. Experiments on age......This paper presents the Relevance VoxelMachine (RVoxM), a Bayesian multivariate pattern analysis (MVPA) algorithm that is specifically designed for making predictions based on image data. In contrast to generic MVPA algorithms that have often been used for this purpose, the method is designed...... prediction from structural brain MRI indicate that RVoxM yields biologically meaningful models that provide excellent predictive accuracy....

  13. Elite Athletes Refine Their Internal Clocks: A Bayesian Analysis.

    Science.gov (United States)

    Chen, Yin-Hua; Verdinelli, Isabella; Cesari, Paola

    2016-07-01

    This paper carries out a full Bayesian analysis for a data set examined in Chen & Cesari (2015). These data were collected for assessing people's ability in evaluating short intervals of time. Chen & Cesari (2015) showed evidence of the existence of two independent internal clocks for evaluating time intervals below and above the second. We reexamine here, the same question by performing a complete statistical Bayesian analysis of the data. The Bayesian approach can be used to analyze these data thanks to the specific trial design. Data were obtained from evaluation of time ranges from two groups of individuals. More specifically, information gathered from a nontrained group (considered as baseline) allowed us to build a prior distribution for the parameter(s) of interest, and data from the trained group determined the likelihood function. This paper's main goals are (i) showing how the Bayesian inferential method can be used in statistical analyses and (ii) showing that the Bayesian methodology gives additional support to the findings presented in Chen & Cesari (2015) regarding the existence of two internal clocks in assessing duration of time intervals.

  14. Bayesian cost-effectiveness analysis with the R package BCEA

    CERN Document Server

    Baio, Gianluca; Heath, Anna

    2017-01-01

    The book provides a description of the process of health economic evaluation and modelling for cost-effectiveness analysis, particularly from the perspective of a Bayesian statistical approach. Some relevant theory and introductory concepts are presented using practical examples and two running case studies. The book also describes in detail how to perform health economic evaluations using the R package BCEA (Bayesian Cost-Effectiveness Analysis). BCEA can be used to post-process the results of a Bayesian cost-effectiveness model and perform advanced analyses producing standardised and highly customisable outputs. It presents all the features of the package, including its many functions and their practical application, as well as its user-friendly web interface. The book is a valuable resource for statisticians and practitioners working in the field of health economics wanting to simplify and standardise their workflow, for example in the preparation of dossiers in support of marketing authorisation, or acade...

  15. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  16. Image-based computer-aided prognosis of lung cancer: predicting patient recurrent-free survival via a variational Bayesian mixture modeling framework for cluster analysis of CT histograms

    Science.gov (United States)

    Kawata, Y.; Niki, N.; Ohamatsu, H.; Kusumoto, M.; Tsuchida, T.; Eguchi, K.; Kaneko, M.; Moriyama, N.

    2012-03-01

    In this paper, we present a computer-aided prognosis (CAP) scheme that utilizes quantitatively derived image information to predict patient recurrent-free survival for lung cancers. Our scheme involves analyzing CT histograms to evaluate the volumetric distribution of CT values within pulmonary nodules. A variational Bayesian mixture modeling framework translates the image-derived features into an image-based risk score for predicting the patient recurrence-free survival. Using our dataset of 454 patients with NSCLC, we demonstrate the potential usefulness of the CAP scheme which can provide a quantitative risk score that is strongly correlated with prognostic factors.

  17. On Bayesian analysis of on–off measurements

    Energy Technology Data Exchange (ETDEWEB)

    Nosek, Dalibor, E-mail: nosek@ipnp.troja.mff.cuni.cz [Charles University, Faculty of Mathematics and Physics, Prague (Czech Republic); Nosková, Jana [Czech Technical University, Faculty of Civil Engineering, Prague (Czech Republic)

    2016-06-01

    We propose an analytical solution to the on–off problem within the framework of Bayesian statistics. Both the statistical significance for the discovery of new phenomena and credible intervals on model parameters are presented in a consistent way. We use a large enough family of prior distributions of relevant parameters. The proposed analysis is designed to provide Bayesian solutions that can be used for any number of observed on–off events, including zero. The procedure is checked using Monte Carlo simulations. The usefulness of the method is demonstrated on examples from γ-ray astronomy.

  18. On Bayesian analysis of on-off measurements

    CERN Document Server

    Nosek, Dalibor

    2016-01-01

    We propose an analytical solution to the on-off problem within the framework of Bayesian statistics. Both the statistical significance for the discovery of new phenomena and credible intervals on model parameters are presented in a consistent way. We use a large enough family of prior distributions of relevant parameters. The proposed analysis is designed to provide Bayesian solutions that can be used for any number of observed on-off events, including zero. The procedure is checked using Monte Carlo simulations. The usefulness of the method is demonstrated on examples from gamma-ray astronomy.

  19. Bayesian analysis in moment inequality models

    CERN Document Server

    Liao, Yuan; 10.1214/09-AOS714

    2010-01-01

    This paper presents a study of the large-sample behavior of the posterior distribution of a structural parameter which is partially identified by moment inequalities. The posterior density is derived based on the limited information likelihood. The posterior distribution converges to zero exponentially fast on any $\\delta$-contraction outside the identified region. Inside, it is bounded below by a positive constant if the identified region is assumed to have a nonempty interior. Our simulation evidence indicates that the Bayesian approach has advantages over frequentist methods, in the sense that, with a proper choice of the prior, the posterior provides more information about the true parameter inside the identified region. We also address the problem of moment and model selection. Our optimality criterion is the maximum posterior procedure and we show that, asymptotically, it selects the true moment/model combination with the most moment inequalities and the simplest model.

  20. Semiparametric bayesian analysis of gene-environment interactions

    OpenAIRE

    Lobach, I.

    2010-01-01

    A key component to prevention and control of complex diseases, such as cancer, diabetes, hypertension, is to analyze the genetic and environmental factors that lead to the development of these complex diseases. We propose a Bayesian approach for analysis of gene-environment interactions that efficiently models information available in the observed data and a priori biomedical knowledge.

  1. Stochastic back analysis of permeability coefficient using generalized Bayesian method

    Directory of Open Access Journals (Sweden)

    Gui-lan ZHENG

    2008-09-01

    Full Text Available Owing to the fact that the conventional deterministic back analysis of the permeability coefficient cannot reflect the uncertainties of parameters, including the hydraulic head at the boundary, the permeability coefficient and measured hydraulic head, a stochastic back analysis taking consideration of uncertainties of parameters was performed using the generalized Bayesian method. Based on the stochastic finite element method (SFEM for a seepage field, the variable metric algorithm and the generalized Bayesian method, formulas for stochastic back analysis of the permeability coefficient were derived. A case study of seepage analysis of a sluice foundation was performed to illustrate the proposed method. The results indicate that, with the generalized Bayesian method that considers the uncertainties of measured hydraulic head, the permeability coefficient and the hydraulic head at the boundary, both the mean and standard deviation of the permeability coefficient can be obtained and the standard deviation is less than that obtained by the conventional Bayesian method. Therefore, the present method is valid and applicable.

  2. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    Science.gov (United States)

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  3. A Bayesian Predictive Discriminant Analysis with Screened Data

    OpenAIRE

    Hea-Jung Kim

    2015-01-01

    In the application of discriminant analysis, a situation sometimes arises where individual measurements are screened by a multidimensional screening scheme. For this situation, a discriminant analysis with screened populations is considered from a Bayesian viewpoint, and an optimal predictive rule for the analysis is proposed. In order to establish a flexible method to incorporate the prior information of the screening mechanism, we propose a hierarchical screened scale mixture of normal (HSS...

  4. A Bayesian model for predicting face recognition performance using image quality

    NARCIS (Netherlands)

    Dutta, A.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2014-01-01

    Quality of a pair of facial images is a strong indicator of the uncertainty in decision about identity based on that image pair. In this paper, we describe a Bayesian approach to model the relation between image quality (like pose, illumination, noise, sharpness, etc) and corresponding face

  5. Bayesian Spectral Analysis of Metal Abandance Deficient Stars

    CERN Document Server

    Sourlas, E; Kashyap, V L; Drake, J; Pease, D; Sourlas, Epaminondas; Dyk, David van; Kashyap, Vinay; Drake, Jeremy; Pease, Deron

    2002-01-01

    Metallicity can be measured by analyzing the spectra in the X-ray region and comparing the flux in spectral lines to the flux in the underlying Bremsstrahlung continuum. In this paper we propose new Bayesian methods which directly model the Poisson nature of the data and thus are expected to exhibit improved sampling properties. Our model also accounts for the Poisson nature of background contamination of the observations, image blurring due to instrument response, and the absorption of photons in space. The resulting highly structured hierarchical model is fit using the Gibbs sampler, data augmentation and Metropolis-Hasting. We demonstrate our methods with the X-ray spectral analysis of several "Metal Abundance Deficient" stars. The model is designed to summarize the relative frequency of the energy of photons (X-ray or gamma-ray) arriving at a detector. Independent Poisson distributions are more appropriate to model the counts than the commonly used normal approximation. We model the high energy tail of th...

  6. A new Bayesian Earthquake Analysis Tool (BEAT)

    Science.gov (United States)

    Vasyura-Bathke, Hannes; Dutta, Rishabh; Jónsson, Sigurjón; Mai, Martin

    2017-04-01

    Modern earthquake source estimation studies increasingly use non-linear optimization strategies to estimate kinematic rupture parameters, often considering geodetic and seismic data jointly. However, the optimization process is complex and consists of several steps that need to be followed in the earthquake parameter estimation procedure. These include pre-describing or modeling the fault geometry, calculating the Green's Functions (often assuming a layered elastic half-space), and estimating the distributed final slip and possibly other kinematic source parameters. Recently, Bayesian inference has become popular for estimating posterior distributions of earthquake source model parameters given measured/estimated/assumed data and model uncertainties. For instance, some research groups consider uncertainties of the layered medium and propagate these to the source parameter uncertainties. Other groups make use of informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed that efficiently explore the often high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational demands of these methods are high and estimation codes are rarely distributed along with the published results. Even if codes are made available, it is often difficult to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in earthquake source estimations, we undertook the effort of producing BEAT, a python package that comprises all the above-mentioned features in one

  7. Imaging anisotropic layering with Bayesian inversion of multiple data types

    Science.gov (United States)

    Bodin, T.; Leiva, J.; Romanowicz, B.; Maupin, V.; Yuan, H.

    2016-07-01

    Azimuthal anisotropy is a powerful tool to reveal information about both the present structure and past evolution of the mantle. Anisotropic images of the upper mantle are usually obtained by analysing various types of seismic observables, such as surface wave dispersion curves or waveforms, SKS splitting data, or receiver functions. These different data types sample different volumes of the earth, they are sensitive to different length scales, and hence are associated with different levels of uncertainties. They are traditionally interpreted separately, and often result in incompatible models. We present a Bayesian inversion approach to jointly invert these different data types. Seismograms for SKS and P phases are directly inverted using a cross-convolution approach, thus avoiding intermediate processing steps, such as numerical deconvolution or computation of splitting parameters. Probabilistic 1-D profiles are obtained with a transdimensional Markov chain Monte Carlo scheme, in which the number of layers, as well as the presence or absence of anisotropy in each layer, are treated as unknown parameters. In this way, seismic anisotropy is only introduced if required by the data. The algorithm is used to resolve both isotropic and anisotropic layering down to a depth of 350 km beneath two seismic stations in North America in two different tectonic settings: the stable Canadian shield (station FFC) and the tectonically active southern Basin and Range Province (station TA-214A). In both cases, the lithosphere-asthenosphere boundary is clearly visible, and marked by a change in direction of the fast axis of anisotropy. Our study confirms that azimuthal anisotropy is a powerful tool for detecting layering in the upper mantle.

  8. Multiple quantitative trait analysis using bayesian networks.

    Science.gov (United States)

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness.

  9. Fully Bayesian inference for structural MRI: application to segmentation and statistical analysis of T2-hypointensities.

    Science.gov (United States)

    Schmidt, Paul; Schmid, Volker J; Gaser, Christian; Buck, Dorothea; Bührlen, Susanne; Förschler, Annette; Mühlau, Mark

    2013-01-01

    Aiming at iron-related T2-hypointensity, which is related to normal aging and neurodegenerative processes, we here present two practicable approaches, based on Bayesian inference, for preprocessing and statistical analysis of a complex set of structural MRI data. In particular, Markov Chain Monte Carlo methods were used to simulate posterior distributions. First, we rendered a segmentation algorithm that uses outlier detection based on model checking techniques within a Bayesian mixture model. Second, we rendered an analytical tool comprising a Bayesian regression model with smoothness priors (in the form of Gaussian Markov random fields) mitigating the necessity to smooth data prior to statistical analysis. For validation, we used simulated data and MRI data of 27 healthy controls (age: [Formula: see text]; range, [Formula: see text]). We first observed robust segmentation of both simulated T2-hypointensities and gray-matter regions known to be T2-hypointense. Second, simulated data and images of segmented T2-hypointensity were analyzed. We found not only robust identification of simulated effects but also a biologically plausible age-related increase of T2-hypointensity primarily within the dentate nucleus but also within the globus pallidus, substantia nigra, and red nucleus. Our results indicate that fully Bayesian inference can successfully be applied for preprocessing and statistical analysis of structural MRI data.

  10. A Bayesian Predictive Discriminant Analysis with Screened Data

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2015-09-01

    Full Text Available In the application of discriminant analysis, a situation sometimes arises where individual measurements are screened by a multidimensional screening scheme. For this situation, a discriminant analysis with screened populations is considered from a Bayesian viewpoint, and an optimal predictive rule for the analysis is proposed. In order to establish a flexible method to incorporate the prior information of the screening mechanism, we propose a hierarchical screened scale mixture of normal (HSSMN model, which makes provision for flexible modeling of the screened observations. An Markov chain Monte Carlo (MCMC method using the Gibbs sampler and the Metropolis–Hastings algorithm within the Gibbs sampler is used to perform a Bayesian inference on the HSSMN models and to approximate the optimal predictive rule. A simulation study is given to demonstrate the performance of the proposed predictive discrimination procedure.

  11. Bayesian Methods for Analysis and Adaptive Scheduling of Exoplanet Observations

    CERN Document Server

    Loredo, Thomas J; Chernoff, David F; Clyde, Merlise A; Liu, Bin

    2011-01-01

    We describe work in progress by a collaboration of astronomers and statisticians developing a suite of Bayesian data analysis tools for extrasolar planet (exoplanet) detection, planetary orbit estimation, and adaptive scheduling of observations. Our work addresses analysis of stellar reflex motion data, where a planet is detected by observing the "wobble" of its host star as it responds to the gravitational tug of the orbiting planet. Newtonian mechanics specifies an analytical model for the resulting time series, but it is strongly nonlinear, yielding complex, multimodal likelihood functions; it is even more complex when multiple planets are present. The parameter spaces range in size from few-dimensional to dozens of dimensions, depending on the number of planets in the system, and the type of motion measured (line-of-sight velocity, or position on the sky). Since orbits are periodic, Bayesian generalizations of periodogram methods facilitate the analysis. This relies on the model being linearly separable, ...

  12. Operational modal analysis modeling, Bayesian inference, uncertainty laws

    CERN Document Server

    Au, Siu-Kui

    2017-01-01

    This book presents operational modal analysis (OMA), employing a coherent and comprehensive Bayesian framework for modal identification and covering stochastic modeling, theoretical formulations, computational algorithms, and practical applications. Mathematical similarities and philosophical differences between Bayesian and classical statistical approaches to system identification are discussed, allowing their mathematical tools to be shared and their results correctly interpreted. Many chapters can be used as lecture notes for the general topic they cover beyond the OMA context. After an introductory chapter (1), Chapters 2–7 present the general theory of stochastic modeling and analysis of ambient vibrations. Readers are first introduced to the spectral analysis of deterministic time series (2) and structural dynamics (3), which do not require the use of probability concepts. The concepts and techniques in these chapters are subsequently extended to a probabilistic context in Chapter 4 (on stochastic pro...

  13. Image Analysis

    DEFF Research Database (Denmark)

    The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...

  14. Fast SAR Image Change Detection Using Bayesian Approach Based Difference Image and Modified Statistical Region Merging

    Directory of Open Access Journals (Sweden)

    Han Zhang

    2014-01-01

    Full Text Available A novel fast SAR image change detection method is presented in this paper. Based on a Bayesian approach, the prior information that speckles follow the Nakagami distribution is incorporated into the difference image (DI generation process. The new DI performs much better than the familiar log ratio (LR DI as well as the cumulant based Kullback-Leibler divergence (CKLD DI. The statistical region merging (SRM approach is first introduced to change detection context. A new clustering procedure with the region variance as the statistical inference variable is exhibited to tailor SAR image change detection purposes, with only two classes in the final map, the unchanged and changed classes. The most prominent advantages of the proposed modified SRM (MSRM method are the ability to cope with noise corruption and the quick implementation. Experimental results show that the proposed method is superior in both the change detection accuracy and the operation efficiency.

  15. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.

  16. Bayesian phylogeny analysis via stochastic approximation Monte Carlo.

    Science.gov (United States)

    Cheon, Sooyoung; Liang, Faming

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time.

  17. Bayesian Variable Selection in Cost-Effectiveness Analysis

    Directory of Open Access Journals (Sweden)

    Miguel A. Negrín

    2010-04-01

    Full Text Available Linear regression models are often used to represent the cost and effectiveness of medical treatment. The covariates used may include sociodemographic variables, such as age, gender or race; clinical variables, such as initial health status, years of treatment or the existence of concomitant illnesses; and a binary variable indicating the treatment received. However, most studies estimate only one model, which usually includes all the covariates. This procedure ignores the question of uncertainty in model selection. In this paper, we examine four alternative Bayesian variable selection methods that have been proposed. In this analysis, we estimate the inclusion probability of each covariate in the real model conditional on the data. Variable selection can be useful for estimating incremental effectiveness and incremental cost, through Bayesian model averaging, as well as for subgroup analysis.

  18. Bayesian imperfect information analysis for clinical recurrent data

    OpenAIRE

    Chang CK; Chang CC

    2014-01-01

    Chih-Kuang Chang,1 Chi-Chang Chang2 1Department of Cardiology, Jen-Ai Hospital, Dali District, Taichung, Taiwan; 2School of Medical Informatics, Chung Shan Medical University, Information Technology Office of Chung Shan Medical University Hospital, Taichung, TaiwanAbstract: In medical research, clinical practice must often be undertaken with imperfect information from limited resources. This study applied Bayesian imperfect information-value analysis to realistic situations to produce likelih...

  19. Introduction to the Restoration of Astrophysical Images by Multiscale Transforms and Bayesian Methods

    Science.gov (United States)

    Bijaoui, A.

    2013-03-01

    The image restoration is today an important part of the astrophysical data analysis. The denoising and the deblurring can be efficiently performed using multiscale transforms. The multiresolution analysis constitutes the fundamental pillar for these transforms. The discrete wavelet transform is introduced from the theory of the approximation by translated functions. The continuous wavelet transform carries out a generalization of multiscale representations from translated and dilated wavelets. The à trous algorithm furnishes its discrete redundant transform. The image denoising is first considered without any hypothesis on the signal distribution, on the basis of the a contrario detection. Different softening functions are introduced. The introduction of a regularization constraint may improve the results. The application of Bayesian methods leads to an automated adaptation of the softening function to the signal distribution. The MAP principle leads to the basis pursuit, a sparse decomposition on redundant dictionaries. Nevertheless the posterior expectation minimizes, scale per scale, the quadratic error. The proposed deconvolution algorithm is based on a coupling of the wavelet denoising with an iterative inversion algorithm. The different methods are illustrated by numerical experiments on a simulated image similar to images of the deep sky. A white Gaussian stationary noise was added with three levels. In the conclusion different important connected problems are tackled.

  20. Bayesian tomography and integrated data analysis in fusion diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dong, E-mail: lid@swip.ac.cn; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R. [Southwestern Institute of Physics, Chengdu, Sichuan 610041 (China)

    2016-11-15

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  1. Bayesian methods for the analysis of inequality constrained contingency tables.

    Science.gov (United States)

    Laudy, Olav; Hoijtink, Herbert

    2007-04-01

    A Bayesian methodology for the analysis of inequality constrained models for contingency tables is presented. The problem of interest lies in obtaining the estimates of functions of cell probabilities subject to inequality constraints, testing hypotheses and selection of the best model. Constraints on conditional cell probabilities and on local, global, continuation and cumulative odds ratios are discussed. A Gibbs sampler to obtain a discrete representation of the posterior distribution of the inequality constrained parameters is used. Using this discrete representation, the credibility regions of functions of cell probabilities can be constructed. Posterior model probabilities are used for model selection and hypotheses are tested using posterior predictive checks. The Bayesian methodology proposed is illustrated in two examples.

  2. A Bayesian on-off analysis of cosmic ray data

    Science.gov (United States)

    Nosek, Dalibor; Nosková, Jana

    2017-09-01

    We deal with the analysis of on-off measurements designed for the confirmation of a weak source of events whose presence is hypothesized, based on former observations. The problem of a small number of source events that are masked by an imprecisely known background is addressed from a Bayesian point of view. We examine three closely related variables, the posterior distributions of which carry relevant information about various aspects of the investigated phenomena. This information is utilized for predictions of further observations, given actual data. Backed by details of detection, we propose how to quantify disparities between different measurements. The usefulness of the Bayesian inference is demonstrated on examples taken from cosmic ray physics.

  3. Bayesian tomography and integrated data analysis in fusion diagnostics

    Science.gov (United States)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  4. A Bayesian semiparametric factor analysis model for subtype identification.

    Science.gov (United States)

    Sun, Jiehuan; Warren, Joshua L; Zhao, Hongyu

    2017-04-25

    Disease subtype identification (clustering) is an important problem in biomedical research. Gene expression profiles are commonly utilized to infer disease subtypes, which often lead to biologically meaningful insights into disease. Despite many successes, existing clustering methods may not perform well when genes are highly correlated and many uninformative genes are included for clustering due to the high dimensionality. In this article, we introduce a novel subtype identification method in the Bayesian setting based on gene expression profiles. This method, called BCSub, adopts an innovative semiparametric Bayesian factor analysis model to reduce the dimension of the data to a few factor scores for clustering. Specifically, the factor scores are assumed to follow the Dirichlet process mixture model in order to induce clustering. Through extensive simulation studies, we show that BCSub has improved performance over commonly used clustering methods. When applied to two gene expression datasets, our model is able to identify subtypes that are clinically more relevant than those identified from the existing methods.

  5. A Bayesian Super-Resolution Approach to Demosaicing of Blurred Images

    Directory of Open Access Journals (Sweden)

    Molina Rafael

    2006-01-01

    Full Text Available Most of the available digital color cameras use a single image sensor with a color filter array (CFA in acquiring an image. In order to produce a visible color image, a demosaicing process must be applied, which produces undesirable artifacts. An additional problem appears when the observed color image is also blurred. This paper addresses the problem of deconvolving color images observed with a single coupled charged device (CCD from the super-resolution point of view. Utilizing the Bayesian paradigm, an estimate of the reconstructed image and the model parameters is generated. The proposed method is tested on real images.

  6. Bayesian networks for omics data analysis

    NARCIS (Netherlands)

    Gavai, A.K.

    2009-01-01

    This thesis focuses on two aspects of high throughput technologies, i.e. data storage and data analysis, in particular in transcriptomics and metabolomics. Both technologies are part of a research field that is generally called ‘omics’ (or ‘-omics’, with a leading hyphen), which refers to genomics,

  7. A Bayesian Nonparametric Meta-Analysis Model

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  8. A Bayesian framework for human body pose tracking from depth image sequences.

    Science.gov (United States)

    Zhu, Youding; Fujimura, Kikuo

    2010-01-01

    This paper addresses the problem of accurate and robust tracking of 3D human body pose from depth image sequences. Recovering the large number of degrees of freedom in human body movements from a depth image sequence is challenging due to the need to resolve the depth ambiguity caused by self-occlusions and the difficulty to recover from tracking failure. Human body poses could be estimated through model fitting using dense correspondences between depth data and an articulated human model (local optimization method). Although it usually achieves a high accuracy due to dense correspondences, it may fail to recover from tracking failure. Alternately, human pose may be reconstructed by detecting and tracking human body anatomical landmarks (key-points) based on low-level depth image analysis. While this method (key-point based method) is robust and recovers from tracking failure, its pose estimation accuracy depends solely on image-based localization accuracy of key-points. To address these limitations, we present a flexible Bayesian framework for integrating pose estimation results obtained by methods based on key-points and local optimization. Experimental results are shown and performance comparison is presented to demonstrate the effectiveness of the proposed approach.

  9. Bayesian networks for omics data analysis

    OpenAIRE

    Gavai, A.K.

    2009-01-01

    This thesis focuses on two aspects of high throughput technologies, i.e. data storage and data analysis, in particular in transcriptomics and metabolomics. Both technologies are part of a research field that is generally called ‘omics’ (or ‘-omics’, with a leading hyphen), which refers to genomics, transcriptomics, proteomics, or metabolomics. Although these techniques study different entities (genes, gene expression, proteins, or metabolites), they all have in common that they use high-throu...

  10. Empirical Markov Chain Monte Carlo Bayesian analysis of fMRI data.

    Science.gov (United States)

    de Pasquale, F; Del Gratta, C; Romani, G L

    2008-08-01

    In this work an Empirical Markov Chain Monte Carlo Bayesian approach to analyse fMRI data is proposed. The Bayesian framework is appealing since complex models can be adopted in the analysis both for the image and noise model. Here, the noise autocorrelation is taken into account by adopting an AutoRegressive model of order one and a versatile non-linear model is assumed for the task-related activation. Model parameters include the noise variance and autocorrelation, activation amplitudes and the hemodynamic response function parameters. These are estimated at each voxel from samples of the Posterior Distribution. Prior information is included by means of a 4D spatio-temporal model for the interaction between neighbouring voxels in space and time. The results show that this model can provide smooth estimates from low SNR data while important spatial structures in the data can be preserved. A simulation study is presented in which the accuracy and bias of the estimates are addressed. Furthermore, some results on convergence diagnostic of the adopted algorithm are presented. To validate the proposed approach a comparison of the results with those from a standard GLM analysis, spatial filtering techniques and a Variational Bayes approach is provided. This comparison shows that our approach outperforms the classical analysis and is consistent with other Bayesian techniques. This is investigated further by means of the Bayes Factors and the analysis of the residuals. The proposed approach applied to Blocked Design and Event Related datasets produced reliable maps of activation.

  11. A combinatorial Bayesian and Dirichlet model for prostate MR image segmentation using probabilistic image features

    Science.gov (United States)

    Li, Ang; Li, Changyang; Wang, Xiuying; Eberl, Stefan; Feng, Dagan; Fulham, Michael

    2016-08-01

    Blurred boundaries and heterogeneous intensities make accurate prostate MR image segmentation problematic. To improve prostate MR image segmentation we suggest an approach that includes: (a) an image patch division method to partition the prostate into homogeneous segments for feature extraction; (b) an image feature formulation and classification method, using the relevance vector machine, to provide probabilistic prior knowledge for graph energy construction; (c) a graph energy formulation scheme with Bayesian priors and Dirichlet graph energy and (d) a non-iterative graph energy minimization scheme, based on matrix differentiation, to perform the probabilistic pixel membership optimization. The segmentation output was obtained by assigning pixels with foreground and background labels based on derived membership probabilities. We evaluated our approach on the PROMISE-12 dataset with 50 prostate MR image volumes. Our approach achieved a mean dice similarity coefficient (DSC) of 0.90  ±  0.02, which surpassed the five best prior-based methods in the PROMISE-12 segmentation challenge.

  12. PFG NMR and Bayesian analysis to characterise non-Newtonian fluids

    Science.gov (United States)

    Blythe, Thomas W.; Sederman, Andrew J.; Stitt, E. Hugh; York, Andrew P. E.; Gladden, Lynn F.

    2017-01-01

    Many industrial flow processes are sensitive to changes in the rheological behaviour of process fluids, and there therefore exists a need for methods that provide online, or inline, rheological characterisation necessary for process control and optimisation over timescales of minutes or less. Nuclear magnetic resonance (NMR) offers a non-invasive technique for this application, without limitation on optical opacity. We present a Bayesian analysis approach using pulsed field gradient (PFG) NMR to enable estimation of the rheological parameters of Herschel-Bulkley fluids in a pipe flow geometry, characterised by a flow behaviour index n , yield stress τ0 , and consistency factor k , by analysis of the signal in q -space. This approach eliminates the need for velocity image acquisition and expensive gradient hardware. We investigate the robustness of the proposed Bayesian NMR approach to noisy data and reduced sampling using simulated NMR data and show that even with a signal-to-noise ratio (SNR) of 100, only 16 points are required to be sampled to provide rheological parameters accurate to within 2% of the ground truth. Experimental validation is provided through an experimental case study on Carbopol 940 solutions (model Herschel-Bulkley fluids) using PFG NMR at a 1H resonance frequency of 85.2 MHz; for SNR > 1000, only 8 points are required to be sampled. This corresponds to a total acquisition time of non-Bayesian NMR methods demonstrates that the Bayesian NMR approach is in agreement with MR flow imaging to within the accuracy of the measurement. Furthermore, as we increase the concentration of Carbopol 940 we observe a change in rheological characteristics, probably due to shear history-dependent behaviour and the different geometries used. This behaviour highlights the need for online, or inline, rheological characterisation in industrial process applications.

  13. A Hierarchical Bayesian M/EEG Imaging Method Correcting for Incomplete Spatio-Temporal Priors

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Attias, Hagai T.; Sekihara, Kensuke;

    2013-01-01

    In this paper we present a hierarchical Bayesian model, to tackle the highly ill-posed problem that follows with MEG and EEG source imaging. Our model promotes spatiotemporal patterns through the use of both spatial and temporal basis functions. While in contrast to most previous spatio-temporal ...

  14. Bayesian Switching Factor Analysis for Estimating Time-varying Functional Connectivity in fMRI.

    Science.gov (United States)

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-03-03

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal

  15. A Bayesian analysis of pentaquark signals from CLAS data

    CERN Document Server

    Ireland, D G; Protopopescu, D; Ambrozewicz, P; Anghinolfi, M; Asryan, G; Avakian, H; Bagdasaryan, H; Baillie, N; Ball, J P; Baltzell, N A; Batourine, V; Battaglieri, M; Bedlinskiy, I; Bellis, M; Benmouna, N; Berman, B L; Biselli, A S; Blaszczyk, L; Bouchigny, S; Boiarinov, S; Bradford, R; Branford, D; Briscoe, W J; Brooks, W K; Burkert, V D; Butuceanu, C; Calarco, J R; Careccia, S L; Carman, D S; Casey, L; Chen, S; Cheng, L; Cole, P L; Collins, P; Coltharp, P; Crabb, D; Credé, V; Dashyan, N; De Masi, R; De Vita, R; De Sanctis, E; Degtyarenko, P V; Deur, A; Dickson, R; Djalali, C; Dodge, G E; Donnelly, J; Doughty, D; Dugger, M; Dzyubak, O P; Egiyan, H; Egiyan, K S; El Fassi, L; Elouadrhiri, L; Eugenio, P; Fedotov, G; Feldman, G; Fradi, A; Funsten, H; Garçon, M; Gavalian, G; Gevorgyan, N; Gilfoyle, G P; Giovanetti, K L; Girod, F X; Goetz, J T; Gohn, W; Gonenc, A; Gothe, R W; Griffioen, K A; Guidal, M; Guler, N; Guo, L; Gyurjyan, V; Hafidi, K; Hakobyan, H; Hanretty, C; Hassall, N; Hersman, F W; Hleiqawi, I; Holtrop, M; Hyde-Wright, C E; Ilieva, Y; Ishkhanov, B S; Isupov, E L; Jenkins, D; Jo, H S; Johnstone, J R; Joo, K; Jüngst, H G; Kalantarians, N; Kellie, J D; Khandaker, M; Kim, W; Klein, A; Klein, F J; Kossov, M; Krahn, Z; Kramer, L H; Kubarovski, V; Kühn, J; Kuleshov, S V; Kuznetsov, V; Lachniet, J; Laget, J M; Langheinrich, J; Lawrence, D; Livingston, K; Lu, H Y; MacCormick, M; Markov, N; Mattione, P; Mecking, B A; Mestayer, M D; Meyer, C A; Mibe, T; Mikhailov, K; Mirazita, M; Miskimen, R; Mokeev, V; Moreno, B; Moriya, K; Morrow, S A; Moteabbed, M; Munevar, E; Mutchler, G S; Nadel-Turonski, P; Nasseripour, R; Niccolai, S; Niculescu, G; Niculescu, I; Niczyporuk, B B; Niroula, M R; Niyazov, R A; Nozar, M; Osipenko, M; Ostrovidov, A I; Park, K; Pasyuk, E; Paterson, C; Anefalos Pereira, S; Pierce, J; Pivnyuk, N; Pogorelko, O; Pozdniakov, S; Price, J W; Procureur, S; Prok, Y; Raue, B A; Ricco, G; Ripani, M; Ritchie, B G; Ronchetti, F; Rosner, G; Rossi, P; Sabatie, F; Salamanca, J; Salgado, C; Santoro, J P; Sapunenko, V; Schumacher, R A; Serov, V S; Sharabyan, Yu G; Sharov, D; Shvedunov, N V; Smith, E S; Smith, L C; Sober, D I; Sokhan, D; Stavinsky, A; Stepanyan, S S; Stepanyan, S; Stokes, B E; Stoler, P; Strauch, S; Taiuti, M; Tedeschi, D J; Thoma, U; Tkabladze, A; Tkachenko, S; Tur, C; Ungaro, M; Vineyard, M F; Vlassov, A V; Watts, D P; Weinstein, L B; Weygand, D P; Williams, M; Wolin, E; Wood, M H; Yegneswaran, A; Zana, L; Zhang, J; Zhao, B; Zhao, Z W

    2007-01-01

    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.

  16. Bayesian analysis of the Hector’s Dolphin data

    Directory of Open Access Journals (Sweden)

    King, R.

    2004-06-01

    Full Text Available In recent years there have been increasing concerns for many wildlife populations, due to decreasing population trends. This has led to the introduction of management schemes to increase the survival rates and hence the population size of many species of animals. We concentrate on a particular dolphin population situated off the coast of New Zealand, and investigate whether the introduction of a fishing gill net ban was effective in decreasing dolphin mortality. We undertake a Bayesian analysis of the data, in which we quantitatively compare the different competing biological hypotheses, determining the effect of the sanctuary upon the dolphin population.

  17. Bayesian frequency analysis of HD 201433 observations with BRITE

    CERN Document Server

    Kallinger, T

    2016-01-01

    Multiple oscillation frequencies separated by close to or less than the formal frequency resolution of a data set are a serious problem in the frequency analysis of time series data. We present a new and fully automated Bayesian approach that searches for close frequencies in time series data and assesses their significance by comparison to no signal and a mono-periodic signal. We extensively test the approach with synthetic data sets and apply it to the 156 days-long high-precision BRITE photometry of the SPB star HD 201433, for which we find a sequence of nine statistically significant rotationally split dipole modes.

  18. Bayesian Reasoning in Data Analysis A Critical Introduction

    CERN Document Server

    D'Agostini, Giulio

    2003-01-01

    This book provides a multi-level introduction to Bayesian reasoning (as opposed to "conventional statistics") and its applications to data analysis. The basic ideas of this "new" approach to the quantification of uncertainty are presented using examples from research and everyday life. Applications covered include: parametric inference; combination of results; treatment of uncertainty due to systematic errors and background; comparison of hypotheses; unfolding of experimental distributions; upper/lower bounds in frontier-type measurements. Approximate methods for routine use are derived and ar

  19. A Bayesian analysis of pentaquark signals from CLAS data

    Energy Technology Data Exchange (ETDEWEB)

    David Ireland; Bryan McKinnon; Dan Protopopescu; Pawel Ambrozewicz; Marco Anghinolfi; G. Asryan; Harutyun Avakian; H. Bagdasaryan; Nathan Baillie; Jacques Ball; Nathan Baltzell; V. Batourine; Marco Battaglieri; Ivan Bedlinski; Ivan Bedlinskiy; Matthew Bellis; Nawal Benmouna; Barry Berman; Angela Biselli; Lukasz Blaszczyk; Sylvain Bouchigny; Sergey Boyarinov; Robert Bradford; Derek Branford; William Briscoe; William Brooks; Volker Burkert; Cornel Butuceanu; John Calarco; Sharon Careccia; Daniel Carman; Liam Casey; Shifeng Chen; Lu Cheng; Philip Cole; Patrick Collins; Philip Coltharp; Donald Crabb; Volker Crede; Natalya Dashyan; Rita De Masi; Raffaella De Vita; Enzo De Sanctis; Pavel Degtiarenko; Alexandre Deur; Richard Dickson; Chaden Djalali; Gail Dodge; Joseph Donnelly; David Doughty; Michael Dugger; Oleksandr Dzyubak; Hovanes Egiyan; Kim Egiyan; Lamiaa Elfassi; Latifa Elouadrhiri; Paul Eugenio; Gleb Fedotov; Gerald Feldman; Ahmed Fradi; Herbert Funsten; Michel Garcon; Gagik Gavalian; Nerses Gevorgyan; Gerard Gilfoyle; Kevin Giovanetti; Francois-Xavier Girod; John Goetz; Wesley Gohn; Atilla Gonenc; Ralf Gothe; Keith Griffioen; Michel Guidal; Nevzat Guler; Lei Guo; Vardan Gyurjyan; Kawtar Hafidi; Hayk Hakobyan; Charles Hanretty; Neil Hassall; F. Hersman; Ishaq Hleiqawi; Maurik Holtrop; Charles Hyde; Yordanka Ilieva; Boris Ishkhanov; Eugeny Isupov; D. Jenkins; Hyon-Suk Jo; John Johnstone; Kyungseon Joo; Henry Juengst; Narbe Kalantarians; James Kellie; Mahbubul Khandaker; Wooyoung Kim; Andreas Klein; Franz Klein; Mikhail Kossov; Zebulun Krahn; Laird Kramer; Valery Kubarovsky; Joachim Kuhn; Sergey Kuleshov; Viacheslav Kuznetsov; Jeff Lachniet; Jean Laget; Jorn Langheinrich; D. Lawrence; Kenneth Livingston; Haiyun Lu; Marion MacCormick; Nikolai Markov; Paul Mattione; Bernhard Mecking; Mac Mestayer; Curtis Meyer; Tsutomu Mibe; Konstantin Mikhaylov; Marco Mirazita; Rory Miskimen; Viktor Mokeev; Brahim Moreno; Kei Moriya; Steven Morrow; Maryam Moteabbed; Edwin Munevar Espitia; Gordon Mutchler; Pawel Nadel-Turonski; Rakhsha Nasseripour; Silvia Niccolai; Gabriel Niculescu; Maria-Ioana Niculescu; Bogdan Niczyporuk; Megh Niroula; Rustam Niyazov; Mina Nozar; Mikhail Osipenko; Alexander Ostrovidov; Kijun Park; Evgueni Pasyuk; Craig Paterson; Sergio Pereira; Joshua Pierce; Nikolay Pivnyuk; Oleg Pogorelko; Sergey Pozdnyakov; John Price; Sebastien Procureur; Yelena Prok; Brian Raue; Giovanni Ricco; Marco Ripani; Barry Ritchie; Federico Ronchetti; Guenther Rosner; Patrizia Rossi; Franck Sabatie; Julian Salamanca; Carlos Salgado; Joseph Santoro; Vladimir Sapunenko; Reinhard Schumacher; Vladimir Serov; Youri Sharabian; Dmitri Sharov; Nikolay Shvedunov; Elton Smith; Lee Smith; Daniel Sober; Daria Sokhan; Aleksey Stavinskiy; Samuel Stepanyan; Stepan Stepanyan; Burnham Stokes; Paul Stoler; Steffen Strauch; Mauro Taiuti; David Tedeschi; Ulrike Thoma; Avtandil Tkabladze; Svyatoslav Tkachenko; Clarisse Tur; Maurizio Ungaro; Michael Vineyard; Alexander Vlassov; Daniel Watts; Lawrence Weinstein; Dennis Weygand; M. Williams; Elliott Wolin; M.H. Wood; Amrit Yegneswaran; Lorenzo Zana; Jixie Zhang; Bo Zhao; Zhiwen Zhao

    2008-02-01

    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.

  20. Safety Analysis of Liquid Rocket Engine Using Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    WANG Hua-wei; YAN Zhi-qiang

    2007-01-01

    Safety analysis for liquid rocket engine has a great meaning for shortening development cycle, saving development expenditure and reducing development risk. The relationship between the structure and component of liquid rocket engine is much more complex, furthermore test data are absent in development phase. Thereby, the uncertainties exist in safety analysis for liquid rocket engine. A safety analysis model integrated with FMEA(failure mode and effect analysis)based on Bayesian networks (BN) is brought forward for liquid rocket engine, which can combine qualitative analysis with quantitative decision. The method has the advantages of fusing multi-information, saving sample amount and having high veracity. An example shows that the method is efficient.

  1. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    Science.gov (United States)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  2. Implementation of a Bayesian Engine for Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  3. Bayesian analysis of inflationary features in Planck and SDSS data

    CERN Document Server

    Benetti, Micol

    2016-01-01

    We perform a Bayesian analysis to study possible features in the primordial inflationary power spectrum of scalar perturbations. In particular, we analyse the possibility of detecting the imprint of these primordial features in the anisotropy temperature power spectrum of the Cosmic Microwave Background (CMB) and also in the matter power spectrum P (k). We use the most recent CMB data provided by the Planck Collaboration and P (k) measurements from the eleventh data release of the Sloan Digital Sky Survey. We focus our analysis on a class of potentials whose features are localised at different intervals of angular scales, corresponding to multipoles in the ranges 10 < l < 60 (Oscill-1) and 150 < l < 300 (Oscill-2). Our results show that one of the step-potentials (Oscill-1) provides a better fit to the CMB data than does the featureless LCDM scenario, with a moderate Bayesian evidence in favor of the former. Adding the P (k) data to the analysis weakens the evidence of the Oscill-1 potential relat...

  4. A Bayesian approach for characterization of soft tissue viscoelasticity in acoustic radiation force imaging.

    Science.gov (United States)

    Zhao, Xiaodong; Pelegri, Assimina A

    2016-04-01

    Biomechanical imaging techniques based on acoustic radiation force (ARF) have been developed to characterize the viscoelasticity of soft tissue by measuring the motion excited by ARF non-invasively. The unknown stress distribution in the region of excitation limits an accurate inverse characterization of soft tissue viscoelasticity, and single degree-of-freedom simplified models have been applied to solve the inverse problem approximately. In this study, the ARF-induced creep imaging is employed to estimate the time constant of a Voigt viscoelastic tissue model, and an inverse finite element (FE) characterization procedure based on a Bayesian formulation is presented. The Bayesian approach aims to estimate a reasonable quantification of the probability distributions of soft tissue mechanical properties in the presence of measurement noise and model parameter uncertainty. Gaussian process metamodeling is applied to provide a fast statistical approximation based on a small number of computationally expensive FE model runs. Numerical simulation results demonstrate that the Bayesian approach provides an efficient and practical estimation of the probability distributions of time constant in the ARF-induced creep imaging. In a comparison study with the single degree of freedom models, the Bayesian approach with FE models improves the estimation results even in the presence of large uncertainty levels of the model parameters.

  5. Semi-Huber Quadratic Function and Comparative Study of Some MRFs for Bayesian Image Restoration

    Science.gov (United States)

    De la Rosa, J. I.; Villa-Hernández, J.; González-Ramírez, E.; De la Rosa, M. E.; Gutiérrez, O.; Olvera-Olvera, C.; Castañeda-Miranda, R.; Fleury, G.

    2013-10-01

    The present work introduces an alternative method to deal with digital image restoration into a Bayesian framework, particularly, the use of a new half-quadratic function is proposed which performance is satisfactory compared with respect to some other functions in existing literature. The bayesian methodology is based on the prior knowledge of some information that allows an efficient modelling of the image acquisition process. The edge preservation of objects into the image while smoothing noise is necessary in an adequate model. Thus, we use a convexity criteria given by a semi-Huber function to obtain adequate weighting of the cost functions (half-quadratic) to be minimized. The principal objective when using Bayesian methods based on the Markov Random Fields (MRF) in the context of image processing is to eliminate those effects caused by the excessive smoothness on the reconstruction process of image which are rich in contours or edges. A comparison between the new introduced scheme and other three existing schemes, for the cases of noise filtering and image deblurring, is presented. This collection of implemented methods is inspired of course on the use of MRFs such as the semi-Huber, the generalized Gaussian, the Welch, and Tukey potential functions with granularity control. The obtained results showed a satisfactory performance and the effectiveness of the proposed estimator with respect to other three estimators.

  6. Bayesian imperfect information analysis for clinical recurrent data

    Directory of Open Access Journals (Sweden)

    Chang CK

    2014-12-01

    Full Text Available Chih-Kuang Chang,1 Chi-Chang Chang2 1Department of Cardiology, Jen-Ai Hospital, Dali District, Taichung, Taiwan; 2School of Medical Informatics, Chung Shan Medical University, Information Technology Office of Chung Shan Medical University Hospital, Taichung, TaiwanAbstract: In medical research, clinical practice must often be undertaken with imperfect information from limited resources. This study applied Bayesian imperfect information-value analysis to realistic situations to produce likelihood functions and posterior distributions, to a clinical decision-making problem for recurrent events. In this study, three kinds of failure models are considered, and our methods illustrated with an analysis of imperfect information from a trial of immunotherapy in the treatment of chronic granulomatous disease. In addition, we present evidence toward a better understanding of the differing behaviors along with concomitant variables. Based on the results of simulations, the imperfect information value of the concomitant variables was evaluated and different realistic situations were compared to see which could yield more accurate results for medical decision-making. Keywords: Bayesian value-of-information, recurrent events, chronic granulomatous disease

  7. Bayesian linkage analysis of categorical traits for arbitrary pedigree designs.

    Directory of Open Access Journals (Sweden)

    Abra Brisbin

    Full Text Available BACKGROUND: Pedigree studies of complex heritable diseases often feature nominal or ordinal phenotypic measurements and missing genetic marker or phenotype data. METHODOLOGY: We have developed a Bayesian method for Linkage analysis of Ordinal and Categorical traits (LOCate that can analyze complex genealogical structure for family groups and incorporate missing data. LOCate uses a Gibbs sampling approach to assess linkage, incorporating a simulated tempering algorithm for fast mixing. While our treatment is Bayesian, we develop a LOD (log of odds score estimator for assessing linkage from Gibbs sampling that is highly accurate for simulated data. LOCate is applicable to linkage analysis for ordinal or nominal traits, a versatility which we demonstrate by analyzing simulated data with a nominal trait, on which LOCate outperforms LOT, an existing method which is designed for ordinal traits. We additionally demonstrate our method's versatility by analyzing a candidate locus (D2S1788 for panic disorder in humans, in a dataset with a large amount of missing data, which LOT was unable to handle. CONCLUSION: LOCate's accuracy and applicability to both ordinal and nominal traits will prove useful to researchers interested in mapping loci for categorical traits.

  8. Analysis of Wave Directional Spreading by Bayesian Parameter Estimation

    Institute of Scientific and Technical Information of China (English)

    钱桦; 莊士贤; 高家俊

    2002-01-01

    A spatial array of wave gauges installed on an observatoion platform has been designed and arranged to measure the lo-cal features of winter monsoon directional waves off Taishi coast of Taiwan. A new method, named the Bayesian ParameterEstimation Method( BPEM), is developed and adopted to determine the main direction and the directional spreading parame-ter of directional spectra. The BPEM could be considered as a regression analysis to find the maximum joint probability ofparameters, which best approximates the observed data from the Bayesian viewpoint. The result of the analysis of field wavedata demonstrates the highly dependency of the characteristics of normalized directional spreading on the wave age. The Mit-suyasu type empirical formula of directional spectnun is therefore modified to be representative of monsoon wave field. More-over, it is suggested that Smax could be expressed as a function of wave steepness. The values of Smax decrease with increas-ing steepness. Finally, a local directional spreading model, which is simple to be utilized in engineering practice, is prop-osed.

  9. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  10. A Bayesian formulation of seismic fragility analysis of safety related equipment

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z-L.; Pandey, M.; Xie, W-C., E-mail: z268wang@uwaterloo.ca, E-mail: mdpandey@uwaterloo.ca, E-mail: xie@uwaterloo.ca [Univ. of Waterloo, Ontario (Canada)

    2013-07-01

    A Bayesian approach to seismic fragility analysis of safety-related equipment is formulated. Unlike treating two sources of uncertainty of in the parameter estimation in two steps separately using the classical statistics, a Bayesian hierarchical model is advocated for interpreting and combining the various uncertainties more clearly in this article. In addition, with the availability of additional earthquake experience data and shaking table test results, a Bayesian approach to updating the fragility model of safety-related equipment is formulated by incorporating acquired failure and survivor evidence. Numerical results show the significance in fragility analysis using the Bayesian approach. (author)

  11. fastRESOLVE: fast Bayesian imaging for aperture synthesis in radio astronomy

    CERN Document Server

    Greiner, Maksim; Junklewitz, Henrik; Enßlin, Torsten A

    2016-01-01

    The standard imaging algorithm for interferometric radio data, CLEAN, is optimal for point source observations, but suboptimal for diffuse emission. Recently, RESOLVE, a new Bayesian algorithm has been developed, which is ideal for extended source imaging. Unfortunately, RESOLVE is computationally very expensive. In this paper we present fastRESOLVE, a modification of RESOLVE based on an approximation of the interferometric likelihood that allows us to avoid expensive gridding routines and consequently gain a factor of roughly 100 in computation time. Furthermore, we include a Bayesian estimation of the measurement uncertainty of the visibilities into the imaging, a procedure not applied in aperture synthesis before. The algorithm requires little to no user input compared to the standard method CLEAN while being superior for extended and faint emission. We apply the algorithm to VLA data of Abell 2199 and show that it resolves more detailed structures.

  12. A Bayesian subgroup analysis using collections of ANOVA models.

    Science.gov (United States)

    Liu, Jinzhong; Sivaganesan, Siva; Laud, Purushottam W; Müller, Peter

    2017-03-20

    We develop a Bayesian approach to subgroup analysis using ANOVA models with multiple covariates, extending an earlier work. We assume a two-arm clinical trial with normally distributed response variable. We also assume that the covariates for subgroup finding are categorical and are a priori specified, and parsimonious easy-to-interpret subgroups are preferable. We represent the subgroups of interest by a collection of models and use a model selection approach to finding subgroups with heterogeneous effects. We develop suitable priors for the model space and use an objective Bayesian approach that yields multiplicity adjusted posterior probabilities for the models. We use a structured algorithm based on the posterior probabilities of the models to determine which subgroup effects to report. Frequentist operating characteristics of the approach are evaluated using simulation. While our approach is applicable in more general cases, we mainly focus on the 2 × 2 case of two covariates each at two levels for ease of presentation. The approach is illustrated using a real data example.

  13. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    Science.gov (United States)

    Evans, John W.; Gallo, Luis; Kaminsky, Mark

    2012-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  14. ProFit: Bayesian Profile Fitting of Galaxy Images

    CERN Document Server

    Robotham, A S G; Tobar, R; A,; Moffett,; Driver, S P

    2016-01-01

    We present ProFit, a new code for Bayesian two-dimensional photometric galaxy profile modelling. ProFit consists of a low-level C++ library (libprofit), accessible via a command-line interface and documented API, along with high-level R (ProFit) and Python (PyProFit) interfaces (available at github.com/ICRAR/ libprofit, github.com/ICRAR/ProFit, and github.com/ICRAR/pyprofit respectively). R ProFit is also available pre-built from CRAN, however this version will be slightly behind the latest GitHub version. libprofit offers fast and accurate two- dimensional integration for a useful number of profiles, including Sersic, Core-Sersic, broken-exponential, Ferrer, Moffat, empirical King, point-source and sky, with a simple mechanism for adding new profiles. We show detailed comparisons between libprofit and GALFIT. libprofit is both faster and more accurate than GALFIT at integrating the ubiquitous Serrsic profile for the most common values of the Serrsic index n (0.5 < n < 8). The high-level fitting code Pr...

  15. STATISTICAL ANALYSIS OF THE TM- MODEL VIA BAYESIAN APPROACH

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam

    2012-11-01

    Full Text Available The method of paired comparisons calls for the comparison of treatments presented in pairs to judges who prefer the better one based on their sensory evaluations. Thurstone (1927 and Mosteller (1951 employ the method of maximum likelihood to estimate the parameters of the Thurstone-Mosteller model for the paired comparisons. A Bayesian analysis of the said model using the non-informative reference (Jeffreys prior is presented in this study. The posterior estimates (means and joint modes of the parameters and the posterior probabilities comparing the two parameters are obtained for the analysis. The predictive probabilities that one treatment (Ti in preferred to any other treatment (Tj in a future single comparison are also computed. In addition, the graphs of the marginal posterior distributions of the individual parameter are drawn. The appropriateness of the model is also tested using the Chi-Square test statistic.

  16. Bayesian Model Selection with Network Based Diffusion Analysis.

    Science.gov (United States)

    Whalen, Andrew; Hoppitt, William J E

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed.

  17. Bayesian data analysis: estimating the efficacy of T'ai Chi as a case study.

    Science.gov (United States)

    Carpenter, Jacque; Gajewski, Byron; Teel, Cynthia; Aaronson, Lauren S

    2008-01-01

    Bayesian inference provides a formal framework for updating knowledge by combining prior knowledge with current data. Over the past 10 years, the Bayesian paradigm has become a popular analytic tool in health research. Although the nursing literature contains examples of Bayes' theorem applications to clinical decision making, it lacks an adequate introduction to Bayesian data analysis. Bayesian data analysis is introduced through a fully Bayesian model for determining the efficacy of tai chi as an illustrative example. The mechanics of using Bayesian models to combine prior knowledge, or data from previous studies, with observed data from a current study are discussed. The primary outcome in the illustrative example was physical function. Three prior probability distributions (priors) were generated for physical function using data from a similar study found in the literature. Each prior was combined with the likelihood from observed data in the current study to obtain a posterior probability distribution. In each case, the posterior distribution showed that the probability that the control group is better than the tai chi treatment group was low. Bayesian analysis is a valid technique that allows the researcher to manage varying amounts of data appropriately. As advancements in computer software continue, Bayesian techniques will become more accessible. Researchers must educate themselves on applications for Bayesian inference, as well as its methods and implications for future research.

  18. A Fast Edge Preserving Bayesian Reconstruction Method for Parallel Imaging Applications in Cardiac MRI

    Science.gov (United States)

    Singh, Gurmeet; Raj, Ashish; Kressler, Bryan; Nguyen, Thanh D.; Spincemaille, Pascal; Zabih, Ramin; Wang, Yi

    2010-01-01

    Among recent parallel MR imaging reconstruction advances, a Bayesian method called Edge-preserving Parallel Imaging with GRAph cut Minimization (EPIGRAM) has been demonstrated to significantly improve signal to noise ratio (SNR) compared to conventional regularized sensitivity encoding (SENSE) method. However, EPIGRAM requires a large number of iterations in proportion to the number of intensity labels in the image, making it computationally expensive for high dynamic range images. The objective of this study is to develop a Fast EPIGRAM reconstruction based on the efficient binary jump move algorithm that provides a logarithmic reduction in reconstruction time while maintaining image quality. Preliminary in vivo validation of the proposed algorithm is presented for 2D cardiac cine MR imaging and 3D coronary MR angiography at acceleration factors of 2-4. Fast EPIGRAM was found to provide similar image quality to EPIGRAM and maintain the previously reported SNR improvement over regularized SENSE, while reducing EPIGRAM reconstruction time by 25-50 times. PMID:20939095

  19. Bayesian Image Restoration Using a Large-Scale Total Patch Variation Prior

    Directory of Open Access Journals (Sweden)

    Yang Chen

    2011-01-01

    Full Text Available Edge-preserving Bayesian restorations using nonquadratic priors are often inefficient in restoring continuous variations and tend to produce block artifacts around edges in ill-posed inverse image restorations. To overcome this, we have proposed a spatial adaptive (SA prior with improved performance. However, this SA prior restoration suffers from high computational cost and the unguaranteed convergence problem. Concerning these issues, this paper proposes a Large-scale Total Patch Variation (LS-TPV Prior model for Bayesian image restoration. In this model, the prior for each pixel is defined as a singleton conditional probability, which is in a mixture prior form of one patch similarity prior and one weight entropy prior. A joint MAP estimation is thus built to ensure the iteration monotonicity. The intensive calculation of patch distances is greatly alleviated by the parallelization of Compute Unified Device Architecture(CUDA. Experiments with both simulated and real data validate the good performance of the proposed restoration.

  20. Bayesian-based Wavelet Shrinkage for SAR Image Despeckling Using Cycle Spinning

    Institute of Scientific and Technical Information of China (English)

    ZHANG De-xiang; GAO Qing-wei; CHEN Jun-ning

    2006-01-01

    A novel and efficient speckle noise reduction algorithm based on Bayesian wavelet shrinkage using cycle spinning is proposed. First, the sub-band decompositions of non-logarithmically transformed SAR images are shown. Then, a Bayesian wavelet shrinkage factor is applied to the decomposed data to estimate noise-free wavelet coefficients. The method is based on the Mixture Gaussian Distributed (MGD) modeling of sub-band coefficients. Finally, multi-resolution wavelet coefficients are reconstructed by wavelet-threshold using cycle spinning. Experimental results show that the proposed despeckling algorithm is possible to achieve an excellent balance between suppresses speckle effectively and preserves as many image details and sharpness as possible. The new method indicated its higher performance than the other speckle noise reduction techniques and minimizing the effect of pseudo-Gibbs phenomena.

  1. Reference priors of nuisance parameters in Bayesian sequential population analysis

    CERN Document Server

    Bousquet, Nicolas

    2010-01-01

    Prior distributions elicited for modelling the natural fluctuations or the uncertainty on parameters of Bayesian fishery population models, can be chosen among a vast range of statistical laws. Since the statistical framework is defined by observational processes, observational parameters enter into the estimation and must be considered random, similarly to parameters or states of interest like population levels or real catches. The former are thus perceived as nuisance parameters whose values are intrinsically linked to the considered experiment, which also require noninformative priors. In fishery research Jeffreys methodology has been presented by Millar (2002) as a practical way to elicit such priors. However they can present wrong properties in multiparameter contexts. Therefore we suggest to use the elicitation method proposed by Berger and Bernardo to avoid paradoxical results raised by Jeffreys priors. These benchmark priors are derived here in the framework of sequential population analysis.

  2. Bayesian large-scale structure inference and cosmic web analysis

    CERN Document Server

    Leclercq, Florent

    2015-01-01

    Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...

  3. BASE-9: Bayesian Analysis for Stellar Evolution with nine variables

    Science.gov (United States)

    Robinson, Elliot; von Hippel, Ted; Stein, Nathan; Stenning, David; Wagner-Kaiser, Rachel; Si, Shijing; van Dyk, David

    2016-08-01

    The BASE-9 (Bayesian Analysis for Stellar Evolution with nine variables) software suite recovers star cluster and stellar parameters from photometry and is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses a Markov chain Monte Carlo (MCMC) technique along with brute force numerical integration to estimate the posterior probability distribution for the age, metallicity, helium abundance, distance modulus, line-of-sight absorption, and parameters of the initial-final mass relation (IFMR) for a cluster, and for the primary mass, secondary mass (if a binary), and cluster probability for every potential cluster member. The MCMC technique is used for the cluster quantities (the first six items listed above) and numerical integration is used for the stellar quantities (the last three items in the above list).

  4. Bayesian analysis of factors associated with fibromyalgia syndrome subjects

    Science.gov (United States)

    Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie

    2015-01-01

    Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.

  5. Japanese Dairy Cattle Productivity Analysis using Bayesian Network Model (BNM

    Directory of Open Access Journals (Sweden)

    Iqbal Ahmed

    2016-11-01

    Full Text Available Japanese Dairy Cattle Productivity Analysis is carried out based on Bayesian Network Model (BNM. Through the experiment with 280 Japanese anestrus Holstein dairy cow, it is found that the estimation for finding out the presence of estrous cycle using BNM represents almost 55% accuracy while considering all samples. On the contrary, almost 73% accurate estimation could be achieved while using suspended likelihood in sample datasets. Moreover, while the proposed BNM model have more confidence then the estimation accuracy is lies in between 93 to 100%. In addition, this research also reveals the optimum factors to find out the presence of estrous cycle among the 270 individual dairy cows. The objective estimation methods using BNM definitely lead a unique idea to overcome the error of subjective estimation of having estrous cycle among these Japanese dairy cattle.

  6. A Bayesian analysis of regularised source inversions in gravitational lensing

    CERN Document Server

    Suyu, S H; Hobson, M P; Marshall, P J

    2006-01-01

    Strong gravitational lens systems with extended sources are of special interest because they provide additional constraints on the models of the lens systems. To use a gravitational lens system for measuring the Hubble constant, one would need to determine the lens potential and the source intensity distribution simultaneously. A linear inversion method to reconstruct a pixellated source distribution of a given lens potential model was introduced by Warren and Dye. In the inversion process, a regularisation on the source intensity is often needed to ensure a successful inversion with a faithful resulting source. In this paper, we use Bayesian analysis to determine the optimal regularisation constant (strength of regularisation) of a given form of regularisation and to objectively choose the optimal form of regularisation given a selection of regularisations. We consider and compare quantitatively three different forms of regularisation previously described in the literature for source inversions in gravitatio...

  7. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    CERN Document Server

    Scargle, Jeffrey D; Jackson, Brad; Chiang, James

    2012-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it - an improved and generalized version of Bayesian Blocks (Scargle 1998) - that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multi-variate time series data, analysis of vari...

  8. Objective Bayesian Comparison of Constrained Analysis of Variance Models.

    Science.gov (United States)

    Consonni, Guido; Paroli, Roberta

    2016-10-04

    In the social sciences we are often interested in comparing models specified by parametric equality or inequality constraints. For instance, when examining three group means [Formula: see text] through an analysis of variance (ANOVA), a model may specify that [Formula: see text], while another one may state that [Formula: see text], and finally a third model may instead suggest that all means are unrestricted. This is a challenging problem, because it involves a combination of nonnested models, as well as nested models having the same dimension. We adopt an objective Bayesian approach, requiring no prior specification from the user, and derive the posterior probability of each model under consideration. Our method is based on the intrinsic prior methodology, suitably modified to accommodate equality and inequality constraints. Focussing on normal ANOVA models, a comparative assessment is carried out through simulation studies. We also present an application to real data collected in a psychological experiment.

  9. Bayesian multi-scale smoothing of photon-limited images with applications to astronomy and medicine

    Science.gov (United States)

    White, John

    Multi-scale models for smoothing Poisson signals or images have gained much attention over the past decade. A new Bayesian model is developed using the concept of the Chinese restaurant process to find structures in two-dimensional images when performing image reconstruction or smoothing. This new model performs very well when compared to other leading methodologies for the same problem. It is developed and evaluated theoretically and empirically throughout Chapter 2. The newly developed Bayesian model is extended to three-dimensional images in Chapter 3. The third dimension has numerous different applications, such as different energy spectra, another spatial index, or possibly a temporal dimension. Empirically, this method shows promise in reducing error with the use of simulation studies. A further development removes background noise in the image. This removal can further reduce the error and is done using a modeling adjustment and post-processing techniques. These details are given in Chapter 4. Applications to real world problems are given throughout. Photon-based images are common in astronomical imaging due to the collection of different types of energy such as X-Rays. Applications to real astronomical images are given, and these consist of X-ray images from the Chandra X-ray observatory satellite. Diagnostic medicine uses many types of imaging such as magnetic resonance imaging and computed tomography that can also benefit from smoothing techniques such as the one developed here. Reducing the amount of radiation a patient takes will make images more noisy, but this can be mitigated through the use of image smoothing techniques. Both types of images represent the potential real world use for these methods.

  10. A Bayesian Seismic Hazard Analysis for the city of Naples

    Science.gov (United States)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  11. An Exploratory Study Examining the Feasibility of Using Bayesian Networks to Predict Circuit Analysis Understanding

    Science.gov (United States)

    Chung, Gregory K. W. K.; Dionne, Gary B.; Kaiser, William J.

    2006-01-01

    Our research question was whether we could develop a feasible technique, using Bayesian networks, to diagnose gaps in student knowledge. Thirty-four college-age participants completed tasks designed to measure conceptual knowledge, procedural knowledge, and problem-solving skills related to circuit analysis. A Bayesian network was used to model…

  12. 用贝叶斯网络进行因果分析%Bayesian Causal Analysis

    Institute of Scientific and Technical Information of China (English)

    王双成; 林士敏; 陆玉昌

    2000-01-01

    The Bayesian causal analysis includes two techniques, one of which takes advantage of Bayesian network structure learning under the Causal Markov assumption and the presupposition that hidden variables are absent, and the other uses canonical form influence diagram. The two techniques possess their distinctive characteristics,and ought to be selected and put to use in the light of specific conditions.

  13. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis : A Cross-National Investigation of Schwartz Values

    NARCIS (Netherlands)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the

  14. Evaluation of a partial genome screening of two asthma susceptibility regions using bayesian network based bayesian multilevel analysis of relevance.

    Directory of Open Access Journals (Sweden)

    Ildikó Ungvári

    Full Text Available Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls. The results were evaluated with traditional frequentist methods and we applied a new statistical method, called bayesian network based bayesian multilevel analysis of relevance (BN-BMLA. This method uses bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated.With frequentist methods one SNP (rs3751464 in the FRMD6 gene provided evidence for an association with asthma (OR = 1.43(1.2-1.8; p = 3×10(-4. The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics.In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance.

  15. Source detection in astronomical images by Bayesian model comparison

    Science.gov (United States)

    Frean, Marcus; Friedlander, Anna; Johnston-Hollitt, Melanie; Hollitt, Christopher

    2014-12-01

    The next generation of radio telescopes will generate exabytes of data on hundreds of millions of objects, making automated methods for the detection of astronomical objects ("sources") essential. Of particular importance are faint, diffuse objects embedded in noise. There is a pressing need for source finding software that identifies these sources, involves little manual tuning, yet is tractable to calculate. We first give a novel image discretisation method that incorporates uncertainty about how an image should be discretised. We then propose a hierarchical prior for astronomical images, which leads to a Bayes factor indicating how well a given region conforms to a model of source that is exceptionally unconstrained, compared to a model of background. This enables the efficient localisation of regions that are "suspiciously different" from the background distribution, so our method looks not for brightness but for anomalous distributions of intensity, which is much more general. The model of background can be iteratively improved by removing the influence on it of sources as they are discovered. The approach is evaluated by identifying sources in real and simulated data, and performs well on these measures: the Bayes factor is maximized at most real objects, while returning only a moderate number of false positives. In comparison to a catalogue constructed by widely-used source detection software with manual post-processing by an astronomer, our method found a number of dim sources that were missing from the "ground truth" catalogue.

  16. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  17. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  18. Spinal imaging and image analysis

    CERN Document Server

    Yao, Jianhua

    2015-01-01

    This book is instrumental to building a bridge between scientists and clinicians in the field of spine imaging by introducing state-of-the-art computational methods in the context of clinical applications.  Spine imaging via computed tomography, magnetic resonance imaging, and other radiologic imaging modalities, is essential for noninvasively visualizing and assessing spinal pathology. Computational methods support and enhance the physician’s ability to utilize these imaging techniques for diagnosis, non-invasive treatment, and intervention in clinical practice. Chapters cover a broad range of topics encompassing radiological imaging modalities, clinical imaging applications for common spine diseases, image processing, computer-aided diagnosis, quantitative analysis, data reconstruction and visualization, statistical modeling, image-guided spine intervention, and robotic surgery. This volume serves a broad audience as  contributions were written by both clinicians and researchers, which reflects the inte...

  19. Insights on the Bayesian spectral density method for operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui

    2016-01-01

    This paper presents a study on the Bayesian spectral density method for operational modal analysis. The method makes Bayesian inference of the modal properties by using the sample power spectral density (PSD) matrix averaged over independent sets of ambient data. In the typical case with a single set of data, it is divided into non-overlapping segments and they are assumed to be independent. This study is motivated by a recent paper that reveals a mathematical equivalence of the method with the Bayesian FFT method. The latter does not require averaging concepts or the independent segment assumption. This study shows that the equivalence does not hold in reality because the theoretical long data asymptotic distribution of the PSD matrix may not be valid. A single time history can be considered long for the Bayesian FFT method but not necessarily for the Bayesian PSD method, depending on the number of segments.

  20. Bayesian Model Selection With Network Based Diffusion Analysis

    Directory of Open Access Journals (Sweden)

    Andrew eWhalen

    2016-04-01

    Full Text Available A number of recent studies have used Network Based Diffusion Analysis (NBDA to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA. To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed.

  1. A procedure for seiche analysis with Bayesian information criterion

    Science.gov (United States)

    Aichi, Masaatsu

    2016-04-01

    Seiche is a standing wave in enclosed or semi-enclosed water body. Its amplitude irregularly changes in time due to weather condition etc. Then, extracting seiche signal is not easy by usual methods for time series analysis such as fast Fourier transform (FFT). In this study, a new method for time series analysis with Bayesian information criterion was developed to decompose seiche, tide, long-term trend and residual components from time series data of tide stations. The method was developed based on the maximum marginal likelihood estimation of tide amplitudes, seiche amplitude, and trend components. Seiche amplitude and trend components were assumed that they gradually changes as second derivative in time was close to zero. These assumptions were incorporated as prior distributions. The variances of prior distributions were estimated by minimizing Akaike-Bayes information criterion (ABIC). The frequency of seiche was determined by Newton method with initial guess by FFT. The accuracy of proposed method was checked by analyzing synthetic time series data composed of known components. The reproducibility of the original components was quite well. The proposed method was also applied to the actual time series data of sea level observed by tide station and the strain of coastal rock masses observed by fiber Bragg grating sensor in Aburatsubo Bay, Japan. The seiche in bay and its response of rock masses were successfully extracted.

  2. Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-11-01

    Full Text Available Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. Results We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Conclusions Our approach provides an attractive statistical methodology for

  3. Combining morphological analysis and Bayesian Networks for strategic decision support

    CSIR Research Space (South Africa)

    De Waal, AJ

    2007-12-01

    Full Text Available on the data format. For example, the EM (Expectation-Maximisation) algorithm (Stuart & Norvig, 2003) may be used to estimate the probabilities of a BN if empirical data are available. Many knowledge engineering techniques exist to elicit and translate...-Verlag, New York (NY). [9] Korb KB & Nicholson AE, 2004, Bayesian artificial intelligence, Chapman & Hall/CRC, Boca Raton (FL). [10] Murphy KP, 2002, Dynamic Bayesian networks: Representation, inference and leaning, PhD disser- tation, University...

  4. A hierarchical Bayesian-MAP approach to inverse problems in imaging

    Science.gov (United States)

    Raj, Raghu G.

    2016-07-01

    We present a novel approach to inverse problems in imaging based on a hierarchical Bayesian-MAP (HB-MAP) formulation. In this paper we specifically focus on the difficult and basic inverse problem of multi-sensor (tomographic) imaging wherein the source object of interest is viewed from multiple directions by independent sensors. Given the measurements recorded by these sensors, the problem is to reconstruct the image (of the object) with a high degree of fidelity. We employ a probabilistic graphical modeling extension of the compound Gaussian distribution as a global image prior into a hierarchical Bayesian inference procedure. Since the prior employed by our HB-MAP algorithm is general enough to subsume a wide class of priors including those typically employed in compressive sensing (CS) algorithms, HB-MAP algorithm offers a vehicle to extend the capabilities of current CS algorithms to include truly global priors. After rigorously deriving the regression algorithm for solving our inverse problem from first principles, we demonstrate the performance of the HB-MAP algorithm on Monte Carlo trials and on real empirical data (natural scenes). In all cases we find that our algorithm outperforms previous approaches in the literature including filtered back-projection and a variety of state-of-the-art CS algorithms. We conclude with directions of future research emanating from this work.

  5. Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.

    Science.gov (United States)

    Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian

    2016-05-01

    Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Guidance on the implementation and reporting of a drug safety Bayesian network meta-analysis.

    Science.gov (United States)

    Ohlssen, David; Price, Karen L; Xia, H Amy; Hong, Hwanhee; Kerman, Jouni; Fu, Haoda; Quartey, George; Heilmann, Cory R; Ma, Haijun; Carlin, Bradley P

    2014-01-01

    The Drug Information Association Bayesian Scientific Working Group (BSWG) was formed in 2011 with a vision to ensure that Bayesian methods are well understood and broadly utilized for design and analysis and throughout the medical product development process, and to improve industrial, regulatory, and economic decision making. The group, composed of individuals from academia, industry, and regulatory, has as its mission to facilitate the appropriate use and contribute to the progress of Bayesian methodology. In this paper, the safety sub-team of the BSWG explores the use of Bayesian methods when applied to drug safety meta-analysis and network meta-analysis. Guidance is presented on the conduct and reporting of such analyses. We also discuss different structural model assumptions and provide discussion on prior specification. The work is illustrated through a case study involving a network meta-analysis related to the cardiovascular safety of non-steroidal anti-inflammatory drugs.

  7. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  8. Spatial Hierarchical Bayesian Analysis of the Historical Extreme Streamflow

    Science.gov (United States)

    Najafi, M. R.; Moradkhani, H.

    2012-04-01

    Analysis of the climate change impact on extreme hydro-climatic events is crucial for future hydrologic/hydraulic designs and water resources decision making. The purpose of this study is to investigate the changes of the extreme value distribution parameters with respect to time to reflect upon the impact of climate change. We develop a statistical model using the observed streamflow data of the Columbia River Basin in USA to estimate the changes of high flows as a function of time as well as other variables. Generalized Pareto Distribution (GPD) is used to model the upper 95% flows during December through March for 31 gauge stations. In the process layer of the model the covariates including time, latitude, longitude, elevation and basin area are considered to assess the sensitivity of the model to each variable. Markov Chain Monte Carlo (MCMC) method is used to estimate the parameters. The Spatial Hierarchical Bayesian technique models the GPD parameters spatially and borrows strength from other locations by pooling data together, while providing an explicit estimation of the uncertainties in all stages of modeling.

  9. A Bayesian model for the analysis of transgenerational epigenetic variation.

    Science.gov (United States)

    Varona, Luis; Munilla, Sebastián; Mouresan, Elena Flavia; González-Rodríguez, Aldemar; Moreno, Carlos; Altarriba, Juan

    2015-01-23

    Epigenetics has become one of the major areas of biological research. However, the degree of phenotypic variability that is explained by epigenetic processes still remains unclear. From a quantitative genetics perspective, the estimation of variance components is achieved by means of the information provided by the resemblance between relatives. In a previous study, this resemblance was described as a function of the epigenetic variance component and a reset coefficient that indicates the rate of dissipation of epigenetic marks across generations. Given these assumptions, we propose a Bayesian mixed model methodology that allows the estimation of epigenetic variance from a genealogical and phenotypic database. The methodology is based on the development of a T: matrix of epigenetic relationships that depends on the reset coefficient. In addition, we present a simple procedure for the calculation of the inverse of this matrix ( T-1: ) and a Gibbs sampler algorithm that obtains posterior estimates of all the unknowns in the model. The new procedure was used with two simulated data sets and with a beef cattle database. In the simulated populations, the results of the analysis provided marginal posterior distributions that included the population parameters in the regions of highest posterior density. In the case of the beef cattle dataset, the posterior estimate of transgenerational epigenetic variability was very low and a model comparison test indicated that a model that did not included it was the most plausible.

  10. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    Directory of Open Access Journals (Sweden)

    Adam W Green

    Full Text Available Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA and available information to inform a formal decision process to determine optimal and timely management policies.

  11. A Bayesian analysis of the 2016 Pedernales (Ecuador) earthquake

    Science.gov (United States)

    Gombert, Baptiste; Duputel, Zacharie; Jolivet, Romain; Rivera, Luis; Simons, Mark; Jiang, Junle; Liang, Cunren; Fielding, Eric

    2017-04-01

    A Mw 7.8 earthquake struck Ecuador on April 16, 2016, causing significant damage and casualties. Long period W-phase and Global CMT solutions suggest that fault slip for this event agrees with the convergence obliquity of the Ecuadorian subduction. We present a new co-seismic kinematic slip model obtained from the joint inversion of multiple observations in an unregularized and fully Bayesian framework. We use a comprehensive static dataset composed of several SAR interferograms, GPS static offsets, and tsunami waveforms from two nearby DART stations. The kinematic component of the rupture process is constrained by an extensive set of high-rate GPS and seismic data. Our solution includes the ensemble of all plausible slip models that are consistent with our prior information and fit the available observations within data and prediction uncertainties. We analyze the source process in light of the historical seismicity, in particular the Mw 7.8 1942 earthquake for which the rupture extent overlaps with the 2016 event. In addition, we conduct a probabilistic comparison of co-seismic slip with a stochastic interseismic coupling model obtained from GPS data. This analysis gives new insights on the processes at play within the Ecuadorian subduction margin.

  12. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  13. Bayesian Tracking of Visual Objects

    Science.gov (United States)

    Zheng, Nanning; Xue, Jianru

    Tracking objects in image sequences involves performing motion analysis at the object level, which is becoming an increasingly important technology in a wide range of computer video applications, including video teleconferencing, security and surveillance, video segmentation, and editing. In this chapter, we focus on sequential Bayesian estimation techniques for visual tracking. We first introduce the sequential Bayesian estimation framework, which acts as the theoretic basis for visual tracking. Then, we present approaches to constructing representation models for specific objects.

  14. How about a Bayesian M/EEG imaging method correcting for incomplete spatio-temporal priors

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Attias, Hagai T.; Sekihara, Kensuke;

    2013-01-01

    In this contribution we present a hierarchical Bayesian model, sAquavit, to tackle the highly ill-posed problem that follows with MEG and EEG source imaging. Our model facilitates spatio-temporal patterns through the use of both spatial and temporal basis functions. While in contrast to most...... previous spatio-temporal inverse M/EEG models, the proposed model benefits of consisting of two source terms, namely, a spatio-temporal pattern term limiting the source configuration to a spatio-temporal subspace and a source correcting term to pick up source activity not covered by the spatio...

  15. Retinal imaging and image analysis

    NARCIS (Netherlands)

    Abramoff, M.D.; Garvin, Mona K.; Sonka, Milan

    2010-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindne

  16. Image Denoising via Bayesian Estimation of Statistical Parameter Using Generalized Gamma Density Prior in Gaussian Noise Model

    Science.gov (United States)

    Kittisuwan, Pichid

    2015-03-01

    The application of image processing in industry has shown remarkable success over the last decade, for example, in security and telecommunication systems. The denoising of natural image corrupted by Gaussian noise is a classical problem in image processing. So, image denoising is an indispensable step during image processing. This paper is concerned with dual-tree complex wavelet-based image denoising using Bayesian techniques. One of the cruxes of the Bayesian image denoising algorithms is to estimate the statistical parameter of the image. Here, we employ maximum a posteriori (MAP) estimation to calculate local observed variance with generalized Gamma density prior for local observed variance and Laplacian or Gaussian distribution for noisy wavelet coefficients. Evidently, our selection of prior distribution is motivated by efficient and flexible properties of generalized Gamma density. The experimental results show that the proposed method yields good denoising results.

  17. Comparison of Bayesian and Classical Analysis of Weibull Regression Model: A Simulation Study

    Directory of Open Access Journals (Sweden)

    İmran KURT ÖMÜRLÜ

    2011-01-01

    Full Text Available Objective: The purpose of this study was to compare performances of classical Weibull Regression Model (WRM and Bayesian-WRM under varying conditions using Monte Carlo simulations. Material and Methods: It was simulated the generated data by running for each of classical WRM and Bayesian-WRM under varying informative priors and sample sizes using our simulation algorithm. In simulation studies, n=50, 100 and 250 were for sample sizes, and informative prior values using a normal prior distribution with was selected for b1. For each situation, 1000 simulations were performed. Results: Bayesian-WRM with proper informative prior showed a good performance with too little bias. It was found out that bias of Bayesian-WRM increased while priors were becoming distant from reliability in all sample sizes. Furthermore, Bayesian-WRM obtained predictions with more little standard error than the classical WRM in both of small and big samples in the light of proper priors. Conclusion: In this simulation study, Bayesian-WRM showed better performance than classical method, when subjective data analysis performed by considering of expert opinions and historical knowledge about parameters. Consequently, Bayesian-WRM should be preferred in existence of reliable informative priors, in the contrast cases, classical WRM should be preferred.

  18. Bayesian methods for quantitative trait loci mapping based on model selection: approximate analysis using the Bayesian information criterion.

    Science.gov (United States)

    Ball, R D

    2001-11-01

    We describe an approximate method for the analysis of quantitative trait loci (QTL) based on model selection from multiple regression models with trait values regressed on marker genotypes, using a modification of the easily calculated Bayesian information criterion to estimate the posterior probability of models with various subsets of markers as variables. The BIC-delta criterion, with the parameter delta increasing the penalty for additional variables in a model, is further modified to incorporate prior information, and missing values are handled by multiple imputation. Marginal probabilities for model sizes are calculated, and the posterior probability of nonzero model size is interpreted as the posterior probability of existence of a QTL linked to one or more markers. The method is demonstrated on analysis of associations between wood density and markers on two linkage groups in Pinus radiata. Selection bias, which is the bias that results from using the same data to both select the variables in a model and estimate the coefficients, is shown to be a problem for commonly used non-Bayesian methods for QTL mapping, which do not average over alternative possible models that are consistent with the data.

  19. JBASE: Joint Bayesian Analysis of Subphenotypes and Epistasis

    Science.gov (United States)

    Colak, Recep; Kim, TaeHyung; Kazan, Hilal; Oh, Yoomi; Cruz, Miguel; Valladares-Salgado, Adan; Peralta, Jesus; Escobedo, Jorge; Parra, Esteban J.; Kim, Philip M.; Goldenberg, Anna

    2016-01-01

    Motivation: Rapid advances in genotyping and genome-wide association studies have enabled the discovery of many new genotype–phenotype associations at the resolution of individual markers. However, these associations explain only a small proportion of theoretically estimated heritability of most diseases. In this work, we propose an integrative mixture model called JBASE: joint Bayesian analysis of subphenotypes and epistasis. JBASE explores two major reasons of missing heritability: interactions between genetic variants, a phenomenon known as epistasis and phenotypic heterogeneity, addressed via subphenotyping. Results: Our extensive simulations in a wide range of scenarios repeatedly demonstrate that JBASE can identify true underlying subphenotypes, including their associated variants and their interactions, with high precision. In the presence of phenotypic heterogeneity, JBASE has higher Power and lower Type 1 Error than five state-of-the-art approaches. We applied our method to a sample of individuals from Mexico with Type 2 diabetes and discovered two novel epistatic modules, including two loci each, that define two subphenotypes characterized by differences in body mass index and waist-to-hip ratio. We successfully replicated these subphenotypes and epistatic modules in an independent dataset from Mexico genotyped with a different platform. Availability and implementation: JBASE is implemented in C++, supported on Linux and is available at http://www.cs.toronto.edu/∼goldenberg/JBASE/jbase.tar.gz. The genotype data underlying this study are available upon approval by the ethics review board of the Medical Centre Siglo XXI. Please contact Dr Miguel Cruz at mcruzl@yahoo.com for assistance with the application. Contact: anna.goldenberg@utoronto.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26411870

  20. Bayesian Analysis of Multiple Populations in Galactic Globular Clusters

    Science.gov (United States)

    Wagner-Kaiser, Rachel A.; Sarajedini, Ata; von Hippel, Ted; Stenning, David; Piotto, Giampaolo; Milone, Antonino; van Dyk, David A.; Robinson, Elliot; Stein, Nathan

    2016-01-01

    We use GO 13297 Cycle 21 Hubble Space Telescope (HST) observations and archival GO 10775 Cycle 14 HST ACS Treasury observations of Galactic Globular Clusters to find and characterize multiple stellar populations. Determining how globular clusters are able to create and retain enriched material to produce several generations of stars is key to understanding how these objects formed and how they have affected the structural, kinematic, and chemical evolution of the Milky Way. We employ a sophisticated Bayesian technique with an adaptive MCMC algorithm to simultaneously fit the age, distance, absorption, and metallicity for each cluster. At the same time, we also fit unique helium values to two distinct populations of the cluster and determine the relative proportions of those populations. Our unique numerical approach allows objective and precise analysis of these complicated clusters, providing posterior distribution functions for each parameter of interest. We use these results to gain a better understanding of multiple populations in these clusters and their role in the history of the Milky Way.Support for this work was provided by NASA through grant numbers HST-GO-10775 and HST-GO-13297 from the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS5-26555. This material is based upon work supported by the National Aeronautics and Space Administration under Grant NNX11AF34G issued through the Office of Space Science. This project was supported by the National Aeronautics & Space Administration through the University of Central Florida's NASA Florida Space Grant Consortium.

  1. A novel image fusion algorithm based on 2D scale-mixing complex wavelet transform and Bayesian MAP estimation for multimodal medical images

    Directory of Open Access Journals (Sweden)

    Abdallah Bengueddoudj

    2017-05-01

    Full Text Available In this paper, we propose a new image fusion algorithm based on two-dimensional Scale-Mixing Complex Wavelet Transform (2D-SMCWT. The fusion of the detail 2D-SMCWT coefficients is performed via a Bayesian Maximum a Posteriori (MAP approach by considering a trivariate statistical model for the local neighboring of 2D-SMCWT coefficients. For the approximation coefficients, a new fusion rule based on the Principal Component Analysis (PCA is applied. We conduct several experiments using three different groups of multimodal medical images to evaluate the performance of the proposed method. The obtained results prove the superiority of the proposed method over the state of the art fusion methods in terms of visual quality and several commonly used metrics. Robustness of the proposed method is further tested against different types of noise. The plots of fusion metrics establish the accuracy of the proposed fusion method.

  2. Quantum System Identification: Hamiltonian Estimation using Spectral and Bayesian Analysis

    CERN Document Server

    Schirmer, S G

    2009-01-01

    Identifying the Hamiltonian of a quantum system from experimental data is considered. General limits on the identifiability of model parameters with limited experimental resources are investigated, and a specific Bayesian estimation procedure is proposed and evaluated for a model system where a-priori information about the Hamiltonian's structure is available.

  3. Exploiting sensitivity analysis in Bayesian networks for consumer satisfaction study

    NARCIS (Netherlands)

    Jaronski, W.; Bloemer, J.M.M.; Vanhoof, K.; Wets, G.

    2004-01-01

    The paper presents an application of Bayesian network technology in a empirical customer satisfaction study. The findings of the study should provide insight as to the importance of product/service dimensions in terms of the strength of their influence on overall satisfaction. To this end we apply a

  4. An analysis of the Bayesian track labelling problem

    NARCIS (Netherlands)

    Aoki, E.H.; Boers, Y.; Svensson, Lennart; Mandal, Pranab K.; Bagchi, Arunabha

    In multi-target tracking (MTT), the problem of assigning labels to tracks (track labelling) is vastly covered in literature, but its exact mathematical formulation, in terms of Bayesian statistics, has not been yet looked at in detail. Doing so, however, may help us to understand how Bayes-optimal

  5. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    Science.gov (United States)

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services.

  6. Nonparametric Bayesian Clustering of Structural Whole Brain Connectivity in Full Image Resolution

    DEFF Research Database (Denmark)

    Ambrosen, Karen Marie Sandø; Albers, Kristoffer Jon; Dyrby, Tim B.

    2014-01-01

    Diffusion magnetic resonance imaging enables measuring the structural connectivity of the human brain at a high spatial resolution. Local noisy connectivity estimates can be derived using tractography approaches and statistical models are necessary to quantify the brain’s salient structural...... organization. However, statistically modeling these massive structural connectivity datasets is a computational challenging task. We develop a high-performance inference procedure for the infinite relational model (a prominent non-parametric Bayesian model for clustering networks into structurally similar...... groups) that defines structural units at the resolution of statistical support. We apply the model to a network of structural brain connectivity in full image resolution with more than one hundred thousand regions (voxels in the gray-white matter boundary) and around one hundred million connections...

  7. 3D mapping of buried underworld infrastructure using dynamic Bayesian network based multi-sensory image data fusion

    Science.gov (United States)

    Dutta, Ritaban; Cohn, Anthony G.; Muggleton, Jen M.

    2013-05-01

    The successful operation of buried infrastructure within urban environments is fundamental to the conservation of modern living standards. In this paper a novel multi-sensor image fusion framework has been proposed and investigated using dynamic Bayesian network for automatic detection of buried underworld infrastructure. Experimental multi-sensors images were acquired for a known buried plastic water pipe using Vibro-acoustic sensor based location methods and Ground Penetrating Radar imaging system. Computationally intelligent conventional image processing techniques were used to process three types of sensory images. Independently extracted depth and location information from different images regarding the target pipe were fused together using dynamic Bayesian network to predict the maximum probable location and depth of the pipe. The outcome from this study was very encouraging as it was able to detect the target pipe with high accuracy compared with the currently existing pipe survey map. The approach was also applied successfully to produce a best probable 3D buried asset map.

  8. A Statistical Theory for Shape Analysis of Curves and Surfaces with Applications in Image Analysis, Biometrics, Bioinformatics and Medical Diagnostics

    Science.gov (United States)

    2010-05-10

    targets in noisy/corrupted images (Bayesian active contours), finding shape models in point clouds derived from images, shape analysis of facial surfaces...Srivastava and I. H. Jermyn, Bayesian Classification of Shapes Hidden in Point Clouds , Proceedings of 13th Digital Signal Processing Workshop, Marco...CA, June 2010. 18. J. Su, Z. Zhu, F. Huffer, and A. Srivastava, Detecting Shapes in 2D Point Clouds Generated from Images, International Conference on

  9. Applied Bayesian Hierarchical Methods

    CERN Document Server

    Congdon, Peter D

    2010-01-01

    Bayesian methods facilitate the analysis of complex models and data structures. Emphasizing data applications, alternative modeling specifications, and computer implementation, this book provides a practical overview of methods for Bayesian analysis of hierarchical models.

  10. MASSIVE: A Bayesian analysis of giant planet populations around low-mass stars

    Science.gov (United States)

    Lannier, J.; Delorme, P.; Lagrange, A. M.; Borgniet, S.; Rameau, J.; Schlieder, J. E.; Gagné, J.; Bonavita, M. A.; Malo, L.; Chauvin, G.; Bonnefoy, M.; Girard, J. H.

    2016-12-01

    Context. Direct imaging has led to the discovery of several giant planet and brown dwarf companions. These imaged companions populate a mass, separation and age domain (mass >1 MJup, orbits > 5 AU, age planetary formation models. Methods: We observed 58 young and nearby M-type dwarfs in L'-band with the VLT/NaCo instrument and used angular differential imaging algorithms to optimize the sensitivity to planetary-mass companions and to derive the best detection limits. We estimate the probability of detecting a planet as a function of its mass and physical separation around each target. We conduct a Bayesian analysis to determine the frequency of substellar companions orbiting low-mass stars, using a homogenous sub-sample of 54 stars. Results: We derive a frequency of for companions with masses in the range of 2-80 MJup, and % for planetary mass companions (2-14 MJup), at physical separations of 8 to 400 AU for both cases. Comparing our results with a previous survey targeting more massive stars, we find evidence that substellar companions more massive than 1 MJup with a low mass ratio Q with respect to their host star (Q 2 MJup might be independent from the mass of the host star.

  11. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  12. A Bayesian Analysis of the Radioactive Releases of Fukushima

    DEFF Research Database (Denmark)

    Tomioka, Ryota; Mørup, Morten

    2012-01-01

    The Fukushima Daiichi disaster 11 March, 2011 is considered the largest nuclear accident since the 1986 Chernobyl disaster and has been rated at level 7 on the International Nuclear Event Scale. As different radioactive materials have different effects to human body, it is important to know...... the types of nuclides and their levels of concentration from the recorded mixture of radiations to take necessary measures. We presently formulate a Bayesian generative model for the data available on radioactive releases from the Fukushima Daiichi disaster across Japan. From the sparsely sampled...... the Fukushima Daiichi plant we establish that the model is able to account for the data. We further demonstrate how the model extends to include all the available measurements recorded throughout Japan. The model can be considered a first attempt to apply Bayesian learning unsupervised in order to give a more...

  13. A NOVEL TECHNIQUE TO IMPROVE PHOTOMETRY IN CONFUSED IMAGES USING GRAPHS AND BAYESIAN PRIORS

    Energy Technology Data Exchange (ETDEWEB)

    Safarzadeh, Mohammadtaher [Department of Physics and Astronomy, Johns Hopkins University, 366 Bloomberg Center, 3400 North Charles Street, Baltimore, MD 21218 (United States); Ferguson, Henry C. [Space Telescope Science Institute, 3700 San Martin Boulevard, Baltimore, MD 21218 (United States); Lu, Yu [Kavli Institute for Particle Astrophysics and Cosmology, Stanford, CA 94309 (United States); Inami, Hanae [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States); Somerville, Rachel S., E-mail: mts@pha.jhu.edu [Department of Physics and Astronomy, Rutgers, The State University of New Jersey, 136 Frelinghuysen Road, Piscataway, NJ 08854 (United States)

    2015-01-10

    We present a new technique for overcoming confusion noise in deep far-infrared Herschel space telescope images making use of prior information from shorter λ < 2 μm wavelengths. For the deepest images obtained by Herschel, the flux limit due to source confusion is about a factor of three brighter than the flux limit due to instrumental noise and (smooth) sky background. We have investigated the possibility of de-confusing simulated Herschel PACS 160 μm images by using strong Bayesian priors on the positions and weak priors on the flux of sources. We find the blended sources and group them together and simultaneously fit their fluxes. We derive the posterior probability distribution function of fluxes subject to these priors through Monte Carlo Markov Chain (MCMC) sampling by fitting the image. Assuming we can predict the FIR flux of sources based on the ultraviolet-optical part of their SEDs to within an order of magnitude, the simulations show that we can obtain reliable fluxes and uncertainties at least a factor of three fainter than the confusion noise limit of 3σ {sub c} = 2.7 mJy in our simulated PACS-160 image. This technique could in principle be used to mitigate the effects of source confusion in any situation where one has prior information of positions and plausible fluxes of blended sources. For Herschel, application of this technique will improve our ability to constrain the dust content in normal galaxies at high redshift.

  14. Bayesian blind separation and deconvolution of dynamic image sequences using sparsity priors.

    Science.gov (United States)

    Tichy, Ondrej; Smidl, Vaclav

    2015-01-01

    A common problem of imaging 3-D objects into image plane is superposition of the projected structures. In dynamic imaging, projection overlaps of organs and tissues complicate extraction of signals specific to individual structures with different dynamics. The problem manifests itself also in dynamic tomography as tissue mixtures are present in voxels. Separation of signals specific to dynamic structures belongs to the category of blind source separation. It is an underdetermined problem with many possible solutions. Existing separation methods select the solution that best matches their additional assumptions on the source model. We propose a novel blind source separation method based on probabilistic model of dynamic image sequences assuming each source dynamics as convolution of an input function and a source specific kernel (modeling organ impulse response or retention function). These assumptions are formalized as a Bayesian model with hierarchical prior and solved by the Variational Bayes method. The proposed prior distribution assigns higher probability to sparse source images and sparse convolution kernels. We show that the results of separation are relevant to selected tasks of dynamic renal scintigraphy. Accuracy of tissue separation with simulated and clinical data provided by the proposed method outperformed accuracy of previously developed methods measured by the mean square and mean absolute errors of estimation of simulated sources and the sources separated by an expert physician. MATLAB implementation of the algorithm is available for download.

  15. Use of SAMC for Bayesian analysis of statistical models with intractable normalizing constants

    KAUST Repository

    Jin, Ick Hoon

    2014-03-01

    Statistical inference for the models with intractable normalizing constants has attracted much attention. During the past two decades, various approximation- or simulation-based methods have been proposed for the problem, such as the Monte Carlo maximum likelihood method and the auxiliary variable Markov chain Monte Carlo methods. The Bayesian stochastic approximation Monte Carlo algorithm specifically addresses this problem: It works by sampling from a sequence of approximate distributions with their average converging to the target posterior distribution, where the approximate distributions can be achieved using the stochastic approximation Monte Carlo algorithm. A strong law of large numbers is established for the Bayesian stochastic approximation Monte Carlo estimator under mild conditions. Compared to the Monte Carlo maximum likelihood method, the Bayesian stochastic approximation Monte Carlo algorithm is more robust to the initial guess of model parameters. Compared to the auxiliary variable MCMC methods, the Bayesian stochastic approximation Monte Carlo algorithm avoids the requirement for perfect samples, and thus can be applied to many models for which perfect sampling is not available or very expensive. The Bayesian stochastic approximation Monte Carlo algorithm also provides a general framework for approximate Bayesian analysis. © 2012 Elsevier B.V. All rights reserved.

  16. Nonlinear inversion of electrical resistivity imaging using pruning Bayesian neural networks

    Science.gov (United States)

    Jiang, Fei-Bo; Dai, Qian-Wei; Dong, Li

    2016-06-01

    Conventional artificial neural networks used to solve electrical resistivity imaging (ERI) inversion problem suffer from overfitting and local minima. To solve these problems, we propose to use a pruning Bayesian neural network (PBNN) nonlinear inversion method and a sample design method based on the K-medoids clustering algorithm. In the sample design method, the training samples of the neural network are designed according to the prior information provided by the K-medoids clustering results; thus, the training process of the neural network is well guided. The proposed PBNN, based on Bayesian regularization, is used to select the hidden layer structure by assessing the effect of each hidden neuron to the inversion results. Then, the hyperparameter α k , which is based on the generalized mean, is chosen to guide the pruning process according to the prior distribution of the training samples under the small-sample condition. The proposed algorithm is more efficient than other common adaptive regularization methods in geophysics. The inversion of synthetic data and field data suggests that the proposed method suppresses the noise in the neural network training stage and enhances the generalization. The inversion results with the proposed method are better than those of the BPNN, RBFNN, and RRBFNN inversion methods as well as the conventional least squares inversion.

  17. Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making.

    OpenAIRE

    Aitchison, L.; Bang, D; Bahrami, B.; Latham, P.E.

    2015-01-01

    Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we te...

  18. Bayesian reconstruction of P(r) directly from two-dimensional detector images via a Markov chain Monte Carlo method.

    Science.gov (United States)

    Paul, Sudeshna; Friedman, Alan M; Bailey-Kellogg, Chris; Craig, Bruce A

    2013-04-01

    The interatomic distance distribution, P(r), is a valuable tool for evaluating the structure of a molecule in solution and represents the maximum structural information that can be derived from solution scattering data without further assumptions. Most current instrumentation for scattering experiments (typically CCD detectors) generates a finely pixelated two-dimensional image. In contin-uation of the standard practice with earlier one-dimensional detectors, these images are typically reduced to a one-dimensional profile of scattering inten-sities, I(q), by circular averaging of the two-dimensional image. Indirect Fourier transformation methods are then used to reconstruct P(r) from I(q). Substantial advantages in data analysis, however, could be achieved by directly estimating the P(r) curve from the two-dimensional images. This article describes a Bayesian framework, using a Markov chain Monte Carlo method, for estimating the parameters of the indirect transform, and thus P(r), directly from the two-dimensional images. Using simulated detector images, it is demonstrated that this method yields P(r) curves nearly identical to the reference P(r). Furthermore, an approach for evaluating spatially correlated errors (such as those that arise from a detector point spread function) is evaluated. Accounting for these errors further improves the precision of the P(r) estimation. Experimental scattering data, where no ground truth reference P(r) is available, are used to demonstrate that this method yields a scattering and detector model that more closely reflects the two-dimensional data, as judged by smaller residuals in cross-validation, than P(r) obtained by indirect transformation of a one-dimensional profile. Finally, the method allows concurrent estimation of the beam center and Dmax, the longest interatomic distance in P(r), as part of the Bayesian Markov chain Monte Carlo method, reducing experimental effort and providing a well defined protocol for these

  19. Bayesian analysis of the flutter margin method in aeroelasticity

    Science.gov (United States)

    Khalil, Mohammad; Poirel, Dominique; Sarkar, Abhijit

    2016-12-01

    A Bayesian statistical framework is presented for Zimmerman and Weissenburger flutter margin method which considers the uncertainties in aeroelastic modal parameters. The proposed methodology overcomes the limitations of the previously developed least-square based estimation technique which relies on the Gaussian approximation of the flutter margin probability density function (pdf). Using the measured free-decay responses at subcritical (preflutter) airspeeds, the joint non-Gaussain posterior pdf of the modal parameters is sampled using the Metropolis-Hastings (MH) Markov chain Monte Carlo (MCMC) algorithm. The posterior MCMC samples of the modal parameters are then used to obtain the flutter margin pdfs and finally the flutter speed pdf. The usefulness of the Bayesian flutter margin method is demonstrated using synthetic data generated from a two-degree-of-freedom pitch-plunge aeroelastic model. The robustness of the statistical framework is demonstrated using different sets of measurement data. It will be shown that the probabilistic (Bayesian) approach reduces the number of test points required in providing a flutter speed estimate for a given accuracy and precision.

  20. Color Medical Image Analysis

    CERN Document Server

    Schaefer, Gerald

    2013-01-01

    Since the early 20th century, medical imaging has been dominated by monochrome imaging modalities such as x-ray, computed tomography, ultrasound, and magnetic resonance imaging. As a result, color information has been overlooked in medical image analysis applications. Recently, various medical imaging modalities that involve color information have been introduced. These include cervicography, dermoscopy, fundus photography, gastrointestinal endoscopy, microscopy, and wound photography. However, in comparison to monochrome images, the analysis of color images is a relatively unexplored area. The multivariate nature of color image data presents new challenges for researchers and practitioners as the numerous methods developed for monochrome images are often not directly applicable to multichannel images. The goal of this volume is to summarize the state-of-the-art in the utilization of color information in medical image analysis.

  1. The Relevance Voxel Machine (RVoxM): A Self-Tuning Bayesian Model for Informative Image-Based Prediction

    DEFF Research Database (Denmark)

    Sabuncu, Mert R.; Van Leemput, Koen

    2012-01-01

    This paper presents the relevance voxel machine (RVoxM), a dedicated Bayesian model for making predictions based on medical imaging data. In contrast to the generic machine learning algorithms that have often been used for this purpose, the method is designed to utilize a small number of spatially...

  2. Improving in situ data acquisition using training images and a Bayesian mixture model

    Science.gov (United States)

    Abdollahifard, Mohammad Javad; Mariethoz, Gregoire; Pourfard, Mohammadreza

    2016-06-01

    Estimating the spatial distribution of physical processes using a minimum number of samples is of vital importance in earth science applications where sampling is costly. In recent years, training image-based methods have received a lot of attention for interpolation and simulation. However, training images have never been employed to optimize spatial sampling process. In this paper, a sequential compressive sampling method is presented which decides the location of new samples based on a training image. First, a Bayesian mixture model is developed based on the training patterns. Then, using this model, unknown values are estimated based on a limited number of random samples. Since the model is probabilistic, it allows estimating local uncertainty conditionally to the available samples. Based on this, new samples are sequentially extracted from the locations with maximum uncertainty. Experiments show that compared to a random sampling strategy, the proposed supervised sampling method significantly reduces the number of samples needed to achieve the same level of accuracy, even when the training image is not optimally chosen. The method has the potential to reduce the number of observations necessary for the characterization of environmental processes.

  3. Off-Grid Radar Coincidence Imaging Based on Variational Sparse Bayesian Learning

    Directory of Open Access Journals (Sweden)

    Xiaoli Zhou

    2016-01-01

    Full Text Available Radar coincidence imaging (RCI is a high-resolution staring imaging technique motivated by classical optical coincidence imaging. In RCI, sparse reconstruction methods are commonly used to achieve better imaging result, while the performance guarantee is based on the general assumption that the scatterers are located at the prediscretized grid-cell centers. However, the widely existing off-grid problem degrades the RCI performance considerably. In this paper, an algorithm based on variational sparse Bayesian learning (VSBL is developed to solve the off-grid RCI. Applying Taylor expansion, the unknown true dictionary is approximated accurately to a linear model. Then target reconstruction is reformulated as a joint sparse recovery problem that recovers three groups of sparse coefficients over three known dictionaries with the constraint of the common support shared by the groups. VSBL is then applied to solve the problem by assigning appropriate priors to the three groups of coefficients. Results of numerical experiments demonstrate that the algorithm can achieve outstanding reconstruction performance and yield superior performance both in suppressing noise and in adapting to off-grid error.

  4. A Gibbs sampler for Bayesian analysis of site-occupancy data

    Science.gov (United States)

    Dorazio, Robert M.; Rodriguez, Daniel Taylor

    2012-01-01

    1. A Bayesian analysis of site-occupancy data containing covariates of species occurrence and species detection probabilities is usually completed using Markov chain Monte Carlo methods in conjunction with software programs that can implement those methods for any statistical model, not just site-occupancy models. Although these software programs are quite flexible, considerable experience is often required to specify a model and to initialize the Markov chain so that summaries of the posterior distribution can be estimated efficiently and accurately. 2. As an alternative to these programs, we develop a Gibbs sampler for Bayesian analysis of site-occupancy data that include covariates of species occurrence and species detection probabilities. This Gibbs sampler is based on a class of site-occupancy models in which probabilities of species occurrence and detection are specified as probit-regression functions of site- and survey-specific covariate measurements. 3. To illustrate the Gibbs sampler, we analyse site-occupancy data of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly species in Switzerland. Our analysis includes a comparison of results based on Bayesian and classical (non-Bayesian) methods of inference. We also provide code (based on the R software program) for conducting Bayesian and classical analyses of site-occupancy data.

  5. Bayesian inference – a way to combine statistical data and semantic analysis meaningfully

    Directory of Open Access Journals (Sweden)

    Eila Lindfors

    2011-11-01

    Full Text Available This article focuses on presenting the possibilities of Bayesian modelling (Finite Mixture Modelling in the semantic analysis of statistically modelled data. The probability of a hypothesis in relation to the data available is an important question in inductive reasoning. Bayesian modelling allows the researcher to use many models at a time and provides tools to evaluate the goodness of different models. The researcher should always be aware that there is no such thing as the exact probability of an exact event. This is the reason for using probabilistic models. Each model presents a different perspective on the phenomenon in focus, and the researcher has to choose the most probable model with a view to previous research and the knowledge available.The idea of Bayesian modelling is illustrated here by presenting two different sets of data, one from craft science research (n=167 and the other (n=63 from educational research (Lindfors, 2007, 2002. The principles of how to build models and how to combine different profiles are described in the light of the research mentioned.Bayesian modelling is an analysis based on calculating probabilities in relation to a specific set of quantitative data. It is a tool for handling data and interpreting it semantically. The reliability of the analysis arises from an argumentation of which model can be selected from the model space as the basis for an interpretation, and on which arguments.Keywords: method, sloyd, Bayesian modelling, student teachersURN:NBN:no-29959

  6. Bayesian rupture imaging in a complex medium: The 29 May 2012 Emilia, Northern Italy, earthquake

    Science.gov (United States)

    Causse, Mathieu; Cultrera, Giovanna; Moreau, Ludovic; Herrero, André; Schiappapietra, Erika; Courboulex, Françoise

    2017-08-01

    We develop a new approach to image earthquake rupture from strong motion data. We use a large data set of aftershock waveforms, interpolated over the seismic fault to obtain Green's function approximations. Next we deploy a Bayesian inversion method to characterize the slip distribution, the rupture velocity, the slip duration, and their uncertainties induced by errors in the Green's functions. The method is applied to the 29 May 2012 Mw 6 Emilia earthquake, which ruptured a fault buried below the Po Plain sediments (Northern Italy). Despite the particularly complex wave propagation, the near-field strong motion observations are well reproduced with 15 rupture parameters. The rupture and slip velocities were notably slow ( 0.5 Vs and earthquake rupture studies in areas where numerical simulations suffer from imprecise knowledge of the velocity structure.

  7. Whole organism high-content screening by label-free, image-based Bayesian classification for parasitic diseases.

    Directory of Open Access Journals (Sweden)

    Ross A Paveley

    Full Text Available Sole reliance on one drug, Praziquantel, for treatment and control of schistosomiasis raises concerns about development of widespread resistance, prompting renewed interest in the discovery of new anthelmintics. To discover new leads we designed an automated label-free, high content-based, high throughput screen (HTS to assess drug-induced effects on in vitro cultured larvae (schistosomula using bright-field imaging. Automatic image analysis and Bayesian prediction models define morphological damage, hit/non-hit prediction and larval phenotype characterization. Motility was also assessed from time-lapse images. In screening a 10,041 compound library the HTS correctly detected 99.8% of the hits scored visually. A proportion of these larval hits were also active in an adult worm ex-vivo screen and are the subject of ongoing studies. The method allows, for the first time, screening of large compound collections against schistosomes and the methods are adaptable to other whole organism and cell-based screening by morphology and motility phenotyping.

  8. Bayesian belief network analysis applied to determine the progression of temporomandibular disorders using MRI.

    Science.gov (United States)

    Iwasaki, H

    2015-01-01

    This study investigated the applicability of a Bayesian belief network (BBN) to MR images to diagnose temporomandibular disorders (TMDs). Our aim was to determine the progression of TMDs, focusing on how each finding affects the other. We selected 1.5-T MRI findings (33 variables) and diagnoses (bone changes and disc displacement) of patients with TMD from 2007 to 2008. There were a total of 295 cases with 590 sides of temporomandibular joints (TMJs). The data were modified according to the research diagnostic criteria of TMD. We compared the accuracy of the BBN using 11 algorithms (necessary path condition, path condition, greedy search-and-score with Bayesian information criterion, Chow-Liu tree, Rebane-Pearl poly tree, tree augmented naïve Bayes model, maximum log likelihood, Akaike information criterion, minimum description length, K2 and C4.5), a multiple regression analysis and an artificial neural network using resubstitution validation and 10-fold cross-validation. There were 191 TMJs (32.4%) with bone changes and 340 (57.6%) with articular disc displacement. The BBN path condition algorithm using resubstitution validation and 10-fold cross-validation was >99% accurate. However, the main advantage of a BBN is that it can represent the causal relationships between different findings and assign conditional probabilities, which can then be used to interpret the progression of TMD. Osteoarthritic bone changes progressed from condyle to articular fossa and finally to mandibular bone contours. Disc displacement was directly related to severe bone changes. Early bone changes were not directly related to disc displacement. TMJ functional factors (condylar translation, bony space and disc form) and age mediated between bone changes and disc displacement.

  9. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    Science.gov (United States)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  10. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    Science.gov (United States)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  11. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis : A Cross-National Investigation of Schwartz Values

    NARCIS (Netherlands)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the mo

  12. PAC-Bayesian Analysis of the Exploration-Exploitation Trade-off

    CERN Document Server

    Seldin, Yevgeny; Laviolette, François; Auer, Peter; Shawe-Taylor, John; Peters, Jan

    2011-01-01

    We develop a coherent framework for integrative simultaneous analysis of the exploration-exploitation and model order selection trade-offs. We improve over our preceding results on the same subject (Seldin et al., 2011) by combining PAC-Bayesian analysis with Bernstein-type inequality for martingales. Such a combination is also of independent interest for studies of multiple simultaneously evolving martingales.

  13. IMPROVING DISPLACEMENT SIGNAL-TO-NOISE RATIO FOR LOW-SIGNAL RADIATION FORCE ELASTICITY IMAGING USING BAYESIAN TECHNIQUES

    Science.gov (United States)

    Dumont, Douglas M.; Walsh, Kristy M.; Byram, Brett C.

    2017-01-01

    Radiation force-based elasticity imaging is currently being investigated as a possible diagnostic modality for a number of clinical tasks, including liver fibrosis staging and the characterization of cardiovascular tissue. In this study, we evaluate the relationship between peak displacement magnitude and image quality and propose using a Bayesian estimator to overcome the challenge of obtaining viable data in low displacement signal environments. Displacement data quality were quantified for two common radiation force-based applications, acoustic radiation force impulse imaging, which measures the displacement within the region of excitation, and shear wave elasticity imaging, which measures displacements outside the region of excitation. Performance as a function of peak displacement magnitude for acoustic radiation force impulse imaging was assessed in simulations and lesion phantoms by quantifying signal-to-noise ratio (SNR) and contrast-to-noise ratio for varying peak displacement magnitudes. Overall performance for shear wave elasticity imaging was assessed in ex vivo chicken breast samples by measuring the displacement SNR as a function of distance from the excitation source. The results show that for any given displacement magnitude level, the Bayesian estimator can increase the SNR by approximately 9 dB over normalized cross-correlation and the contrast-to-noise ratio by a factor of two. We conclude from the results that a Bayesian estimator may be useful for increasing data quality in SNR-limited imaging environments. PMID:27157861

  14. Morphological image analysis

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, H. De; Kawakatsu, T.

    2000-01-01

    We describe a morphological image analysis method to characterize images in terms of geometry and topology. We present a method to compute the morphological properties of the objects building up the image and apply the method to triply periodic minimal surfaces and to images taken from polymer chemi

  15. Morphological image analysis

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, H; Kawakatsu, T; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a morphological image analysis method to characterize images in terms of geometry and topology. We present a method to compute the morphological properties of the objects building up the image and apply the method to triply periodic minimal surfaces and to images taken from polymer chemi

  16. Quantum System Identification by Bayesian Analysis of Noisy Data: Beyond Hamiltonian Tomography

    CERN Document Server

    Schirmer, S G

    2009-01-01

    We consider how to characterize the dynamics of a quantum system from a restricted set of initial states and measurements using Bayesian analysis. Previous work has shown that Hamiltonian systems can be well estimated from analysis of noisy data. Here we show how to generalize this approach to systems with moderate dephasing in the eigenbasis of the Hamiltonian. We illustrate the process for a range of three-level quantum systems. The results suggest that the Bayesian estimation of the frequencies and dephasing rates is generally highly accurate and the main source of errors are errors in the reconstructed Hamiltonian basis.

  17. A Bayesian nonparametric method for prediction in EST analysis

    Directory of Open Access Journals (Sweden)

    Prünster Igor

    2007-09-01

    Full Text Available Abstract Background Expressed sequence tags (ESTs analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b the number of new unique genes to be observed in a future sample; c the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample.

  18. Bayesian inference for inverse problems occurring in uncertainty analysis

    OpenAIRE

    Fu, Shuai; Celeux, Gilles; Bousquet, Nicolas; Couplet, Mathieu

    2012-01-01

    The inverse problem considered here is to estimate the distribution of a non-observed random variable $X$ from some noisy observed data $Y$ linked to $X$ through a time-consuming physical model $H$. Bayesian inference is considered to take into account prior expert knowledge on $X$ in a small sample size setting. A Metropolis-Hastings within Gibbs algorithm is proposed to compute the posterior distribution of the parameters of $X$ through a data augmentation process. Since calls to $H$ are qu...

  19. Bayesian analysis of the dynamic structure in China's economic growth

    Science.gov (United States)

    Kyo, Koki; Noda, Hideo

    2008-11-01

    To analyze the dynamic structure in China's economic growth during the period 1952-1998, we introduce a model of the aggregate production function for the Chinese economy that considers total factor productivity (TFP) and output elasticities as time-varying parameters. Specifically, this paper is concerned with the relationship between the rate of economic growth in China and the trend in TFP. Here, we consider the time-varying parameters as random variables and introduce smoothness priors to construct a set of Bayesian linear models for parameter estimation. The results of the estimation are in agreement with the movements in China's social economy, thus illustrating the validity of the proposed methods.

  20. Bayesian analysis of truncation errors in chiral effective field theory

    Science.gov (United States)

    Melendez, J.; Furnstahl, R. J.; Klco, N.; Phillips, D. R.; Wesolowski, S.

    2016-09-01

    In the Bayesian approach to effective field theory (EFT) expansions, truncation errors are derived from degree-of-belief (DOB) intervals for EFT predictions. By encoding expectations about the naturalness of EFT expansion coefficients for observables, this framework provides a statistical interpretation of the standard EFT procedure where truncation errors are estimated using the order-by-order convergence of the expansion. We extend and test previous calculations of DOB intervals for chiral EFT observables, examine correlations between contributions at different orders and energies, and explore methods to validate the statistical consistency of the EFT expansion parameter. Supported in part by the NSF and the DOE.

  1. Naive Bayesian classifiers for multinomial features: a theoretical analysis

    CSIR Research Space (South Africa)

    Van Dyk, E

    2007-11-01

    Full Text Available are individually binomial. Then, if we apply the naive Bayesian philosophy and define the likelihood function as the product of all binomial features, we get the likelihood function of class cr p(x¯|cr) = DY d=1 m! xd!(m− xd)!p xd dcrq m−xd dcr (1...) where x¯ is the input vector, xd is the frequency count for feature d, m is the number of Bernoulli trials done, pdcr is the proba- bility of feature d occurring in a Bernoulli trial for class cr and qdcr = 1− pdcr . The advantage of using eq. (1...

  2. [Meta analysis of the use of Bayesian networks in breast cancer diagnosis].

    Science.gov (United States)

    Simões, Priscyla Waleska; Silva, Geraldo Doneda da; Moretti, Gustavo Pasquali; Simon, Carla Sasso; Winnikow, Erik Paul; Nassar, Silvia Modesto; Medeiros, Lidia Rosi; Rosa, Maria Inês

    2015-01-01

    The aim of this study was to determine the accuracy of Bayesian networks in supporting breast cancer diagnoses. Systematic review and meta-analysis were carried out, including articles and papers published between January 1990 and March 2013. We included prospective and retrospective cross-sectional studies of the accuracy of diagnoses of breast lesions (target conditions) made using Bayesian networks (index test). Four primary studies that included 1,223 breast lesions were analyzed, 89.52% (444/496) of the breast cancer cases and 6.33% (46/727) of the benign lesions were positive based on the Bayesian network analysis. The area under the curve (AUC) for the summary receiver operating characteristic curve (SROC) was 0.97, with a Q* value of 0.92. Using Bayesian networks to diagnose malignant lesions increased the pretest probability of a true positive from 40.03% to 90.05% and decreased the probability of a false negative to 6.44%. Therefore, our results demonstrated that Bayesian networks provide an accurate and non-invasive method to support breast cancer diagnosis.

  3. Bayesian analysis of time-series data under case-crossover designs: posterior equivalence and inference.

    Science.gov (United States)

    Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay

    2013-12-01

    Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations.

  4. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    Energy Technology Data Exchange (ETDEWEB)

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-02-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.

  5. Bayesian orientation estimate and structure information from sparse single-molecule x-ray diffraction images.

    Science.gov (United States)

    Walczak, Michał; Grubmüller, Helmut

    2014-08-01

    We developed a Bayesian method to extract macromolecular structure information from sparse single-molecule x-ray free-electron laser diffraction images. The method addresses two possible scenarios. First, using a "seed" structural model, the molecular orientation is determined for each of the provided diffraction images, which are then averaged in three-dimensional reciprocal space. Subsequently, the real space electron density is determined using a relaxed averaged alternating reflections algorithm. In the second approach, the probability that the "seed" model fits to the given set of diffraction images as a whole is determined and used to distinguish between proposed structures. We show that for a given x-ray intensity, unexpectedly, the achievable resolution increases with molecular mass such that structure determination should be more challenging for small molecules than for larger ones. For a sufficiently large number of recorded photons (>200) per diffraction image an M^{1/6} scaling is seen. Using synthetic diffraction data for a small glutathione molecule as a challenging test case, successful determination of electron density was demonstrated for 20000 diffraction patterns with random orientations and an average of 82 elastically scattered and recorded photons per image, also in the presence of up to 50% background noise. The second scenario is exemplified and assessed for three biomolecules of different sizes. In all cases, determining the probability of a structure given set of diffraction patterns allowed successful discrimination between different conformations of the test molecules. A structure model of the glutathione tripeptide was refined in a Monte Carlo simulation from a random starting conformation. Further, effective distinguishing between three differently arranged immunoglobulin domains of a titin molecule and also different states of a ribosome in a tRNA translocation process was demonstrated. These results show that the proposed method is

  6. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  7. A Bayesian approach for solar resource potential assessment using satellite images

    Science.gov (United States)

    Linguet, L.; Atif, J.

    2014-03-01

    The need for a more sustainable and more protective development opens new possibilities for renewable energy. Among the different renewable energy sources, the direct conversion of sunlight into electricity by solar photovoltaic (PV) technology seems to be the most promising and represents a technically viable solution to energy demands. But implantation and deployment of PV energy need solar resource data for utility planning, accommodating grid capacity, and formulating future adaptive policies. Currently, the best approach to determine the solar resource at a given site is based on the use of satellite images. However, the computation of solar resource (non-linear process) from satellite images is unfortunately not straightforward. From a signal processing point of view, it falls within non-stationary, non-linear/non-Gaussian dynamical inverse problems. In this paper, we propose a Bayesian approach combining satellite images and in situ data. We propose original observation and transition functions taking advantages of the characteristics of both the involved type of data. A simulation study of solar irradiance is carried along with this method and a French Guiana solar resource potential map for year 2010 is given.

  8. The Bayesian image retrieval system, PicHunter: theory, implementation, and psychophysical experiments.

    Science.gov (United States)

    Cox, I J; Miller, M L; Minka, T P; Papathomas, T V; Yianilos, P N

    2000-01-01

    This paper presents the theory, design principles, implementation and performance results of PicHunter, a prototype content-based image retrieval (CBIR) system. In addition, this document presents the rationale, design and results of psychophysical experiments that were conducted to address some key issues that arose during PicHunter's development. The PicHunter project makes four primary contributions to research on CBIR. First, PicHunter represents a simple instance of a general Bayesian framework which we describe for using relevance feedback to direct a search. With an explicit model of what users would do, given the target image they want, PicHunter uses Bayes's rule to predict the target they want, given their actions. This is done via a probability distribution over possible image targets, rather than by refining a query. Second, an entropy-minimizing display algorithm is described that attempts to maximize the information obtained from a user at each iteration of the search. Third, PicHunter makes use of hidden annotation rather than a possibly inaccurate/inconsistent annotation structure that the user must learn and make queries in. Finally, PicHunter introduces two experimental paradigms to quantitatively evaluate the performance of the system, and psychophysical experiments are presented that support the theoretical claims.

  9. A Bayesian Analysis of the Ages of Four Open Clusters

    CERN Document Server

    Jeffery, Elizabeth J; van Dyk, David A; Stenning, David C; Robinson, Elliot; Stein, Nathan; Jefferys, W H

    2016-01-01

    In this paper we apply a Bayesian technique to determine the best fit of stellar evolution models to find the main sequence turn off age and other cluster parameters of four intermediate-age open clusters: NGC 2360, NGC 2477, NGC 2660, and NGC 3960. Our algorithm utilizes a Markov chain Monte Carlo technique to fit these various parameters, objectively finding the best-fit isochrone for each cluster. The result is a high-precision isochrone fit. We compare these results with the those of traditional "by-eye" isochrone fitting methods. By applying this Bayesian technique to NGC 2360, NGC 2477, NGC 2660, and NGC 3960, we determine the ages of these clusters to be 1.35 +/- 0.05, 1.02 +/- 0.02, 1.64 +/- 0.04, and 0.860 +/- 0.04 Gyr, respectively. The results of this paper continue our effort to determine cluster ages to higher precision than that offered by these traditional methods of isochrone fitting.

  10. Risk Analysis of New Product Development Using Bayesian Networks

    Directory of Open Access Journals (Sweden)

    MohammadRahim Ramezanian

    2012-06-01

    Full Text Available The process of presenting new product development (NPD to market is of great importance due to variability of competitive rules in the business world. The product development teams face a lot of pressures due to rapid growth of technology, increased risk-taking of world markets and increasing variations in the customers` needs. However, the process of NPD is always associated with high uncertainties and complexities. To be successful in completing NPD project, existing risks should be identified and assessed. On the other hand, the Bayesian networks as a strong approach of decision making modeling of uncertain situations has attracted many researchers in various areas. These networks provide a decision supporting system for problems with uncertainties or probable reasoning. In this paper, the available risk factors in product development have been first identified in an electric company and then, the Bayesian network has been utilized and their interrelationships have been modeled to evaluate the available risk in the process. To determine the primary and conditional probabilities of the nodes, the viewpoints of experts in this area have been applied. The available risks in this process have been divided to High (H, Medium (M and Low (L groups and analyzed by the Agena Risk software. The findings derived from software output indicate that the production of the desired product has relatively high risk. In addition, Predictive support and Diagnostic support have been performed on the model with two different scenarios..

  11. Risk Analysis of New Product Development Using Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Mohammad Rahim Ramezanian

    2012-01-01

    Full Text Available The process of presenting new product development (NPD to market is of great importance due to variability of competitive rules in the business world. The product development teams face a lot of pressures due to rapid growth of technology, increased risk-taking of world markets and increasing variations in the customers` needs. However, the process of NPD is always associated with high uncertainties and complexities. To be successful in completing NPD project, existing risks should be identified and assessed. On the other hand, the Bayesian networks as a strong approach of decision making modeling of uncertain situations has attracted many researchers in various areas. These networks provide a decision supporting system for problems with uncertainties or probable reasoning. In this paper, the available risk factors in product development have been first identified in an electric company and then, the Bayesian network has been utilized and their interrelationships have been modeled to evaluate the available risk in the process. To determine the primary and conditional probabilities of the nodes, the viewpoints of experts in this area have been applied. The available risks in this process have been divided to High (H, Medium (M and Low (L groups and analyzed by the Agena Risk software. The findings derived from software output indicate that the production of the desired product has relatively high risk. In addition, Predictive support and Diagnostic support have been performed on the model with two different scenarios.

  12. Bayesian item fit analysis for unidimensional item response theory models.

    Science.gov (United States)

    Sinharay, Sandip

    2006-11-01

    Assessing item fit for unidimensional item response theory models for dichotomous items has always been an issue of enormous interest, but there exists no unanimously agreed item fit diagnostic for these models, and hence there is room for further investigation of the area. This paper employs the posterior predictive model-checking method, a popular Bayesian model-checking tool, to examine item fit for the above-mentioned models. An item fit plot, comparing the observed and predicted proportion-correct scores of examinees with different raw scores, is suggested. This paper also suggests how to obtain posterior predictive p-values (which are natural Bayesian p-values) for the item fit statistics of Orlando and Thissen that summarize numerically the information in the above-mentioned item fit plots. A number of simulation studies and a real data application demonstrate the effectiveness of the suggested item fit diagnostics. The suggested techniques seem to have adequate power and reasonable Type I error rate, and psychometricians will find them promising.

  13. Bayesian analysis of deterministic and stochastic prisoner's dilemma games

    Directory of Open Access Journals (Sweden)

    Howard Kunreuther

    2009-08-01

    Full Text Available This paper compares the behavior of individuals playing a classic two-person deterministic prisoner's dilemma (PD game with choice data obtained from repeated interdependent security prisoner's dilemma games with varying probabilities of loss and the ability to learn (or not learn about the actions of one's counterpart, an area of recent interest in experimental economics. This novel data set, from a series of controlled laboratory experiments, is analyzed using Bayesian hierarchical methods, the first application of such methods in this research domain. We find that individuals are much more likely to be cooperative when payoffs are deterministic than when the outcomes are probabilistic. A key factor explaining this difference is that subjects in a stochastic PD game respond not just to what their counterparts did but also to whether or not they suffered a loss. These findings are interpreted in the context of behavioral theories of commitment, altruism and reciprocity. The work provides a linkage between Bayesian statistics, experimental economics, and consumer psychology.

  14. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Pedro, E-mail: pedrocarv@coc.ufrj.br [Computational Modelling in Engineering and Geophysics Laboratory (LAMEMO), Department of Civil Engineering, COPPE, Federal University of Rio de Janeiro, Av. Pedro Calmon - Ilha do Fundão, 21941-596 Rio de Janeiro (Brazil); Center for Urban and Regional Systems (CESUR), CERIS, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisbon (Portugal); Marques, Rui Cunha, E-mail: pedro.c.carvalho@tecnico.ulisboa.pt [Center for Urban and Regional Systems (CESUR), CERIS, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisbon (Portugal)

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. - Highlights: • This study aims to search for economies of size and scope in the water sector; • The usefulness of the application of Bayesian methods is highlighted; • Important economies of output density, economies of size, economies of vertical integration and economies of scope are found.

  15. Calibration of crash risk models on freeways with limited real-time traffic data using Bayesian meta-analysis and Bayesian inference approach.

    Science.gov (United States)

    Xu, Chengcheng; Wang, Wei; Liu, Pan; Li, Zhibin

    2015-12-01

    This study aimed to develop a real-time crash risk model with limited data in China by using Bayesian meta-analysis and Bayesian inference approach. A systematic review was first conducted by using three different Bayesian meta-analyses, including the fixed effect meta-analysis, the random effect meta-analysis, and the meta-regression. The meta-analyses provided a numerical summary of the effects of traffic variables on crash risks by quantitatively synthesizing results from previous studies. The random effect meta-analysis and the meta-regression produced a more conservative estimate for the effects of traffic variables compared with the fixed effect meta-analysis. Then, the meta-analyses results were used as informative priors for developing crash risk models with limited data. Three different meta-analyses significantly affect model fit and prediction accuracy. The model based on meta-regression can increase the prediction accuracy by about 15% as compared to the model that was directly developed with limited data. Finally, the Bayesian predictive densities analysis was used to identify the outliers in the limited data. It can further improve the prediction accuracy by 5.0%.

  16. A Bayesian Surrogate Model for Rapid Time Series Analysis and Application to Exoplanet Observations

    CERN Document Server

    Ford, Eric B; Veras, Dimitri

    2011-01-01

    We present a Bayesian surrogate model for the analysis of periodic or quasi-periodic time series data. We describe a computationally efficient implementation that enables Bayesian model comparison. We apply this model to simulated and real exoplanet observations. We discuss the results and demonstrate some of the challenges for applying our surrogate model to realistic exoplanet data sets. In particular, we find that analyses of real world data should pay careful attention to the effects of uneven spacing of observations and the choice of prior for the "jitter" parameter.

  17. Variational Bayesian Causal Connectivity Analysis for fMRI

    Directory of Open Access Journals (Sweden)

    Martin eLuessi

    2014-05-01

    Full Text Available The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions.

  18. Bayesian Analysis of Multiple Populations I: Statistical and Computational Methods

    CERN Document Server

    Stenning, D C; Robinson, E; van Dyk, D A; von Hippel, T; Sarajedini, A; Stein, N

    2016-01-01

    We develop a Bayesian model for globular clusters composed of multiple stellar populations, extending earlier statistical models for open clusters composed of simple (single) stellar populations (vanDyk et al. 2009, Stein et al. 2013). Specifically, we model globular clusters with two populations that differ in helium abundance. Our model assumes a hierarchical structuring of the parameters in which physical properties---age, metallicity, helium abundance, distance, absorption, and initial mass---are common to (i) the cluster as a whole or to (ii) individual populations within a cluster, or are unique to (iii) individual stars. An adaptive Markov chain Monte Carlo (MCMC) algorithm is devised for model fitting that greatly improves convergence relative to its precursor non-adaptive MCMC algorithm. Our model and computational tools are incorporated into an open-source software suite known as BASE-9. We use numerical studies to demonstrate that our method can recover parameters of two-population clusters, and al...

  19. Unsupervised Transient Light Curve Analysis Via Hierarchical Bayesian Inference

    CERN Document Server

    Sanders, Nathan; Soderberg, Alicia

    2014-01-01

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometr...

  20. A Software Risk Analysis Model Using Bayesian Belief Network

    Institute of Scientific and Technical Information of China (English)

    Yong Hu; Juhua Chen; Mei Liu; Yang Yun; Junbiao Tang

    2006-01-01

    The uncertainty during the period of software project development often brings huge risks to contractors and clients. Ifwe can find an effective method to predict the cost and quality of software projects based on facts like the project character and two-side cooperating capability at the beginning of the project, we can reduce the risk.Bayesian Belief Network(BBN) is a good tool for analyzing uncertain consequences, but it is difficult to produce precise network structure and conditional probability table. In this paper, we built up network structure by Delphi method for conditional probability table learning, and learn update probability table and nodes' confidence levels continuously according to the application cases, which made the evaluation network have learning abilities, and evaluate the software development risk of organization more accurately. This paper also introduces EM algorithm, which will enhance the ability to produce hidden nodes caused by variant software projects.

  1. Bayesian 3D X-ray computed tomography image reconstruction with a scaled Gaussian mixture prior model

    Science.gov (United States)

    Wang, Li; Gac, Nicolas; Mohammad-Djafari, Ali

    2015-01-01

    In order to improve quality of 3D X-ray tomography reconstruction for Non Destructive Testing (NDT), we investigate in this paper hierarchical Bayesian methods. In NDT, useful prior information on the volume like the limited number of materials or the presence of homogeneous area can be included in the iterative reconstruction algorithms. In hierarchical Bayesian methods, not only the volume is estimated thanks to the prior model of the volume but also the hyper parameters of this prior. This additional complexity in the reconstruction methods when applied to large volumes (from 5123 to 81923 voxels) results in an increasing computational cost. To reduce it, the hierarchical Bayesian methods investigated in this paper lead to an algorithm acceleration by Variational Bayesian Approximation (VBA) [1] and hardware acceleration thanks to projection and back-projection operators paralleled on many core processors like GPU [2]. In this paper, we will consider a Student-t prior on the gradient of the image implemented in a hierarchical way [3, 4, 1]. Operators H (forward or projection) and Ht (adjoint or back-projection) implanted in multi-GPU [2] have been used in this study. Different methods will be evalued on synthetic volume "Shepp and Logan" in terms of quality and time of reconstruction. We used several simple regularizations of order 1 and order 2. Other prior models also exists [5]. Sometimes for a discrete image, we can do the segmentation and reconstruction at the same time, then the reconstruction can be done with less projections.

  2. Bayesian Meta-Analysis of Cronbach's Coefficient Alpha to Evaluate Informative Hypotheses

    Science.gov (United States)

    Okada, Kensuke

    2015-01-01

    This paper proposes a new method to evaluate informative hypotheses for meta-analysis of Cronbach's coefficient alpha using a Bayesian approach. The coefficient alpha is one of the most widely used reliability indices. In meta-analyses of reliability, researchers typically form specific informative hypotheses beforehand, such as "alpha of…

  3. A continuous-time Bayesian network reliability modeling and analysis framework

    NARCIS (Netherlands)

    Boudali, H.; Dugan, J.B.

    2006-01-01

    We present a continuous-time Bayesian network (CTBN) framework for dynamic systems reliability modeling and analysis. Dynamic systems exhibit complex behaviors and interactions between their components; where not only the combination of failure events matters, but so does the sequence ordering of th

  4. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    Science.gov (United States)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  5. Bayesian Network Meta-Analysis for Unordered Categorical Outcomes with Incomplete Data

    Science.gov (United States)

    Schmid, Christopher H.; Trikalinos, Thomas A.; Olkin, Ingram

    2014-01-01

    We develop a Bayesian multinomial network meta-analysis model for unordered (nominal) categorical outcomes that allows for partially observed data in which exact event counts may not be known for each category. This model properly accounts for correlations of counts in mutually exclusive categories and enables proper comparison and ranking of…

  6. Using Discrete Loss Functions and Weighted Kappa for Classification: An Illustration Based on Bayesian Network Analysis

    Science.gov (United States)

    Zwick, Rebecca; Lenaburg, Lubella

    2009-01-01

    In certain data analyses (e.g., multiple discriminant analysis and multinomial log-linear modeling), classification decisions are made based on the estimated posterior probabilities that individuals belong to each of several distinct categories. In the Bayesian network literature, this type of classification is often accomplished by assigning…

  7. Family background variables as instruments for education in income regressions: A Bayesian analysis

    NARCIS (Netherlands)

    L.F. Hoogerheide (Lennart); J.H. Block (Jörn); A.R. Thurik (Roy)

    2012-01-01

    textabstractThe validity of family background variables instrumenting education in income regressions has been much criticized. In this paper, we use data from the 2004 German Socio-Economic Panel and Bayesian analysis to analyze to what degree violations of the strict validity assumption affect the

  8. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  9. A Bayesian analysis of the unit root in real exchange rates

    NARCIS (Netherlands)

    P.C. Schotman (Peter); H.K. van Dijk (Herman)

    1991-01-01

    textabstractWe propose a posterior odds analysis of the hypothesis of a unit root in real exchange rates. From a Bayesian viewpoint the random walk hypothesis for real exchange rates is a posteriori as probable as a stationary AR(1) process for four out of eight time series investigated. The French

  10. Bayesian Factor Analysis as a Variable-Selection Problem: Alternative Priors and Consequences.

    Science.gov (United States)

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor-loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, Muthén & Asparouhov proposed a Bayesian structural equation modeling (BSEM) approach to explore the presence of cross loadings in CFA models. We show that the issue of determining factor-loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov's approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike-and-slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set is used to demonstrate our approach.

  11. A Bayesian multidimensional scaling procedure for the spatial analysis of revealed choice data

    NARCIS (Netherlands)

    DeSarbo, WS; Kim, Y; Fong, D

    1999-01-01

    We present a new Bayesian formulation of a vector multidimensional scaling procedure for the spatial analysis of binary choice data. The Gibbs sampler is gainfully employed to estimate the posterior distribution of the specified scalar products, bilinear model parameters. The computational procedure

  12. A continuous-time Bayesian network reliability modeling and analysis framework

    NARCIS (Netherlands)

    Boudali, H.; Dugan, J.B.

    2006-01-01

    We present a continuous-time Bayesian network (CTBN) framework for dynamic systems reliability modeling and analysis. Dynamic systems exhibit complex behaviors and interactions between their components; where not only the combination of failure events matters, but so does the sequence ordering of th

  13. A Pragmatic Bayesian Perspective on Correlation Analysis : The exoplanetary gravity - stellar activity case.

    Science.gov (United States)

    Figueira, P; Faria, J P; Adibekyan, V Zh; Oshagh, M; Santos, N C

    2016-11-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (∼ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log([Formula: see text]) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.

  14. A Pragmatic Bayesian Perspective on Correlation Analysis. The exoplanetary gravity - stellar activity case

    Science.gov (United States)

    Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.

    2016-11-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log(R^' }_{ {HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.

  15. Bayesian dose-response analysis for epidemiological studies with complex uncertainty in dose estimation.

    Science.gov (United States)

    Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L

    2016-02-10

    Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure.

  16. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    Science.gov (United States)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  17. Noninvasive Brain Stimulation, Maladaptive Plasticity, and Bayesian Analysis in Phantom Limb Pain.

    Science.gov (United States)

    Morales-Quezada, Leon

    2017-08-01

    Introduction: Phantom limb pain (PLP) is a common and poorly understood pathology of difficult medical control that progressively takes place after amputation occurs. Objective: This article discusses the multifactorial bases of PLP. These bases involve local changes at the stump level, spinal modifications of excitability, deafferentation, and central sensitization, leading to the development of maladaptive plasticity, and consequentially, defective processing of sensory information by associative neural networks. These changes can be traced by neurophysiology and imaging topographical studies, indicating a degree of cortical reorganization that perpetuates pain and discomfort. Intervention: Noninvasive brain stimulation can be an alternative way to manage PLP. This article discusses two techniques-transcranial direct current stimulation (tDCS) and repetitive transcranial magnetic stimulation (rTMS)-that have shown promising results for controlling PLP. The modulation that both techniques rely on is based on synaptic mechanisms linked to long-term potentiation and long-term depression phenomena. By applying tDCS or rTMS, clinicians can target processes associated with central sensitization and maladaptive plasticity, while promoting adequate sensory information processing by integrative cognitive behavioral techniques in a comprehensive rehabilitation program. Conclusions: Understanding PLP from a dynamic neurocomputational perspective will help to develop better treatments. Furthermore, Bayesian analysis of sensory information can help guide and monitor therapeutic interventions directed toward PLP resolution.

  18. Case-control studies of gene-environment interaction: Bayesian design and analysis.

    Science.gov (United States)

    Mukherjee, Bhramar; Ahn, Jaeil; Gruber, Stephen B; Ghosh, Malay; Chatterjee, Nilanjan

    2010-09-01

    With increasing frequency, epidemiologic studies are addressing hypotheses regarding gene-environment interaction. In many well-studied candidate genes and for standard dietary and behavioral epidemiologic exposures, there is often substantial prior information available that may be used to analyze current data as well as for designing a new study. In this article, first, we propose a proper full Bayesian approach for analyzing studies of gene-environment interaction. The Bayesian approach provides a natural way to incorporate uncertainties around the assumption of gene-environment independence, often used in such an analysis. We then consider Bayesian sample size determination criteria for both estimation and hypothesis testing regarding the multiplicative gene-environment interaction parameter. We illustrate our proposed methods using data from a large ongoing case-control study of colorectal cancer investigating the interaction of N-acetyl transferase type 2 (NAT2) with smoking and red meat consumption. We use the existing data to elicit a design prior and show how to use this information in allocating cases and controls in planning a future study that investigates the same interaction parameters. The Bayesian design and analysis strategies are compared with their corresponding frequentist counterparts.

  19. Heterogeneous multimodal biomarkers analysis for Alzheimer's disease via Bayesian network.

    Science.gov (United States)

    Jin, Yan; Su, Yi; Zhou, Xiao-Hua; Huang, Shuai

    2016-12-01

    By 2050, it is estimated that the number of worldwide Alzheimer's disease (AD) patients will quadruple from the current number of 36 million, while no proven disease-modifying treatments are available. At present, the underlying disease mechanisms remain under investigation, and recent studies suggest that the disease involves multiple etiological pathways. To better understand the disease and develop treatment strategies, a number of ongoing studies including the Alzheimer's Disease Neuroimaging Initiative (ADNI) enroll many study participants and acquire a large number of biomarkers from various modalities including demographic, genotyping, fluid biomarkers, neuroimaging, neuropsychometric test, and clinical assessments. However, a systematic approach that can integrate all the collected data is lacking. The overarching goal of our study is to use machine learning techniques to understand the relationships among different biomarkers and to establish a system-level model that can better describe the interactions among biomarkers and provide superior diagnostic and prognostic information. In this pilot study, we use Bayesian network (BN) to analyze multimodal data from ADNI, including demographics, volumetric MRI, PET, genotypes, and neuropsychometric measurements and demonstrate our approach to have superior prediction accuracy.

  20. On the On-Off Problem: An Objective Bayesian Analysis

    CERN Document Server

    Ahnen, Max Ludwig

    2015-01-01

    The On-Off problem, aka. Li-Ma problem, is a statistical problem where a measured rate is the sum of two parts. The first is due to a signal and the second due to a background, both of which are unknown. Mostly frequentist solutions are being used that are only adequate for high count numbers. When the events are rare such an approximation is not good enough. Indeed, in high-energy astrophysics this is often the rule rather than the exception. I will present a universal objective Bayesian solution that depends only on the initial three parameters of the On-Off problem: the number of events in the "on" region, the number of events in the "off" region, and their ratio-of-exposure. With a two-step approach it is possible to infer the signal's significance, strength, uncertainty or upper limit in a unified a way. The approach is valid without restrictions for any count number including zero and may be widely applied in particle physics, cosmic-ray physics and high-energy astrophysics. I apply the method to Gamma ...

  1. Bayesian inversion analysis of nonlinear dynamics in surface heterogeneous reactions.

    Science.gov (United States)

    Omori, Toshiaki; Kuwatani, Tatsu; Okamoto, Atsushi; Hukushima, Koji

    2016-09-01

    It is essential to extract nonlinear dynamics from time-series data as an inverse problem in natural sciences. We propose a Bayesian statistical framework for extracting nonlinear dynamics of surface heterogeneous reactions from sparse and noisy observable data. Surface heterogeneous reactions are chemical reactions with conjugation of multiple phases, and they have the intrinsic nonlinearity of their dynamics caused by the effect of surface-area between different phases. We adapt a belief propagation method and an expectation-maximization (EM) algorithm to partial observation problem, in order to simultaneously estimate the time course of hidden variables and the kinetic parameters underlying dynamics. The proposed belief propagation method is performed by using sequential Monte Carlo algorithm in order to estimate nonlinear dynamical system. Using our proposed method, we show that the rate constants of dissolution and precipitation reactions, which are typical examples of surface heterogeneous reactions, as well as the temporal changes of solid reactants and products, were successfully estimated only from the observable temporal changes in the concentration of the dissolved intermediate product.

  2. Bayesian SPLDA

    OpenAIRE

    Villalba, Jesús

    2015-01-01

    In this document we are going to derive the equations needed to implement a Variational Bayes estimation of the parameters of the simplified probabilistic linear discriminant analysis (SPLDA) model. This can be used to adapt SPLDA from one database to another with few development data or to implement the fully Bayesian recipe. Our approach is similar to Bishop's VB PPCA.

  3. Inferring Neuronal Dynamics from Calcium Imaging Data Using Biophysical Models and Bayesian Inference.

    Science.gov (United States)

    Rahmati, Vahid; Kirmse, Knut; Marković, Dimitrije; Holthoff, Knut; Kiebel, Stefan J

    2016-02-01

    Calcium imaging has been used as a promising technique to monitor the dynamic activity of neuronal populations. However, the calcium trace is temporally smeared which restricts the extraction of quantities of interest such as spike trains of individual neurons. To address this issue, spike reconstruction algorithms have been introduced. One limitation of such reconstructions is that the underlying models are not informed about the biophysics of spike and burst generations. Such existing prior knowledge might be useful for constraining the possible solutions of spikes. Here we describe, in a novel Bayesian approach, how principled knowledge about neuronal dynamics can be employed to infer biophysical variables and parameters from fluorescence traces. By using both synthetic and in vitro recorded fluorescence traces, we demonstrate that the new approach is able to reconstruct different repetitive spiking and/or bursting patterns with accurate single spike resolution. Furthermore, we show that the high inference precision of the new approach is preserved even if the fluorescence trace is rather noisy or if the fluorescence transients show slow rise kinetics lasting several hundred milliseconds, and inhomogeneous rise and decay times. In addition, we discuss the use of the new approach for inferring parameter changes, e.g. due to a pharmacological intervention, as well as for inferring complex characteristics of immature neuronal circuits.

  4. Bayesian Fusion of Multi-Scale Detectors for Road Extraction from SAR Images

    Directory of Open Access Journals (Sweden)

    Rui Xu

    2017-01-01

    Full Text Available This paper introduces an innovative road network extraction algorithm using synthetic aperture radar (SAR imagery for improving the accuracy of road extraction. The state-of-the-art approaches, such as fraction extraction and road network optimization, failed to obtain continuous road segments in separate successions, since the optimization could not change the parts ignored by the fraction extraction. In this paper, the proposed algorithm integrates the fraction extraction and optimization procedure simultaneously to extract the road network: (1 the Bayesian framework is utilized to transfer the road network extraction to joint reasoning of the likelihood of fraction extraction and the priority of network optimization; (2 the multi-scale linear feature detector (MLFD and the network optimization beamlet are introduced; (3 the conditional random field (CRF is used to reason jointly. The result is the global optimum since the fraction extraction and network optimization are exploited at the same time. The proposed algorithm solves the problem that the fractions are bound to reduce in the process of network optimization and has demonstrated effectiveness in real SAR images applications.

  5. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, N. E.; Soderberg, A. M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Betancourt, M., E-mail: nsanders@cfa.harvard.edu [Department of Statistics, University of Warwick, Coventry CV4 7AL (United Kingdom)

    2015-02-10

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  6. A problem in particle physics and its Bayesian analysis

    Science.gov (United States)

    Landon, Joshua

    An up and coming field in contemporary nuclear and particle physics is "Lattice Quantum Chromodynamics", henceforth Lattice QCD. Indeed the 2004 Nobel Prize in Physics went to the developers of equations that describe QCD. In this dissertation, following a layperson's introduction to the structure of matter, we outline the statistical aspects of a problem in Lattice QCD faced by particle physicists, and point out the difficulties encountered by them in trying to address the problem. The difficulties stem from the fact that one is required to estimate a large -- conceptually infinite -- number of parameters based on a finite number of non-linear equations, each of which is a sum of exponential functions. We then present a plausible approach for solving the problem. Our approach is Bayesian and is driven by a computationally intensive Markov Chain Monte Carlo based solution. However, in order to invoke our approach we first look at the underlying anatomy of the problem and synthesize its essentials. These essentials reveal a pattern that can be harnessed via some assumptions, and this in turn enables us to outline a pathway towards a solution. We demonstrate the viability of our approach via simulated data, followed by its validation against real data provided to us by our physicist colleagues. Our approach yields results that in the past were not obtainable via alternate approaches. The contribution of this dissertation is two-fold. The first is a use of computationally intensive statistical technology to produce results in physics that could not be obtained using physics based techniques. Since the statistical architecture of the problem considered here can arise in other contexts as well, the second contribution of this dissertation is to indicate a plausible approach for addressing a generic class of problems wherein the number of parameters to be estimated exceeds the number of constraints, each constraint being a non-linear equation that is the sum of

  7. A Bayesian analysis of kaon photoproduction with the Regge-plus-resonance model

    CERN Document Server

    De Cruz, Lesley; Vrancx, Tom; Vancraeyveld, Pieter

    2012-01-01

    We address the issue of unbiased model selection and propose a methodology based on Bayesian inference to extract physical information from kaon photoproduction $p(\\gamma,K^+)\\Lambda$ data. We use the single-channel Regge-plus-resonance (RPR) framework for $p(\\gamma,K^+)\\Lambda$ to illustrate the proposed strategy. The Bayesian evidence Z is a quantitative measure for the model's fitness given the world's data. We present a numerical method for performing the multidimensional integrals in the expression for the Bayesian evidence. We use the $p(\\gamma,K^+)\\Lambda$ data with an invariant energy W > 2.6 GeV in order to constrain the background contributions in the RPR framework with Bayesian inference. Next, the resonance information is extracted from the analysis of differential cross sections, single and double polarization observables. This background and resonance content constitutes the basis of a model which is coined RPR-2011. It is shown that RPR-2011 yields a comprehensive account of the kaon photoprodu...

  8. Individual organisms as units of analysis: Bayesian-clustering alternatives in population genetics.

    Science.gov (United States)

    Mank, Judith E; Avise, John C

    2004-12-01

    Population genetic analyses traditionally focus on the frequencies of alleles or genotypes in 'populations' that are delimited a priori. However, there are potential drawbacks of amalgamating genetic data into such composite attributes of assemblages of specimens: genetic information on individual specimens is lost or submerged as an inherent part of the analysis. A potential also exists for circular reasoning when a population's initial identification and subsequent genetic characterization are coupled. In principle, these problems are circumvented by some newer methods of population identification and individual assignment based on statistical clustering of specimen genotypes. Here we evaluate a recent method in this genre--Bayesian clustering--using four genotypic data sets involving different types of molecular markers in non-model organisms from nature. As expected, measures of population genetic structure (F(ST) and phiST) tended to be significantly greater in Bayesian a posteriori data treatments than in analyses where populations were delimited a priori. In the four biological contexts examined, which involved both geographic population structures and hybrid zones, Bayesian clustering was able to recover differentiated populations, and Bayesian assignments were able to identify likely population sources of specific individuals.

  9. Klebsiella pneumoniae blaKPC-3 nosocomial epidemic: Bayesian and evolutionary analysis.

    Science.gov (United States)

    Angeletti, Silvia; Presti, Alessandra Lo; Cella, Eleonora; Fogolari, Marta; De Florio, Lucia; Dedej, Etleva; Blasi, Aletheia; Milano, Teresa; Pascarella, Stefano; Incalzi, Raffaele Antonelli; Coppola, Roberto; Dicuonzo, Giordano; Ciccozzi, Massimo

    2016-12-01

    K. pneumoniae isolates carrying blaKPC-3 gene were collected to perform Bayesian phylogenetic and selective pressure analysis and to apply homology modeling to the KPC-3 protein. A dataset of 44 blakpc-3 gene sequences from clinical isolates of K. pneumoniae was used for Bayesian phylogenetic, selective pressure analysis and homology modeling. The mean evolutionary rate for blakpc-3 gene was 2.67×10(-3) substitution/site/year (95% HPD: 3.4×10(-4-)5.59×10(-)(3)). The root of the Bayesian tree dated back to the year 2011 (95% HPD: 2007-2012). Two main clades (I and II) were identified. The population dynamics analysis showed an exponential growth from 2011 to 2013 and the reaching of a plateau. The phylogeographic reconstruction showed that the root of the tree had a probable common ancestor in the general surgery ward. Selective pressure analysis revealed twelve positively selected sites. Structural analysis of KPC-3 protein predicted that the amino acid mutations are destabilizing for the protein and could alter the substrate specificity. Phylogenetic analysis and homology modeling of blaKPC-3 gene could represent a useful tool to follow KPC spread in nosocomial setting and to evidence amino acid substitutions altering the substrate specificity. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Calibrating spatio-temporal models of leukocyte dynamics against in vivo live-imaging data using approximate Bayesian computation

    Science.gov (United States)

    Barnes, Chris P.; Huvet, Maxime; Bugeon, Laurence; Thorne, Thomas; Lamb, Jonathan R.; Dallman, Margaret J.; Stumpf, Michael P. H.

    2016-01-01

    In vivo studies allow us to investigate biological processes at the level of the organism. But not all aspects of in vivo systems are amenable to direct experimental measurements. In order to make the most of such data we therefore require statistical tools that allow us to obtain reliable estimates for e.g. kinetic in vivo parameters. Here we show how we can use approximate Bayesian computation approaches in order to analyse leukocyte migration in zebrafish embryos in response to injuries. We track individual leukocytes using live imaging following surgical injury to the embryos’ tail-fins. The signalling gradient that leukocytes follow towards the site of the injury cannot be directly measured but we can estimate its shape and how it changes with time from the directly observed patterns of leukocyte migration. By coupling simple models of immune signalling and leukocyte migration with the unknown gradient shape into a single statistical framework we can gain detailed insights into the tissue-wide processes that are involved in the innate immune response to wound injury. In particular we find conclusive evidence for a temporally and spatially changing signalling gradient that modulates the changing activity of the leukocyte population in the embryos. We conclude with a robustness analysis which highlights the most important factors determining the leukocyte dynamics. Our approach relies only on the ability to simulate numerically the process under investigation and is therefore also applicable in other in vivo contexts and studies. PMID:22327539

  11. Bayesian methods for model uncertainty analysis with application to future sea level rise

    Energy Technology Data Exchange (ETDEWEB)

    Patwardhan, A.; Small, M.J. (Carnegie Mellon Univ., Pittsburgh, PA (United States))

    1992-12-01

    This paper addresses the use of data for identifying and characterizing uncertainties in model parameters and predictions. The Bayesian Monte Carlo method is formally presented and elaborated, and applied to the analysis of the uncertainty in a predictive model for global mean sea level change. The method uses observations of output variables, made with an assumed error structure, to determine a posterior distribution of model outputs. This is used to derive a posterior distribution for the model parameters. Results demonstrate the resolution of the uncertainty that is obtained as a result of the Bayesian analysis and also indicate the key contributors to the uncertainty in the sea level rise model. While the technique is illustrated with a simple, preliminary model, the analysis provides an iterative framework for model refinement. The methodology developed in this paper provides a mechanism for the incorporation of ongoing data collection and research in decision-making for problems involving uncertain environmental change.

  12. An integrated Bayesian analysis of LOH and copy number data

    Directory of Open Access Journals (Sweden)

    Hutter Marcus

    2010-06-01

    Full Text Available Abstract Background Cancer and other disorders are due to genomic lesions. SNP-microarrays are able to measure simultaneously both genotype and copy number (CN at several Single Nucleotide Polymorphisms (SNPs along the genome. CN is defined as the number of DNA copies, and the normal is two, since we have two copies of each chromosome. The genotype of a SNP is the status given by the nucleotides (alleles which are present on the two copies of DNA. It is defined homozygous or heterozygous if the two alleles are the same or if they differ, respectively. Loss of heterozygosity (LOH is the loss of the heterozygous status due to genomic events. Combining CN and LOH data, it is possible to better identify different types of genomic aberrations. For example, a long sequence of homozygous SNPs might be caused by either the physical loss of one copy or a uniparental disomy event (UPD, i.e. each SNP has two identical nucleotides both derived from only one parent. In this situation, the knowledge of the CN can help in distinguishing between these two events. Results To better identify genomic aberrations, we propose a method (called gBPCR which infers the type of aberration occurred, taking into account all the possible influence in the microarray detection of the homozygosity status of the SNPs, resulting from an altered CN level. Namely, we model the distributions of the detected genotype, given a specific genomic alteration and we estimate the parameters involved on public reference datasets. The estimation is performed similarly to the modified Bayesian Piecewise Constant Regression, but with improved estimators for the detection of the breakpoints. Using artificial and real data, we evaluate the quality of the estimation of gBPCR and we also show that it outperforms other well-known methods for LOH estimation. Conclusions We propose a method (gBPCR for the estimation of both LOH and CN aberrations, improving their estimation by integrating both types

  13. Bayesian model selection for analysis and design of multilayer sound absorbers

    Science.gov (United States)

    Fackler, Cameron Jeff

    New methods for the analysis and design of multilayer sound absorbers, utilizing a model-based Bayesian inference approach, are proposed. Additionally, a Bayesian method for calibrating impedance tubes, widely used to measure the acoustic properties of sound absorbing materials, is developed. Impedance tubes provide a convenient way to characterize the normal-incidence acoustic properties of materials. These measurements rely on accurately knowing the positions of microphones sensing the sound field inside the tube; these positions must be determined acoustically since the physical dimensions of the microphones are larger than the required precision. Using a calibration measurement of the empty tube, the method developed here determines the acoustic positions and their uncertainties for the microphones of an impedance tube. Microperforated panel absorbers are an exciting, relatively new type of sound absorber, requiring no traditional fibrous materials. The provided absorption, however, has a narrow frequency bandwidth. To provide a more broadband absorption, multiple microperforated panels may be combined into a multilayer absorber, but this yields a difficult design challenge. Here, the Bayesian framework is used to design such multilayer microperforated panels. This provides a method that automatically determines the minimum number of layers required and the design parameters for each layer of a multilayer arrangement yielding a desired acoustic absorption profile. Traditional porous materials are widely used as sound absorbers. Additionally, other substances such as soils or sediments may be modeled as porous materials. When studying and attempting to predict the acoustic properties of such materials, knowing the physical properties of the material is essential. A Bayesian approach to infer these physical parameters from an acoustic measurement is developed. In addition to determining the values and associated uncertainties of the physical material parameters

  14. Application of Bayesian graphs to SN Ia data analysis and compression

    Science.gov (United States)

    Ma, Cong; Corasaniti, Pier-Stefano; Bassett, Bruce A.

    2016-12-01

    Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the joint light-curve analysis (JLA) data set. In contrast to the χ2 approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with χ2 analysis results, we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal 6σ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only 2.4σ. Systematic offsets on the cosmological parameters remain small, but may increase by combining constraints from complementary cosmological probes. The bias of the χ2 analysis is due to neglecting the parameter-dependent log-determinant of the data covariance, which gives more statistical weight to larger values of the standardization parameters. We find a similar effect on compressed distance modulus data. To this end, we implement a fully consistent compression method of the JLA data set that uses a Gaussian approximation of the posterior distribution for fast generation of compressed data. Overall, the results of our analysis emphasize the need for a fully consistent Bayesian statistical approach in the analysis of future large SN Ia data sets.

  15. Application of Bayesian graphs to SN Ia data analysis and compression

    Science.gov (United States)

    Ma, Cong; Corasaniti, Pier-Stefano; Bassett, Bruce A.

    2016-08-01

    Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the Joint Light-curve Analysis (JLA) dataset (Betoule et al. 2014). In contrast to the χ2 approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with χ2 analysis results we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal 6σ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only 2.4σ. Systematic offsets on the cosmological parameters remain small, but may increase by combining constraints from complementary cosmological probes. The bias of the χ2 analysis is due to neglecting the parameter-dependent log-determinant of the data covariance, which gives more statistical weight to larger values of the standardization parameters. We find a similar effect on compressed distance modulus data. To this end we implement a fully consistent compression method of the JLA dataset that uses a Gaussian approximation of the posterior distribution for fast generation of compressed data. Overall, the results of our analysis emphasize the need for a fully consistent Bayesian statistical approach in the analysis of future large SN Ia datasets.

  16. The effect of close relatives on unsupervised Bayesian clustering algorithms in population genetic structure analysis.

    Science.gov (United States)

    Rodríguez-Ramilo, Silvia T; Wang, Jinliang

    2012-09-01

    The inference of population genetic structures is essential in many research areas in population genetics, conservation biology and evolutionary biology. Recently, unsupervised Bayesian clustering algorithms have been developed to detect a hidden population structure from genotypic data, assuming among others that individuals taken from the population are unrelated. Under this assumption, markers in a sample taken from a subpopulation can be considered to be in Hardy-Weinberg and linkage equilibrium. However, close relatives might be sampled from the same subpopulation, and consequently, might cause Hardy-Weinberg and linkage disequilibrium and thus bias a population genetic structure analysis. In this study, we used simulated and real data to investigate the impact of close relatives in a sample on Bayesian population structure analysis. We also showed that, when close relatives were identified by a pedigree reconstruction approach and removed, the accuracy of a population genetic structure analysis can be greatly improved. The results indicate that unsupervised Bayesian clustering algorithms cannot be used blindly to detect genetic structure in a sample with closely related individuals. Rather, when closely related individuals are suspected to be frequent in a sample, these individuals should be first identified and removed before conducting a population structure analysis.

  17. A pragmatic Bayesian perspective on correlation analysis: The exoplanetary gravity - stellar activity case

    CERN Document Server

    Figueira, P; Adibekyan, V Zh; Oshagh, M; Santos, N C

    2016-01-01

    We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, $\\rho$, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short ($\\sim$130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log($R'_{\\mathrm{HK}}$) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the re...

  18. Assessment of Water Availability Impact on Wetland Management using Multi-temporal Landsat Images and Bayesian-based Learning Machines

    Science.gov (United States)

    Alminagorta, O.; Torres, A. F.

    2013-12-01

    Water availability has a direct impact on the wetland ecosystems. While wetland managers need better information to allocate scarce water to improve wetland services, most monitoring activities of flood areas and vegetation condition on wetlands relies on manual estimation of water depth and use of airboat with GPS devices. This process is costly and time-consuming. Remote sensing techniques have been previously used to characterize vegetation conditions along with hydrological characteristics of the wetlands with excellent results. Nevertheless, limited analysis has been done to relate the resulting wetland characterization with the historical water availability records. The present paper addresses the lack of adequate feedback on wetland conditions upon the available water for the wetland system by making use of multi-temporal Landsat images. These images are processed at wetland unit and system level to extract information about vegetation, soil and water conditions. This information is then correlated with historical water availability records for the wetland system by means of the Relevance Vector Machine, a Bayesian-based algorithm known for its robustness, efficiency, and sparseness. This research is applied at the Bear River Migratory Bird Refuge (the Refuge), located on the northeast side of Great Salt Lake, Utah. The Refuge constitutes one of the most important habitats for migratory birds for the Pacific Flyway of North America. Water-discharge records and coverage vegetation collected at the Refuge has been used to calibrate and evaluate the effects on wetland services to the process of flooding and drought in wetland units during different years. The final product of this research is to provide a methodology that wetland managers can use to make informed decisions about water allocation to improve wetland services while avoiding wasting resources, effort, time and money.

  19. A note on the robustness of a full Bayesian method for nonignorable missing data analysis

    OpenAIRE

    Zhang, Zhiyong; Wang,Lijuan

    2012-01-01

    A full Bayesian method utilizing data augmentation and Gibbs sampling algorithms is presented for analyzing nonignorable missing data. The discussion focuses on a simplified selection model for regression analysis. Regardless of missing mechanisms, it is assumed that missingness only depends on the missing variable itself. Simulation results demonstrate that the simplified selection model can recover regression model parameters under both correctly specified situations and many misspecified s...

  20. Bayesian 3D X-ray computed tomography image reconstruction with a scaled Gaussian mixture prior model

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gac, Nicolas; Mohammad-Djafari, Ali [Laboratoire des Signaux et Systèmes 3, Rue Joliot-Curie 91192 Gif sur Yvette (France)

    2015-01-13

    In order to improve quality of 3D X-ray tomography reconstruction for Non Destructive Testing (NDT), we investigate in this paper hierarchical Bayesian methods. In NDT, useful prior information on the volume like the limited number of materials or the presence of homogeneous area can be included in the iterative reconstruction algorithms. In hierarchical Bayesian methods, not only the volume is estimated thanks to the prior model of the volume but also the hyper parameters of this prior. This additional complexity in the reconstruction methods when applied to large volumes (from 512{sup 3} to 8192{sup 3} voxels) results in an increasing computational cost. To reduce it, the hierarchical Bayesian methods investigated in this paper lead to an algorithm acceleration by Variational Bayesian Approximation (VBA) [1] and hardware acceleration thanks to projection and back-projection operators paralleled on many core processors like GPU [2]. In this paper, we will consider a Student-t prior on the gradient of the image implemented in a hierarchical way [3, 4, 1]. Operators H (forward or projection) and H{sup t} (adjoint or back-projection) implanted in multi-GPU [2] have been used in this study. Different methods will be evalued on synthetic volume 'Shepp and Logan' in terms of quality and time of reconstruction. We used several simple regularizations of order 1 and order 2. Other prior models also exists [5]. Sometimes for a discrete image, we can do the segmentation and reconstruction at the same time, then the reconstruction can be done with less projections.

  1. A Bayesian spatial temporal mixtures approach to kinetic parametric images in dynamic positron emission tomography.

    Science.gov (United States)

    Zhu, W; Ouyang, J; Rakvongthai, Y; Guehl, N J; Wooten, D W; El Fakhri, G; Normandin, M D; Fan, Y

    2016-03-01

    Estimation of parametric maps is challenging for kinetic models in dynamic positron emission tomography. Since voxel kinetics tend to be spatially contiguous, the authors consider groups of homogeneous voxels together. The authors propose a novel algorithm to identify the groups and estimate kinetic parameters simultaneously. Uncertainty estimates for kinetic parameters are also obtained. Mixture models were used to fit the time activity curves. In order to borrow information from spatially nearby voxels, the Potts model was adopted. A spatial temporal model was built incorporating both spatial and temporal information in the data. Markov chain Monte Carlo was used to carry out parameter estimation. Evaluation and comparisons with existing methods were carried out on cardiac studies using both simulated data sets and a pig study data. One-compartment kinetic modeling was used, in which K1 is the parameter of interest, providing a measure of local perfusion. Based on simulation experiments, the median standard deviation across all image voxels, of K1 estimates were 0, 0.13, and 0.16 for the proposed spatial mixture models (SMMs), standard curve fitting, and spatial K-means methods, respectively. The corresponding median mean squared biases for K1 were 0.04, 0.06, and 0.06 for abnormal region of interest (ROI); 0.03, 0.03, and 0.04 for normal ROI; and 0.007, 0.02, and 0.05 for the noise region. SMM is a fully Bayesian algorithm which determines the optimal number of homogeneous voxel groups, voxel group membership, parameter estimation, and parameter uncertainty estimation simultaneously. The voxel membership can also be used for classification purposes. By borrowing information from spatially nearby voxels, SMM substantially reduces the variability of parameter estimates. In some ROIs, SMM also reduces mean squared bias.

  2. Gabor Analysis for Imaging

    DEFF Research Database (Denmark)

    Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan

    2015-01-01

    , it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...... of the generalities relevant for an understanding of Gabor analysis of functions on Rd. We pay special attention to the case d = 2, which is the most important case for image processing and image analysis applications. The chapter is organized as follows. Section 2 presents central tools from functional analysis......, the application of Gabor expansions to image representation is considered in Sect. 6....

  3. Modified Bayesian Kriging for Noisy Response Problems for Reliability Analysis

    Science.gov (United States)

    2015-01-01

    surrogate model is used to do the MCS prediction for the reliability analysis for the sampling- based reliability-based design optimization ( RBDO ) method...D., Choi, K. K., Noh, Y., & Zhao, L. (2011). Sampling-based stochastic sensitivity analysis using score functions for RBDO problems with correlated...K., and Zhao, L., (2011). Sampling- based RBDO using the stochastic sensitivity analysis and dynamic Kriging method. Structural and

  4. Bayesian models and meta analysis for multiple tissue gene expression data following corticosteroid administration

    Directory of Open Access Journals (Sweden)

    Kelemen Arpad

    2008-08-01

    Full Text Available Abstract Background This paper addresses key biological problems and statistical issues in the analysis of large gene expression data sets that describe systemic temporal response cascades to therapeutic doses in multiple tissues such as liver, skeletal muscle, and kidney from the same animals. Affymetrix time course gene expression data U34A are obtained from three different tissues including kidney, liver and muscle. Our goal is not only to find the concordance of gene in different tissues, identify the common differentially expressed genes over time and also examine the reproducibility of the findings by integrating the results through meta analysis from multiple tissues in order to gain a significant increase in the power of detecting differentially expressed genes over time and to find the differential differences of three tissues responding to the drug. Results and conclusion Bayesian categorical model for estimating the proportion of the 'call' are used for pre-screening genes. Hierarchical Bayesian Mixture Model is further developed for the identifications of differentially expressed genes across time and dynamic clusters. Deviance information criterion is applied to determine the number of components for model comparisons and selections. Bayesian mixture model produces the gene-specific posterior probability of differential/non-differential expression and the 95% credible interval, which is the basis for our further Bayesian meta-inference. Meta-analysis is performed in order to identify commonly expressed genes from multiple tissues that may serve as ideal targets for novel treatment strategies and to integrate the results across separate studies. We have found the common expressed genes in the three tissues. However, the up/down/no regulations of these common genes are different at different time points. Moreover, the most differentially expressed genes were found in the liver, then in kidney, and then in muscle.

  5. The potential for Bayesian compressive sensing to significantly reduce electron dose in high-resolution STEM images.

    Science.gov (United States)

    Stevens, Andrew; Yang, Hao; Carin, Lawrence; Arslan, Ilke; Browning, Nigel D

    2014-02-01

    The use of high-resolution imaging methods in scanning transmission electron microscopy (STEM) is limited in many cases by the sensitivity of the sample to the beam and the onset of electron beam damage (for example, in the study of organic systems, in tomography and during in situ experiments). To demonstrate that alternative strategies for image acquisition can help alleviate this beam damage issue, here we apply compressive sensing via Bayesian dictionary learning to high-resolution STEM images. These computational algorithms have been applied to a set of images with a reduced number of sampled pixels in the image. For a reduction in the number of pixels down to 5% of the original image, the algorithms can recover the original image from the reduced data set. We show that this approach is valid for both atomic-resolution images and nanometer-resolution studies, such as those that might be used in tomography datasets, by applying the method to images of strontium titanate and zeolites. As STEM images are acquired pixel by pixel while the beam is scanned over the surface of the sample, these postacquisition manipulations of the images can, in principle, be directly implemented as a low-dose acquisition method with no change in the electron optics or the alignment of the microscope itself.

  6. Bayesian Propensity Score Analysis: Simulation and Case Study

    Science.gov (United States)

    Kaplan, David; Chen, Cassie J. S.

    2011-01-01

    Propensity score analysis (PSA) has been used in a variety of settings, such as education, epidemiology, and sociology. Most typically, propensity score analysis has been implemented within the conventional frequentist perspective of statistics. This perspective, as is well known, does not account for uncertainty in either the parameters of the…

  7. A method of spherical harmonic analysis in the geosciences via hierarchical Bayesian inference

    Science.gov (United States)

    Muir, J. B.; Tkalčić, H.

    2015-11-01

    The problem of decomposing irregular data on the sphere into a set of spherical harmonics is common in many fields of geosciences where it is necessary to build a quantitative understanding of a globally varying field. For example, in global seismology, a compressional or shear wave speed that emerges from tomographic images is used to interpret current state and composition of the mantle, and in geomagnetism, secular variation of magnetic field intensity measured at the surface is studied to better understand the changes in the Earth's core. Optimization methods are widely used for spherical harmonic analysis of irregular data, but they typically do not treat the dependence of the uncertainty estimates on the imposed regularization. This can cause significant difficulties in interpretation, especially when the best-fit model requires more variables as a result of underestimating data noise. Here, with the above limitations in mind, the problem of spherical harmonic expansion of irregular data is treated within the hierarchical Bayesian framework. The hierarchical approach significantly simplifies the problem by removing the need for regularization terms and user-supplied noise estimates. The use of the corrected Akaike Information Criterion for picking the optimal maximum degree of spherical harmonic expansion and the resulting spherical harmonic analyses are first illustrated on a noisy synthetic data set. Subsequently, the method is applied to two global data sets sensitive to the Earth's inner core and lowermost mantle, consisting of PKPab-df and PcP-P differential traveltime residuals relative to a spherically symmetric Earth model. The posterior probability distributions for each spherical harmonic coefficient are calculated via Markov Chain Monte Carlo sampling; the uncertainty obtained for the coefficients thus reflects the noise present in the real data and the imperfections in the spherical harmonic expansion.

  8. Risks Analysis of Logistics Financial Business Based on Evidential Bayesian Network

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2013-01-01

    Full Text Available Risks in logistics financial business are identified and classified. Making the failure of the business as the root node, a Bayesian network is constructed to measure the risk levels in the business. Three importance indexes are calculated to find the most important risks in the business. And more, considering the epistemic uncertainties in the risks, evidence theory associate with Bayesian network is used as an evidential network in the risk analysis of logistics finance. To find how much uncertainty in root node is produced by each risk, a new index, epistemic importance, is defined. Numerical examples show that the proposed methods could provide a lot of useful information. With the information, effective approaches could be found to control and avoid these sensitive risks, thus keep logistics financial business working more reliable. The proposed method also gives a quantitative measure of risk levels in logistics financial business, which provides guidance for the selection of financing solutions.

  9. MorePower 6.0 for ANOVA with relational confidence intervals and Bayesian analysis.

    Science.gov (United States)

    Campbell, Jamie I D; Thompson, Valerie A

    2012-12-01

    MorePower 6.0 is a flexible freeware statistical calculator that computes sample size, effect size, and power statistics for factorial ANOVA designs. It also calculates relational confidence intervals for ANOVA effects based on formulas from Jarmasz and Hollands (Canadian Journal of Experimental Psychology 63:124-138, 2009), as well as Bayesian posterior probabilities for the null and alternative hypotheses based on formulas in Masson (Behavior Research Methods 43:679-690, 2011). The program is unique in affording direct comparison of these three approaches to the interpretation of ANOVA tests. Its high numerical precision and ability to work with complex ANOVA designs could facilitate researchers' attention to issues of statistical power, Bayesian analysis, and the use of confidence intervals for data interpretation. MorePower 6.0 is available at https://wiki.usask.ca/pages/viewpageattachments.action?pageId=420413544 .

  10. Factor analysis models for structuring covariance matrices of additive genetic effects: a Bayesian implementation

    Directory of Open Access Journals (Sweden)

    Gianola Daniel

    2007-09-01

    Full Text Available Abstract Multivariate linear models are increasingly important in quantitative genetics. In high dimensional specifications, factor analysis (FA may provide an avenue for structuring (covariance matrices, thus reducing the number of parameters needed for describing (codispersion. We describe how FA can be used to model genetic effects in the context of a multivariate linear mixed model. An orthogonal common factor structure is used to model genetic effects under Gaussian assumption, so that the marginal likelihood is multivariate normal with a structured genetic (covariance matrix. Under standard prior assumptions, all fully conditional distributions have closed form, and samples from the joint posterior distribution can be obtained via Gibbs sampling. The model and the algorithm developed for its Bayesian implementation were used to describe five repeated records of milk yield in dairy cattle, and a one common FA model was compared with a standard multiple trait model. The Bayesian Information Criterion favored the FA model.

  11. Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test

    DEFF Research Database (Denmark)

    Wang, C.; Turnbull, B.W.; Nielsen, Søren Saxmose

    2011-01-01

    . An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has......A Bayesian methodology was developed based on a latent change-point model to evaluate the performance of milk ELISA and fecal culture tests for longitudinal Johne's disease diagnostic data. The situation of no perfect reference test was considered; that is, no “gold standard.” A change......-point process with a Weibull survival hazard function was used to model the progression of the hidden disease status. The model adjusted for the fixed effects of covariate variables and random effects of subject on the diagnostic testing procedure. Markov chain Monte Carlo methods were used to compute...

  12. A PAC-Bayesian Analysis of Graph Clustering and Pairwise Clustering

    CERN Document Server

    Seldin, Yevgeny

    2010-01-01

    We formulate weighted graph clustering as a prediction problem: given a subset of edge weights we analyze the ability of graph clustering to predict the remaining edge weights. This formulation enables practical and theoretical comparison of different approaches to graph clustering as well as comparison of graph clustering with other possible ways to model the graph. We adapt the PAC-Bayesian analysis of co-clustering (Seldin and Tishby, 2008; Seldin, 2009) to derive a PAC-Bayesian generalization bound for graph clustering. The bound shows that graph clustering should optimize a trade-off between empirical data fit and the mutual information that clusters preserve on the graph nodes. A similar trade-off derived from information-theoretic considerations was already shown to produce state-of-the-art results in practice (Slonim et al., 2005; Yom-Tov and Slonim, 2009). This paper supports the empirical evidence by providing a better theoretical foundation, suggesting formal generalization guarantees, and offering...

  13. Bayesian analysis of non-homogeneous Markov chains: application to mental health data.

    Science.gov (United States)

    Sung, Minje; Soyer, Refik; Nhan, Nguyen

    2007-07-10

    In this paper we present a formal treatment of non-homogeneous Markov chains by introducing a hierarchical Bayesian framework. Our work is motivated by the analysis of correlated categorical data which arise in assessment of psychiatric treatment programs. In our development, we introduce a Markovian structure to describe the non-homogeneity of transition patterns. In doing so, we introduce a logistic regression set-up for Markov chains and incorporate covariates in our model. We present a Bayesian model using Markov chain Monte Carlo methods and develop inference procedures to address issues encountered in the analyses of data from psychiatric treatment programs. Our model and inference procedures are implemented to some real data from a psychiatric treatment study.

  14. Application of Bayesian graphs to SN Ia data analysis and compression

    CERN Document Server

    Ma, Con; Bassett, Bruce A

    2016-01-01

    Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the Joint Light-curve Analysis (JLA) dataset (Betoule et al. 2014, arXiv:1401.4064). In contrast to the $\\chi^2$ approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with $\\chi^2$ analysis results we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal $6\\sigma$ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only $2.4\\sigma$. Systematic offsets on the cosmological parameters remain small, but may incre...

  15. bspmma: An R Package for Bayesian Semiparametric Models for Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Deborah Burr

    2012-07-01

    Full Text Available We introduce an R package, bspmma, which implements a Dirichlet-based random effects model specific to meta-analysis. In meta-analysis, when combining effect estimates from several heterogeneous studies, it is common to use a random-effects model. The usual frequentist or Bayesian models specify a normal distribution for the true effects. However, in many situations, the effect distribution is not normal, e.g., it can have thick tails, be skewed, or be multi-modal. A Bayesian nonparametric model based on mixtures of Dirichlet process priors has been proposed in the literature, for the purpose of accommodating the non-normality. We review this model and then describe a competitor, a semiparametric version which has the feature that it allows for a well-defined centrality parameter convenient for determining whether the overall effect is significant. This second Bayesian model is based on a different version of the Dirichlet process prior, and we call it the "conditional Dirichlet model". The package contains functions to carry out analyses based on either the ordinary or the conditional Dirichlet model, functions for calculating certain Bayes factors that provide a check on the appropriateness of the conditional Dirichlet model, and functions that enable an empirical Bayes selection of the precision parameter of the Dirichlet process. We illustrate the use of the package on two examples, and give an interpretation of the results in these two different scenarios.

  16. Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula

    Science.gov (United States)

    Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.

    2016-03-01

    A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.

  17. Digital image analysis

    DEFF Research Database (Denmark)

    Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben

    2012-01-01

    Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... reproducibility, application of stereology-based quantitative measurements, time consumption, optimization of histological slides, regions of interest selection and recent developments in staining and imaging techniques....

  18. Bayesian model-based cluster analysis for predicting macrofaunal communities

    NARCIS (Netherlands)

    Braak, ter C.J.F.; Hoijtink, H.; Akkermans, W.; Verdonschot, P.F.M.

    2003-01-01

    To predict macrofaunal community composition from environmental data a two-step approach is often followed: (1) the water samples are clustered into groups on the basis of the macrofauna data and (2) the groups are related to the environmental data, e.g. by discriminant analysis. For the cluster ana

  19. Bayesian conformational analysis of ring molecules through reversible jump MCMC

    DEFF Research Database (Denmark)

    Nolsøe, Kim; Kessler, Mathieu; Pérez, José

    2005-01-01

    In this paper we address the problem of classifying the conformations of mmembered rings using experimental observations obtained by crystal structure analysis. We formulate a model for the data generation mechanism that consists in a multidimensional mixture model. We perform inference...

  20. A hybrid classifier using the parallelepiped and Bayesian techniques. [for multispectral image data

    Science.gov (United States)

    Addington, J. D.

    1975-01-01

    A versatile classification scheme is developed which uses the best features of the parallelepiped algorithm and the Bayesian maximum likelihood algorithm. The parallelepiped technique has the advantage of being very fast, especially when implemented into a table look-up scheme; its disadvantage is its inability to distinguish and classify spectral signatures which are similar in nature. This disadvantage is eliminated by the Bayesian technique which is capable of distinguishing subtle differences very well. The hybrid algorithm developed reduces computer time by as much as 90%. A two- and n-dimensional description of the hybrid classifier is given.

  1. Bayesian Nonnegative Matrix Factorization with Volume Prior for Unmixing of Hyperspectral Images

    DEFF Research Database (Denmark)

    Arngren, Morten; Schmidt, Mikkel Nørgaard; Larsen, Jan

    2009-01-01

    In hyperspectral image analysis the objective is to unmix a set of acquired pixels into pure spectral signatures (endmembers) and corresponding fractional abundances. The Non-negative Matrix Factorization (NMF) methods have received a lot of attention for this unmixing process. Many of these NMF...

  2. Bayesian latent variable models for the analysis of experimental psychology data.

    Science.gov (United States)

    Merkle, Edgar C; Wang, Ting

    2016-03-18

    In this paper, we address the use of Bayesian factor analysis and structural equation models to draw inferences from experimental psychology data. While such application is non-standard, the models are generally useful for the unified analysis of multivariate data that stem from, e.g., subjects' responses to multiple experimental stimuli. We first review the models and the parameter identification issues inherent in the models. We then provide details on model estimation via JAGS and on Bayes factor estimation. Finally, we use the models to re-analyze experimental data on risky choice, comparing the approach to simpler, alternative methods.

  3. Microcanonical thermostatistics analysis without histograms: cumulative distribution and Bayesian approaches

    CERN Document Server

    Alves, Nelson A; Rizzi, Leandro G

    2015-01-01

    Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature $\\beta(E)$ and the microcanonical entropy $S(E)$ is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms $H(E)$, which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for $H(E)$ in order to evaluate $\\beta(E)$ by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distrib...

  4. Bayesian meta-analysis models for microarray data: a comparative study

    Directory of Open Access Journals (Sweden)

    Song Joon J

    2007-03-01

    Full Text Available Abstract Background With the growing abundance of microarray data, statistical methods are increasingly needed to integrate results across studies. Two common approaches for meta-analysis of microarrays include either combining gene expression measures across studies or combining summaries such as p-values, probabilities or ranks. Here, we compare two Bayesian meta-analysis models that are analogous to these methods. Results Two Bayesian meta-analysis models for microarray data have recently been introduced. The first model combines standardized gene expression measures across studies into an overall mean, accounting for inter-study variability, while the second combines probabilities of differential expression without combining expression values. Both models produce the gene-specific posterior probability of differential expression, which is the basis for inference. Since the standardized expression integration model includes inter-study variability, it may improve accuracy of results versus the probability integration model. However, due to the small number of studies typical in microarray meta-analyses, the variability between studies is challenging to estimate. The probability integration model eliminates the need to model variability between studies, and thus its implementation is more straightforward. We found in simulations of two and five studies that combining probabilities outperformed combining standardized gene expression measures for three comparison values: the percent of true discovered genes in meta-analysis versus individual studies; the percent of true genes omitted in meta-analysis versus separate studies, and the number of true discovered genes for fixed levels of Bayesian false discovery. We identified similar results when pooling two independent studies of Bacillus subtilis. We assumed that each study was produced from the same microarray platform with only two conditions: a treatment and control, and that the data sets

  5. OVERALL SENSITIVITY ANALYSIS UTILIZING BAYESIAN NETWORK FOR THE QUESTIONNAIRE INVESTIGATION ON SNS

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Aburai

    2013-11-01

    Full Text Available Social Networking Service (SNS is prevailing rapidly in Japan in recent years. The most popular ones are Facebook, mixi, and Twitter, which are utilized in various fields of life together with the convenient tool such as smart-phone. In this work, a questionnaire investigation is carried out in order to clarify the current usage condition, issues and desired functions. More than 1,000 samples are gathered. Bayesian network is utilized for this analysis. Sensitivity analysis is carried out by setting evidence to all items. This enables overall analysis for each item. We analyzed them by sensitivity analysis and some useful results were obtained. We have presented the paper concerning this. But the volume becomes too large, therefore we have split them and this paper shows the latter half of the investigation result by setting evidence to Bayesian Network parameters. Differences in usage objectives and SNS sites are made clear by the attributes and preference of SNS users. They can be utilized effectively for marketing by clarifying the target customer through the sensitivity analysis.

  6. Diagnostic accuracy of a bayesian latent group analysis for the detection of malingering-related poor effort.

    Science.gov (United States)

    Ortega, Alonso; Labrenz, Stephan; Markowitsch, Hans J; Piefke, Martina

    2013-01-01

    In the last decade, different statistical techniques have been introduced to improve assessment of malingering-related poor effort. In this context, we have recently shown preliminary evidence that a Bayesian latent group model may help to optimize classification accuracy using a simulation research design. In the present study, we conducted two analyses. Firstly, we evaluated how accurately this Bayesian approach can distinguish between participants answering in an honest way (honest response group) and participants feigning cognitive impairment (experimental malingering group). Secondly, we tested the accuracy of our model in the differentiation between patients who had real cognitive deficits (cognitively impaired group) and participants who belonged to the experimental malingering group. All Bayesian analyses were conducted using the raw scores of a visual recognition forced-choice task (2AFC), the Test of Memory Malingering (TOMM, Trial 2), and the Word Memory Test (WMT, primary effort subtests). The first analysis showed 100% accuracy for the Bayesian model in distinguishing participants of both groups with all effort measures. The second analysis showed outstanding overall accuracy of the Bayesian model when estimates were obtained from the 2AFC and the TOMM raw scores. Diagnostic accuracy of the Bayesian model diminished when using the WMT total raw scores. Despite, overall diagnostic accuracy can still be considered excellent. The most plausible explanation for this decrement is the low performance in verbal recognition and fluency tasks of some patients of the cognitively impaired group. Additionally, the Bayesian model provides individual estimates, p(zi |D), of examinees' effort levels. In conclusion, both high classification accuracy levels and Bayesian individual estimates of effort may be very useful for clinicians when assessing for effort in medico-legal settings.

  7. A Bayesian approach to distinguishing interdigitated tongue muscles from limited diffusion magnetic resonance imaging.

    Science.gov (United States)

    Ye, Chuyang; Murano, Emi; Stone, Maureen; Prince, Jerry L

    2015-10-01

    computed fiber directions in both the controls and the patients were also compared, suggesting a potential clinical use for this imaging and image analysis methodology.

  8. Crash risk analysis for Shanghai urban expressways: A Bayesian semi-parametric modeling approach.

    Science.gov (United States)

    Yu, Rongjie; Wang, Xuesong; Yang, Kui; Abdel-Aty, Mohamed

    2016-10-01

    Urban expressway systems have been developed rapidly in recent years in China; it has become one key part of the city roadway networks as carrying large traffic volume and providing high traveling speed. Along with the increase of traffic volume, traffic safety has become a major issue for Chinese urban expressways due to the frequent crash occurrence and the non-recurrent congestions caused by them. For the purpose of unveiling crash occurrence mechanisms and further developing Active Traffic Management (ATM) control strategies to improve traffic safety, this study developed disaggregate crash risk analysis models with loop detector traffic data and historical crash data. Bayesian random effects logistic regression models were utilized as it can account for the unobserved heterogeneity among crashes. However, previous crash risk analysis studies formulated random effects distributions in a parametric approach, which assigned them to follow normal distributions. Due to the limited information known about random effects distributions, subjective parametric setting may be incorrect. In order to construct more flexible and robust random effects to capture the unobserved heterogeneity, Bayesian semi-parametric inference technique was introduced to crash risk analysis in this study. Models with both inference techniques were developed for total crashes; semi-parametric models were proved to provide substantial better model goodness-of-fit, while the two models shared consistent coefficient estimations. Later on, Bayesian semi-parametric random effects logistic regression models were developed for weekday peak hour crashes, weekday non-peak hour crashes, and weekend non-peak hour crashes to investigate different crash occurrence scenarios. Significant factors that affect crash risk have been revealed and crash mechanisms have been concluded.

  9. BayesLCA: An R Package for Bayesian Latent Class Analysis

    Directory of Open Access Journals (Sweden)

    Arthur White

    2014-11-01

    Full Text Available The BayesLCA package for R provides tools for performing latent class analysis within a Bayesian setting. Three methods for fitting the model are provided, incorporating an expectation-maximization algorithm, Gibbs sampling and a variational Bayes approximation. The article briefly outlines the methodology behind each of these techniques and discusses some of the technical difficulties associated with them. Methods to remedy these problems are also described. Visualization methods for each of these techniques are included, as well as criteria to aid model selection.

  10. Combat Identification of Synthetic Aperture Radar Images Using Contextual Features and Bayesian Belief Networks

    Science.gov (United States)

    2012-03-01

    object. The second is unsupervised classification , in which the target is assigned to an unknown class. Jain et al.[17] defined the four best 10 known...0277):413–424, 1999. [34] Yang, He, Ben Ma, Qian Du, and Liangpei Zhang. “Comparison of spectral- spatial classification for urban hyperspectral ... classification accuracy at similar or extended operating conditions. Classification accuracy improvements achieved through Bayesian Belief Networks and the direct

  11. Nonlinear Bayesian Algorithms for Gas Plume Detection and Estimation from Hyper-spectral Thermal Image Data

    Energy Technology Data Exchange (ETDEWEB)

    Heasler, Patrick G.; Posse, Christian; Hylden, Jeff L.; Anderson, Kevin K.

    2007-06-13

    This paper presents a nonlinear Bayesian regression algorithm for the purpose of detecting and estimating gas plume content from hyper-spectral data. Remote sensing data, by its very nature, is collected under less controlled conditions than laboratory data. As a result, the physics-based model that is used to describe the relationship between the observed remotesensing spectra, and the terrestrial (or atmospheric) parameters that we desire to estimate, is typically littered with many unknown "nuisance" parameters (parameters that we are not interested in estimating, but also appear in the model). Bayesian methods are well-suited for this context as they automatically incorporate the uncertainties associated with all nuisance parameters into the error estimates of the parameters of interest. The nonlinear Bayesian regression methodology is illustrated on realistic simulated data from a three-layer model for longwave infrared (LWIR) measurements from a passive instrument. This shows that this approach should permit more accurate estimation as well as a more reasonable description of estimate uncertainty.

  12. Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification.

    Directory of Open Access Journals (Sweden)

    Fang Yan

    Full Text Available Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie analysis was proposed by mapping bow-tie analysis into Bayesian network (BN. Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures.

  13. Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification.

    Science.gov (United States)

    Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang

    2016-01-01

    Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures.

  14. Medical Image Analysis Facility

    Science.gov (United States)

    1978-01-01

    To improve the quality of photos sent to Earth by unmanned spacecraft. NASA's Jet Propulsion Laboratory (JPL) developed a computerized image enhancement process that brings out detail not visible in the basic photo. JPL is now applying this technology to biomedical research in its Medical lrnage Analysis Facility, which employs computer enhancement techniques to analyze x-ray films of internal organs, such as the heart and lung. A major objective is study of the effects of I stress on persons with heart disease. In animal tests, computerized image processing is being used to study coronary artery lesions and the degree to which they reduce arterial blood flow when stress is applied. The photos illustrate the enhancement process. The upper picture is an x-ray photo in which the artery (dotted line) is barely discernible; in the post-enhancement photo at right, the whole artery and the lesions along its wall are clearly visible. The Medical lrnage Analysis Facility offers a faster means of studying the effects of complex coronary lesions in humans, and the research now being conducted on animals is expected to have important application to diagnosis and treatment of human coronary disease. Other uses of the facility's image processing capability include analysis of muscle biopsy and pap smear specimens, and study of the microscopic structure of fibroprotein in the human lung. Working with JPL on experiments are NASA's Ames Research Center, the University of Southern California School of Medicine, and Rancho Los Amigos Hospital, Downey, California.

  15. Assessment of myocardial metabolic rate of glucose by means of Bayesian ICA and Markov Chain Monte Carlo methods in small animal PET imaging

    Science.gov (United States)

    Berradja, Khadidja; Boughanmi, Nabil

    2016-09-01

    In dynamic cardiac PET FDG studies the assessment of myocardial metabolic rate of glucose (MMRG) requires the knowledge of the blood input function (IF). IF can be obtained by manual or automatic blood sampling and cross calibrated with PET. These procedures are cumbersome, invasive and generate uncertainties. The IF is contaminated by spillover of radioactivity from the adjacent myocardium and this could cause important error in the estimated MMRG. In this study, we show that the IF can be extracted from the images in a rat heart study with 18F-fluorodeoxyglucose (18F-FDG) by means of Independent Component Analysis (ICA) based on Bayesian theory and Markov Chain Monte Carlo (MCMC) sampling method (BICA). Images of the heart from rats were acquired with the Sherbrooke small animal PET scanner. A region of interest (ROI) was drawn around the rat image and decomposed into blood and tissue using BICA. The Statistical study showed that there is a significant difference (p < 0.05) between MMRG obtained with IF extracted by BICA with respect to IF extracted from measured images corrupted with spillover.

  16. Analysis of lifestyle and metabolic predictors of visceral obesity with Bayesian Networks

    Directory of Open Access Journals (Sweden)

    de Morais Sérgio

    2010-09-01

    Full Text Available Abstract Background The aim of this study was to provide a framework for the analysis of visceral obesity and its determinants in women, where complex inter-relationships are observed among lifestyle, nutritional and metabolic predictors. Thirty-four predictors related to lifestyle, adiposity, body fat distribution, blood lipids and adipocyte sizes have been considered as potential correlates of visceral obesity in women. To properly address the difficulties in managing such interactions given our limited sample of 150 women, bootstrapped Bayesian networks were constructed based on novel constraint-based learning methods that appeared recently in the statistical learning community. Statistical significance of edge strengths was evaluated and the less reliable edges were pruned to increase the network robustness. To allow accessible interpretation and integrate biological knowledge into the final network, several undirected edges were afterwards directed with physiological expertise according to relevant literature. Results Extensive experiments on synthetic data sampled from a known Bayesian network show that the algorithm, called Recursive Hybrid Parents and Children (RHPC, outperforms state-of-the-art algorithms that appeared in the recent literature. Regarding biological plausibility, we found that the inference results obtained with the proposed method were in excellent agreement with biological knowledge. For example, these analyses indicated that visceral adipose tissue accumulation is strongly related to blood lipid alterations independent of overall obesity level. Conclusions Bayesian Networks are a useful tool for investigating and summarizing evidence when complex relationships exist among predictors, in particular, as in the case of multifactorial conditions like visceral obesity, when there is a concurrent incidence for several variables, interacting in a complex manner. The source code and the data sets used for the empirical tests

  17. Application of evidence theory in information fusion of multiple sources in bayesian analysis

    Institute of Scientific and Technical Information of China (English)

    周忠宝; 蒋平; 武小悦

    2004-01-01

    How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.

  18. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    Science.gov (United States)

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents.

  19. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India.

    Directory of Open Access Journals (Sweden)

    Nazia Afreen

    2016-03-01

    Full Text Available Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011-2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011-2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India.

  20. Multivariate Bayesian analysis of Gaussian, right censored Gaussian, ordered categorical and binary traits using Gibbs sampling

    Directory of Open Access Journals (Sweden)

    Madsen Per

    2003-03-01

    Full Text Available Abstract A fully Bayesian analysis using Gibbs sampling and data augmentation in a multivariate model of Gaussian, right censored, and grouped Gaussian traits is described. The grouped Gaussian traits are either ordered categorical traits (with more than two categories or binary traits, where the grouping is determined via thresholds on the underlying Gaussian scale, the liability scale. Allowances are made for unequal models, unknown covariance matrices and missing data. Having outlined the theory, strategies for implementation are reviewed. These include joint sampling of location parameters; efficient sampling from the fully conditional posterior distribution of augmented data, a multivariate truncated normal distribution; and sampling from the conditional inverse Wishart distribution, the fully conditional posterior distribution of the residual covariance matrix. Finally, a simulated dataset was analysed to illustrate the methodology. This paper concentrates on a model where residuals associated with liabilities of the binary traits are assumed to be independent. A Bayesian analysis using Gibbs sampling is outlined for the model where this assumption is relaxed.

  1. Bayesian flux balance analysis applied to a skeletal muscle metabolic model.

    Science.gov (United States)

    Heino, Jenni; Tunyan, Knarik; Calvetti, Daniela; Somersalo, Erkki

    2007-09-01

    In this article, the steady state condition for the multi-compartment models for cellular metabolism is considered. The problem is to estimate the reaction and transport fluxes, as well as the concentrations in venous blood when the stoichiometry and bound constraints for the fluxes and the concentrations are given. The problem has been addressed previously by a number of authors, and optimization-based approaches as well as extreme pathway analysis have been proposed. These approaches are briefly discussed here. The main emphasis of this work is a Bayesian statistical approach to the flux balance analysis (FBA). We show how the bound constraints and optimality conditions such as maximizing the oxidative phosphorylation flux can be incorporated into the model in the Bayesian framework by proper construction of the prior densities. We propose an effective Markov chain Monte Carlo (MCMC) scheme to explore the posterior densities, and compare the results with those obtained via the previously studied linear programming (LP) approach. The proposed methodology, which is applied here to a two-compartment model for skeletal muscle metabolism, can be extended to more complex models.

  2. Multivariate Bayesian analysis of Gaussian, right censored Gaussian, ordered categorical and binary traits using Gibbs sampling.

    Science.gov (United States)

    Korsgaard, Inge Riis; Lund, Mogens Sandø; Sorensen, Daniel; Gianola, Daniel; Madsen, Per; Jensen, Just

    2003-01-01

    A fully Bayesian analysis using Gibbs sampling and data augmentation in a multivariate model of Gaussian, right censored, and grouped Gaussian traits is described. The grouped Gaussian traits are either ordered categorical traits (with more than two categories) or binary traits, where the grouping is determined via thresholds on the underlying Gaussian scale, the liability scale. Allowances are made for unequal models, unknown covariance matrices and missing data. Having outlined the theory, strategies for implementation are reviewed. These include joint sampling of location parameters; efficient sampling from the fully conditional posterior distribution of augmented data, a multivariate truncated normal distribution; and sampling from the conditional inverse Wishart distribution, the fully conditional posterior distribution of the residual covariance matrix. Finally, a simulated dataset was analysed to illustrate the methodology. This paper concentrates on a model where residuals associated with liabilities of the binary traits are assumed to be independent. A Bayesian analysis using Gibbs sampling is outlined for the model where this assumption is relaxed.

  3. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India.

    Science.gov (United States)

    Afreen, Nazia; Naqvi, Irshad H; Broor, Shobha; Ahmed, Anwar; Kazim, Syed Naqui; Dohare, Ravins; Kumar, Manoj; Parveen, Shama

    2016-03-01

    Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011-2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011-2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India.

  4. Improving water quality assessments through a hierarchical Bayesian analysis of variability.

    Science.gov (United States)

    Gronewold, Andrew D; Borsuk, Mark E

    2010-10-15

    Water quality measurement error and variability, while well-documented in laboratory-scale studies, is rarely acknowledged or explicitly resolved in most model-based water body assessments, including those conducted in compliance with the United States Environmental Protection Agency (USEPA) Total Maximum Daily Load (TMDL) program. Consequently, proposed pollutant loading reductions in TMDLs and similar water quality management programs may be biased, resulting in either slower-than-expected rates of water quality restoration and designated use reinstatement or, in some cases, overly conservative management decisions. To address this problem, we present a hierarchical Bayesian approach for relating actual in situ or model-predicted pollutant concentrations to multiple sampling and analysis procedures, each with distinct sources of variability. We apply this method to recently approved TMDLs to investigate whether appropriate accounting for measurement error and variability will lead to different management decisions. We find that required pollutant loading reductions may in fact vary depending not only on how measurement variability is addressed but also on which water quality analysis procedure is used to assess standard compliance. As a general strategy, our Bayesian approach to quantifying variability may represent an alternative to the common practice of addressing all forms of uncertainty through an arbitrary margin of safety (MOS).

  5. Bayesian Analysis for Stellar Evolution with Nine Parameters (BASE-9): User's Manual

    CERN Document Server

    von Hippel, Ted; Jeffery, Elizabeth; Wagner-Kaiser, Rachel; DeGennaro, Steven; Stein, Nathan; Stenning, David; Jefferys, William H; van Dyk, David

    2014-01-01

    BASE-9 is a Bayesian software suite that recovers star cluster and stellar parameters from photometry. BASE-9 is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses Markov chain Monte Carlo and brute-force numerical integration techniques to estimate the posterior probability distributions for the age, metallicity, helium abundance, distance modulus, and line-of-sight absorption for a cluster, and the mass, binary mass ratio, and cluster membership probability for every stellar object. BASE-9 is provided as open source code on a version-controlled web server. The executables are also available as Amazon Elastic Compute Cloud images. This manual provides potential users with an overview of BASE-9, including instructions for installation and use.

  6. Incorporating Parameter Uncertainty in Bayesian Segmentation Models: Application to Hippocampal Subfield Volumetry

    OpenAIRE

    Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen

    2012-01-01

    Many successful segmentation algorithms are based on Bayesian models in which prior anatomical knowledge is combined with the available image information. However, these methods typically have many free parameters that are estimated to obtain point estimates only, whereas a faithful Bayesian analysis would also consider all possible alternate values these parameters may take. In this paper, we propose to incorporate the uncertainty of the free parameters in Bayesian segmentation models more a...

  7. Extraction of Active Regions and Coronal Holes from EUV Images Using the Unsupervised Segmentation Method in the Bayesian Framework

    CERN Document Server

    Arish, Saeid; Safari, Hossein; Amiri, Ali

    2016-01-01

    The solar corona is the origin of very dynamic events that are mostly produced in active regions (AR) and coronal holes (CH). The exact location of these large-scale features can be determined by applying image-processing approaches to extreme-ultraviolet (EUV) data. We here investigate the problem of segmentation of solar EUV images into ARs, CHs, and quiet-Sun (QS) images in a firm Bayesian way. On the basis of Bayes' rule, we need to obtain both prior and likelihood models. To find the prior model of an image, we used a Potts model in non-local mode. To construct the likelihood model, we combined a mixture of a Markov-Gauss model and non-local means. After estimating labels and hyperparameters with the Gibbs estimator, cellular learning automata were employed to determine the label of each pixel. We applied the proposed method to a Solar Dynamics Observatory/ Atmospheric Imaging Assembly (SDO/AIA) dataset recorded during 2011 and found that the mean value of the filling factor of ARs is 0.032 and 0.057 for...

  8. Bayesian analysis for OPC modeling with film stack properties and posterior predictive checking

    Science.gov (United States)

    Burbine, Andrew; Fenger, Germain; Sturtevant, John; Fryer, David

    2016-10-01

    The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and analysis techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper expands upon Bayesian analysis methods for parameter selection in lithographic models by increasing the parameter set and employing posterior predictive checks. Work continues with a Markov chain Monte Carlo (MCMC) search algorithm to generate posterior distributions of parameters. Models now include wafer film stack refractive indices, n and k, as parameters, recognizing the uncertainties associated with these values. Posterior predictive checks are employed as a method to validate parameter vectors discovered by the analysis, akin to cross validation.

  9. Bayesian sensitivity analysis of incomplete data: bridging pattern-mixture and selection models.

    Science.gov (United States)

    Kaciroti, Niko A; Raghunathan, Trivellore

    2014-11-30

    Pattern-mixture models (PMM) and selection models (SM) are alternative approaches for statistical analysis when faced with incomplete data and a nonignorable missing-data mechanism. Both models make empirically unverifiable assumptions and need additional constraints to identify the parameters. Here, we first introduce intuitive parameterizations to identify PMM for different types of outcome with distribution in the exponential family; then we translate these to their equivalent SM approach. This provides a unified framework for performing sensitivity analysis under either setting. These new parameterizations are transparent, easy-to-use, and provide dual interpretation from both the PMM and SM perspectives. A Bayesian approach is used to perform sensitivity analysis, deriving inferences using informative prior distributions on the sensitivity parameters. These models can be fitted using software that implements Gibbs sampling.

  10. A Semi-parametric Bayesian Approach for Differential Expression Analysis of RNA-seq Data.

    Science.gov (United States)

    Liu, Fangfang; Wang, Chong; Liu, Peng

    2015-12-01

    RNA-sequencing (RNA-seq) technologies have revolutionized the way agricultural biologists study gene expression as well as generated a tremendous amount of data waiting for analysis. Detecting differentially expressed genes is one of the fundamental steps in RNA-seq data analysis. In this paper, we model the count data from RNA-seq experiments with a Poisson-Gamma hierarchical model, or equivalently, a negative binomial (NB) model. We derive a semi-parametric Bayesian approach with a Dirichlet process as the prior model for the distribution of fold changes between the two treatment means. An inference strategy using Gibbs algorithm is developed for differential expression analysis. The results of several simulation studies show that our proposed method outperforms other methods including the popularly applied edgeR and DESeq methods. We also discuss an application of our method to a dataset that compares gene expression between bundle sheath and mesophyll cells in maize leaves.

  11. Image sequence analysis

    CERN Document Server

    1981-01-01

    The processing of image sequences has a broad spectrum of important applica­ tions including target tracking, robot navigation, bandwidth compression of TV conferencing video signals, studying the motion of biological cells using microcinematography, cloud tracking, and highway traffic monitoring. Image sequence processing involves a large amount of data. However, because of the progress in computer, LSI, and VLSI technologies, we have now reached a stage when many useful processing tasks can be done in a reasonable amount of time. As a result, research and development activities in image sequence analysis have recently been growing at a rapid pace. An IEEE Computer Society Workshop on Computer Analysis of Time-Varying Imagery was held in Philadelphia, April 5-6, 1979. A related special issue of the IEEE Transactions on Pattern Anal­ ysis and Machine Intelligence was published in November 1980. The IEEE Com­ puter magazine has also published a special issue on the subject in 1981. The purpose of this book ...

  12. Variational Bayesian mixture of experts models and sensitivity analysis for nonlinear dynamical systems

    Science.gov (United States)

    Baldacchino, Tara; Cross, Elizabeth J.; Worden, Keith; Rowson, Jennifer

    2016-01-01

    Most physical systems in reality exhibit a nonlinear relationship between input and output variables. This nonlinearity can manifest itself in terms of piecewise continuous functions or bifurcations, between some or all of the variables. The aims of this paper are two-fold. Firstly, a mixture of experts (MoE) model was trained on different physical systems exhibiting these types of nonlinearities. MoE models separate the input space into homogeneous regions and a different expert is responsible for the different regions. In this paper, the experts were low order polynomial regression models, thus avoiding the need for high-order polynomials. The model was trained within a Bayesian framework using variational Bayes, whereby a novel approach within the MoE literature was used in order to determine the number of experts in the model. Secondly, Bayesian sensitivity analysis (SA) of the systems under investigation was performed using the identified probabilistic MoE model in order to assess how uncertainty in the output can be attributed to uncertainty in the different inputs. The proposed methodology was first tested on a bifurcating Duffing oscillator, and it was then applied to real data sets obtained from the Tamar and Z24 bridges. In all cases, the MoE model was successful in identifying bifurcations and different physical regimes in the data by accurately dividing the input space; including identifying boundaries that were not parallel to coordinate axes.

  13. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    Science.gov (United States)

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature.

  14. Hierarchical Bayesian analysis of censored microbiological contamination data for use in risk assessment and mitigation.

    Science.gov (United States)

    Busschaert, P; Geeraerd, A H; Uyttendaele, M; Van Impe, J F

    2011-06-01

    Microbiological contamination data often is censored because of the presence of non-detects or because measurement outcomes are known only to be smaller than, greater than, or between certain boundary values imposed by the laboratory procedures. Therefore, it is not straightforward to fit distributions that summarize contamination data for use in quantitative microbiological risk assessment, especially when variability and uncertainty are to be characterized separately. In this paper, distributions are fit using Bayesian analysis, and results are compared to results obtained with a methodology based on maximum likelihood estimation and the non-parametric bootstrap method. The Bayesian model is also extended hierarchically to estimate the effects of the individual elements of a covariate such as, for example, on a national level, the food processing company where the analyzed food samples were processed, or, on an international level, the geographical origin of contamination data. Including this extra information allows a risk assessor to differentiate between several scenario's and increase the specificity of the estimate of risk of illness, or compare different scenario's to each other. Furthermore, inference is made on the predictive importance of several different covariates while taking into account uncertainty, allowing to indicate which covariates are influential factors determining contamination.

  15. A Bayesian analysis of the 69 highest energy cosmic rays detected by the Pierre Auger Observatory

    CERN Document Server

    Khanin, Alexander

    2016-01-01

    The origins of ultra-high energy cosmic rays (UHECRs) remain an open question. Several attempts have been made to cross-correlate the arrival directions of the UHECRs with catalogs of potential sources, but no definite conclusion has been reached. We report a Bayesian analysis of the 69 events from the Pierre Auger Observatory (PAO), that aims to determine the fraction of the UHECRs that originate from known AGNs in the Veron-Cety & Veron (VCV) catalog, as well as AGNs detected with the Swift Burst Alert Telescope (Swift-BAT), galaxies from the 2MASS Redshift Survey (2MRS), and an additional volume-limited sample of 17 nearby AGNs. The study makes use of a multi-level Bayesian model of UHECR injection, propagation and detection. We find that for reasonable ranges of prior parameters, the Bayes factors disfavour a purely isotropic model. For fiducial values of the model parameters, we report 68% credible intervals for the fraction of source originating UHECRs of 0.09+0.05-0.04, 0.25+0.09-0.08, 0.24+0.12-0....

  16. Spatial-Temporal Epidemiology of Tuberculosis in Mainland China: An Analysis Based on Bayesian Theory

    Directory of Open Access Journals (Sweden)

    Kai Cao

    2016-05-01

    Full Text Available Objective: To explore the spatial-temporal interaction effect within a Bayesian framework and to probe the ecological influential factors for tuberculosis. Methods: Six different statistical models containing parameters of time, space, spatial-temporal interaction and their combination were constructed based on a Bayesian framework. The optimum model was selected according to the deviance information criterion (DIC value. Coefficients of climate variables were then estimated using the best fitting model. Results: The model containing spatial-temporal interaction parameter was the best fitting one, with the smallest DIC value (−4,508,660. Ecological analysis results showed the relative risks (RRs of average temperature, rainfall, wind speed, humidity, and air pressure were 1.00324 (95% CI, 1.00150–1.00550, 1.01010 (95% CI, 1.01007–1.01013, 0.83518 (95% CI, 0.93732–0.96138, 0.97496 (95% CI, 0.97181–1.01386, and 1.01007 (95% CI, 1.01003–1.01011, respectively. Conclusions: The spatial-temporal interaction was statistically meaningful and the prevalence of tuberculosis was influenced by the time and space interaction effect. Average temperature, rainfall, wind speed, and air pressure influenced tuberculosis. Average humidity had no influence on tuberculosis.

  17. A Bayesian analysis of the 69 highest energy cosmic rays detected by the Pierre Auger Observatory

    Science.gov (United States)

    Khanin, Alexander; Mortlock, Daniel J.

    2016-08-01

    The origins of ultrahigh energy cosmic rays (UHECRs) remain an open question. Several attempts have been made to cross-correlate the arrival directions of the UHECRs with catalogues of potential sources, but no definite conclusion has been reached. We report a Bayesian analysis of the 69 events, from the Pierre Auger Observatory (PAO), that aims to determine the fraction of the UHECRs that originate from known AGNs in the Veron-Cety & Verson (VCV) catalogue, as well as AGNs detected with the Swift Burst Alert Telescope (Swift-BAT), galaxies from the 2MASS Redshift Survey (2MRS), and an additional volume-limited sample of 17 nearby AGNs. The study makes use of a multilevel Bayesian model of UHECR injection, propagation and detection. We find that for reasonable ranges of prior parameters the Bayes factors disfavour a purely isotropic model. For fiducial values of the model parameters, we report 68 per cent credible intervals for the fraction of source originating UHECRs of 0.09^{+0.05}_{-0.04}, 0.25^{+0.09}_{-0.08}, 0.24^{+0.12}_{-0.10}, and 0.08^{+0.04}_{-0.03} for the VCV, Swift-BAT and 2MRS catalogues, and the sample of 17 AGNs, respectively.

  18. A Preliminary Bayesian Analysis of Incomplete Longitudinal Data from a Small Sample: Methodological Advances in an International Comparative Study of Educational Inequality

    Science.gov (United States)

    Hsieh, Chueh-An; Maier, Kimberly S.

    2009-01-01

    The capacity of Bayesian methods in estimating complex statistical models is undeniable. Bayesian data analysis is seen as having a range of advantages, such as an intuitive probabilistic interpretation of the parameters of interest, the efficient incorporation of prior information to empirical data analysis, model averaging and model selection.…

  19. Inverse scattering in a Bayesian framework: application to microwave imaging for breast cancer detection

    Science.gov (United States)

    Gharsalli, Leila; Ayasso, Hacheme; Duchêne, Bernard; Mohammad-Djafari, Ali

    2014-11-01

    In this paper, we deal with a nonlinear inverse scattering problem where the goal is to detect breast cancer from measurements of the scattered field that results from the interaction between the breast and a known interrogating wave in the microwave frequency range. Modeling of the wave-object (breast) interaction is tackled through a domain integral representation of the electric field in a 2D-TM configuration. The inverse problem is solved in a Bayesian framework where prior information, which consists in the fact that the object is supposed to be composed of compact homogeneous regions made of a restricted number of different materials, is introduced via a Gauss-Markov-Potts model. As an analytic expression for the joint maximum a posteriori (MAP) estimators yields an intractable solution, an approximation of the latter is proposed. This is done by means of a variational Bayesian approximation (VBA) technique that is adapted to complex-valued contrast and applied to compute the posterior estimators, and reconstruct maps of both permittivity and conductivity of the sought object. This leads to a joint semi-supervised estimation approach, which allows us to estimate the induced currents, the contrast and all of the parameters introduced in the prior model. The method is tested on two sets of synthetic data generated in different configurations and its performances are compared to that given by a contrast source inversion technique.

  20. Bayesian Analysis of Inertial Confinement Fusion Experiments at the National Ignition Facility

    CERN Document Server

    Gaffney, J A; Sonnad, V; Libby, S B

    2012-01-01

    We develop a Bayesian inference method that allows the efficient determination of several interesting parameters from complicated high-energy-density experiments performed on the National Ignition Facility (NIF). The model is based on an exploration of phase space using the hydrodynamic code HYDRA. A linear model is used to describe the effect of nuisance parameters on the analysis, allowing an analytic likelihood to be derived that can be determined from a small number of HYDRA runs and then used in existing advanced statistical analysis methods. This approach is applied to a recent experiment in order to determine the carbon opacity and X-ray drive; it is found that the inclusion of prior expert knowledge and fluctuations in capsule dimensions and chemical composition significantly improve the agreement between experiment and theoretical opacity calculations. A parameterisation of HYDRA results is used to test the application of both Markov chain Monte Carlo (MCMC) and genetic algorithm (GA) techniques to e...

  1. Bayesian Analysis of $C_{x'}$ and $C_{z'}$ Double Polarizations in Kaon Photoproduction

    CERN Document Server

    Hutauruk, P T P

    2010-01-01

    Have been analyzed the latest experimental data for $\\gamma + p \\to K^{+} + \\Lambda$ reaction of $C_{x'}$ and $C_{z'}$ double polarizations. In theoretical calculation, all of these observables can be classified into four Legendre classes and represented by associated Legendre polynomial function itself \\cite{fasano92}. In this analysis we attempt to determine the best data model for both observables. We use the bayesian technique to select the best model by calculating the posterior probabilities and comparing the posterior among the models. The posteriors probabilities for each data model are computed using a Nested sampling integration. From this analysis we concluded that $C_{x'}$ and $C_{z'}$ double polarizations require two and three order of associated Legendre polynomials respectively to describe the data well. The extracted coefficients of each observable will also be presented. It shows the structure of baryon resonances qualitatively

  2. Bayesian methods for meta-analysis of causal relationships estimated using genetic instrumental variables

    DEFF Research Database (Denmark)

    2010-01-01

    Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context...... of multiple genetic markers measured in multiple studies, based on the analysis of individual participant data. First, for a single genetic marker in one study, we show that the usual ratio of coefficients approach can be reformulated as a regression with heterogeneous error in the explanatory variable....... This can be implemented using a Bayesian approach, which is next extended to include multiple genetic markers. We then propose a hierarchical model for undertaking a meta-analysis of multiple studies, in which it is not necessary that the same genetic markers are measured in each study. This provides...

  3. Bayesian methods for model uncertainty analysis with application to future sea level rise

    Energy Technology Data Exchange (ETDEWEB)

    Patwardhan, A.; Small, M.J.

    1992-01-01

    In no other area is the need for effective analysis of uncertainty more evident than in the problem of evaluating the consequences of increasing atmospheric concentrations of radiatively active gases. The major consequences of concern is global warming, with related environmental effects that include changes in local patterns of precipitation, soil moisture, forest and agricultural productivity, and a potential increase in global mean sea level. In order to identify an optimum set of responses to sea level change, a full characterization of the uncertainties associated with the predictions of future sea level rise is essential. The paper addresses the use of data for identifying and characterizing uncertainties in model parameters and predictions. The Bayesian Monte Carlo method is formally presented and elaborated, and applied to the analysis of the uncertainty in a predictive model for global mean sea level change.

  4. Status of the 2D Bayesian analysis of XENON100 data

    Energy Technology Data Exchange (ETDEWEB)

    Schindler, Stefan [JGU, Staudingerweg 7, 55128 Mainz (Germany)

    2015-07-01

    The XENON100 experiment is located in the underground laboratory at LNGS in Italy. Since Dark Matter particles will only interact very rarely with normal matter, an environment with ultra low background, which is shielded from cosmic radiation is needed. The standard analysis of XENON100 data has made use of the profile likelihood method (a most frequent approach) and still provides one of the most sensitive exclusion limits to WIMP Dark Matter. Here we present work towards a Bayesian approach to the analysis of XENON100 data, where we attempt to include the measured primary (S1) and secondary (S2) scintillation signals in a more complete way. The background and signal models in the S1-S2 space have to be defined and a corresponding likelihood function, describing these models, has to be constructed.

  5. BaalChIP: Bayesian analysis of allele-specific transcription factor binding in cancer genomes.

    Science.gov (United States)

    de Santiago, Ines; Liu, Wei; Yuan, Ke; O'Reilly, Martin; Chilamakuri, Chandra Sekhar Reddy; Ponder, Bruce A J; Meyer, Kerstin B; Markowetz, Florian

    2017-02-24

    Allele-specific measurements of transcription factor binding from ChIP-seq data are key to dissecting the allelic effects of non-coding variants and their contribution to phenotypic diversity. However, most methods of detecting an allelic imbalance assume diploid genomes. This assumption severely limits their applicability to cancer samples with frequent DNA copy-number changes. Here we present a Bayesian statistical approach called BaalChIP to correct for the effect of background allele frequency on the observed ChIP-seq read counts. BaalChIP allows the joint analysis of multiple ChIP-seq samples across a single variant and outperforms competing approaches in simulations. Using 548 ENCODE ChIP-seq and six targeted FAIRE-seq samples, we show that BaalChIP effectively corrects allele-specific analysis for copy-number variation and increases the power to detect putative cis-acting regulatory variants in cancer genomes.

  6. Bayesian Analysis of Hybrid EoS based on Astrophysical Observational Data

    CERN Document Server

    Alvarez-Castillo, David; Blaschke, David; Grigorian, Hovik

    2014-01-01

    The most basic features of a neutron star (NS) are its radius and mass which so far have not been well determined simultaneously for a single object. In some cases masses are precisely measured like in the case of binary systems but radii are quite uncertain. In the other hand, for isolated neutron stars some radius and mass measurements exist but lack the necessary precision to inquire into their interiors. However, the present observable data allows to make probabilistic estimation of the internal structure of the star. In this work preliminary probabilistic estimation of the super dense stellar matter equation of state using Bayesian Analysis and modelling of relativistic configurations of neutron stars is shown. This analysis is important for research of existence the quark-gluon plasma in massive (around 2 sun masses) neutron stars.

  7. Bayesian analysis for exponential random graph models using the adaptive exchange sampler

    KAUST Repository

    Jin, Ick Hoon

    2013-01-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the existence of intractable normalizing constants. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the issue of intractable normalizing constants encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency.

  8. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling

    Science.gov (United States)

    Onisko, Agnieszka; Druzdzel, Marek J.; Austin, R. Marshall

    2016-01-01

    Background: Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. Aim: The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. Materials and Methods: This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan–Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. Results: The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Conclusion: Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches. PMID:28163973

  9. Bayesian and Geostatistical Approaches to Combining Categorical Data Derived from Visual and Digital Processing of Remotely Sensed Images

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jingxiong; LI Deren

    2005-01-01

    This paper seeks a synthesis of Bayesian and geostatistical approaches to combining categorical data in the context of remote sensing classification.By experiment with aerial photographs and Landsat TM data, accuracy of spectral, spatial, and combined classification results was evaluated.It was confirmed that the incorporation of spatial information in spectral classification increases accuracy significantly.Secondly, through test with a 5-class and a 3-class classification schemes, it was revealed that setting a proper semantic framework for classification is fundamental to any endeavors of categorical mapping and the most important factor affecting accuracy.Lastly, this paper promotes non-parametric methods for both definition of class membership profiling based on band-specific histograms of image intensities and derivation of spatial probability via indicator kriging, a non-parametric geostatistical technique.

  10. Antiplatelets versus anticoagulants for the treatment of cervical artery dissection: Bayesian meta-analysis.

    Directory of Open Access Journals (Sweden)

    Hakan Sarikaya

    Full Text Available OBJECTIVE: To compare the effects of antiplatelets and anticoagulants on stroke and death in patients with acute cervical artery dissection. DESIGN: Systematic review with Bayesian meta-analysis. DATA SOURCES: The reviewers searched MEDLINE and EMBASE from inception to November 2012, checked reference lists, and contacted authors. STUDY SELECTION: Studies were eligible if they were randomised, quasi-randomised or observational comparisons of antiplatelets and anticoagulants in patients with cervical artery dissection. DATA EXTRACTION: Data were extracted by one reviewer and checked by another. Bayesian techniques were used to appropriately account for studies with scarce event data and imbalances in the size of comparison groups. DATA SYNTHESIS: Thirty-seven studies (1991 patients were included. We found no randomised trial. The primary analysis revealed a large treatment effect in favour of antiplatelets for preventing the primary composite outcome of ischaemic stroke, intracranial haemorrhage or death within the first 3 months after treatment initiation (relative risk 0.32, 95% credibility interval 0.12 to 0.63, while the degree of between-study heterogeneity was moderate (τ(2 = 0.18. In an analysis restricted to studies of higher methodological quality, the possible advantage of antiplatelets over anticoagulants was less obvious than in the main analysis (relative risk 0.73, 95% credibility interval 0.17 to 2.30. CONCLUSION: In view of these results and the safety advantages, easier usage and lower cost of antiplatelets, we conclude that antiplatelets should be given precedence over anticoagulants as a first line treatment in patients with cervical artery dissection unless results of an adequately powered randomised trial suggest the opposite.

  11. Extreme Rainfall Analysis using Bayesian Hierarchical Modeling in the Willamette River Basin, Oregon

    Science.gov (United States)

    Love, C. A.; Skahill, B. E.; AghaKouchak, A.; Karlovits, G. S.; England, J. F.; Duren, A. M.

    2016-12-01

    We present preliminary results of ongoing research directed at evaluating the worth of including various covariate data to support extreme rainfall analysis in the Willamette River basin using Bayesian hierarchical modeling (BHM). We also compare the BHM derived extreme rainfall estimates with their respective counterparts obtained from a traditional regional frequency analysis (RFA) using the same set of rain gage extreme rainfall data. The U.S. Army Corps of Engineers (USACE) Portland District operates thirteen dams in the 11,478 square mile Willamette River basin (WRB) located in northwestern Oregon, a major tributary of the Columbia River whose 187 miles long main stem, the Willamette River, flows northward between the Coastal and Cascade Ranges. The WRB contains approximately two-thirds of Oregon's population and 20 of the 25 most populous cities in the state. Extreme rainfall estimates are required to support risk-informed hydrologic analyses for these projects as part of the USACE Dam Safety Program. We analyze daily annual rainfall maxima data for the WRB utilizing the spatial BHM R package "spatial.gev.bma", which has been shown to be efficient in developing coherent maps of extreme rainfall by return level. Our intent is to profile for the USACE an alternate methodology to a RFA which was developed in 2008 due to the lack of an official NOAA Atlas 14 update for the state of Oregon. Unlike RFA, the advantage of a BHM-based analysis of hydrometeorological extremes is its ability to account for non-stationarity while providing robust estimates of uncertainty. BHM also allows for the inclusion of geographical and climatological factors which we show for the WRB influence regional rainfall extremes. Moreover, the Bayesian framework permits one to combine additional data types into the analysis; for example, information derived via elicitation and causal information expansion data, both being additional opportunities for future related research.

  12. Bayesian Framework with Non-local and Low-rank Constraint for Image Reconstruction

    Science.gov (United States)

    Tang, Zhonghe; Wang, Shengzhe; Huo, Jianliang; Guo, Hang; Zhao, Haibo; Mei, Yuan

    2017-01-01

    Built upon the similar methodology of 'grouping and collaboratively filtering', the proposed algorithm recovers image patches from the array of similar noisy patches based on the assumption that their noise-free versions or approximation lie in a low dimensional subspace and has a low rank. Based on the analysis of the effect of noise and perturbation on the singular value, a weighted nuclear norm is defined to replace the conventional nuclear norm. Corresponding low-rank decomposition model and singular value shrinkage operator are derived. Taking into account the difference between the distribution of the signal and the noise, the weight depends not only on the standard deviation of noise, but also on the rank of the noise-free matrix and the singular value itself. Experimental results in image reconstruction tasks show that at relatively low computational cost the performance of proposed method is very close to state-of-the-art reconstruction methods BM3D and LSSC even outperforms them in restoring and preserving structure

  13. A Bayesian Super-Resolution Approach to Demosaicing of Blurred Images

    OpenAIRE

    Molina Rafael; Katsaggelos Aggelos K; Vega Miguel

    2006-01-01

    Most of the available digital color cameras use a single image sensor with a color filter array (CFA) in acquiring an image. In order to produce a visible color image, a demosaicing process must be applied, which produces undesirable artifacts. An additional problem appears when the observed color image is also blurred. This paper addresses the problem of deconvolving color images observed with a single coupled charged device (CCD) from the super-resolution point of view. Utilizing the Bayes...

  14. Image based performance analysis of thermal imagers

    Science.gov (United States)

    Wegner, D.; Repasi, E.

    2016-05-01

    Due to advances in technology, modern thermal imagers resemble sophisticated image processing systems in functionality. Advanced signal and image processing tools enclosed into the camera body extend the basic image capturing capability of thermal cameras. This happens in order to enhance the display presentation of the captured scene or specific scene details. Usually, the implemented methods are proprietary company expertise, distributed without extensive documentation. This makes the comparison of thermal imagers especially from different companies a difficult task (or at least a very time consuming/expensive task - e.g. requiring the execution of a field trial and/or an observer trial). For example, a thermal camera equipped with turbulence mitigation capability stands for such a closed system. The Fraunhofer IOSB has started to build up a system for testing thermal imagers by image based methods in the lab environment. This will extend our capability of measuring the classical IR-system parameters (e.g. MTF, MTDP, etc.) in the lab. The system is set up around the IR- scene projector, which is necessary for the thermal display (projection) of an image sequence for the IR-camera under test. The same set of thermal test sequences might be presented to every unit under test. For turbulence mitigation tests, this could be e.g. the same turbulence sequence. During system tests, gradual variation of input parameters (e. g. thermal contrast) can be applied. First ideas of test scenes selection and how to assembly an imaging suite (a set of image sequences) for the analysis of imaging thermal systems containing such black boxes in the image forming path is discussed.

  15. Adaptive Bayesian-based speck-reduction in SAR images using complex wavelet transform

    Science.gov (United States)

    Ma, Ning; Yan, Wei; Zhang, Peng

    2005-10-01

    In this paper, an improved adaptive speckle reduction method is presented based on dual tree complex wavelet transform (CWT). It combines the characteristics of additive noise reduction of soft thresholding with the CWT's directional selectivity, being its main contribution to adapt the effective threshold to preserve the edge detail. A Bayesian estimator is applied to the decomposed data also to estimate the best value for the noise-free complex wavelet coefficients. This estimation is based on alpha-stable and Gaussian distribution hypotheses for complex wavelet coefficients of the signal and noise, respectively. Experimental results show that the denoising performance is among the state-of-the-art techniques based on real discrete wavelet transform (DWT).

  16. Aerial Image Texture Classification Based on u-level Bayesian Network%多级Bayesian Network的影像纹理分类方法

    Institute of Scientific and Technical Information of China (English)

    虞欣; 郑肇葆; 叶志伟; 李林宜

    2008-01-01

    在影像分类的实际应用中,所提取的特征(或波段)间往往存在较大的相关性.为了把Naive Bayes Classifters (NBC)模型更好地应用于分类中,本文在研究NBC模型的基础上,从特征空间划分的角度,将它进一步推广为多级Bayesian Network.实验结果分析表明:由于多级Bayesian Network模型综合考虑了特征之间的条件依赖关系,它在分类精度方面一般高于原始的NBC和最大似然法.然而,对于不同的n值,其分类结果也有所不同.

  17. A Bayesian sensitivity study of risk difference in the meta-analysis of binary outcomes from sparse data.

    Science.gov (United States)

    Vázquez-Polo, Francisco-Jose; Moreno, Elías; Negrín, Miguel A; Martel, Maria

    2015-04-01

    In most cases, including those of discrete random variables, statistical meta-analysis is carried out using the normal random effect model. The authors argue that normal approximation does not always properly reflect the underlying uncertainty of the original discrete data. Furthermore, in the presence of rare events the results from this approximation can be very poor. This review proposes a Bayesian meta-analysis to address binary outcomes from sparse data and also introduces a simple way to examine the sensitivity of the quantities of interest in the meta-analysis with respect to the structure dependence selected. The findings suggest that for binary outcomes data it is possible to develop a Bayesian procedure, which can be directly applied to sparse data without ad hoc corrections. By choosing a suitable class of linking distributions, the authors found that a Bayesian robustness study can be easily implemented. For illustrative purposes, an example with real data is analyzed using the proposed Bayesian meta-analysis for binomial sparse data.

  18. Selection of Trusted Service Providers by Enforcing Bayesian Analysis in iVCE

    Institute of Scientific and Technical Information of China (English)

    GU Bao-jun; LI Xiao-yong; WANG Wei-nong

    2008-01-01

    The initiative of internet-based virtual computing environment (iVCE) aims to provide the end users and applications With a harmonious, trustworthy and transparent integrated computing environment which will facilitate sharing and collaborating of network resources between applications. Trust management is an elementary component for iVCE. The uncertain and dynamic characteristics of iVCE necessitate the requirement for the trust management to be subjective, historical evidence based and context dependent. This paper presents a Bayesian analysis-based trust model, which aims to secure the active agents for selecting appropriate trustod services in iVCE. Simulations are made to analyze the properties of the trust model which show that the subjective prior information influences trust evaluation a lot and the model stimulates positive interactions.

  19. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    Science.gov (United States)

    Takamizawa, Hisashi; Itoh, Hiroto; Nishiyama, Yutaka

    2016-10-01

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased.

  20. Bayesian Reliability Analysis of Non-Stationarity in Multi-agent Systems

    Directory of Open Access Journals (Sweden)

    TONT Gabriela

    2013-05-01

    Full Text Available The Bayesian methods provide information about the meaningful parameters in a statistical analysis obtained by combining the prior and sampling distributions to form the posterior distribution of theparameters. The desired inferences are obtained from this joint posterior. An estimation strategy for hierarchical models, where the resulting joint distribution of the associated model parameters cannotbe evaluated analytically, is to use sampling algorithms, known as Markov Chain Monte Carlo (MCMC methods, from which approximate solutions can be obtained. Both serial and parallel configurations of subcomponents are permitted. The capability of time-dependent method to describe a multi-state system is based on a case study, assessingthe operatial situation of studied system. The rationality and validity of the presented model are demonstrated via a case of study. The effect of randomness of the structural parameters is alsoexamined.

  1. A Bayesian based functional mixed-effects model for analysis of LC-MS data.

    Science.gov (United States)

    Befekadu, Getachew K; Tadesse, Mahlet G; Ressom, Habtom W

    2009-01-01

    A Bayesian multilevel functional mixed-effects model with group specific random-effects is presented for analysis of liquid chromatography-mass spectrometry (LC-MS) data. The proposed framework allows alignment of LC-MS spectra with respect to both retention time (RT) and mass-to-charge ratio (m/z). Affine transformations are incorporated within the model to account for any variability along the RT and m/z dimensions. Simultaneous posterior inference of all unknown parameters is accomplished via Markov chain Monte Carlo method using the Gibbs sampling algorithm. The proposed approach is computationally tractable and allows incorporating prior knowledge in the inference process. We demonstrate the applicability of our approach for alignment of LC-MS spectra based on total ion count profiles derived from two LC-MS datasets.

  2. Fermi's paradox, extraterrestrial life and the future of humanity: a Bayesian analysis

    CERN Document Server

    Verendel, Vilhelm

    2015-01-01

    The Great Filter interpretation of Fermi's great silence asserts that $Npq$ is not a very large number, where $N$ is the number of potentially life-supporting planets in the observable universe, $p$ is the probability that a randomly chosen such planet develops intelligent life to the level of present-day human civilization, and $q$ is the conditional probability that it then goes on to develop a technological supercivilization visible all over the observable universe. Evidence suggests that $N$ is huge, which implies that $pq$ is very small. Hanson (1998) and Bostrom (2008) have argued that the discovery of extraterrestrial life would point towards $p$ not being small and therefore a very small $q$, which can be seen as bad news for humanity's prospects of colonizing the universe. Here we investigate whether a Bayesian analysis supports their argument, and the answer turns out to depend critically on the choice of prior distribution.

  3. Fermi's paradox, extraterrestrial life and the future of humanity: a Bayesian analysis

    Science.gov (United States)

    Verendel, Vilhelm; Häggström, Olle

    2017-01-01

    The Great Filter interpretation of Fermi's great silence asserts that Npq is not a very large number, where N is the number of potentially life-supporting planets in the observable universe, p is the probability that a randomly chosen such planet develops intelligent life to the level of present-day human civilization, and q is the conditional probability that it then goes on to develop a technological supercivilization visible all over the observable universe. Evidence suggests that N is huge, which implies that pq is very small. Hanson (1998) and Bostrom (2008) have argued that the discovery of extraterrestrial life would point towards p not being small and therefore a very small q, which can be seen as bad news for humanity's prospects of colonizing the universe. Here we investigate whether a Bayesian analysis supports their argument, and the answer turns out to depend critically on the choice of prior distribution.

  4. Constraints on cosmic-ray propagation models from a global Bayesian analysis

    CERN Document Server

    Trotta, R; Moskalenko, I V; Porter, T A; de Austri, R Ruiz; Strong, A W

    2010-01-01

    Research in many areas of modern physics such as, e.g., indirect searches for dark matter and particle acceleration in SNR shocks, rely heavily on studies of cosmic rays (CRs) and associated diffuse emissions (radio, microwave, X-rays, gamma rays). While very detailed numerical models of CR propagation exist, a quantitative statistical analysis of such models has been so far hampered by the large computational effort that those models require. Although statistical analyses have been carried out before using semi-analytical models (where the computation is much faster), the evaluation of the results obtained from such models is difficult, as they necessarily suffer from many simplifying assumptions, The main objective of this paper is to present a working method for a full Bayesian parameter estimation for a numerical CR propagation model. For this study, we use the GALPROP code, the most advanced of its kind, that uses astrophysical information, nuclear and particle data as input to self-consistently predict ...

  5. Evaluating predictors of dispersion: a comparison of Dominance Analysis and Bayesian Model Averaging.

    Science.gov (United States)

    Shou, Yiyun; Smithson, Michael

    2015-03-01

    Conventional measures of predictor importance in linear models are applicable only when the assumption of homoscedasticity is satisfied. Moreover, they cannot be adapted to evaluating predictor importance in models of heteroscedasticity (i.e., dispersion), an issue that seems not to have been systematically addressed in the literature. We compare two suitable approaches, Dominance Analysis (DA) and Bayesian Model Averaging (BMA), for simultaneously evaluating predictor importance in models of location and dispersion. We apply them to the beta general linear model as a test-case, illustrating this with an example using real data. Simulations using several different model structures, sample sizes, and degrees of multicollinearity suggest that both DA and BMA largely agree on the relative importance of predictors of the mean, but differ when ranking predictors of dispersion. The main implication of these findings for researchers is that the choice between DA and BMA is most important when they wish to evaluate the importance of predictors of dispersion.

  6. Bayesian semiparametric power spectral density estimation in gravitational wave data analysis

    CERN Document Server

    Edwards, Matthew C; Christensen, Nelson

    2015-01-01

    The standard noise model in gravitational wave (GW) data analysis assumes detector noise is stationary and Gaussian distributed, with a known power spectral density (PSD) that is usually estimated using clean off-source data. Real GW data often depart from these assumptions, and misspecified parametric models of the PSD could result in misleading inferences. We propose a Bayesian semiparametric approach to improve this. We use a nonparametric Bernstein polynomial prior on the PSD, with weights attained via a Dirichlet process distribution, and update this using the Whittle likelihood. Posterior samples are obtained using a Metropolis-within-Gibbs sampler. We simultaneously estimate the reconstruction parameters of a rotating core collapse supernova GW burst that has been embedded in simulated Advanced LIGO noise. We also discuss an approach to deal with non-stationary data by breaking longer data streams into smaller and locally stationary components.

  7. Bayesian design and analysis of computer experiments: Use of derivatives in surface prediction

    Energy Technology Data Exchange (ETDEWEB)

    Morris, M.D.; Mitchell, T.J. (Oak Ridge National Lab., TN (USA)); Ylvisaker, D. (California Univ., Los Angeles, CA (USA). Dept. of Mathematics)

    1991-06-01

    The work of Currin et al. and others in developing fast predictive approximations'' of computer models is extended for the case in which derivatives of the output variable of interest with respect to input variables are available. In addition to describing the calculations required for the Bayesian analysis, the issue of experimental design is also discussed, and an algorithm is described for constructing maximin distance'' designs. An example is given based on a demonstration model of eight inputs and one output, in which predictions based on a maximin design, a Latin hypercube design, and two compromise'' designs are evaluated and compared. 12 refs., 2 figs., 6 tabs.

  8. Bayesian Method of Moments (BMOM) Analysis of Mean and Regression Models

    CERN Document Server

    Zellner, Arnold

    2008-01-01

    A Bayesian method of moments/instrumental variable (BMOM/IV) approach is developed and applied in the analysis of the important mean and multiple regression models. Given a single set of data, it is shown how to obtain posterior and predictive moments without the use of likelihood functions, prior densities and Bayes' Theorem. The posterior and predictive moments, based on a few relatively weak assumptions, are then used to obtain maximum entropy densities for parameters, realized error terms and future values of variables. Posterior means for parameters and realized error terms are shown to be equal to certain well known estimates and rationalized in terms of quadratic loss functions. Conditional maxent posterior densities for means and regression coefficients given scale parameters are in the normal form while scale parameters' maxent densities are in the exponential form. Marginal densities for individual regression coefficients, realized error terms and future values are in the Laplace or double-exponenti...

  9. Bayesian analysis of general failure data from an ageing distribution: advances in numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Procaccia, H.; Villain, B. [Electricite de France (EDF), 93 - Saint-Denis (France); Clarotti, C.A. [ENEA, Casaccia (Italy)

    1996-12-31

    EDF and ENEA carried out a joint research program for developing the numerical methods and computer codes needed for Bayesian analysis of component-lives in the case of ageing. Early results of this study were presented at ESREL`94. Since then the following further steps have been gone: input data have been generalized to the case that observed lives are censored both on the right and on the left; allowable life distributions are Weibull and gamma - their parameters are both unknown and can be statistically dependent; allowable priors are histograms relative to different parametrizations of the life distribution of concern; first-and-second-order-moments of the posterior distributions can be computed. In particular the covariance will give some important information about the degree of the statistical dependence between the parameters of interest. An application of the code to the appearance of a stress corrosion cracking in a tube of the PWR Steam Generator system is presented. (authors). 10 refs.

  10. Linking Neighborhood Characteristics and Drug-Related Police Interventions: A Bayesian Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Miriam Marco

    2017-02-01

    Full Text Available This paper aimed to analyze the spatial distribution of drug-related police interventions and the neighborhood characteristics influencing these spatial patterns. To this end, police officers ranked each census block group in Valencia, Spain (N = 552, providing an index of drug-related police interventions. Data from the City Statistics Office and observational variables were used to analyze neighborhood characteristics. Distance to the police station was used as the control variable. A Bayesian ecological analysis was performed with a spatial beta regression model. Results indicated that high physical decay, low socioeconomic status, and high immigrant concentration were associated with high levels of drug-related police interventions after adjustment for distance to the police station. Results illustrate the importance of a spatial approach to understanding crime.

  11. Intuitive logic revisited: new data and a Bayesian mixed model meta-analysis.

    Science.gov (United States)

    Singmann, Henrik; Klauer, Karl Christoph; Kellen, David

    2014-01-01

    Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid) of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012). These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies (n = 287) without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects) which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim.

  12. A Bayesian analysis of redshifted 21-cm HI signal and foregrounds: Simulations for LOFAR

    CERN Document Server

    Ghosh, Abhik; Chapman, Emma; Jelic, Vibor

    2015-01-01

    Observations of the EoR with the 21-cm hyperfine emission of neutral hydrogen (HI) promise to open an entirely new window onto the formation of the first stars, galaxies and accreting black holes. In order to characterize the weak 21-cm signal, we need to develop imaging techniques which can reconstruct the extended emission very precisely. Here, we present an inversion technique for LOFAR baselines at NCP, based on a Bayesian formalism with optimal spatial regularization, which is used to reconstruct the diffuse foreground map directly from the simulated visibility data. We notice the spatial regularization de-noises the images to a large extent, allowing one to recover the 21-cm power-spectrum over a considerable $k_{\\perp}-k_{\\para}$ space in the range of $0.03\\,{\\rm Mpc^{-1}}

  13. Bayesian deconvolution of scanning electron microscopy images using point-spread function estimation and non-local regularization.

    Science.gov (United States)

    Roels, Joris; Aelterman, Jan; De Vylder, Jonas; Hiep Luong; Saeys, Yvan; Philips, Wilfried

    2016-08-01

    Microscopy is one of the most essential imaging techniques in life sciences. High-quality images are required in order to solve (potentially life-saving) biomedical research problems. Many microscopy techniques do not achieve sufficient resolution for these purposes, being limited by physical diffraction and hardware deficiencies. Electron microscopy addresses optical diffraction by measuring emitted or transmitted electrons instead of photons, yielding nanometer resolution. Despite pushing back the diffraction limit, blur should still be taken into account because of practical hardware imperfections and remaining electron diffraction. Deconvolution algorithms can remove some of the blur in post-processing but they depend on knowledge of the point-spread function (PSF) and should accurately regularize noise. Any errors in the estimated PSF or noise model will reduce their effectiveness. This paper proposes a new procedure to estimate the lateral component of the point spread function of a 3D scanning electron microscope more accurately. We also propose a Bayesian maximum a posteriori deconvolution algorithm with a non-local image prior which employs this PSF estimate and previously developed noise statistics. We demonstrate visual quality improvements and show that applying our method improves the quality of subsequent segmentation steps.

  14. A Bayesian-based multilevel factorial analysis method for analyzing parameter uncertainty of hydrological model

    Science.gov (United States)

    Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.

    2017-10-01

    In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.

  15. Bayesian Analysis on Abduction%从贝叶斯方法看溯因推理

    Institute of Scientific and Technical Information of China (English)

    袁继红; 陈晓平

    2014-01-01

    皮尔斯指出溯因或溯因推理(abduction)是不同于归纳和演绎的第三种推理,然而皮尔斯对溯因概念的定义是模糊的,于是便出现溯因悖论:溯因既属于归纳又不属于归纳。本文基于贝叶斯方法对归纳的理解和处理,考察了当代两种典型的消解溯因悖论的路径,即辛提卡区分定义性规则和策略性规则的措施,以及利普顿的IBE 理论。指出这两种路径均是行不通的,而贝叶斯方法却可以容纳溯因性归纳和溯因,从而消解溯因悖论。%Charles S.Peirce argued that abduction is a third kind of reasoning,different from both deduction and induction.However,Peirce’s concept of abduction is ambiguous,which results in the paradox about abduction:on the one hand,abduction is distinct from induction;one the other hand, abduction belongs to induction.Based on the Bayesian analysis on induction,two typical approaches, Hintikka’s distinction between definitory rules and strategic rules and Lipton’s inference to the best explanation (IBE),are discussed respectively in this paper.As a result,the analysis shows that the paradox about abduction can not be eliminated by the two typical approaches but can be eliminated in Bayesian framework which can contain abductory induction and abduction.

  16. Bayesian adaptive methods for clinical trials

    CERN Document Server

    Berry, Scott M; Muller, Peter

    2010-01-01

    Already popular in the analysis of medical device trials, adaptive Bayesian designs are increasingly being used in drug development for a wide variety of diseases and conditions, from Alzheimer's disease and multiple sclerosis to obesity, diabetes, hepatitis C, and HIV. Written by leading pioneers of Bayesian clinical trial designs, Bayesian Adaptive Methods for Clinical Trials explores the growing role of Bayesian thinking in the rapidly changing world of clinical trial analysis. The book first summarizes the current state of clinical trial design and analysis and introduces the main ideas and potential benefits of a Bayesian alternative. It then gives an overview of basic Bayesian methodological and computational tools needed for Bayesian clinical trials. With a focus on Bayesian designs that achieve good power and Type I error, the next chapters present Bayesian tools useful in early (Phase I) and middle (Phase II) clinical trials as well as two recent Bayesian adaptive Phase II studies: the BATTLE and ISP...

  17. A Bayesian nonrigid registration method to enhance intraoperative target definition in image-guided prostate procedures through uncertainty characterization

    Science.gov (United States)

    Pursley, Jennifer; Risholm, Petter; Fedorov, Andriy; Tuncali, Kemal; Fennessy, Fiona M.; Wells, William M.; Tempany, Clare M.; Cormack, Robert A.

    2012-01-01

    Purpose: This study introduces a probabilistic nonrigid registration method for use in image-guided prostate brachytherapy. Intraoperative imaging for prostate procedures, usually transrectal ultrasound (TRUS), is typically inferior to diagnostic-quality imaging of the pelvis such as endorectal magnetic resonance imaging (MRI). MR images contain superior detail of the prostate boundaries and provide substructure features not otherwise visible. Previous efforts to register diagnostic prostate images with the intraoperative coordinate system have been deterministic and did not offer a measure of the registration uncertainty. The authors developed a Bayesian registration method to estimate the posterior distribution on deformations and provide a case-specific measure of the associated registration uncertainty. Methods: The authors adapted a biomechanical-based probabilistic nonrigid method to register diagnostic to intraoperative images by aligning a physician's segmentations of the prostate in the two images. The posterior distribution was characterized with a Markov Chain Monte Carlo method; the maximum a posteriori deformation and the associated uncertainty were estimated from the collection of deformation samples drawn from the posterior distribution. The authors validated the registration method using a dataset created from ten patients with MRI-guided prostate biopsies who had both diagnostic and intraprocedural 3 Tesla MRI scans. The accuracy and precision of the estimated posterior distribution on deformations were evaluated from two predictive distance distributions: between the deformed central zone-peripheral zone (CZ-PZ) interface and the physician-labeled interface, and based on physician-defined landmarks. Geometric margins on the registration of the prostate's peripheral zone were determined from the posterior predictive distance to the CZ-PZ interface separately for the base, mid-gland, and apical regions of the prostate. Results: The authors observed

  18. Sea-level variability in tide-gauge and geological records: An empirical Bayesian analysis (Invited)

    Science.gov (United States)

    Kopp, R. E.; Hay, C.; Morrow, E.; Mitrovica, J. X.; Horton, B.; Kemp, A.

    2013-12-01

    Sea level varies at a range of temporal and spatial scales, and understanding all its significant sources of variability is crucial to building sea-level rise projections relevant to local decision-making. In the twentieth-century record, sites along the U.S. east coast have exhibited typical year-to-year variability of several centimeters. A faster-than-global increase in sea-level rise in the northeastern United States since about 1990 has led some to hypothesize a 'sea-level rise hot spot' in this region, perhaps driven by a trend in the Atlantic Meridional Overturning Circulation related to anthropogenic climate change [1]. However, such hypotheses must be evaluated in the context of natural variability, as revealed by observational and paleo-records. Bayesian and empirical Bayesian statistical approaches are well suited for assimilating data from diverse sources, such as tide-gauges and peats, with differing data availability and uncertainties, and for identifying regionally covarying patterns within these data. We present empirical Bayesian analyses of twentieth-century tide gauge data [2]. We find that the mid-Atlantic region of the United States has experienced a clear acceleration of sea level relative to the global average since about 1990, but this acceleration does not appear to be unprecedented in the twentieth-century record. The rate and extent of this acceleration instead appears comparable to an acceleration observed in the 1930s and 1940s. Both during the earlier episode of acceleration and today, the effect appears to be significantly positively correlated with the Atlantic Multidecadal Oscillation and likely negatively correlated with the North Atlantic Oscillation [2]. The Holocene and Common Era database of geological sea-level rise proxies [3,4] may allow these relationships to be assessed beyond the span of the direct observational record. At a global scale, similar approaches can be employed to look for the spatial fingerprints of land ice

  19. Hip fracture in the elderly: a re-analysis of the EPIDOS study with causal Bayesian networks.

    Directory of Open Access Journals (Sweden)

    Pascal Caillet

    Full Text Available Hip fractures commonly result in permanent disability, institutionalization or death in elderly. Existing hip-fracture predicting tools are underused in clinical practice, partly due to their lack of intuitive interpretation. By use of a graphical layer, Bayesian network models could increase the attractiveness of fracture prediction tools. Our aim was to study the potential contribution of a causal Bayesian network in this clinical setting. A logistic regression was performed as a standard control approach to check the robustness of the causal Bayesian network approach.EPIDOS is a multicenter study, conducted in an ambulatory care setting in five French cities between 1992 and 1996 and updated in 2010. The study included 7598 women aged 75 years or older, in which fractures were assessed quarterly during 4 years. A causal Bayesian network and a logistic regression were performed on EPIDOS data to describe major variables involved in hip fractures occurrences.Both models had similar association estimations and predictive performances. They detected gait speed and mineral bone density as variables the most involved in the fracture process. The causal Bayesian network showed that gait speed and bone mineral density were directly connected to fracture and seem to mediate the influence of all the other variables included in our model. The logistic regression approach detected multiple interactions involving psychotropic drug use, age and bone mineral density.Both approaches retrieved similar variables as predictors of hip fractures. However, Bayesian network highlighted the whole web of relation between the variables involved in the analysis, suggesting a possible mechanism leading to hip fracture. According to the latter results, intervention focusing concomitantly on gait speed and bone mineral density may be necessary for an optimal prevention of hip fracture occurrence in elderly people.

  20. Bayesian Analysis of Nonlinear Structural Equation Models with Nonignorable Missing Data

    Science.gov (United States)

    Lee, Sik-Yum

    2006-01-01

    A Bayesian approach is developed for analyzing nonlinear structural equation models with nonignorable missing data. The nonignorable missingness mechanism is specified by a logistic regression model. A hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm is used to produce the joint Bayesian estimates of…

  1. Bayesian analysis of risk associated with workplace accidents in earthmoving operations

    Directory of Open Access Journals (Sweden)

    J. F. García

    2017-06-01

    Full Text Available This paper analyses the characteristics of earthmoving operations involving a workplace accident. Bayesian networks were used to identify the factors that best predicted potential risk situations. Inference studies were then conducted to analyse the interplay between different risk factors. We demonstrate the potential of Bayesian networks to describe workplace contexts and predict risk situations from a safety and production planning perspective.

  2. Reflections on ultrasound image analysis.

    Science.gov (United States)

    Alison Noble, J

    2016-10-01

    Ultrasound (US) image analysis has advanced considerably in twenty years. Progress in ultrasound image analysis has always been fundamental to the advancement of image-guided interventions research due to the real-time acquisition capability of ultrasound and this has remained true over the two decades. But in quantitative ultrasound image analysis - which takes US images and turns them into more meaningful clinical information - thinking has perhaps more fundamentally changed. From roots as a poor cousin to Computed Tomography (CT) and Magnetic Resonance (MR) image analysis, both of which have richer anatomical definition and thus were better suited to the earlier eras of medical image analysis which were dominated by model-based methods, ultrasound image analysis has now entered an exciting new era, assisted by advances in machine learning and the growing clinical and commercial interest in employing low-cost portable ultrasound devices outside traditional hospital-based clinical settings. This short article provides a perspective on this change, and highlights some challenges ahead and potential opportunities in ultrasound image analysis which may both have high impact on healthcare delivery worldwide in the future but may also, perhaps, take the subject further away from CT and MR image analysis research with time.

  3. Integrated survival analysis using an event-time approach in a Bayesian framework

    Science.gov (United States)

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  4. Bias correction and Bayesian analysis of aggregate counts in SAGE libraries

    Directory of Open Access Journals (Sweden)

    Briggs William M

    2010-02-01

    Full Text Available Abstract Background Tag-based techniques, such as SAGE, are commonly used to sample the mRNA pool of an organism's transcriptome. Incomplete digestion during the tag formation process may allow for multiple tags to be generated from a given mRNA transcript. The probability of forming a tag varies with its relative location. As a result, the observed tag counts represent a biased sample of the actual transcript pool. In SAGE this bias can be avoided by ignoring all but the 3' most tag but will discard a large fraction of the observed data. Taking this bias into account should allow more of the available data to be used leading to increased statistical power. Results Three new hierarchical models, which directly embed a model for the variation in tag formation probability, are proposed and their associated Bayesian inference algorithms are developed. These models may be applied to libraries at both the tag and aggregate level. Simulation experiments and analysis of real data are used to contrast the accuracy of the various methods. The consequences of tag formation bias are discussed in the context of testing differential expression. A description is given as to how these algorithms can be applied in that context. Conclusions Several Bayesian inference algorithms that account for tag formation effects are compared with the DPB algorithm providing clear evidence of superior performance. The accuracy of inferences when using a particular non-informative prior is found to depend on the expression level of a given gene. The multivariate nature of the approach easily allows both univariate and joint tests of differential expression. Calculations demonstrate the potential for false positive and negative findings due to variation in tag formation probabilities across samples when testing for differential expression.

  5. Genome-wide association study of swine farrowing traits. Part II: Bayesian analysis of marker data.

    Science.gov (United States)

    Schneider, J F; Rempel, L A; Snelling, W M; Wiedmann, R T; Nonneman, D J; Rohrer, G A

    2012-10-01

    Reproductive efficiency has a great impact on the economic success of pork (sus scrofa) production. Number born alive (NBA) and average piglet birth weight (ABW) contribute greatly to reproductive efficiency. To better understand the underlying genetics of birth traits, a genome-wide association study (GWAS) was undertaken. Samples of DNA were collected and tested using the Illumina PorcineSNP60 BeadChip from 1,152 first parity gilts. Traits included total number born (TNB), NBA, number born dead (NBD), number stillborn (NSB), number of mummies (MUM), total litter birth weight (LBW), and ABW. A total of 41,151 SNP were tested using a Bayesian approach. Beginning with the first 5 SNP on SSC1 and ending with the last 5 SNP on the SSCX, SNP were assigned to groups of 5 consecutive SNP by chromosome-position order and analyzed again using a Bayesian approach. From that analysis, 5-SNP groups were selected having no overlap with another 5-SNP groups and no overlap across chromosomes. These selected 5-SNP non-overlapping groups were defined as QTL. Of the available 8,814 QTL, 124 were found to be statistically significant (P ABW, 9 on SSC1, 3 on SSC2, 9 on SSC5, 5 on SSC6, 1 on SSC7, 2 on SSC8, 2 on SSC9, 3 on SSC10, 1 on SSC11, 3 on SSC12, 2 on SSC13, 8 on SSC14, 8 on SSC15, 1 on SSC17, and 8 on SSC18. Several candidate genes have been identified that overlap QTL locations among TNB, NBA, NBD, and ABW. These QTL when combined with information on genes found in the same regions should provide useful information that could be used for marker assisted selection, marker assisted management, or genomic selection applications in commercial pig populations.

  6. 预案分析的贝叶斯网络方法%Contingency Plan Analysis of Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    徐立

    2012-01-01

    在对预案进行评估分析和执行过程中常会涉及不确定性问题,传统的预案编制工具关键路径法(Critical Path Method,CPM)不具备处理不确定性问题的能力.本文推荐的贝叶斯网络法(Bayesian Networks)因其处理分析不确定性问题的能力已经被广泛应用于一系列的决策支持应用,但对预案评估分析的应用是新颖的.本文介绍了用贝叶斯网络法分析传统关键路径法编制的预案.%In the process of contingency plan analysis and execution, we meet uncertainty problem frequently. The traditional critical path method (CPM) can not deal with uncertainty problem. Bayesian networks which has capability to dispose uncertainty problem is applied to support decision -making widely. It is novel to use bayesian networks in contingency plan analysis. In this paper, a contingency plan is presented, which utilizes bayesian networks in CPM.

  7. New formulae for estimating age-at-death in the Balkans utilizing Lamendin's dental technique and Bayesian analysis.

    Science.gov (United States)

    Prince, Debra A; Konigsberg, Lyle W

    2008-05-01

    The present study analyzed apical translucency and periodontal recession on single-rooted teeth in order to generate age-at-death estimations using two inverse calibration methods and one Bayesian method. The three age estimates were compared to highlight inherent problems with the inverse calibration methods. The results showed that the Bayesian analysis reduced severity of several problems associated with adult skeletal age-at-death estimations. The Bayesian estimates produced a lower overall mean error, a higher correlation with actual age, reduced aging bias, reduced age mimicry, and reduced the age ranges associated with the most probable age as compared to the inverse calibration methods for this sample. This research concluded that periodontal recession cannot be used as a univariate age indicator, due to its low correlation with chronological age. Apical translucency yielded a high correlation with chronological age and was concluded to be an important age indicator. The Bayesian approach offered the most appropriate statistical analysis for the estimation of age-at-death with the current sample.

  8. Bayesian wavelet-based image deconvolution: a GEM algorithm exploiting a class of heavy-tailed priors.

    Science.gov (United States)

    Bioucas-Dias, José M

    2006-04-01

    Image deconvolution is formulated in the wavelet domain under the Bayesian framework. The well-known sparsity of the wavelet coefficients of real-world images is modeled by heavy-tailed priors belonging to the Gaussian scale mixture (GSM) class; i.e., priors given by a linear (finite of infinite) combination of Gaussian densities. This class includes, among others, the generalized Gaussian, the Jeffreys, and the Gaussian mixture priors. Necessary and sufficient conditions are stated under which the prior induced by a thresholding/shrinking denoising rule is a GSM. This result is then used to show that the prior induced by the "nonnegative garrote" thresholding/shrinking rule, herein termed the garrote prior, is a GSM. To compute the maximum a posteriori estimate, we propose a new generalized expectation maximization (GEM) algorithm, where the missing variables are the scale factors of the GSM densities. The maximization step of the underlying expectation maximization algorithm is replaced with a linear stationary second-order iterative method. The result is a GEM algorithm of O(N log N) computational complexity. In a series of benchmark tests, the proposed approach outperforms or performs similarly to state-of-the art methods, demanding comparable (in some cases, much less) computational complexity.

  9. Digital Images Analysis

    OpenAIRE

    2012-01-01

    International audience; A specific field of image processing focuses on the evaluation of image quality and assessment of their authenticity. A loss of image quality may be due to the various processes by which it passes. In assessing the authenticity of the image we detect forgeries, detection of hidden messages, etc. In this work, we present an overview of these areas; these areas have in common the need to develop theories and techniques to detect changes in the image that it is not detect...

  10. Image Analysis in CT Angiography

    NARCIS (Netherlands)

    Manniesing, R.

    2006-01-01

    In this thesis we develop and validate novel image processing techniques for the analysis of vascular structures in medical images. First a new type of filter is proposed which is capable of enhancing vascular structures while suppressing noise in the remainder of the image. This filter is based on

  11. Bayesian inference on multiscale models for poisson intensity estimation: applications to photon-limited image denoising.

    Science.gov (United States)

    Lefkimmiatis, Stamatios; Maragos, Petros; Papandreou, George

    2009-08-01

    We present an improved statistical model for analyzing Poisson processes, with applications to photon-limited imaging. We build on previous work, adopting a multiscale representation of the Poisson process in which the ratios of the underlying Poisson intensities (rates) in adjacent scales are modeled as mixtures of conjugate parametric distributions. Our main contributions include: 1) a rigorous and robust regularized expectation-maximization (EM) algorithm for maximum-likelihood estimation of the rate-ratio density parameters directly from the noisy observed Poisson data (counts); 2) extension of the method to work under a multiscale hidden Markov tree model (HMT) which couples the mixture label assignments in consecutive scales, thus modeling interscale coefficient dependencies in the vicinity of image edges; 3) exploration of a 2-D recursive quad-tree image representation, involving Dirichlet-mixture rate-ratio densities, instead of the conventional separable binary-tree image representation involving beta-mixture rate-ratio densities; and 4) a novel multiscale image representation, which we term Poisson-Haar decomposition, that better models the image edge structure, thus yielding improved performance. Experimental results on standard images with artificially simulated Poisson noise and on real photon-limited images demonstrate the effectiveness of the proposed techniques.

  12. Reference image selection for difference imaging analysis

    CERN Document Server

    Huckvale, Leo; Sale, Stuart E

    2014-01-01

    Difference image analysis (DIA) is an effective technique for obtaining photometry in crowded fields, relative to a chosen reference image. As yet, however, optimal reference image selection is an unsolved problem. We examine how this selection depends on the combination of seeing, background and detector pixel size. Our tests use a combination of simulated data and quality indicators from DIA of well-sampled optical data and under-sampled near-infrared data from the OGLE and VVV surveys, respectively. We search for a figure-of-merit (FoM) which could be used to select reference images for each survey. While we do not find a universally applicable FoM, survey-specific measures indicate that the effect of spatial under-sampling may require a change in strategy from the standard DIA approach, even though seeing remains the primary criterion. We find that background is not an important criterion for reference selection, at least for the dynamic range in the images we test. For our analysis of VVV data in particu...

  13. Bayesian analysis of two stellar populations in Galactic globular clusters- III. Analysis of 30 clusters

    Science.gov (United States)

    Wagner-Kaiser, R.; Stenning, D. C.; Sarajedini, A.; von Hippel, T.; van Dyk, D. A.; Robinson, E.; Stein, N.; Jefferys, W. H.

    2016-12-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of 30 Galactic globular clusters to characterize two distinct stellar populations. A sophisticated Bayesian technique is employed to simultaneously sample the joint posterior distribution of age, distance, and extinction for each cluster, as well as unique helium values for two populations within each cluster and the relative proportion of those populations. We find the helium differences among the two populations in the clusters fall in the range of ˜0.04 to 0.11. Because adequate models varying in carbon, nitrogen, and oxygen are not presently available, we view these spreads as upper limits and present them with statistical rather than observational uncertainties. Evidence supports previous studies suggesting an increase in helium content concurrent with increasing mass of the cluster and we also find that the proportion of the first population of stars increases with mass as well. Our results are examined in the context of proposed globular cluster formation scenarios. Additionally, we leverage our Bayesian technique to shed light on the inconsistencies between the theoretical models and the observed data.

  14. Bayesian reliability analysis for non-periodic inspection with estimation of uncertain parameters; Bayesian shinraisei kaiseki wo tekiyoshita hiteiki kozo kensa ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Itagaki, H. [Yokohama National University, Yokohama (Japan). Faculty of Engineering; Asada, H.; Ito, S. [National Aerospace Laboratory, Tokyo (Japan); Shinozuka, M.

    1996-12-31

    Risk assessed structural positions in a pressurized fuselage of a transport-type aircraft applied with damage tolerance design are taken up as the subject of discussion. A small number of data obtained from inspections on the positions was used to discuss the Bayesian reliability analysis that can estimate also a proper non-periodic inspection schedule, while estimating proper values for uncertain factors. As a result, time period of generating fatigue cracks was determined according to procedure of detailed visual inspections. The analysis method was found capable of estimating values that are thought reasonable and the proper inspection schedule using these values, in spite of placing the fatigue crack progress expression in a very simple form and estimating both factors as the uncertain factors. Thus, the present analysis method was verified of its effectiveness. This study has discussed at the same time the structural positions, modeling of fatigue cracks generated and develop in the positions, conditions for destruction, damage factors, and capability of the inspection from different viewpoints. This reliability analysis method is thought effective also on such other structures as offshore structures. 18 refs., 8 figs., 1 tab.

  15. Application of Bayesian and cost benefit risk analysis in water resources management

    Science.gov (United States)

    Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.

    2016-03-01

    Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.

  16. A Bayesian cluster analysis method for single-molecule localization microscopy data.

    Science.gov (United States)

    Griffié, Juliette; Shannon, Michael; Bromley, Claire L; Boelen, Lies; Burn, Garth L; Williamson, David J; Heard, Nicholas A; Cope, Andrew P; Owen, Dylan M; Rubin-Delanchy, Patrick

    2016-12-01

    Cell function is regulated by the spatiotemporal organization of the signaling machinery, and a key facet of this is molecular clustering. Here, we present a protocol for the analysis of clustering in data generated by 2D single-molecule localization microscopy (SMLM)-for example, photoactivated localization microscopy (PALM) or stochastic optical reconstruction microscopy (STORM). Three features of such data can cause standard cluster analysis approaches to be ineffective: (i) the data take the form of a list of points rather than a pixel array; (ii) there is a non-negligible unclustered background density of points that must be accounted for; and (iii) each localization has an associated uncertainty in regard to its position. These issues are overcome using a Bayesian, model-based approach. Many possible cluster configurations are proposed and scored against a generative model, which assumes Gaussian clusters overlaid on a completely spatially random (CSR) background, before every point is scrambled by its localization precision. We present the process of generating simulated and experimental data that are suitable to our algorithm, the analysis itself, and the extraction and interpretation of key cluster descriptors such as the number of clusters, cluster radii and the number of localizations per cluster. Variations in these descriptors can be interpreted as arising from changes in the organization of the cellular nanoarchitecture. The protocol requires no specific programming ability, and the processing time for one data set, typically containing 30 regions of interest, is ∼18 h; user input takes ∼1 h.

  17. A Bayesian Semi-parametric Approach for the Differential Analysis of Sequence Counts Data.

    Science.gov (United States)

    Guindani, Michele; Sepúlveda, Nuno; Paulino, Carlos Daniel; Müller, Peter

    2014-04-01

    Data obtained using modern sequencing technologies are often summarized by recording the frequencies of observed sequences. Examples include the analysis of T cell counts in immunological research and studies of gene expression based on counts of RNA fragments. In both cases the items being counted are sequences, of proteins and base pairs, respectively. The resulting sequence-abundance distribution is usually characterized by overdispersion. We propose a Bayesian semi-parametric approach to implement inference for such data. Besides modeling the overdispersion, the approach takes also into account two related sources of bias that are usually associated with sequence counts data: some sequence types may not be recorded during the experiment and the total count may differ from one experiment to another. We illustrate our methodology with two data sets, one regarding the analysis of CD4+ T cell counts in healthy and diabetic mice and another data set concerning the comparison of mRNA fragments recorded in a Serial Analysis of Gene Expression (SAGE) experiment with gastrointestinal tissue of healthy and cancer patients.

  18. Bayesian meta-analysis of diagnostic tests allowing for imperfect reference standards.

    Science.gov (United States)

    Menten, J; Boelaert, M; Lesaffre, E

    2013-12-30

    There is an increasing interest in meta-analyses of rapid diagnostic tests (RDTs) for infectious diseases. To avoid spectrum bias, these meta-analyses should focus on phase IV studies performed in the target population. For many infectious diseases, these target populations attend primary health care centers in resource-constrained settings where it is difficult to perform gold standard diagnostic tests. As a consequence, phase IV diagnostic studies often use imperfect reference standards, which may result in biased meta-analyses of the diagnostic accuracy of novel RDTs. We extend the standard bivariate model for the meta-analysis of diagnostic studies to correct for differing and imperfect reference standards in the primary studies and to accommodate data from studies that try to overcome the absence of a true gold standard through the use of latent class analysis. Using Bayesian methods, improved estimates of sensitivity and specificity are possible, especially when prior information is available on the diagnostic accuracy of the reference test. In this analysis, the deviance information criterion can be used to detect conflicts between the prior information and observed data. When applying the model to a dataset of the diagnostic accuracy of an RDT for visceral leishmaniasis, the standard meta-analytic methods appeared to underestimate the specificity of the RDT.

  19. Bayesian statistical modeling of spatially correlated error structure in atmospheric tracer inverse analysis

    Directory of Open Access Journals (Sweden)

    C. Mukherjee

    2011-01-01

    Full Text Available Inverse modeling applications in atmospheric chemistry are increasingly addressing the challenging statistical issues of data synthesis by adopting refined statistical analysis methods. This paper advances this line of research by addressing several central questions in inverse modeling, focusing specifically on Bayesian statistical computation. Motivated by problems of refining bottom-up estimates of source/sink fluxes of trace gas and aerosols based on increasingly high-resolution satellite retrievals of atmospheric chemical concentrations, we address head-on the need for integrating formal spatial statistical methods of residual error structure in global scale inversion models. We do this using analytically and computationally tractable spatial statistical models, know as conditional autoregressive spatial models, as components of a global inversion framework. We develop Markov chain Monte Carlo methods to explore and fit these spatial structures in an overall statistical framework that simultaneously estimates source fluxes. Additional aspects of the study extend the statistical framework to utilize priors in a more physically realistic manner, and to formally address and deal with missing data in satellite retrievals. We demonstrate the analysis in the context of inferring carbon monoxide (CO sources constrained by satellite retrievals of column CO from the Measurement of Pollution in the Troposphere (MOPITT instrument on the TERRA satellite, paying special attention to evaluating performance of the inverse approach using various statistical diagnostic metrics. This is developed using synthetic data generated to resemble MOPITT data to define a~proof-of-concept and model assessment, and then in analysis of real MOPITT data.

  20. Estimation of protein folding free energy barriers from calorimetric data by multi-model Bayesian analysis.

    Science.gov (United States)

    Naganathan, Athi N; Perez-Jimenez, Raul; Muñoz, Victor; Sanchez-Ruiz, Jose M

    2011-10-14

    The realization that folding free energy barriers can be small enough to result in significant population of the species at the barrier top has sprouted in several methods to estimate folding barriers from equilibrium experiments. Some of these approaches are based on fitting the experimental thermogram measured by differential scanning calorimetry (DSC) to a one-dimensional representation of the folding free-energy surface (FES). Different physical models have been used to represent the FES: (1) a Landau quartic polynomial as a function of the total enthalpy, which acts as an order parameter; (2) the projection onto a structural order parameter (i.e. number of native residues or native contacts) of the free energy of all the conformations generated by Ising-like statistical mechanical models; and (3) mean-field models that define conformational entropy and stabilization energy as functions of a continuous local order parameter. The fundamental question that emerges is how can we obtain robust, model-independent estimates of the thermodynamic folding barrier from the analysis of DSC experiments. Here we address this issue by comparing the performance of various FES models in interpreting the thermogram of a protein with a marginal folding barrier. We chose the small α-helical protein PDD, which folds-unfolds in microseconds crossing a free energy barrier previously estimated as ~1 RT. The fits of the PDD thermogram to the various models and assumptions produce FES with a consistently small free energy barrier separating the folded and unfolded ensembles. However, the fits vary in quality as well as in the estimated barrier. Applying Bayesian probabilistic analysis we rank the fit performance using a statistically rigorous criterion that leads to a global estimate of the folding barrier and its precision, which for PDD is 1.3 ± 0.4 kJ mol(-1). This result confirms that PDD folds over a minor barrier consistent with the downhill folding regime. We have further

  1. ANALYSIS OF FUNDUS IMAGES

    DEFF Research Database (Denmark)

    2000-01-01

    A method classifying objects man image as respective arterial or venous vessels comprising: identifying pixels of the said modified image which are located on a line object, determining which of the said image points is associated with crossing point or a bifurcation of the respective line object......, wherein a crossing point is represented by an image point which is the intersection of four line segments, performing a matching operation on pairs of said line segments for each said crossing point, to determine the path of blood vessels in the image, thereby classifying the line objects in the original...... image into two arbitrary sets, and thereafter designating one of the sets as representing venous structure, the other of the sets as representing arterial structure, depending on one or more of the following criteria: (a) complexity of structure; (b) average density; (c) average width; (d) tortuosity...

  2. A BAYESIAN HIERARCHICAL SPATIAL POINT PROCESS MODEL FOR MULTI-TYPE NEUROIMAGING META-ANALYSIS.

    Science.gov (United States)

    Kang, Jian; Nichols, Thomas E; Wager, Tor D; Johnson, Timothy D

    2014-09-01

    Neuroimaging meta-analysis is an important tool for finding consistent effects over studies that each usually have 20 or fewer subjects. Interest in meta-analysis in brain mapping is also driven by a recent focus on so-called "reverse inference": where as traditional "forward inference" identifies the regions of the brain involved in a task, a reverse inference identifies the cognitive processes that a task engages. Such reverse inferences, however, requires a set of meta-analysis, one for each possible cognitive domain. However, existing methods for neuroimaging meta-analysis have significant limitations. Commonly used methods for neuroimaging meta-analysis are not model based, do not provide interpretable parameter estimates, and only produce null hypothesis inferences; further, they are generally designed for a single group of studies and cannot produce reverse inferences. In this work we address these limitations by adopting a non-parametric Bayesian approach for meta analysis data from multiple classes or types of studies. In particular, foci from each type of study are modeled as a cluster process driven by a random intensity function that is modeled as a kernel convolution of a gamma random field. The type-specific gamma random fields are linked and modeled as a realization of a common gamma random field, shared by all types, that induces correlation between study types and mimics the behavior of a univariate mixed effects model. We illustrate our model on simulation studies and a meta analysis of five emotions from 219 studies and check model fit by a posterior predictive assessment. In addition, we implement reverse inference by using the model to predict study type from a newly presented study. We evaluate this predictive performance via leave-one-out cross validation that is efficiently implemented using importance sampling techniques.

  3. Probabilistic Delamination Diagnosis of Composite Materials Using a Novel Bayesian Imaging Method

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a probabilistic delamination location and size detection framework is proposed. The delamination probability image using Lamb wave-based damage...

  4. Bayesian Analysis Made Simple An Excel GUI for WinBUGS

    CERN Document Server

    Woodward, Philip

    2011-01-01

    From simple NLMs to complex GLMMs, this book describes how to use the GUI for WinBUGS - BugsXLA - an Excel add-in written by the author that allows a range of Bayesian models to be easily specified. With case studies throughout, the text shows how to routinely apply even the more complex aspects of model specification, such as GLMMs, outlier robust models, random effects Emax models, auto-regressive errors, and Bayesian variable selection. It provides brief, up-to-date discussions of current issues in the practical application of Bayesian methods. The author also explains how to obtain free so

  5. Introduction to Medical Image Analysis

    DEFF Research Database (Denmark)

    Paulsen, Rasmus Reinhold; Moeslund, Thomas B.

    2011-01-01

    of the book is to present the fascinating world of medical image analysis in an easy and interesting way. Compared to many standard books on image analysis, the approach we have chosen is less mathematical and more casual. Some of the key algorithms are exemplified in C-code. Please note that the code...

  6. Introduction to Medical Image Analysis

    DEFF Research Database (Denmark)

    Paulsen, Rasmus Reinhold; Moeslund, Thomas B.

    of the book is to present the fascinating world of medical image analysis in an easy and interesting way. Compared to many standard books on image analysis, the approach we have chosen is less mathematical and more casual. Some of the key algorithms are exemplified in C-code. Please note that the code...

  7. Unmixing of Hyperspectral Images using Bayesian Non-negative Matrix Factorization with Volume Prior

    DEFF Research Database (Denmark)

    Arngren, Morten; Schmidt, Mikkel Nørgaard; Larsen, Jan

    2011-01-01

    Hyperspectral imaging can be used in assessing the quality of foods by decomposing the image into constituents such as protein, starch, and water. Observed data can be considered a mixture of underlying characteristic spectra (endmembers), and estimating the constituents and their abundances requ...... perform as good or better than existing volume constrained methods. Further, our method gives credible intervals for the endmembers and abundances, which allows us to asses the confidence of the results....

  8. Oncological image analysis: medical and molecular image analysis

    Science.gov (United States)

    Brady, Michael

    2007-03-01

    This paper summarises the work we have been doing on joint projects with GE Healthcare on colorectal and liver cancer, and with Siemens Molecular Imaging on dynamic PET. First, we recall the salient facts about cancer and oncological image analysis. Then we introduce some of the work that we have done on analysing clinical MRI images of colorectal and liver cancer, specifically the detection of lymph nodes and segmentation of the circumferential resection margin. In the second part of the paper, we shift attention to the complementary aspect of molecular image analysis, illustrating our approach with some recent work on: tumour acidosis, tumour hypoxia, and multiply drug resistant tumours.

  9. Bayesian inference analysis of the uncertainty linked to the evaluation of potential flood damage in urban areas.

    Science.gov (United States)

    Fontanazza, C M; Freni, G; Notaro, V

    2012-01-01

    Flood damage in urbanized watersheds may be assessed by combining the flood depth-damage curves and the outputs of urban flood models. The complexity of the physical processes that must be simulated and the limited amount of data available for model calibration may lead to high uncertainty in the model results and consequently in damage estimation. Moreover depth-damage functions are usually affected by significant uncertainty related to the collected data and to the simplified structure of the regression law that is used. The present paper carries out the analysis of the uncertainty connected to the flood damage estimate obtained combining the use of hydraulic models and depth-damage curves. A Bayesian inference analysis was proposed along with a probabilistic approach for the parameters estimating. The analysis demonstrated that the Bayesian approach is very effective considering that the available databases are usually short.

  10. Analysis of traffic accidents on rural highways using Latent Class Clustering and Bayesian Networks.

    Science.gov (United States)

    de Oña, Juan; López, Griselda; Mujalli, Randa; Calvo, Francisco J

    2013-03-01

    One of the principal objectives of traffic accident analyses is to identify key factors that affect the severity of an accident. However, with the presence of heterogeneity in the raw data used, the analysis of traffic accidents becomes difficult. In this paper, Latent Class Cluster (LCC) is used as a preliminary tool for segmentation of 3229 accidents on rural highways in Granada (Spain) between 2005 and 2008. Next, Bayesian Networks (BNs) are used to identify the main factors involved in accident severity for both, the entire database (EDB) and the clusters previously obtained by LCC. The results of these cluster-based analyses are compared with the results of a full-data analysis. The results show that the combined use of both techniques is very interesting as it reveals further information that would not have been obtained without prior segmentation of the data. BN inference is used to obtain the variables that best identify accidents with killed or seriously injured. Accident type and sight distance have been identify in all the cases analysed; other variables such as time, occupant involved or age are identified in EDB and only in one cluster; whereas variables vehicles involved, number of injuries, atmospheric factors, pavement markings and pavement width are identified only in one cluster. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey

    CERN Document Server

    Leclercq, Florent; Wandelt, Benjamin

    2015-01-01

    Recent application of the Bayesian algorithm BORG to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments and clusters) on the basis of the tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces highly detailed and accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis...

  12. Computational model of an infant brain subjected to periodic motion simplified modelling and Bayesian sensitivity analysis.

    Science.gov (United States)

    Batterbee, D C; Sims, N D; Becker, W; Worden, K; Rowson, J

    2011-11-01

    Non-accidental head injury in infants, or shaken baby syndrome, is a highly controversial and disputed topic. Biomechanical studies often suggest that shaking alone cannot cause the classical symptoms, yet many medical experts believe the contrary. Researchers have turned to finite element modelling for a more detailed understanding of the interactions between the brain, skull, cerebrospinal fluid (CSF), and surrounding tissues. However, the uncertainties in such models are significant; these can arise from theoretical approximations, lack of information, and inherent variability. Consequently, this study presents an uncertainty analysis of a finite element model of a human head subject to shaking. Although the model geometry was greatly simplified, fluid-structure-interaction techniques were used to model the brain, skull, and CSF using a Eulerian mesh formulation with penalty-based coupling. Uncertainty and sensitivity measurements were obtained using Bayesian sensitivity analysis, which is a technique that is relatively new to the engineering community. Uncertainty in nine different model parameters was investigated for two different shaking excitations: sinusoidal translation only, and sinusoidal translation plus rotation about the base of the head. The level and type of sensitivity in the results was found to be highly dependent on the excitation type.

  13. Hyperspectral image analysis. A tutorial

    DEFF Research Database (Denmark)

    Amigo Rubio, Jose Manuel; Babamoradi, Hamid; Elcoroaristizabal Martin, Saioa

    2015-01-01

    This tutorial aims at providing guidelines and practical tools to assist with the analysis of hyperspectral images. Topics like hyperspectral image acquisition, image pre-processing, multivariate exploratory analysis, hyperspectral image resolution, classification and final digital image processi...... to differentiate between several types of plastics by using Near infrared hyperspectral imaging and Partial Least Squares - Discriminant Analysis. Thus, the reader is guided through every single step and oriented in order to adapt those strategies to the user's case....... will be exposed, and some guidelines given and discussed. Due to the broad character of current applications and the vast number of multivariate methods available, this paper has focused on an industrial chemical framework to explain, in a step-wise manner, how to develop a classification methodology...

  14. Bayesian Analysis of Two Stellar Populations in Galactic Globular Clusters III: Analysis of 30 Clusters

    CERN Document Server

    Wagner-Kaiser, R; Sarajedini, A; von Hippel, T; van Dyk, D A; Robinson, E; Stein, N; Jefferys, W H

    2016-01-01

    We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of 30 Galactic Globular Clusters to characterize two distinct stellar populations. A sophisticated Bayesian technique is employed to simultaneously sample the joint posterior distribution of age, distance, and extinction for each cluster, as well as unique helium values for two populations within each cluster and the relative proportion of those populations. We find the helium differences among the two populations in the clusters fall in the range of ~0.04 to 0.11. Because adequate models varying in CNO are not presently available, we view these spreads as upper limits and present them with statistical rather than observational uncertainties. Evidence supports previous studies suggesting an increase in helium content concurrent with increasing mass of the cluster and also find that the proportion of the first population of stars increases with mass as well. Our results are examined in the context of proposed g...

  15. Hyperspectral image analysis. A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Amigo, José Manuel, E-mail: jmar@food.ku.dk [Spectroscopy and Chemometrics Group, Department of Food Sciences, Faculty of Science, University of Copenhagen, Rolighedsvej 30, Frederiksberg C DK–1958 (Denmark); Babamoradi, Hamid [Spectroscopy and Chemometrics Group, Department of Food Sciences, Faculty of Science, University of Copenhagen, Rolighedsvej 30, Frederiksberg C DK–1958 (Denmark); Elcoroaristizabal, Saioa [Spectroscopy and Chemometrics Group, Department of Food Sciences, Faculty of Science, University of Copenhagen, Rolighedsvej 30, Frederiksberg C DK–1958 (Denmark); Chemical and Environmental Engineering Department, School of Engineering, University of the Basque Country, Alameda de Urquijo s/n, E-48013 Bilbao (Spain)

    2015-10-08

    This tutorial aims at providing guidelines and practical tools to assist with the analysis of hyperspectral images. Topics like hyperspectral image acquisition, image pre-processing, multivariate exploratory analysis, hyperspectral image resolution, classification and final digital image processing will be exposed, and some guidelines given and discussed. Due to the broad character of current applications and the vast number of multivariate methods available, this paper has focused on an industrial chemical framework to explain, in a step-wise manner, how to develop a classification methodology to differentiate between several types of plastics by using Near infrared hyperspectral imaging and Partial Least Squares – Discriminant Analysis. Thus, the reader is guided through every single step and oriented in order to adapt those strategies to the user's case. - Highlights: • Comprehensive tutorial of Hyperspectral Image analysis. • Hierarchical discrimination of six classes of plastics containing flame retardant. • Step by step guidelines to perform class-modeling on hyperspectral images. • Fusion of multivariate data analysis and digital image processing methods. • Promising methodology for real-time detection of plastics containing flame retardant.

  16. Bayesian network analysis revealed the connectivity difference of the default mode network from the resting-state to task-state.

    Science.gov (United States)

    Wu, Xia; Yu, Xinyu; Yao, Li; Li, Rui

    2014-01-01

    Functional magnetic resonance imaging (fMRI) studies have converged to reveal the default mode network (DMN), a constellation of regions that display co-activation during resting-state but co-deactivation during attention-demanding tasks in the brain. Here, we employed a Bayesian network (BN) analysis method to construct a directed effective connectivity model of the DMN and compared the organizational architecture and interregional directed connections under both resting-state and task-state. The analysis results indicated that the DMN was consistently organized into two closely interacting subsystems in both resting-state and task-state. The directed connections between DMN regions, however, changed significantly from the resting-state to task-state condition. The results suggest that the DMN intrinsically maintains a relatively stable structure whether at rest or performing tasks but has different information processing mechanisms under varied states.

  17. The Resistible Rise of Bayesian Thinking in Management: Historical Lessons From Decision Analysis

    OpenAIRE

    Cabantous, L.; Gond, J-P.

    2015-01-01

    This paper draws from a case study of decision analysis—a discipline rooted in Bayesianism aimed at supporting managerial decision making—to inform the current discussion on the adoption of Bayesian modes of thinking in management research and practice. Relying on concepts from the science, technology, and society field of study and actor-network theory, we approach the production of scientific knowledge as a cultural, practical, and material affair. Specifically, we analyze the activities de...

  18. Scalable Bayesian modeling, monitoring and analysis of dynamic network flow data

    OpenAIRE

    2016-01-01

    Traffic flow count data in networks arise in many applications, such as automobile or aviation transportation, certain directed social network contexts, and Internet studies. Using an example of Internet browser traffic flow through site-segments of an international news website, we present Bayesian analyses of two linked classes of models which, in tandem, allow fast, scalable and interpretable Bayesian inference. We first develop flexible state-space models for streaming count data, able to...

  19. A Fast C++ Implementation of Neural Network Backpropagation Training Algorithm: Application to Bayesian Optimal Image Demosaicing

    Directory of Open Access Journals (Sweden)

    Yi-Qing Wang

    2015-09-01

    Full Text Available Recent years have seen a surge of interest in multilayer neural networks fueled by their successful applications in numerous image processing and computer vision tasks. In this article, we describe a C++ implementation of the stochastic gradient descent to train a multilayer neural network, where a fast and accurate acceleration of tanh(· is achieved with linear interpolation. As an example of application, we present a neural network able to deliver state-of-the-art performance in image demosaicing.

  20. A comparison of Bayesian and Monte Carlo sensitivity analysis for unmeasured confounding.

    Science.gov (United States)

    McCandless, Lawrence C; Gustafson, Paul

    2017-04-06

    Bias from unmeasured confounding is a persistent concern in observational studies, and sensitivity analysis has been proposed as a solution. In the recent years, probabilistic sensitivity analysis using either Monte Carlo sensitivity analysis (MCSA) or Bayesian sensitivity analysis (BSA) has emerged as a practical analytic strategy when there are multiple bias parameters inputs. BSA uses Bayes theorem to formally combine evidence from the prior distribution and the data. In contrast, MCSA samples bias parameters directly from the prior distribution. Intuitively, one would think that BSA and MCSA ought to give similar results. Both methods use similar models and the same (prior) probability distributions for the bias parameters. In this paper, we illustrate the surprising finding that BSA and MCSA can give very different results. Specifically, we demonstrate that MCSA can give inaccurate uncertainty assessments (e.g. 95% intervals) that do not reflect the data's influence on uncertainty about unmeasured confounding. Using a data example from epidemiology and simulation studies, we show that certain combinations of data and prior distributions can result in dramatic prior-to-posterior changes in uncertainty about the bias parameters. This occurs because the application of Bayes theorem in a non-identifiable model can sometimes rule out certain patterns of unmeasured confounding that are not compatible with the data. Consequently, the MCSA approach may give 95% intervals that are either too wide or too narrow and that do not have 95% frequentist coverage probability. Based on our findings, we recommend that analysts use BSA for probabilistic sensitivity analysis. Copyright © 2017 John Wiley & Sons, Ltd.

  1. CHARA/MIRC observations of two M supergiants in Perseus OB1: Temperature, bayesian modeling, and compressed sensing imaging

    Energy Technology Data Exchange (ETDEWEB)

    Baron, F.; Monnier, J. D.; Anderson, M.; Aarnio, A. [Department of Astronomy, University of Michigan, 918 Dennison Building, Ann Arbor, MI 48109-1090 (United States); Kiss, L. L. [Sydney Institute for Astrophysics, School of Physics, University of Sydney, NSW 2006 (Australia); Neilson, H. R. [Department of Physics and Astronomy, East Tennessee State University, Box 70652, Johnson City, TN 37614 (United States); Zhao, M. [Department of Astronomy and Astrophysics, Penn State University, University Park, PA 16802 (United States); Pedretti, E.; Thureau, N. [Department of Physics and Astronomy, University of St. Andrews (United Kingdom); Ten Brummelaar, T. A.; Sturmann, J.; Sturmann, L.; Turner, N. [The CHARA Array, Georgia State University, P.O. Box 3965, Atlanta, GA 30302-3965 (United States); Ridgway, S. T. [National Optical Astronomy Observatory, Tucson, AZ 85726-6732 (United States); McAlister, H. A., E-mail: baron@phy-astr.gsu.edu [CHARA and Department of Physics and Astronomy, Georgia State University, P. O. Box 4106, Atlanta, GA 30302-4106 (United States)

    2014-04-10

    Two red supergiants (RSGs) of the Per OB1 association, RS Per and T Per, have been observed in the H band using the Michigan Infra-Red Combiner (MIRC) instrument at the CHARA array. The data show clear evidence of a departure from circular symmetry. We present here new techniques specially developed to analyze such cases, based on state-of-the-art statistical frameworks. The stellar surfaces are first modeled as limb-darkened disks based on SATLAS models that fit both MIRC interferometric data and publicly available spectrophotometric data. Bayesian model selection is then used to determine the most probable number of spots. The effective surface temperatures are also determined and give further support to the recently derived hotter temperature scales of RSGs. The stellar surfaces are reconstructed by our model-independent imaging code SQUEEZE, making use of its novel regularizer based on Compressed Sensing theory. We find excellent agreement between the model-selection results and the reconstructions. Our results provide evidence for the presence of near-infrared spots representing about 3%-5% of the stellar flux.

  2. Stochastic geometry for image analysis

    CERN Document Server

    Descombes, Xavier

    2013-01-01

    This book develops the stochastic geometry framework for image analysis purpose. Two main frameworks are  described: marked point process and random closed sets models. We derive the main issues for defining an appropriate model. The algorithms for sampling and optimizing the models as well as for estimating parameters are reviewed.  Numerous applications, covering remote sensing images, biological and medical imaging, are detailed.  This book provides all the necessary tools for developing an image analysis application based on modern stochastic modeling.

  3. Development of Hierarchical Bayesian Model Based on Regional Frequency Analysis and Its Application to Estimate Areal Rainfall in South Korea

    Science.gov (United States)

    Kim, J.; Kwon, H. H.

    2014-12-01

    The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, This study aims to develop a hierarchical Bayesian model based regional frequency analysis in that spatial patterns of the design rainfall with geographical information are explicitly incorporated. This study assumes that the parameters of Gumbel distribution are a function of geographical characteristics (e.g. altitude, latitude and longitude) within a general linear regression framework. Posterior distributions of the regression parameters are estimated by Bayesian Markov Chain Monte Calro (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the Gumbel distribution by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Acknowledgement: This research was supported by a grant (14AWMP-B079364-01) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  4. Paraxial ghost image analysis

    Science.gov (United States)

    Abd El-Maksoud, Rania H.; Sasian, José M.

    2009-08-01

    This paper develops a methodology to model ghost images that are formed by two reflections between the surfaces of a multi-element lens system in the paraxial regime. An algorithm is presented to generate the ghost layouts from the nominal layout. For each possible ghost layout, paraxial ray tracing is performed to determine the ghost Gaussian cardinal points, the size of the ghost image at the nominal image plane, the location and diameter of the ghost entrance and exit pupils, and the location and diameter for the ghost entrance and exit windows. The paraxial ghost irradiance point spread function is obtained by adding up the irradiance contributions for all ghosts. Ghost simulation results for a simple lens system are provided. This approach provides a quick way to analyze ghost images in the paraxial regime.

  5. Mobile real-time EEG imaging Bayesian inference with sparse, temporally smooth source priors

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Hansen, Sofie Therese; Stahlhut, Carsten

    2013-01-01

    EEG based real-time imaging of human brain function has many potential applications including quality control, in-line experimental design, brain state decoding, and neuro-feedback. In mobile applications these possibilities are attractive as elements in systems for personal state monitoring...

  6. A Bayesian approach for three-dimensional markerless tumor tracking using kV imaging during lung radiotherapy

    Science.gov (United States)

    Shieh, Chun-Chien; Caillet, Vincent; Dunbar, Michelle; Keall, Paul J.; Booth, Jeremy T.; Hardcastle, Nicholas; Haddad, Carol; Eade, Thomas; Feain, Ilana

    2017-04-01

    The ability to monitor tumor motion without implanted markers can potentially enable broad access to more accurate and precise lung radiotherapy. A major challenge is that kilovoltage (kV) imaging based methods are rarely able to continuously track the tumor due to the inferior tumor visibility on 2D kV images. Another challenge is the estimation of 3D tumor position based on only 2D imaging information. The aim of this work is to address both challenges by proposing a Bayesian approach for markerless tumor tracking for the first time. The proposed approach adopts the framework of the extended Kalman filter, which combines a prediction and measurement steps to make the optimal tumor position update. For each imaging frame, the tumor position is first predicted by a respiratory-correlated model. The 2D tumor position on the kV image is then measured by template matching. Finally, the prediction and 2D measurement are combined based on the 3D distribution of tumor positions in the past 10 s and the estimated uncertainty of template matching. To investigate the clinical feasibility of the proposed method, a total of 13 lung cancer patient datasets were used for retrospective validation, including 11 cone-beam CT scan pairs and two stereotactic ablative body radiotherapy cases. The ground truths for tumor motion were generated from the the 3D trajectories of implanted markers or beacons. The mean, standard deviation, and 95th percentile of the 3D tracking error were found to range from 1.6-2.9 mm, 0.6-1.5 mm, and 2.6-5.8 mm, respectively. Markerless tumor tracking always resulted in smaller errors compared to the standard of care. The improvement was the most pronounced in the superior-inferior (SI) direction, with up to 9.5 mm reduction in the 95th-percentile SI error for patients with  >10 mm 5th-to-95th percentile SI tumor motion. The percentage of errors with 3D magnitude  <5 mm was 96.5% for markerless tumor tracking and 84.1% for the

  7. Applying Image Matching to Video Analysis

    Science.gov (United States)

    2010-09-01

    Database of Spent Cartridge Cases of Firearms". Forensic Science International . Page(s) 97-106. 2001. 21: Birchfield, S. "Derivation of Kanade-Lucas-Tomasi...Ortega-Garcia, J. "Bayesian Analysis of Fingerprint, Face and Signature Evidences with Automatic Biometric Systems". Forensic Science International . Vol

  8. Cyclist activity and injury risk analysis at signalized intersections: a Bayesian modelling approach.

    Science.gov (United States)

    Strauss, Jillian; Miranda-Moreno, Luis F; Morency, Patrick

    2013-10-01

    This study proposes a two-equation Bayesian modelling approach to simultaneously study cyclist injury occurrence and bicycle activity at signalized intersections as joint outcomes. This approach deals with the potential presence of endogeneity and unobserved heterogeneities and is used to identify factors associated with both cyclist injuries and volumes. Its application to identify high-risk corridors is also illustrated. Montreal, Quebec, Canada is the application environment, using an extensive inventory of a large sample of signalized intersections containing disaggregate motor-vehicle traffic volumes and bicycle flows, geometric design, traffic control and built environment characteristics in the vicinity of the intersections. Cyclist injury data for the period of 2003-2008 is used in this study. Also, manual bicycle counts were standardized using temporal and weather adjustment factors to obtain average annual daily volumes. Results confirm and quantify the effects of both bicycle and motor-vehicle flows on cyclist injury occurrence. Accordingly, more cyclists at an intersection translate into more cyclist injuries but lower injury rates due to the non-linear association between bicycle volume and injury occurrence. Furthermore, the results emphasize the importance of turning motor-vehicle movements. The presence of bus stops and total crosswalk length increase cyclist injury occurrence whereas the presence of a raised median has the opposite effect. Bicycle activity through intersections was found to increase as employment, number of metro stations, land use mix, area of commercial land use type, length of bicycle facilities and the presence of schools within 50-800 m of the intersection increase. Intersections with three approaches are expected to have fewer cyclists than those with four. Using Bayesian analysis, expected injury frequency and injury rates were estimated for each intersection and used to rank corridors. Corridors with high bicycle volumes

  9. No control genes required: Bayesian analysis of qRT-PCR data.

    Directory of Open Access Journals (Sweden)

    Mikhail V Matz

    Full Text Available BACKGROUND: Model-based analysis of data from quantitative reverse-transcription PCR (qRT-PCR is potentially more powerful and versatile than traditional methods. Yet existing model-based approaches cannot properly deal with the higher sampling variances associated with low-abundant targets, nor do they provide a natural way to incorporate assumptions about the stability of control genes directly into the model-fitting process. RESULTS: In our method, raw qPCR data are represented as molecule counts, and described using generalized linear mixed models under Poisson-lognormal error. A Markov Chain Monte Carlo (MCMC algorithm is used to sample from the joint posterior distribution over all model parameters, thereby estimating the effects of all experimental factors on the expression of every gene. The Poisson-based model allows for the correct specification of the mean-variance relationship of the PCR amplification process, and can also glean information from instances of no amplification (zero counts. Our method is very flexible with respect to control genes: any prior knowledge about the expected degree of their stability can be directly incorporated into the model. Yet the method provides sensible answers without such assumptions, or even in the complete absence of control genes. We also present a natural Bayesian analogue of the "classic" analysis, which uses standard data pre-processing steps (logarithmic transformation and multi-gene normalization but estimates all gene expression changes jointly within a single model. The new methods are considerably more flexible and powerful than the standard delta-delta Ct analysis based on pairwise t-tests. CONCLUSIONS: Our methodology expands the applicability of the relative-quantification analysis protocol all the way to the lowest-abundance targets, and provides a novel opportunity to analyze qRT-PCR data without making any assumptions concerning target stability. These procedures have been

  10. JAM: A Scalable Bayesian Framework for Joint Analysis of Marginal SNP Effects.

    Science.gov (United States)

    Newcombe, Paul J; Conti, David V; Richardson, Sylvia

    2016-04-01

    Recently, large scale genome-wide association study (GWAS) meta-analyses have boosted the number of known signals for some traits into the tens and hundreds. Typically, however, variants are only analysed one-at-a-time. This complicates the ability of fine-mapping to identify a small set of SNPs for further functional follow-up. We describe a new and scalable algorithm, joint analysis of marginal summary statistics (JAM), for the re-analysis of published marginal summary statistics under joint multi-SNP models. The correlation is accounted for according to estimates from a reference dataset, and models and SNPs that best explain the complete joint pattern of marginal effects are highlighted via an integrated Bayesian penalized regression framework. We provide both enumerated and Reversible Jump MCMC implementations of JAM and present some comparisons of performance. In a series of realistic simulation studies, JAM demonstrated identical performance to various alternatives designed for single region settings. In multi-region settings, where the only multivariate alternative involves stepwise selection, JAM offered greater power and specificity. We also present an application to real published results from MAGIC (meta-analysis of glucose and insulin related traits consortium) - a GWAS meta-analysis of more than 15,000 people. We re-analysed several genomic regions that produced multiple significant signals with glucose levels 2 hr after oral stimulation. Through joint multivariate modelling, JAM was able to formally rule out many SNPs, and for one gene, ADCY5, suggests that an additional SNP, which transpired to be more biologically plausible, should be followed up with equal priority to the reported index.

  11. Bayesian Analysis for Food-Safety Risk Assessment: Evaluation of Dose-Response Functions within WinBUGS

    OpenAIRE

    Williams, Michael S.; Ebel, Eric D.; Jennifer A Hoeting

    2011-01-01

    Bayesian methods are becoming increasingly popular in the field of food-safety risk assessment. Risk assessment models often require the integration of a dose-response function over the distribution of all possible doses of a pathogen ingested with a specific food. This requires the evaluation of an integral for every sample for a Markov chain Monte Carlo analysis of a model. While many statistical software packages have functions that allow for the evaluation of the integral, this functional...

  12. Genotype-Based Bayesian Analysis of Gene-Environment Interactions with Multiple Genetic Markers and Misclassification in Environmental Factors

    OpenAIRE

    Iryna Lobach; Ruzong Fan

    2012-01-01

    A key component to understanding etiology of complex diseases, such as cancer, diabetes, alcohol dependence, is to investigate gene-environment interactions. This work is motivated by the following two concerns in the analysis of gene-environment interactions. First, multiple genetic markers in moderate linkage disequilibrium may be involved in susceptibility to a complex disease. Second, environmental factors may be subject to misclassification. We develop a genotype based Bayesian pseudolik...

  13. Epileptic Seizure Detection Using Lacunarity and Bayesian Linear Discriminant Analysis in Intracranial EEG.

    Science.gov (United States)

    Zhou, Weidong; Liu, Yinxia; Yuan, Qi; Li, Xueli

    2013-12-01

    Automatic seizure detection plays an important role in long-term epilepsy monitoring, and seizure detection algorithms have been intensively investigated over the years. This paper proposes an algorithm for seizure detection using lacunarity and Bayesian linear discriminant analysis (BLDA) in long-term intracranial EEG. Lacunarity is a measure of heterogeneity for a fractal. The proposed method first conducts wavelet decomposition on EEGs with five scales, and selects the wavelet coefficients at scale 3, 4, and 5 for subsequent processing. Effective features including lacunarity and fluctuation index are extracted from the selected three scales, and then sent into the BLDA for training and classification. Finally, postprocessing which includes smoothing, threshold judgment, multichannels integration, and collar technique is applied to obtain high sensitivity and low false detection rate. The proposed algorithm is evaluated on 289.14 h intracranial EEG data from 21-patient Freiburg dataset and yields a sensitivity of 96.25% and a false detection rate of 0.13/h with a mean delay time of 13.8 s.

  14. To be certain about the uncertainty: Bayesian statistics for (13) C metabolic flux analysis.

    Science.gov (United States)

    Theorell, Axel; Leweke, Samuel; Wiechert, Wolfgang; Nöh, Katharina

    2017-07-11

    (13) C Metabolic Fluxes Analysis ((13) C MFA) remains to be the most powerful approach to determine intracellular metabolic reaction rates. Decisions on strain engineering and experimentation heavily rely upon the certainty with which these fluxes are estimated. For uncertainty quantification, the vast majority of (13) C MFA studies relies on confidence intervals from the paradigm of Frequentist statistics. However, it is well known that the confidence intervals for a given experimental outcome are not uniquely defined. As a result, confidence intervals produced by different methods can be different, but nevertheless equally valid. This is of high relevance to (13) C MFA, since practitioners regularly use three different approximate approaches for calculating confidence intervals. By means of a computational study with a realistic model of the central carbon metabolism of E. coli, we provide strong evidence that confidence intervals used in the field depend strongly on the technique with which they were calculated and, thus, their use leads to misinterpretation of the flux uncertainty. In order to provide a better alternative to confidence intervals in (13) C MFA, we demonstrate that credible intervals from the paradigm of Bayesian statistics give more reliable flux uncertainty quantifications which can be readily computed with high accuracy using Markov chain Monte Carlo. In addition, the widely applied chi-square test, as a means of testing whether the model reproduces the data, is examined closer. © 2017 Wiley Periodicals, Inc.

  15. Bayesian model accounting for within-class biological variability in Serial Analysis of Gene Expression (SAGE

    Directory of Open Access Journals (Sweden)

    Brentani Helena

    2004-08-01

    Full Text Available Abstract Background An important challenge for transcript counting methods such as Serial Analysis of Gene Expression (SAGE, "Digital Northern" or Massively Parallel Signature Sequencing (MPSS, is to carry out statistical analyses that account for the within-class variability, i.e., variability due to the intrinsic biological differences among sampled individuals of the same class, and not only variability due to technical sampling error. Results We introduce a Bayesian model that accounts for the within-class variability by means of mixture distribution. We show that the previously available approaches of aggregation in pools ("pseudo-libraries" and the Beta-Binomial model, are particular cases of the mixture model. We illustrate our method with a brain tumor vs. normal comparison using SAGE data from public databases. We show examples of tags regarded as differentially expressed with high significance if the within-class variability is ignored, but clearly not so significant if one accounts for it. Conclusion Using available information about biological replicates, one can transform a list of candidate transcripts showing differential expression to a more reliable one. Our method is freely available, under GPL/GNU copyleft, through a user friendly web-based on-line tool or as R language scripts at supplemental web-site.

  16. Bayesian network meta-analysis comparing five contemporary treatment strategies for newly diagnosed acute promyelocytic leukaemia.

    Science.gov (United States)

    Wu, Fenfang; Wu, Di; Ren, Yong; Duan, Chongyang; Chen, Shangwu; Xu, Anlong

    2016-07-26

    Acute promyelocytic leukemia (APL) is a curable subtype of acute myeloid leukemia. The optimum regimen for newly diagnosed APL remains inconclusive. In this Bayesian network meta-analysis, we compared the effectiveness of five regimens-arsenic trioxide (ATO) + all-trans retinoic acid (ATRA), realgar-indigo naturalis formula (RIF) which contains arsenic tetrasulfide + ATRA, ATRA + anthracycline-based chemotherapy (CT), ATO alone and ATRA alone, based on fourteen randomized controlled trials (RCTs), which included 1407 newly diagnosed APL patients. According to the results, the ranking efficacy of the treatment, including early death and complete remission in the induction stage, was the following: 1. ATO/RIF + ATRA; 2. ATRA + CT; 3. ATO, and 4. ATRA. For long-term benefit, ATO/RIF + ATRA significantly improved overall survival (OS) (hazard ratio = 0.35, 95%CI 0.15-0.82, p = 0.02) and event-free survival (EFS) (hazard ratio = 0.32, 95%CI 0.16-0.61, p = 0.001) over ATRA + CT regimen for the low-to-intermediate-risk patients. Thus, ATO + ATRA and RIF + ATRA might be considered the optimum treatments for the newly diagnosed APL and should be recommended as the standard care for frontline therapy.

  17. Bayesian analysis of spatial-dependent cosmic-ray propagation: astrophysical background of antiprotons and positrons

    CERN Document Server

    Feng, Jie; Oliva, Alberto

    2016-01-01

    The AMS-02 experiment has reported a new measurement of the antiproton/proton ratio in Galactic cosmic rays (CRs). In the energy range $E\\sim\\,$60-450 GeV, this ratio is found to be remarkably constant. Using recent data on CR proton, helium, carbon, 10Be/9Be, and B/C ratio, we have performed a global Bayesian analysis based on a Markov-Chain Monte-Carlo sampling algorithm under a "two halo model" of CR propagation. In this model, CRs are allowed to experience a different type of diffusion when they propagate in the region close of the Galactic disk. We found that the vertical extent of this region is about 900 pc above and below the disk, and the corresponding diffusion coefficient scales with energy as $D\\sim\\,E^{0.15}$, describing well the observations on primary CR spectra, secondary/primary ratios and anisotropy. Under this model we have carried out improved calculations of antiparticle spectra arising from secondary CR production and their corresponding uncertainties. We made use of Monte-Carlo generato...

  18. Bayesian time series analysis of segments of the Rocky Mountain trumpeter swan population

    Science.gov (United States)

    Wright, Christopher K.; Sojda, Richard S.; Goodman, Daniel

    2002-01-01

    A Bayesian time series analysis technique, the dynamic linear model, was used to analyze counts of Trumpeter Swans (Cygnus buccinator) summering in Idaho, Montana, and Wyoming from 1931 to 2000. For the Yellowstone National Park segment of white birds (sub-adults and adults combined) the estimated probability of a positive growth rate is 0.01. The estimated probability of achieving the Subcommittee on Rocky Mountain Trumpeter Swans 2002 population goal of 40 white birds for the Yellowstone segment is less than 0.01. Outside of Yellowstone National Park, Wyoming white birds are estimated to have a 0.79 probability of a positive growth rate with a 0.05 probability of achieving the 2002 objective of 120 white birds. In the Centennial Valley in southwest Montana, results indicate a probability of 0.87 that the white bird population is growing at a positive rate with considerable uncertainty. The estimated probability of achieving the 2002 Centennial Valley objective of 160 white birds is 0.14 but under an alternative model falls to 0.04. The estimated probability that the Targhee National Forest segment of white birds has a positive growth rate is 0.03. In Idaho outside of the Targhee National Forest, white birds are estimated to have a 0.97 probability of a positive growth rate with a 0.18 probability of attaining the 2002 goal of 150 white birds.

  19. Bayesian Modeling of MPSS Data: Gene Expression Analysis of Bovine Salmonella Infection

    KAUST Repository

    Dhavala, Soma S.

    2010-09-01

    Massively Parallel Signature Sequencing (MPSS) is a high-throughput, counting-based technology available for gene expression profiling. It produces output that is similar to Serial Analysis of Gene Expression and is ideal for building complex relational databases for gene expression. Our goal is to compare the in vivo global gene expression profiles of tissues infected with different strains of Salmonella obtained using the MPSS technology. In this article, we develop an exact ANOVA type model for this count data using a zero-inflatedPoisson distribution, different from existing methods that assume continuous densities. We adopt two Bayesian hierarchical models-one parametric and the other semiparametric with a Dirichlet process prior that has the ability to "borrow strength" across related signatures, where a signature is a specific arrangement of the nucleotides, usually 16-21 base pairs long. We utilize the discreteness of Dirichlet process prior to cluster signatures that exhibit similar differential expression profiles. Tests for differential expression are carried out using nonparametric approaches, while controlling the false discovery rate. We identify several differentially expressed genes that have important biological significance and conclude with a summary of the biological discoveries. This article has supplementary materials online. © 2010 American Statistical Association.

  20. Bayesian analysis of diagnostic test accuracy when disease state is unverified for some subjects.

    Science.gov (United States)

    Pennello, Gene A

    2011-09-01

    Studies of the accuracy of medical tests to diagnose the presence or absence of disease can suffer from an inability to verify the true disease state in everyone. When verification is missing at random (MAR), the missing data mechanism can be ignored in likelihood-based inference. However, this assumption may not hold even approximately. When verification is nonignorably missing, the most general model of the distribution of disease state, test result, and verification indicator is overparameterized. Parameters are only partially identified, creating regions of ignorance for maximum likelihood estimators. For studies of a single test, we use Bayesian analysis to implement the most general nonignorable model, a reduced nonignorable model with identifiable parameters, and the MAR model. Simple Gibbs sampling algorithms are derived that enable computation of the posterior distribution of test accuracy parameters. In particular, the posterior distribution is easily obtained for the most general nonignorable model, which makes relatively weak assumptions about the missing data mechanism. For this model, the posterior distribution combines two sources of uncertainty: ignorance in the estimation of partially identified parameters, and imprecision due to finite sampling variability. We compare the three models on data from a study of the accuracy of scintigraphy to diagnose liver disease.

  1. Bayesian linear regression with skew-symmetric error distributions with applications to survival analysis

    KAUST Repository

    Rubio, Francisco J.

    2016-02-09

    We study Bayesian linear regression models with skew-symmetric scale mixtures of normal error distributions. These kinds of models can be used to capture departures from the usual assumption of normality of the errors in terms of heavy tails and asymmetry. We propose a general noninformative prior structure for these regression models and show that the corresponding posterior distribution is proper under mild conditions. We extend these propriety results to cases where the response variables are censored. The latter scenario is of interest in the context of accelerated failure time models, which are relevant in survival analysis. We present a simulation study that demonstrates good frequentist properties of the posterior credible intervals associated with the proposed priors. This study also sheds some light on the trade-off between increased model flexibility and the risk of over-fitting. We illustrate the performance of the proposed models with real data. Although we focus on models with univariate response variables, we also present some extensions to the multivariate case in the Supporting Information.

  2. Integration of Bayesian analysis for eutrophication prediction and assessment in a landscape lake.

    Science.gov (United States)

    Yang, Likun; Zhao, Xinhua; Peng, Sen; Zhou, Guangyu

    2015-01-01

    Eutrophication models have been widely used to assess water quality in landscape lakes. Because flow rate in landscape lakes is relatively low and similar to that of natural lakes, eutrophication is more dominant in landscape lakes. To assess the risk of eutrophication in landscape lakes, a set of dynamic equations was developed to simulate lake water quality for total nitrogen (TN), total phosphorous (TP), dissolve oxygen (DO) and chlorophyll a (Chl a). Firstly, the Bayesian calibration results were described. Moreover, the ability of the model to reproduce adequately the observed mean patterns and major cause-effect relationships for water quality conditions in landscape lakes were presented. Two loading scenarios were used. A Monte Carlo algorithm was applied to calculate the predicated water quality distributions, which were used in the established hierarchical assessment system for lake water quality risk. The important factors affecting the lake water quality risk were defined using linear regression analysis. The results indicated that the variations in the landscape lake receiving recharge water quality caused considerable landscape lake water quality risk in the surrounding area. Moreover, the Chl a concentration in lake water was significantly affected by TP and TN concentrations; the lake TP concentration was the limiting factor for growth of plankton in lake water. The lake water TN concentration provided the basic nutritional requirements. Lastly, lower TN and TP concentrations in the receiving recharge water caused increased lake water quality risk.

  3. Assessment of occupational safety risks in Floridian solid waste systems using Bayesian analysis.

    Science.gov (United States)

    Bastani, Mehrad; Celik, Nurcin

    2015-10-01

    Safety risks embedded within solid waste management systems continue to be a significant issue and are prevalent at every step in the solid waste management process. To recognise and address these occupational hazards, it is necessary to discover the potential safety concerns that cause them, as well as their direct and/or indirect impacts on the different types of solid waste workers. In this research, our goal is to statistically assess occupational safety risks to solid waste workers in the state of Florida. Here, we first review the related standard industrial codes to major solid waste management methods including recycling, incineration, landfilling, and composting. Then, a quantitative assessment of major risks is conducted based on the data collected using a Bayesian data analysis and predictive methods. The risks estimated in this study for the period of 2005-2012 are then compared with historical statistics (1993-1997) from previous assessment studies. The results have shown that the injury rates among refuse collectors in both musculoskeletal and dermal injuries have decreased from 88 and 15 to 16 and three injuries per 1000 workers, respectively. However, a contrasting trend is observed for the injury rates among recycling workers, for whom musculoskeletal and dermal injuries have increased from 13 and four injuries to 14 and six injuries per 1000 workers, respectively. Lastly, a linear regression model has been proposed to identify major elements of the high number of musculoskeletal and dermal injuries.

  4. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  5. Bayesian analysis of spatial-dependent cosmic-ray propagation: Astrophysical background of antiprotons and positrons

    Science.gov (United States)

    Feng, Jie; Tomassetti, Nicola; Oliva, Alberto

    2016-12-01

    The AMS-02 experiment has reported a new measurement of the antiproton/proton ratio in Galactic cosmic rays (CRs). In the energy range E ˜60 - 450 GeV , this ratio is found to be remarkably constant. Using recent data on CR proton, helium, and carbon fluxes, 10Be/9Be and B/C ratios, we have performed a global Bayesian analysis based on a Markov chain Monte Carlo sampling algorithm under a "two halo model" of CR propagation. In this model, CRs are allowed to experience a different type of diffusion when they propagate in the region close to the Galactic disk. We found that the vertical extent of this region is about 900 pc above and below the disk, and the corresponding diffusion coefficient scales with energy as D ∝E0.15 , describing well the observations on primary CR spectra, secondary/primary ratios, and anisotropy. Under this model, we have carried out improved calculations of antiparticle spectra arising from secondary CR production and their corresponding uncertainties. We made use of Monte Carlo generators and accelerator data to assess the antiproton production cross sections and their uncertainties. While the positron excess requires the contribution of additional unknown sources, we found that the new AMS-02 antiproton data are consistent, within the estimated uncertainties, with our calculations based on secondary production.

  6. Bayesian analysis of sparse anisotropic universe models and application to the 5-yr WMAP data

    CERN Document Server

    Groeneboom, Nicolaas E

    2008-01-01

    We extend the previously described CMB Gibbs sampling framework to allow for exact Bayesian analysis of anisotropic universe models, and apply this method to the 5-year WMAP temperature observations. This involves adding support for non-diagonal signal covariance matrices, and implementing a general spectral parameter MCMC sampler. As a worked example we apply these techniques to the model recently introduced by Ackerman et al., describing for instance violations of rotational invariance during the inflationary epoch. After verifying the code with simulated data, we analyze the foreground-reduced 5-year WMAP temperature sky maps. For l < 400 and the W-band data, we find tentative evidence for a preferred direction pointing towards (l,b) = (110 deg, 10 deg) with an anisotropy amplitude of g* = 0.15 +- 0.039, nominally equivalent to a 3.8 sigma detection. Similar results are obtained from the V-band data [g* = 0.11 +- 0.039; (l,b) = (130 deg, 20 deg)]. Further, the preferred direction is stable with respect ...

  7. Bayesian nonparametric regression analysis of data with random effects covariates from longitudinal measurements.

    Science.gov (United States)

    Ryu, Duchwan; Li, Erning; Mallick, Bani K

    2011-06-01

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves.

  8. A Bayesian Approach to the Design and Analysis of Computer Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Currin, C.

    1988-01-01

    We consider the problem of designing and analyzing experiments for prediction of the function y(f), t {element_of} T, where y is evaluated by means of a computer code (typically by solving complicated equations that model a physical system), and T represents the domain of inputs to the code. We use a Bayesian approach, in which uncertainty about y is represented by a spatial stochastic process (random function); here we restrict attention to stationary Gaussian processes. The posterior mean function can be used as an interpolating function, with uncertainties given by the posterior standard deviations. Instead of completely specifying the prior process, we consider several families of priors, and suggest some cross-validational methods for choosing one that performs relatively well on the function at hand. As a design criterion, we use the expected reduction in the entropy of the random vector y (T*), where T* {contained_in} T is a given finite set of ''sites'' (input configurations) at which predictions are to be made. We describe an exchange algorithm for constructing designs that are optimal with respect to this criterion. To demonstrate the use of these design and analysis methods, several examples are given, including one experiment on a computer model of a thermal energy storage device and another on an integrated circuit simulator.

  9. Cross-validation analysis of bias models in Bayesian multi-model projections of climate

    Science.gov (United States)

    Huttunen, J. M. J.; Räisänen, J.; Nissinen, A.; Lipponen, A.; Kolehmainen, V.

    2017-03-01

    Climate change projections are commonly based on multi-model ensembles of climate simulations. In this paper we consider the choice of bias models in Bayesian multimodel predictions. Buser et al. (Clim Res 44(2-3):227-241, 2010a) introduced a hybrid bias model which combines commonly used constant bias and constant relation bias assumptions. The hybrid model includes a weighting parameter which balances these bias models. In this study, we use a cross-validation approach to study which bias model or bias parameter leads to, in a specific sense, optimal climate change projections. The analysis is carried out for summer and winter season means of 2 m-temperatures spatially averaged over the IPCC SREX regions, using 19 model runs from the CMIP5 data set. The cross-validation approach is applied to calculate optimal bias parameters (in the specific sense) for projecting the temperature change from the control period (1961-2005) to the scenario period (2046-2090). The results are compared to the results of the Buser et al. (Clim Res 44(2-3):227-241, 2010a) method which includes the bias parameter as one of the unknown parameters to be estimated from the data.

  10. Factors Influencing Water System Functionality in Nigeria and Tanzania: A Regression and Bayesian Network Analysis.

    Science.gov (United States)

    Cronk, Ryan; Bartram, Jamie

    2017-09-21

    Sufficient, safe, and continuously available water services are important for human development and health yet many water systems in low- and middle-income countries are nonfunctional. Monitoring data were analyzed using regression and Bayesian networks (BNs) to explore factors influencing the functionality of 82 503 water systems in Nigeria and Tanzania. Functionality varied by system type. In Tanzania, Nira handpumps were more functional than Afridev and India Mark II handpumps. Higher functionality was associated with fee collection in Nigeria. In Tanzania, functionality was higher if fees were collected monthly rather than in response to system breakdown. Systems in Nigeria were more likely to be functional if they were used for both human and livestock consumption. In Tanzania, systems managed by private operators were more functional than community-managed systems. The BNs found strong dependencies between functionality and system type and administrative unit (e.g., district). The BNs predicted functionality increased from 68% to 89% in Nigeria and from 53% to 68% in Tanzania when best observed conditions were in place. Improvements to water system monitoring and analysis of monitoring data with different modeling techniques may be useful for identifying water service improvement opportunities and informing evidence-based decision-making for better management, policy, programming, and practice.

  11. Paired Comparison Analysis of the van Baaren Model Using Bayesian Approach with Noninformative Prior

    Directory of Open Access Journals (Sweden)

    Saima Altaf

    2012-03-01

    Full Text Available 800x600 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} One technique being commonly studied these days because of its attractive applications for the comparison of several objects is the method of paired comparisons. This technique permits the ranking of the objects by means of a score, which reflects the merit of the items on a linear scale. The present study is concerned with the Bayesian analysis of a paired comparison model, namely the van Baaren model VI using noninformative uniform prior. For this purpose, the joint posterior distribution for the parameters of the model, their marginal distributions, posterior estimates (means and modes, the posterior probabilities for comparing the two treatment parameters and the predictive probabilities are obtained.

  12. Bayesian analysis of the linear reaction norm model with unknown covariates.

    Science.gov (United States)

    Su, G; Madsen, P; Lund, M S; Sorensen, D; Korsgaard, I R; Jensen, J

    2006-07-01

    The reaction norm model is becoming a popular approach for the analysis of genotype x environment interactions. In a classical reaction norm model, the expression of a genotype in different environments is described as a linear function (a reaction norm) of an environmental gradient or value. An environmental value is typically defined as the mean performance of all genotypes in the environment, which is usually unknown. One approximation is to estimate the mean phenotypic performance in each environment and then treat these estimates as known covariates in the model. However, a more satisfactory alternative is to infer environmental values simultaneously with the other parameters of the model. This study describes a method and its Bayesian Markov Chain Monte Carlo implementation that makes this possible. Frequentist properties of the proposed method are tested in a simulation study. Estimates of parameters of interest agree well with the true values. Further, inferences about genetic parameters from the proposed method are similar to those derived from a reaction norm model using true environmental values. On the other hand, using phenotypic means as proxies for environmental values results in poor inferences.

  13. A Bayesian parameter estimation approach to pulsar time-of-arrival analysis

    CERN Document Server

    Messenger, C; Demorest, P; Ransom, S

    2011-01-01

    The increasing sensitivities of pulsar timing arrays to ultra-low frequency (nHz) gravitational waves promises to achieve direct gravitational wave detection within the next 5-10 years. While there are many parallel efforts being made in the improvement of telescope sensitivity, the detection of stable millisecond pulsars and the improvement of the timing software, there are reasons to believe that the methods used to accurately determine the time-of-arrival (TOA) of pulses from radio pulsars can be improved upon. More specifically, the determination of the uncertainties on these TOAs, which strongly affect the ability to detect GWs through pulsar timing, may be unreliable. We propose two Bayesian methods for the generation of pulsar TOAs starting from pulsar "search-mode" data and pre-folded data. These methods are applied to simulated toy-model examples and in this initial work we focus on the issue of uncertainties in the folding period. The final results of our analysis are expressed in the form of poster...

  14. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  15. Regularization and Bayesian methods for inverse problems in signal and image processing

    CERN Document Server

    Giovannelli , Jean-François

    2015-01-01

    The focus of this book is on "ill-posed inverse problems". These problems cannot be solved only on the basis of observed data. The building of solutions involves the recognition of other pieces of a priori information. These solutions are then specific to the pieces of information taken into account. Clarifying and taking these pieces of information into account is necessary for grasping the domain of validity and the field of application for the solutions built.  For too long, the interest in these problems has remained very limited in the signal-image community. However, the community has si

  16. Image Analysis for Tongue Characterization

    Institute of Scientific and Technical Information of China (English)

    SHENLansun; WEIBaoguo; CAIYiheng; ZHANGXinfeng; WANGYanqing; CHENJing; KONGLingbiao

    2003-01-01

    Tongue diagnosis is one of the essential methods in traditional Chinese medical diagnosis. The ac-curacy of tongue diagnosis can be improved by tongue char-acterization. This paper investigates the use of image anal-ysis techniques for tongue characterization by evaluating visual features obtained from images. A tongue imaging and analysis instrument (TIAI) was developed to acquire digital color tongue images. Several novel approaches are presented for color calibration, tongue area segmentation,quantitative analysis and qualitative description for the colors of tongue and its coating, the thickness and moisture of coating and quantification of the cracks of the toilgue.The overall accuracy of the automatic analysis of the colors of tongue and the thickness of tongue coating exceeds 85%.This work shows the promising future of tongue character-ization.

  17. A Bayesian SIRS model for the analysis of respiratory syncytial virus in the region of Valencia, Spain.

    Science.gov (United States)

    Corberán-Vallet, Ana; Santonja, Francisco J

    2014-09-01

    We present a Bayesian stochastic susceptible-infected-recovered-susceptible (SIRS) model in discrete time to understand respiratory syncytial virus dynamics in the region of Valencia, Spain. A SIRS model based on ordinary differential equations has also been proposed to describe RSV dynamics in the region of Valencia. However, this continuous-time deterministic model is not suitable when the initial number of infected individuals is small. Stochastic epidemic models based on a probability of disease transmission provide a more natural description of the spread of infectious diseases. In addition, by allowing the transmission rate to vary stochastically over time, the proposed model provides an improved description of RSV dynamics. The Bayesian analysis of the model allows us to calculate both the posterior distribution of the model parameters and the posterior predictive distribution, which facilitates the computation of point forecasts and prediction intervals for future observations. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Flightspeed Integral Image Analysis Toolkit

    Science.gov (United States)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  19. A label field fusion bayesian model and its penalized maximum rand estimator for image segmentation.

    Science.gov (United States)

    Mignotte, Max

    2010-06-01

    This paper presents a novel segmentation approach based on a Markov random field (MRF) fusion model which aims at combining several segmentation results associated with simpler clustering models in order to achieve a more reliable and accurate segmentation result. The proposed fusion model is derived from the recently introduced probabilistic Rand measure for comparing one segmentation result to one or more manual segmentations of the same image. This non-parametric measure allows us to easily derive an appealing fusion model of label fields, easily expressed as a Gibbs distribution, or as a nonstationary MRF model defined on a complete graph. Concretely, this Gibbs energy model encodes the set of binary constraints, in terms of pairs of pixel labels, provided by each segmentation results to be fused. Combined with a prior distribution, this energy-based Gibbs model also allows for definition of an interesting penalized maximum probabilistic rand estimator with which the fusion of simple, quickly estimated, segmentation results appears as an interesting alternative to complex segmentation models existing in the literature. This fusion framework has been successfully applied on the Berkeley image database. The experiments reported in this paper demonstrate that the proposed method is efficient in terms of visual evaluation and quantitative performance measures and performs well compared to the best existing state-of-the-art segmentation methods recently proposed in the literature.

  20. Light-sheet Bayesian microscopy enables deep-cell super-resolution imaging of heterochromatin in live human embryonic stem cells

    Science.gov (United States)

    Hu, Ying S; Zhu, Quan; Elkins, Keri; Tse, Kevin; Li, Yu; Fitzpatrick, James A J; Verma, Inder M; Cang, Hu

    2016-01-01

    Background Heterochromatin in the nucleus of human embryonic cells plays an important role in the epigenetic regulation of gene expression. The architecture of heterochromatin and its dynamic organization remain elusive because of the lack of fast and high-resolution deep-cell imaging tools. We enable this task by advancing instrumental and algorithmic implementation of the localization-based super-resolution technique. Results We present light-sheet Bayesian super-resolution microscopy (LSBM). We adapt light-sheet illumination for super-resolution imaging by using a novel prism-coupled condenser design to illuminate a thin slice of the nucleus with high signal-to-noise ratio. Coupled with a Bayesian algorithm that resolves overlapping fluorophores from high-density areas, we show, for the first time, nanoscopic features of the heterochromatin structure in both fixed and live human embryonic stem cells. The enhanced temporal resolution allows capturing the dynamic change of heterochromatin with a lateral resolution of 50–60 nm on a time scale of 2.3 s. Conclusion Light-sheet Bayesian microscopy opens up broad new possibilities of probing nanometer-scale nuclear structures and real-time sub-cellular processes and other previously difficult-to-access intracellular regions of living cells at the single-molecule, and single cell level.