Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Bayesian Exploratory Factor Analysis
DEFF Research Database (Denmark)
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Evaluation of Bayesian tensor estimation using tensor coherence
Energy Technology Data Exchange (ETDEWEB)
Kim, Dae-Jin; Park, Hae-Jeong [Laboratory of Molecular Neuroimaging Technology, Brain Korea 21 Project for Medical Science, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, In-Young [Department of Biomedical Engineering, Hanyang University, Seoul (Korea, Republic of); Jeong, Seok-Oh [Department of Statistics, Hankuk University of Foreign Studies, Yongin (Korea, Republic of)], E-mail: parkhj@yuhs.ac
2009-06-21
Fiber tractography, a unique and non-invasive method to estimate axonal fibers within white matter, constructs the putative streamlines from diffusion tensor MRI by interconnecting voxels according to the propagation direction defined by the diffusion tensor. This direction has uncertainties due to the properties of underlying fiber bundles, neighboring structures and image noise. Therefore, robust estimation of the diffusion direction is essential to reconstruct reliable fiber pathways. For this purpose, we propose a tensor estimation method using a Bayesian framework, which includes an a priori probability distribution based on tensor coherence indices, to utilize both the neighborhood direction information and the inertia moment as regularization terms. The reliability of the proposed tensor estimation was evaluated using Monte Carlo simulations in terms of accuracy and precision with four synthetic tensor fields at various SNRs and in vivo human data of brain and calf muscle. Proposed Bayesian estimation demonstrated the relative robustness to noise and the higher reliability compared to the simple tensor regression.
Bayesian Analysis of Experimental Data
Directory of Open Access Journals (Sweden)
Lalmohan Bhar
2013-10-01
Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.
Bayesian analysis of rare events
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
Bayesian Group Factor Analysis
Virtanen, Seppo; Klami, Arto; Khan, Suleiman A; Kaski, Samuel
2011-01-01
We introduce a factor analysis model that summarizes the dependencies between observed variable groups, instead of dependencies between individual variables as standard factor analysis does. A group may correspond to one view of the same set of objects, one of many data sets tied by co-occurrence, or a set of alternative variables collected from statistics tables to measure one property of interest. We show that by assuming group-wise sparse factors, active in a subset of the sets, the variat...
Multiview Bayesian Correlated Component Analysis
DEFF Research Database (Denmark)
Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai
2015-01-01
Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks...... are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....
Bayesian analysis of exoplanet and binary orbits
Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas
2012-01-01
We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.
Bayesian analysis of volcanic eruptions
Ho, Chih-Hsiang
1990-10-01
The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.
Subjective Bayesian Analysis: Principles and Practice
Goldstein, Michael
2006-01-01
We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we consider possible modifications to the Bayesian approach from a subjectivist viewpoint. Finally, we address the issue of pragmatism in implementing the subjectivist approach.
Bayesian Analysis of Multivariate Probit Models
Siddhartha Chib; Edward Greenberg
1996-01-01
This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...
Bayesian analysis of contingency tables
Gómez Villegas, Miguel A.; González Pérez, Beatriz
2005-01-01
The display of the data by means of contingency tables is used in different approaches to statistical inference, for example, to broach the test of homogeneity of independent multinomial distributions. We develop a Bayesian procedure to test simple null hypotheses versus bilateral alternatives in contingency tables. Given independent samples of two binomial distributions and taking a mixed prior distribution, we calculate the posterior probability that the proportion of successes in the first...
Bayesian analysis of cosmic structures
Kitaura, Francisco-Shu
2011-01-01
We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...
A Bayesian nonparametric meta-analysis model.
Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G
2015-03-01
In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.
A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research
Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, t
Book review: Bayesian analysis for population ecology
Link, William A.
2011-01-01
Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)
Bayesian Analysis of Individual Level Personality Dynamics
Directory of Open Access Journals (Sweden)
Edward Cripps
2016-07-01
Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively ﬁxed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the beneﬁts of Bayesian techniques for the analysis of within-person processes. These include more formal speciﬁcation of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.
Bayesian Analysis of Individual Level Personality Dynamics
Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann
2016-01-01
A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415
Bayesian Analysis of Individual Level Personality Dynamics.
Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann
2016-01-01
A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415
Bayesian Inference in Statistical Analysis
Box, George E P
2011-01-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Rob
Bayesian Analysis of Type Ia Supernova Data
Institute of Scientific and Technical Information of China (English)
王晓峰; 周旭; 李宗伟; 陈黎
2003-01-01
Recently, the distances to type Ia supernova (SN Ia) at z ～ 0.5 have been measured with the motivation of estimating cosmological parameters. However, different sleuthing techniques tend to give inconsistent measurements for SN Ia distances (～0.3 mag), which significantly affects the determination of cosmological parameters.A Bayesian "hyper-parameter" procedure is used to analyse jointly the current SN Ia data, which considers the relative weights of different datasets. For a flat Universe, the combining analysis yields ΩM = 0.20 ± 0.07.
Bayesian global analysis of neutrino oscillation data
Bergstrom, Johannes; Maltoni, Michele; Schwetz, Thomas
2015-01-01
We perform a Bayesian analysis of current neutrino oscillation data. When estimating the oscillation parameters we find that the results generally agree with those of the $\\chi^2$ method, with some differences involving $s_{23}^2$ and CP-violating effects. We discuss the additional subtleties caused by the circular nature of the CP-violating phase, and how it is possible to obtain correlation coefficients with $s_{23}^2$. When performing model comparison, we find that there is no significant evidence for any mass ordering, any octant of $s_{23}^2$ or a deviation from maximal mixing, nor the presence of CP-violation.
Doing bayesian data analysis a tutorial with R and BUGS
Kruschke, John K
2011-01-01
There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all
Type Ia Supernova Light Curve Inference: Hierarchical Bayesian Analysis in the Near Infrared
Mandel, Kaisey S; Friedman, Andrew S; Kirshner, Robert P
2009-01-01
We present a comprehensive statistical analysis of the properties of Type Ia SN light curves in the near infrared using recent data from PAIRITEL and the literature. We construct a hierarchical Bayesian framework, incorporating several uncertainties including photometric error, peculiar velocities, dust extinction and intrinsic variations, for coherent statistical inference. SN Ia light curve inferences are drawn from the global posterior probability of parameters describing both individual supernovae and the population conditioned on the entire SN Ia NIR dataset. The logical structure of the hierarchical Bayesian model is represented by a directed acyclic graph. Fully Bayesian analysis of the model and data is enabled by an efficient MCMC algorithm exploiting the conditional structure using Gibbs sampling. We apply this framework to the JHK_s SN Ia light curve data. A new light curve model captures the observed J-band light curve shape variations. The intrinsic variances in peak absolute magnitudes are: sigm...
Bayesian networks as a tool for epidemiological systems analysis
Lewis, F.I.
2012-01-01
Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter ...
Bayesian Analysis of High Dimensional Classification
Mukhopadhyay, Subhadeep; Liang, Faming
2009-12-01
Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.
The bugs book a practical introduction to Bayesian analysis
Lunn, David; Best, Nicky; Thomas, Andrew; Spiegelhalter, David
2012-01-01
Introduction: Probability and ParametersProbabilityProbability distributionsCalculating properties of probability distributionsMonte Carlo integrationMonte Carlo Simulations Using BUGSIntroduction to BUGSDoodleBUGSUsing BUGS to simulate from distributionsTransformations of random variablesComplex calculations using Monte CarloMultivariate Monte Carlo analysisPredictions with unknown parametersIntroduction to Bayesian InferenceBayesian learningPosterior predictive distributionsConjugate Bayesian inferenceInference about a discrete parameterCombinations of conjugate analysesBayesian and classica
BEAST: Bayesian evolutionary analysis by sampling trees
Directory of Open Access Journals (Sweden)
Drummond Alexei J
2007-11-01
Full Text Available Abstract Background The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models suitable for both within- and between-species sequence data are implemented. Results BEAST version 1.4.6 consists of 81000 lines of Java source code, 779 classes and 81 packages. It provides models for DNA and protein sequence evolution, highly parametric coalescent analysis, relaxed clock phylogenetics, non-contemporaneous sequence data, statistical alignment and a wide range of options for prior distributions. BEAST source code is object-oriented, modular in design and freely available at http://beast-mcmc.googlecode.com/ under the GNU LGPL license. Conclusion BEAST is a powerful and flexible evolutionary analysis package for molecular sequence variation. It also provides a resource for the further development of new models and statistical methods of evolutionary analysis.
Bayesian data analysis in population ecology: motivations, methods, and benefits
Dorazio, Robert
2016-01-01
During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.
Bayesian analysis for EMP damaged function based on Weibull distribution
International Nuclear Information System (INIS)
Weibull distribution is one of the most commonly used statistical distribution in EMP vulnerability analysis. In the paper, the EMP damage function based on Weibull distribution of solid state relays was solved by bayesian computation using gibbs sampling algorithm. (authors)
Stochastic back analysis of permeability coefficient using generalized Bayesian method
Institute of Scientific and Technical Information of China (English)
Zheng Guilan; Wang Yuan; Wang Fei; Yang Jian
2008-01-01
Owing to the fact that the conventional deterministic back analysis of the permeability coefficient cannot reflect the uncertainties of parameters, including the hydraulic head at the boundary, the permeability coefficient and measured hydraulic head, a stochastic back analysis taking consideration of uncertainties of parameters was performed using the generalized Bayesian method. Based on the stochastic finite element method (SFEM) for a seepage field, the variable metric algorithm and the generalized Bayesian method, formulas for stochastic back analysis of the permeability coefficient were derived. A case study of seepage analysis of a sluice foundation was performed to illustrate the proposed method. The results indicate that, with the generalized Bayesian method that considers the uncertainties of measured hydraulic head, the permeability coefficient and the hydraulic head at the boundary, both the mean and standard deviation of the permeability coefficient can be obtained and the standard deviation is less than that obtained by the conventional Bayesian method. Therefore, the present method is valid and applicable.
Bayesian Analysis of the Cosmic Microwave Background
Jewell, Jeffrey
2007-01-01
There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.
Objective Bayesian Analysis of Skew- t Distributions
BRANCO, MARCIA D'ELIA
2012-02-27
We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.
A Bayesian Analysis of Spectral ARMA Model
Directory of Open Access Journals (Sweden)
Manoel I. Silvestre Bezerra
2012-01-01
Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.
Bayesian methods for the design and analysis of noninferiority trials.
Gamalo-Siebers, Margaret; Gao, Aijun; Lakshminarayanan, Mani; Liu, Guanghan; Natanegara, Fanni; Railkar, Radha; Schmidli, Heinz; Song, Guochen
2016-01-01
The gold standard for evaluating treatment efficacy of a medical product is a placebo-controlled trial. However, when the use of placebo is considered to be unethical or impractical, a viable alternative for evaluating treatment efficacy is through a noninferiority (NI) study where a test treatment is compared to an active control treatment. The minimal objective of such a study is to determine whether the test treatment is superior to placebo. An assumption is made that if the active control treatment remains efficacious, as was observed when it was compared against placebo, then a test treatment that has comparable efficacy with the active control, within a certain range, must also be superior to placebo. Because of this assumption, the design, implementation, and analysis of NI trials present challenges for sponsors and regulators. In designing and analyzing NI trials, substantial historical data are often required on the active control treatment and placebo. Bayesian approaches provide a natural framework for synthesizing the historical data in the form of prior distributions that can effectively be used in design and analysis of a NI clinical trial. Despite a flurry of recent research activities in the area of Bayesian approaches in medical product development, there are still substantial gaps in recognition and acceptance of Bayesian approaches in NI trial design and analysis. The Bayesian Scientific Working Group of the Drug Information Association provides a coordinated effort to target the education and implementation issues on Bayesian approaches for NI trials. In this article, we provide a review of both frequentist and Bayesian approaches in NI trials, and elaborate on the implementation for two common Bayesian methods including hierarchical prior method and meta-analytic-predictive approach. Simulations are conducted to investigate the properties of the Bayesian methods, and some real clinical trial examples are presented for illustration.
Bayesian analysis of Markov point processes
DEFF Research Database (Denmark)
Berthelsen, Kasper Klitgaard; Møller, Jesper
2006-01-01
Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...
PAC-Bayesian Analysis of Martingales and Multiarmed Bandits
Seldin, Yevgeny; Shawe-Taylor, John; Peters, Jan; Auer, Peter
2011-01-01
We present two alternative ways to apply PAC-Bayesian analysis to sequences of dependent random variables. The first is based on a new lemma that enables to bound expectations of convex functions of certain dependent random variables by expectations of the same functions of independent Bernoulli random variables. This lemma provides an alternative tool to Hoeffding-Azuma inequality to bound concentration of martingale values. Our second approach is based on integration of Hoeffding-Azuma inequality with PAC-Bayesian analysis. We also introduce a way to apply PAC-Bayesian analysis in situation of limited feedback. We combine the new tools to derive PAC-Bayesian generalization and regret bounds for the multiarmed bandit problem. Although our regret bound is not yet as tight as state-of-the-art regret bounds based on other well-established techniques, our results significantly expand the range of potential applications of PAC-Bayesian analysis and introduce a new analysis tool to reinforcement learning and many ...
MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS
Directory of Open Access Journals (Sweden)
Anass BAYAGA
2010-07-01
Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.
Elite Athletes Refine Their Internal Clocks: A Bayesian Analysis.
Chen, Yin-Hua; Verdinelli, Isabella; Cesari, Paola
2016-07-01
This paper carries out a full Bayesian analysis for a data set examined in Chen & Cesari (2015). These data were collected for assessing people's ability in evaluating short intervals of time. Chen & Cesari (2015) showed evidence of the existence of two independent internal clocks for evaluating time intervals below and above the second. We reexamine here, the same question by performing a complete statistical Bayesian analysis of the data. The Bayesian approach can be used to analyze these data thanks to the specific trial design. Data were obtained from evaluation of time ranges from two groups of individuals. More specifically, information gathered from a nontrained group (considered as baseline) allowed us to build a prior distribution for the parameter(s) of interest, and data from the trained group determined the likelihood function. This paper's main goals are (i) showing how the Bayesian inferential method can be used in statistical analyses and (ii) showing that the Bayesian methodology gives additional support to the findings presented in Chen & Cesari (2015) regarding the existence of two internal clocks in assessing duration of time intervals.
Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm
Directory of Open Access Journals (Sweden)
Raj Kumar
2012-12-01
Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.
Nested sampling applied in Bayesian room-acoustics decay analysis.
Jasa, Tomislav; Xiang, Ning
2012-11-01
Room-acoustic energy decays often exhibit single-rate or multiple-rate characteristics in a wide variety of rooms/halls. Both the energy decay order and decay parameter estimation are of practical significance in architectural acoustics applications, representing two different levels of Bayesian probabilistic inference. This paper discusses a model-based sound energy decay analysis within a Bayesian framework utilizing the nested sampling algorithm. The nested sampling algorithm is specifically developed to evaluate the Bayesian evidence required for determining the energy decay order with decay parameter estimates as a secondary result. Taking the energy decay analysis in architectural acoustics as an example, this paper demonstrates that two different levels of inference, decay model-selection and decay parameter estimation, can be cohesively accomplished by the nested sampling algorithm. PMID:23145609
Phycas: software for Bayesian phylogenetic analysis.
Lewis, Paul O; Holder, Mark T; Swofford, David L
2015-05-01
Phycas is open source, freely available Bayesian phylogenetics software written primarily in C++ but with a Python interface. Phycas specializes in Bayesian model selection for nucleotide sequence data, particularly the estimation of marginal likelihoods, central to computing Bayes Factors. Marginal likelihoods can be estimated using newer methods (Thermodynamic Integration and Generalized Steppingstone) that are more accurate than the widely used Harmonic Mean estimator. In addition, Phycas supports two posterior predictive approaches to model selection: Gelfand-Ghosh and Conditional Predictive Ordinates. The General Time Reversible family of substitution models, as well as a codon model, are available, and data can be partitioned with all parameters unlinked except tree topology and edge lengths. Phycas provides for analyses in which the prior on tree topologies allows polytomous trees as well as fully resolved trees, and provides for several choices for edge length priors, including a hierarchical model as well as the recently described compound Dirichlet prior, which helps avoid overly informative induced priors on tree length. PMID:25577605
Bayesian networks as a tool for epidemiological systems analysis
Lewis, F. I.
2012-11-01
Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.
On Bayesian analysis of on-off measurements
Nosek, Dalibor
2016-01-01
We propose an analytical solution to the on-off problem within the framework of Bayesian statistics. Both the statistical significance for the discovery of new phenomena and credible intervals on model parameters are presented in a consistent way. We use a large enough family of prior distributions of relevant parameters. The proposed analysis is designed to provide Bayesian solutions that can be used for any number of observed on-off events, including zero. The procedure is checked using Monte Carlo simulations. The usefulness of the method is demonstrated on examples from gamma-ray astronomy.
Bayesian analysis of log Gaussian Cox processes for disease mapping
DEFF Research Database (Denmark)
Benes, Viktor; Bodlák, Karel; Møller, Jesper;
of the risk on the covariates. Instead of using the common area level approaches we consider a Bayesian analysis for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using markov chain Monte Carlo methods...
Analysis of Technology for Compact Coherent Lidar
Amzajerdian, Farzin
1997-01-01
In view of the recent advances in the area of solid state and semiconductor lasers has created new possibilities for the development of compact and reliable coherent lidars for a wide range of applications. These applications include: Automated Rendezvous and Capture, wind shear and clear air turbulence detection, aircraft wake vortex detection, and automobile collision avoidance. The work performed by the UAH personnel under this Delivery Order, concentrated on design and analyses of a compact coherent lidar system capable of measuring range and velocity of hard targets, and providing air mass velocity data. The following is the scope of this work. a. Investigate various laser sources and optical signal detection configurations in support of a compact and lightweight coherent laser radar to be developed for precision range and velocity measurements of hard and fuzzy targets. Through interaction with MSFC engineers, the most suitable laser source and signal detection technique that can provide a reliable compact and lightweight laser radar design will be selected. b. Analyze and specify the coherent laser radar system configuration and assist with its optical and electronic design efforts. Develop a system design including its optical layout design. Specify all optical components and provide the general requirements of the electronic subsystems including laser beam modulator and demodulator drivers, detector electronic interface, and the signal processor. c. Perform a thorough performance analysis to predict the system measurement range and accuracy. This analysis will utilize various coherent laser radar sensitivity formulations and different target models.
Bayesian analysis of genetic differentiation between populations.
Corander, Jukka; Waldmann, Patrik; Sillanpää, Mikko J
2003-01-01
We introduce a Bayesian method for estimating hidden population substructure using multilocus molecular markers and geographical information provided by the sampling design. The joint posterior distribution of the substructure and allele frequencies of the respective populations is available in an analytical form when the number of populations is small, whereas an approximation based on a Markov chain Monte Carlo simulation approach can be obtained for a moderate or large number of populations. Using the joint posterior distribution, posteriors can also be derived for any evolutionary population parameters, such as the traditional fixation indices. A major advantage compared to most earlier methods is that the number of populations is treated here as an unknown parameter. What is traditionally considered as two genetically distinct populations, either recently founded or connected by considerable gene flow, is here considered as one panmictic population with a certain probability based on marker data and prior information. Analyses of previously published data on the Moroccan argan tree (Argania spinosa) and of simulated data sets suggest that our method is capable of estimating a population substructure, while not artificially enforcing a substructure when it does not exist. The software (BAPS) used for the computations is freely available from http://www.rni.helsinki.fi/~mjs. PMID:12586722
Coherent analysis of quantum optical sideband modes
Huntington, E H; Robilliard, C; Ralph, T C
2005-01-01
We demonstrate a device that allows for the coherent analysis of a pair of optical frequency sidebands in an arbitrary basis. We show that our device is quantum noise limited and hence applications for this scheme may be found in discrete and continuous variable optical quantum information experiments.
Ildikó Ungvári; Gábor Hullám; Péter Antal; Petra Sz Kiszel; András Gézsi; Éva Hadadi; Viktor Virág; Gergely Hajós; András Millinghoffer; Adrienne Nagy; András Kiss; Semsei, Ágnes F.; Gergely Temesi; Béla Melegh; Péter Kisfali
2012-01-01
Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called bayesian network based bayesian multilevel analysis of relevance (BN-BMLA). Th...
Phase Coherence Analysis of Insect Flight
Shchekinova, E Y
2009-01-01
During insect flight a high--frequency wing oscillatory motion is generated. Here we present analysis of kinematic data of tethered Drosophila melanogaster flight. For a reliable detection of specific regimes of frequency variation we propose phase coherence method. The approach is generic and can be applied to nonstationary biological signals that feature multi- scale frequency variations. Our analysis reveals existence of distinct oscillatory modes.
Bayesian phylogeny analysis via stochastic approximation Monte Carlo
Cheon, Sooyoung
2009-11-01
Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.
REMITTANCES, DUTCH DISEASE, AND COMPETITIVENESS: A BAYESIAN ANALYSIS
FARID MAKHLOUF; MAZHAR MUGHAL
2013-01-01
The paper studies symptoms of Dutch disease in the Pakistani economy arising from international remittances. An IV Bayesian analysis is carried out to take care of the endogeneity and uncertainty due to the managed float of Pakistani Rupee. We find evidence for both spending and resource movement effects in both the short and the long-run. These impacts are stronger and different from those the Official Development Assistance and the FDI exert. We find that while aggregate remittances and the...
Optimizing Nuclear Reaction Analysis (NRA) using Bayesian Experimental Design
von Toussaint, U.; Schwarz-Selinger, T.; Gori, S.
2008-01-01
Nuclear Reaction Analysis with ${}^{3}$He holds the promise to measure Deuterium depth profiles up to large depths. However, the extraction of the depth profile from the measured data is an ill-posed inversion problem. Here we demonstrate how Bayesian Experimental Design can be used to optimize the number of measurements as well as the measurement energies to maximize the information gain. Comparison of the inversion properties of the optimized design with standard settings reveals huge possi...
Bayesian-network-based safety risk analysis in construction projects
International Nuclear Information System (INIS)
This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical
Estimation of time delay by coherence analysis
Govindan, R B; Kopper, F; Claussen, J C; Deuschl, G
2004-01-01
Using coherence analysis (which is an extensively used method to study the correlations in frequency domain, between two simultaneously measured signals) we estimate the time delay between two signals. This method is suitable for time delay estimation of narrow band coherence signals for which the conventional methods cannot be reliably applied. We show by analysing coupled R\\"ossler attractors with a known delay, that the method yields satisfactory results. Then, we apply this method to human pathologic tremor. The delay between simultaneously measured traces of Electroencephalogram (EEG) and Electromyogram (EMG) data of subjects with essential hand tremor is calculated. We find that there is a delay of 11-27 milli-seconds ($ms$) between the tremor correlated parts (cortex) of the brain (EEG) and the trembling hand (EMG) which is in agreement with the experimentally observed delay value of 15 $ms$ for the cortico-muscular conduction time. By surrogate analysis we calculate error-bars of the estimated delay.
BaTMAn: Bayesian Technique for Multi-image Analysis
Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I
2016-01-01
This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...
Bayesian tomography and integrated data analysis in fusion diagnostics
Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.
2016-11-01
In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.
Bayesian networks for omics data analysis
Gavai, A.K.
2009-01-01
This thesis focuses on two aspects of high throughput technologies, i.e. data storage and data analysis, in particular in transcriptomics and metabolomics. Both technologies are part of a research field that is generally called ‘omics’ (or ‘-omics’, with a leading hyphen), which refers to genomics,
Coherent Energy and Environmental System Analysis
DEFF Research Database (Denmark)
Hvelplund, Frede Kloster; Mathiesen, Brian Vad; Østergaard, Poul Alberg;
This report presents a summary of results of the strategic research project “Coherent Energy and Environmental System Analysis” (CEESA) which was conducted in the period 2007-2011 and funded by the Danish Strategic Research Council together with the participating parties. The project...... energy and environmental analysis tools as well as analyses of the design and implementation of future renewable energy systems. For practical reasons, the work has been carried out as an interaction between five work packages, and a number of reports, papers and tools have been reported separately from...... of the different project parts in a coherent way by presenting tools and methodologies as well as analyses of the design and implementation of renewable energy systems – including both energy and environmental aspects....
Nonparametric Bayesian Negative Binomial Factor Analysis
Zhou, Mingyuan
2016-01-01
A common approach to analyze an attribute-instance count matrix, an element of which represents how many times an attribute appears in an instance, is to factorize it under the Poisson likelihood. We show its limitation in capturing the tendency for an attribute present in an instance to both repeat itself and excite related ones. To address this limitation, we construct negative binomial factor analysis (NBFA) to factorize the matrix under the negative binomial likelihood, and relate it to a...
Bayesian analysis for extreme climatic events: A review
Chu, Pao-Shin; Zhao, Xin
2011-11-01
This article reviews Bayesian analysis methods applied to extreme climatic data. We particularly focus on applications to three different problems related to extreme climatic events including detection of abrupt regime shifts, clustering tropical cyclone tracks, and statistical forecasting for seasonal tropical cyclone activity. For identifying potential change points in an extreme event count series, a hierarchical Bayesian framework involving three layers - data, parameter, and hypothesis - is formulated to demonstrate the posterior probability of the shifts throughout the time. For the data layer, a Poisson process with a gamma distributed rate is presumed. For the hypothesis layer, multiple candidate hypotheses with different change-points are considered. To calculate the posterior probability for each hypothesis and its associated parameters we developed an exact analytical formula, a Markov Chain Monte Carlo (MCMC) algorithm, and a more sophisticated reversible jump Markov Chain Monte Carlo (RJMCMC) algorithm. The algorithms are applied to several rare event series: the annual tropical cyclone or typhoon counts over the central, eastern, and western North Pacific; the annual extremely heavy rainfall event counts at Manoa, Hawaii; and the annual heat wave frequency in France. Using an Expectation-Maximization (EM) algorithm, a Bayesian clustering method built on a mixture Gaussian model is applied to objectively classify historical, spaghetti-like tropical cyclone tracks (1945-2007) over the western North Pacific and the South China Sea into eight distinct track types. A regression based approach to forecasting seasonal tropical cyclone frequency in a region is developed. Specifically, by adopting large-scale environmental conditions prior to the tropical cyclone season, a Poisson regression model is built for predicting seasonal tropical cyclone counts, and a probit regression model is alternatively developed toward a binary classification problem. With a non
Bayesian analysis to detect abrupt changes in extreme hydrological processes
Jo, Seongil; Kim, Gwangsu; Jeon, Jong-June
2016-07-01
In this study, we develop a new method for a Bayesian change point analysis. The proposed method is easy to implement and can be extended to a wide class of distributions. Using a generalized extreme-value distribution, we investigate the annual maximum of precipitations observed at stations in the South Korean Peninsula, and find significant changes in the considered sites. We evaluate the hydrological risk in predictions using the estimated return levels. In addition, we explain that the misspecification of the probability model can lead to a bias in the number of change points and using a simple example, show that this problem is difficult to avoid by technical data transformation.
A Bayesian analysis of pentaquark signals from CLAS data
Ireland, D G; Protopopescu, D; Ambrozewicz, P; Anghinolfi, M; Asryan, G; Avakian, H; Bagdasaryan, H; Baillie, N; Ball, J P; Baltzell, N A; Batourine, V; Battaglieri, M; Bedlinskiy, I; Bellis, M; Benmouna, N; Berman, B L; Biselli, A S; Blaszczyk, L; Bouchigny, S; Boiarinov, S; Bradford, R; Branford, D; Briscoe, W J; Brooks, W K; Burkert, V D; Butuceanu, C; Calarco, J R; Careccia, S L; Carman, D S; Casey, L; Chen, S; Cheng, L; Cole, P L; Collins, P; Coltharp, P; Crabb, D; Credé, V; Dashyan, N; De Masi, R; De Vita, R; De Sanctis, E; Degtyarenko, P V; Deur, A; Dickson, R; Djalali, C; Dodge, G E; Donnelly, J; Doughty, D; Dugger, M; Dzyubak, O P; Egiyan, H; Egiyan, K S; El Fassi, L; Elouadrhiri, L; Eugenio, P; Fedotov, G; Feldman, G; Fradi, A; Funsten, H; Garçon, M; Gavalian, G; Gevorgyan, N; Gilfoyle, G P; Giovanetti, K L; Girod, F X; Goetz, J T; Gohn, W; Gonenc, A; Gothe, R W; Griffioen, K A; Guidal, M; Guler, N; Guo, L; Gyurjyan, V; Hafidi, K; Hakobyan, H; Hanretty, C; Hassall, N; Hersman, F W; Hleiqawi, I; Holtrop, M; Hyde-Wright, C E; Ilieva, Y; Ishkhanov, B S; Isupov, E L; Jenkins, D; Jo, H S; Johnstone, J R; Joo, K; Jüngst, H G; Kalantarians, N; Kellie, J D; Khandaker, M; Kim, W; Klein, A; Klein, F J; Kossov, M; Krahn, Z; Kramer, L H; Kubarovski, V; Kühn, J; Kuleshov, S V; Kuznetsov, V; Lachniet, J; Laget, J M; Langheinrich, J; Lawrence, D; Livingston, K; Lu, H Y; MacCormick, M; Markov, N; Mattione, P; Mecking, B A; Mestayer, M D; Meyer, C A; Mibe, T; Mikhailov, K; Mirazita, M; Miskimen, R; Mokeev, V; Moreno, B; Moriya, K; Morrow, S A; Moteabbed, M; Munevar, E; Mutchler, G S; Nadel-Turonski, P; Nasseripour, R; Niccolai, S; Niculescu, G; Niculescu, I; Niczyporuk, B B; Niroula, M R; Niyazov, R A; Nozar, M; Osipenko, M; Ostrovidov, A I; Park, K; Pasyuk, E; Paterson, C; Anefalos Pereira, S; Pierce, J; Pivnyuk, N; Pogorelko, O; Pozdniakov, S; Price, J W; Procureur, S; Prok, Y; Raue, B A; Ricco, G; Ripani, M; Ritchie, B G; Ronchetti, F; Rosner, G; Rossi, P; Sabatie, F; Salamanca, J; Salgado, C; Santoro, J P; Sapunenko, V; Schumacher, R A; Serov, V S; Sharabyan, Yu G; Sharov, D; Shvedunov, N V; Smith, E S; Smith, L C; Sober, D I; Sokhan, D; Stavinsky, A; Stepanyan, S S; Stepanyan, S; Stokes, B E; Stoler, P; Strauch, S; Taiuti, M; Tedeschi, D J; Thoma, U; Tkabladze, A; Tkachenko, S; Tur, C; Ungaro, M; Vineyard, M F; Vlassov, A V; Watts, D P; Weinstein, L B; Weygand, D P; Williams, M; Wolin, E; Wood, M H; Yegneswaran, A; Zana, L; Zhang, J; Zhao, B; Zhao, Z W
2007-01-01
We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.
A Bayesian analysis of pentaquark signals from CLAS data
Energy Technology Data Exchange (ETDEWEB)
David Ireland; Bryan McKinnon; Dan Protopopescu; Pawel Ambrozewicz; Marco Anghinolfi; G. Asryan; Harutyun Avakian; H. Bagdasaryan; Nathan Baillie; Jacques Ball; Nathan Baltzell; V. Batourine; Marco Battaglieri; Ivan Bedlinski; Ivan Bedlinskiy; Matthew Bellis; Nawal Benmouna; Barry Berman; Angela Biselli; Lukasz Blaszczyk; Sylvain Bouchigny; Sergey Boyarinov; Robert Bradford; Derek Branford; William Briscoe; William Brooks; Volker Burkert; Cornel Butuceanu; John Calarco; Sharon Careccia; Daniel Carman; Liam Casey; Shifeng Chen; Lu Cheng; Philip Cole; Patrick Collins; Philip Coltharp; Donald Crabb; Volker Crede; Natalya Dashyan; Rita De Masi; Raffaella De Vita; Enzo De Sanctis; Pavel Degtiarenko; Alexandre Deur; Richard Dickson; Chaden Djalali; Gail Dodge; Joseph Donnelly; David Doughty; Michael Dugger; Oleksandr Dzyubak; Hovanes Egiyan; Kim Egiyan; Lamiaa Elfassi; Latifa Elouadrhiri; Paul Eugenio; Gleb Fedotov; Gerald Feldman; Ahmed Fradi; Herbert Funsten; Michel Garcon; Gagik Gavalian; Nerses Gevorgyan; Gerard Gilfoyle; Kevin Giovanetti; Francois-Xavier Girod; John Goetz; Wesley Gohn; Atilla Gonenc; Ralf Gothe; Keith Griffioen; Michel Guidal; Nevzat Guler; Lei Guo; Vardan Gyurjyan; Kawtar Hafidi; Hayk Hakobyan; Charles Hanretty; Neil Hassall; F. Hersman; Ishaq Hleiqawi; Maurik Holtrop; Charles Hyde; Yordanka Ilieva; Boris Ishkhanov; Eugeny Isupov; D. Jenkins; Hyon-Suk Jo; John Johnstone; Kyungseon Joo; Henry Juengst; Narbe Kalantarians; James Kellie; Mahbubul Khandaker; Wooyoung Kim; Andreas Klein; Franz Klein; Mikhail Kossov; Zebulun Krahn; Laird Kramer; Valery Kubarovsky; Joachim Kuhn; Sergey Kuleshov; Viacheslav Kuznetsov; Jeff Lachniet; Jean Laget; Jorn Langheinrich; D. Lawrence; Kenneth Livingston; Haiyun Lu; Marion MacCormick; Nikolai Markov; Paul Mattione; Bernhard Mecking; Mac Mestayer; Curtis Meyer; Tsutomu Mibe; Konstantin Mikhaylov; Marco Mirazita; Rory Miskimen; Viktor Mokeev; Brahim Moreno; Kei Moriya; Steven Morrow; Maryam Moteabbed; Edwin Munevar Espitia; Gordon Mutchler; Pawel Nadel-Turonski; Rakhsha Nasseripour; Silvia Niccolai; Gabriel Niculescu; Maria-Ioana Niculescu; Bogdan Niczyporuk; Megh Niroula; Rustam Niyazov; Mina Nozar; Mikhail Osipenko; Alexander Ostrovidov; Kijun Park; Evgueni Pasyuk; Craig Paterson; Sergio Pereira; Joshua Pierce; Nikolay Pivnyuk; Oleg Pogorelko; Sergey Pozdnyakov; John Price; Sebastien Procureur; Yelena Prok; Brian Raue; Giovanni Ricco; Marco Ripani; Barry Ritchie; Federico Ronchetti; Guenther Rosner; Patrizia Rossi; Franck Sabatie; Julian Salamanca; Carlos Salgado; Joseph Santoro; Vladimir Sapunenko; Reinhard Schumacher; Vladimir Serov; Youri Sharabian; Dmitri Sharov; Nikolay Shvedunov; Elton Smith; Lee Smith; Daniel Sober; Daria Sokhan; Aleksey Stavinskiy; Samuel Stepanyan; Stepan Stepanyan; Burnham Stokes; Paul Stoler; Steffen Strauch; Mauro Taiuti; David Tedeschi; Ulrike Thoma; Avtandil Tkabladze; Svyatoslav Tkachenko; Clarisse Tur; Maurizio Ungaro; Michael Vineyard; Alexander Vlassov; Daniel Watts; Lawrence Weinstein; Dennis Weygand; M. Williams; Elliott Wolin; M.H. Wood; Amrit Yegneswaran; Lorenzo Zana; Jixie Zhang; Bo Zhao; Zhiwen Zhao
2008-02-01
We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.
Bayesian conformational analysis of ring molecules through reversible jump MCMC
DEFF Research Database (Denmark)
Nolsøe, Kim; Kessler, Mathieu; Pérez, José;
2005-01-01
In this paper we address the problem of classifying the conformations of mmembered rings using experimental observations obtained by crystal structure analysis. We formulate a model for the data generation mechanism that consists in a multidimensional mixture model. We perform inference for the p...... for the proportions and the components in a Bayesian framework, implementing an MCMC Reversible Jumps Algorithm to obtain samples of the posterior distributions. The method is illustrated on a simulated data set and on real data corresponding to cyclo-octane structures....
Bayesian Reasoning in Data Analysis A Critical Introduction
D'Agostini, Giulio
2003-01-01
This book provides a multi-level introduction to Bayesian reasoning (as opposed to "conventional statistics") and its applications to data analysis. The basic ideas of this "new" approach to the quantification of uncertainty are presented using examples from research and everyday life. Applications covered include: parametric inference; combination of results; treatment of uncertainty due to systematic errors and background; comparison of hypotheses; unfolding of experimental distributions; upper/lower bounds in frontier-type measurements. Approximate methods for routine use are derived and ar
A Bayesian analysis of pentaquark signals from CLAS data
International Nuclear Information System (INIS)
We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a Θ+ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a Θ+. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner
Safety Analysis of Liquid Rocket Engine Using Bayesian Networks
Institute of Scientific and Technical Information of China (English)
WANG Hua-wei; YAN Zhi-qiang
2007-01-01
Safety analysis for liquid rocket engine has a great meaning for shortening development cycle, saving development expenditure and reducing development risk. The relationship between the structure and component of liquid rocket engine is much more complex, furthermore test data are absent in development phase. Thereby, the uncertainties exist in safety analysis for liquid rocket engine. A safety analysis model integrated with FMEA(failure mode and effect analysis)based on Bayesian networks (BN) is brought forward for liquid rocket engine, which can combine qualitative analysis with quantitative decision. The method has the advantages of fusing multi-information, saving sample amount and having high veracity. An example shows that the method is efficient.
Implementation of a Bayesian Engine for Uncertainty Analysis
Energy Technology Data Exchange (ETDEWEB)
Leng Vang; Curtis Smith; Steven Prescott
2014-08-01
In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.
Inference algorithms and learning theory for Bayesian sparse factor analysis
International Nuclear Information System (INIS)
Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.
Analysis of Wave Directional Spreading by Bayesian Parameter Estimation
Institute of Scientific and Technical Information of China (English)
钱桦; 莊士贤; 高家俊
2002-01-01
A spatial array of wave gauges installed on an observatoion platform has been designed and arranged to measure the lo-cal features of winter monsoon directional waves off Taishi coast of Taiwan. A new method, named the Bayesian ParameterEstimation Method( BPEM), is developed and adopted to determine the main direction and the directional spreading parame-ter of directional spectra. The BPEM could be considered as a regression analysis to find the maximum joint probability ofparameters, which best approximates the observed data from the Bayesian viewpoint. The result of the analysis of field wavedata demonstrates the highly dependency of the characteristics of normalized directional spreading on the wave age. The Mit-suyasu type empirical formula of directional spectnun is therefore modified to be representative of monsoon wave field. More-over, it is suggested that Smax could be expressed as a function of wave steepness. The values of Smax decrease with increas-ing steepness. Finally, a local directional spreading model, which is simple to be utilized in engineering practice, is prop-osed.
Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.
Hack, C Eric
2006-04-17
Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842
Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling
Energy Technology Data Exchange (ETDEWEB)
Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory
2010-01-01
Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.
Williford, W. O.; Hsieh, P.; Carter, M. C.
1974-01-01
A Bayesian analysis of the two discrete probability models, the negative binomial and the modified negative binomial distributions, which have been used to describe thunderstorm activity at Cape Kennedy, Florida, is presented. The Bayesian approach with beta prior distributions is compared to the classical approach which uses a moment method of estimation or a maximum-likelihood method. The accuracy and simplicity of the Bayesian method is demonstrated.
A Bayesian Framework for Reliability Analysis of Spacecraft Deployments
Evans, John W.; Gallo, Luis; Kaminsky, Mark
2012-01-01
Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.
Risk analysis of dust explosion scenarios using Bayesian networks.
Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul
2015-02-01
In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures. PMID:25264172
Bayesian large-scale structure inference and cosmic web analysis
Leclercq, Florent
2015-01-01
Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...
Bayesian analysis of factors associated with fibromyalgia syndrome subjects
Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie
2015-01-01
Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.
BASE-9: Bayesian Analysis for Stellar Evolution with nine variables
Robinson, Elliot; von Hippel, Ted; Stein, Nathan; Stenning, David; Wagner-Kaiser, Rachel; Si, Shijing; van Dyk, David
2016-08-01
The BASE-9 (Bayesian Analysis for Stellar Evolution with nine variables) software suite recovers star cluster and stellar parameters from photometry and is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses a Markov chain Monte Carlo (MCMC) technique along with brute force numerical integration to estimate the posterior probability distribution for the age, metallicity, helium abundance, distance modulus, line-of-sight absorption, and parameters of the initial-final mass relation (IFMR) for a cluster, and for the primary mass, secondary mass (if a binary), and cluster probability for every potential cluster member. The MCMC technique is used for the cluster quantities (the first six items listed above) and numerical integration is used for the stellar quantities (the last three items in the above list).
Reference priors of nuisance parameters in Bayesian sequential population analysis
Bousquet, Nicolas
2010-01-01
Prior distributions elicited for modelling the natural fluctuations or the uncertainty on parameters of Bayesian fishery population models, can be chosen among a vast range of statistical laws. Since the statistical framework is defined by observational processes, observational parameters enter into the estimation and must be considered random, similarly to parameters or states of interest like population levels or real catches. The former are thus perceived as nuisance parameters whose values are intrinsically linked to the considered experiment, which also require noninformative priors. In fishery research Jeffreys methodology has been presented by Millar (2002) as a practical way to elicit such priors. However they can present wrong properties in multiparameter contexts. Therefore we suggest to use the elicitation method proposed by Berger and Bernardo to avoid paradoxical results raised by Jeffreys priors. These benchmark priors are derived here in the framework of sequential population analysis.
Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations
Scargle, Jeffrey D; Jackson, Brad; Chiang, James
2012-01-01
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it - an improved and generalized version of Bayesian Blocks (Scargle 1998) - that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multi-variate time series data, analysis of vari...
A Bayesian Seismic Hazard Analysis for the city of Naples
Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo
2016-04-01
In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil
Chung, Gregory K. W. K.; Dionne, Gary B.; Kaiser, William J.
2006-01-01
Our research question was whether we could develop a feasible technique, using Bayesian networks, to diagnose gaps in student knowledge. Thirty-four college-age participants completed tasks designed to measure conceptual knowledge, procedural knowledge, and problem-solving skills related to circuit analysis. A Bayesian network was used to model…
A Bayesian latent group analysis for detecting poor effort in the assessment of malingering
A. Ortega; E.-J. Wagenmakers; M.D. Lee; H.J. Markowitsch; M. Piefke
2012-01-01
Despite their theoretical appeal, Bayesian methods for the assessment of poor effort and malingering are still rarely used in neuropsychological research and clinical diagnosis. In this article, we outline a novel and easy-to-use Bayesian latent group analysis of malingering whose goal is to identif
Directory of Open Access Journals (Sweden)
Ildikó Ungvári
Full Text Available Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls. The results were evaluated with traditional frequentist methods and we applied a new statistical method, called bayesian network based bayesian multilevel analysis of relevance (BN-BMLA. This method uses bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated.With frequentist methods one SNP (rs3751464 in the FRMD6 gene provided evidence for an association with asthma (OR = 1.43(1.2-1.8; p = 3×10(-4. The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics.In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance.
Multivariate meta-analysis of mixed outcomes: a Bayesian approach.
Bujkiewicz, Sylwia; Thompson, John R; Sutton, Alex J; Cooper, Nicola J; Harrison, Mark J; Symmons, Deborah P M; Abrams, Keith R
2013-09-30
Multivariate random effects meta-analysis (MRMA) is an appropriate way for synthesizing data from studies reporting multiple correlated outcomes. In a Bayesian framework, it has great potential for integrating evidence from a variety of sources. In this paper, we propose a Bayesian model for MRMA of mixed outcomes, which extends previously developed bivariate models to the trivariate case and also allows for combination of multiple outcomes that are both continuous and binary. We have constructed informative prior distributions for the correlations by using external evidence. Prior distributions for the within-study correlations were constructed by employing external individual patent data and using a double bootstrap method to obtain the correlations between mixed outcomes. The between-study model of MRMA was parameterized in the form of a product of a series of univariate conditional normal distributions. This allowed us to place explicit prior distributions on the between-study correlations, which were constructed using external summary data. Traditionally, independent 'vague' prior distributions are placed on all parameters of the model. In contrast to this approach, we constructed prior distributions for the between-study model parameters in a way that takes into account the inter-relationship between them. This is a flexible method that can be extended to incorporate mixed outcomes other than continuous and binary and beyond the trivariate case. We have applied this model to a motivating example in rheumatoid arthritis with the aim of incorporating all available evidence in the synthesis and potentially reducing uncertainty around the estimate of interest. PMID:23630081
Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis
Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William
2009-01-01
This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).
Bayesian Analysis of Graphical Models of Marginal Independence for Three Way Contingency Tables
Tarantola, Claudia; Ntzoufras, Ioannis
2012-01-01
This paper deals with the Bayesian analysis of graphical models of marginal independence for three way contingency tables. Each marginal independence model corresponds to a particular factorization of the cell probabilities and a conjugate analysis based on Dirichlet prior can be performed. We illustrate a comprehensive Bayesian analysis of such models, involving suitable choices of prior parameters, estimation, model determination, as well as the allied computational issues. The posterior di...
New Ephemeris for LSI+61 303, A Bayesian Analysis
Gregory, P. C.
1997-12-01
The luminous early-type binary LSI+61 303 is an interesting radio, X-ray and possible gamma-ray source. At radio wavelengths it exhibits periodic outbursts with an approximate period of 26.5 days as well as a longer term modulation of the outburst peaks of approximately 4 years. Recently Paredes et al. have found evidence that the X-ray outbursts are very likely to recur with the same radio outburst period from an analysis of RXTE all sky monitoring data. The system has been observed by many groups at all wavelengths but still the energy source powering the radio outbursts and their relation to the high energy emission remains a mystery. For more details see the "LSI+61 303 Resource Page" at http://www.srl.caltech.edu/personnel/paulr/lsi.html . There has been increasing evidence for a change in the period of the system. We will present a new ephemeris for the system based on a Bayesian analysis of 20 years of radio observations including the GBI-NASA radio monitoring data.
Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems
Directory of Open Access Journals (Sweden)
Goutsias John
2010-11-01
Full Text Available Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. Results We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Conclusions Our approach provides an attractive statistical methodology for
High resolution coherence analysis between planetary and climate oscillations
Scafetta, Nicola
2016-05-01
This study investigates the existence of a multi-frequency spectral coherence between planetary and global surface temperature oscillations by using advanced techniques of coherence analysis and statistical significance tests. The performance of the standard Matlab mscohere algorithms is compared versus high resolution coherence analysis methodologies such as the canonical correlation analysis. The Matlab mscohere function highlights large coherence peaks at 20 and 60-year periods although, due to the shortness of the global surface temperature record (1850-2014), the statistical significance of the result depends on the specific window function adopted for pre-processing the data. In fact, window functions disrupt the low frequency component of the spectrum. On the contrary, using the canonical correlation analysis at least five coherent frequencies at the 95% significance level are found at the following periods: 6.6, 7.4, 14, 20 and 60 years. Thus, high resolution coherence analysis confirms that the climate system can be partially modulated by astronomical forces of gravitational, electromagnetic and solar origin. A possible chain of the physical causes explaining this coherence is briefly discussed.
Bayesian Analysis of Dynamic Multivariate Models with Multiple Structural Breaks
Sugita, Katsuhiro
2006-01-01
This paper considers a vector autoregressive model or a vector error correction model with multiple structural breaks in any subset of parameters, using a Bayesian approach with Markov chain Monte Carlo simulation technique. The number of structural breaks is determined as a sort of model selection by the posterior odds. For a cointegrated model, cointegrating rank is also allowed to change with breaks. Bayesian approach by Strachan (Journal of Business and Economic Statistics 21 (2003) 185) ...
Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.
Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian
2016-05-01
Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd.
Dynamic sensor action selection with Bayesian decision analysis
Kristensen, Steen; Hansen, Volker; Kondak, Konstantin
1998-10-01
The aim of this work is to create a framework for the dynamic planning of sensor actions for an autonomous mobile robot. The framework uses Bayesian decision analysis, i.e., a decision-theoretic method, to evaluate possible sensor actions and selecting the most appropriate ones given the available sensors and what is currently known about the state of the world. Since sensing changes the knowledge of the system and since the current state of the robot (task, position, etc.) determines what knowledge is relevant, the evaluation and selection of sensing actions is an on-going process that effectively determines the behavior of the robot. The framework has been implemented on a real mobile robot and has been proven to be able to control in real-time the sensor actions of the system. In current work we are investigating methods to reduce or automatically generate the necessary model information needed by the decision- theoretic method to select the appropriate sensor actions.
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
Energy Technology Data Exchange (ETDEWEB)
Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)
2013-02-20
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
A Bayesian model for the analysis of transgenerational epigenetic variation.
Varona, Luis; Munilla, Sebastián; Mouresan, Elena Flavia; González-Rodríguez, Aldemar; Moreno, Carlos; Altarriba, Juan
2015-01-23
Epigenetics has become one of the major areas of biological research. However, the degree of phenotypic variability that is explained by epigenetic processes still remains unclear. From a quantitative genetics perspective, the estimation of variance components is achieved by means of the information provided by the resemblance between relatives. In a previous study, this resemblance was described as a function of the epigenetic variance component and a reset coefficient that indicates the rate of dissipation of epigenetic marks across generations. Given these assumptions, we propose a Bayesian mixed model methodology that allows the estimation of epigenetic variance from a genealogical and phenotypic database. The methodology is based on the development of a T: matrix of epigenetic relationships that depends on the reset coefficient. In addition, we present a simple procedure for the calculation of the inverse of this matrix ( T-1: ) and a Gibbs sampler algorithm that obtains posterior estimates of all the unknowns in the model. The new procedure was used with two simulated data sets and with a beef cattle database. In the simulated populations, the results of the analysis provided marginal posterior distributions that included the population parameters in the regions of highest posterior density. In the case of the beef cattle dataset, the posterior estimate of transgenerational epigenetic variability was very low and a model comparison test indicated that a model that did not included it was the most plausible.
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
International Nuclear Information System (INIS)
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
Bayesian Spectral Analysis of Metal Abandance Deficient Stars
Sourlas, E; Kashyap, V L; Drake, J; Pease, D; Sourlas, Epaminondas; Dyk, David van; Kashyap, Vinay; Drake, Jeremy; Pease, Deron
2002-01-01
Metallicity can be measured by analyzing the spectra in the X-ray region and comparing the flux in spectral lines to the flux in the underlying Bremsstrahlung continuum. In this paper we propose new Bayesian methods which directly model the Poisson nature of the data and thus are expected to exhibit improved sampling properties. Our model also accounts for the Poisson nature of background contamination of the observations, image blurring due to instrument response, and the absorption of photons in space. The resulting highly structured hierarchical model is fit using the Gibbs sampler, data augmentation and Metropolis-Hasting. We demonstrate our methods with the X-ray spectral analysis of several "Metal Abundance Deficient" stars. The model is designed to summarize the relative frequency of the energy of photons (X-ray or gamma-ray) arriving at a detector. Independent Poisson distributions are more appropriate to model the counts than the commonly used normal approximation. We model the high energy tail of th...
Nonparametric survival analysis using Bayesian Additive Regression Trees (BART).
Sparapani, Rodney A; Logan, Brent R; McCulloch, Robert E; Laud, Purushottam W
2016-07-20
Bayesian additive regression trees (BART) provide a framework for flexible nonparametric modeling of relationships of covariates to outcomes. Recently, BART models have been shown to provide excellent predictive performance, for both continuous and binary outcomes, and exceeding that of its competitors. Software is also readily available for such outcomes. In this article, we introduce modeling that extends the usefulness of BART in medical applications by addressing needs arising in survival analysis. Simulation studies of one-sample and two-sample scenarios, in comparison with long-standing traditional methods, establish face validity of the new approach. We then demonstrate the model's ability to accommodate data from complex regression models with a simulation study of a nonproportional hazards scenario with crossing survival functions and survival function estimation in a scenario where hazards are multiplicatively modified by a highly nonlinear function of the covariates. Using data from a recently published study of patients undergoing hematopoietic stem cell transplantation, we illustrate the use and some advantages of the proposed method in medical investigations. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26854022
Gasparini, Mauro; Eisele, J
2003-01-01
Compositional random vectors are fundamental tools in the Bayesian analysis of categorical data. Many of the issues that are discussed with reference to the statistical analysis of compositional data have a natural counterpart in the construction of a Bayesian statistical model for categorical data. This note builds on the idea of cross-fertilization of the two areas recommended by Aitchison (1986) in his seminal book on compositional data. Particular emphasis is put on the pro...
Bayesian Analysis of Marginal Log-Linear Graphical Models for Three Way Contingency Tables
Ntzoufras, Ioannis; Tarantola, Claudia
2008-01-01
This paper deals with the Bayesian analysis of graphical models of marginal independence for three way contingency tables. We use a marginal log-linear parametrization, under which the model is defined through suitable zero-constraints on the interaction parameters calculated within marginal distributions. We undertake a comprehensive Bayesian analysis of these models, involving suitable choices of prior distributions, estimation, model determination, as well as the allied computational issue...
On Coherence and Cohesion in Discourse Analysis
Institute of Scientific and Technical Information of China (English)
李振国
2013-01-01
‘cohesion’is the way words formally hang together in sentences and the like, and“coherence”captures the content-based connections between the words that make them produce sense. In the discourse, while the local sequence of turns creates a certain amount of cohesion, it isn’t sufficient to guarantee coherence.
Gruber, Lutz F.; West, Mike
2016-01-01
The recently introduced class of simultaneous graphical dynamic linear models (SGDLMs) defines an ability to scale on-line Bayesian analysis and forecasting to higher-dimensional time series. This paper advances the methodology of SGDLMs, developing and embedding a novel, adaptive method of simultaneous predictor selection in forward filtering for on-line learning and forecasting. The advances include developments in Bayesian computation for scalability, and a case study in exploring the resu...
Bayesian Analysis of Multiple Populations in Galactic Globular Clusters
Wagner-Kaiser, Rachel A.; Sarajedini, Ata; von Hippel, Ted; Stenning, David; Piotto, Giampaolo; Milone, Antonino; van Dyk, David A.; Robinson, Elliot; Stein, Nathan
2016-01-01
We use GO 13297 Cycle 21 Hubble Space Telescope (HST) observations and archival GO 10775 Cycle 14 HST ACS Treasury observations of Galactic Globular Clusters to find and characterize multiple stellar populations. Determining how globular clusters are able to create and retain enriched material to produce several generations of stars is key to understanding how these objects formed and how they have affected the structural, kinematic, and chemical evolution of the Milky Way. We employ a sophisticated Bayesian technique with an adaptive MCMC algorithm to simultaneously fit the age, distance, absorption, and metallicity for each cluster. At the same time, we also fit unique helium values to two distinct populations of the cluster and determine the relative proportions of those populations. Our unique numerical approach allows objective and precise analysis of these complicated clusters, providing posterior distribution functions for each parameter of interest. We use these results to gain a better understanding of multiple populations in these clusters and their role in the history of the Milky Way.Support for this work was provided by NASA through grant numbers HST-GO-10775 and HST-GO-13297 from the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS5-26555. This material is based upon work supported by the National Aeronautics and Space Administration under Grant NNX11AF34G issued through the Office of Space Science. This project was supported by the National Aeronautics & Space Administration through the University of Central Florida's NASA Florida Space Grant Consortium.
JBASE: Joint Bayesian Analysis of Subphenotypes and Epistasis
Colak, Recep; Kim, TaeHyung; Kazan, Hilal; Oh, Yoomi; Cruz, Miguel; Valladares-Salgado, Adan; Peralta, Jesus; Escobedo, Jorge; Parra, Esteban J.; Kim, Philip M.; Goldenberg, Anna
2016-01-01
Motivation: Rapid advances in genotyping and genome-wide association studies have enabled the discovery of many new genotype–phenotype associations at the resolution of individual markers. However, these associations explain only a small proportion of theoretically estimated heritability of most diseases. In this work, we propose an integrative mixture model called JBASE: joint Bayesian analysis of subphenotypes and epistasis. JBASE explores two major reasons of missing heritability: interactions between genetic variants, a phenomenon known as epistasis and phenotypic heterogeneity, addressed via subphenotyping. Results: Our extensive simulations in a wide range of scenarios repeatedly demonstrate that JBASE can identify true underlying subphenotypes, including their associated variants and their interactions, with high precision. In the presence of phenotypic heterogeneity, JBASE has higher Power and lower Type 1 Error than five state-of-the-art approaches. We applied our method to a sample of individuals from Mexico with Type 2 diabetes and discovered two novel epistatic modules, including two loci each, that define two subphenotypes characterized by differences in body mass index and waist-to-hip ratio. We successfully replicated these subphenotypes and epistatic modules in an independent dataset from Mexico genotyped with a different platform. Availability and implementation: JBASE is implemented in C++, supported on Linux and is available at http://www.cs.toronto.edu/∼goldenberg/JBASE/jbase.tar.gz. The genotype data underlying this study are available upon approval by the ethics review board of the Medical Centre Siglo XXI. Please contact Dr Miguel Cruz at mcruzl@yahoo.com for assistance with the application. Contact: anna.goldenberg@utoronto.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26411870
Exploiting sensitivity analysis in Bayesian networks for consumer satisfaction study
Jaronski, W.; Bloemer, J.M.M.; Vanhoof, K.; Wets, G.
2004-01-01
The paper presents an application of Bayesian network technology in a empirical customer satisfaction study. The findings of the study should provide insight as to the importance of product/service dimensions in terms of the strength of their influence on overall satisfaction. To this end we apply a
Carvalho, Pedro; Marques, Rui Cunha
2016-02-15
This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services.
Phase and coherence analysis of VHF scintillation over Christmas Island
Shume, E. B.; Mannucci, A. J.; Caton, R.
2014-03-01
This short paper presents phase and coherence data from the cross-wavelet transform applied on longitudinally separated very high frequency (VHF) equatorial ionospheric scintillation observations over Christmas Island. The phase and coherence analyses were employed on a pair of scintillation observations, namely, the east-looking and west-looking VHF scintillation monitors at Christmas Island. Our analysis includes 3 years of peak season scintillation data from 2008, 2009 (low solar activity), and 2011 (moderate solar activity). In statistically significant and high spectral coherence regions of the cross-wavelet transform, scintillation observations from the east-looking monitor lead those from the west-looking monitor by about 20 to 60 (40 ± 20) min (most frequent lead times). Using several years (seasons and solar cycle) of lead (or lag) and coherence information of the cross-wavelet transform, we envisage construction of a probability model for forecasting scintillation in the nighttime equatorial ionosphere.
A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis
Directory of Open Access Journals (Sweden)
Dilip Swaminathan
2009-01-01
kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.
Use of SAMC for Bayesian analysis of statistical models with intractable normalizing constants
Jin, Ick Hoon
2014-03-01
Statistical inference for the models with intractable normalizing constants has attracted much attention. During the past two decades, various approximation- or simulation-based methods have been proposed for the problem, such as the Monte Carlo maximum likelihood method and the auxiliary variable Markov chain Monte Carlo methods. The Bayesian stochastic approximation Monte Carlo algorithm specifically addresses this problem: It works by sampling from a sequence of approximate distributions with their average converging to the target posterior distribution, where the approximate distributions can be achieved using the stochastic approximation Monte Carlo algorithm. A strong law of large numbers is established for the Bayesian stochastic approximation Monte Carlo estimator under mild conditions. Compared to the Monte Carlo maximum likelihood method, the Bayesian stochastic approximation Monte Carlo algorithm is more robust to the initial guess of model parameters. Compared to the auxiliary variable MCMC methods, the Bayesian stochastic approximation Monte Carlo algorithm avoids the requirement for perfect samples, and thus can be applied to many models for which perfect sampling is not available or very expensive. The Bayesian stochastic approximation Monte Carlo algorithm also provides a general framework for approximate Bayesian analysis. © 2012 Elsevier B.V. All rights reserved.
Bayesian network models in brain functional connectivity analysis
Ide, Jaime S.; Zhang, Sheng; Chiang-shan R. Li
2013-01-01
Much effort has been made to better understand the complex integration of distinct parts of the human brain using functional magnetic resonance imaging (fMRI). Altered functional connectivity between brain regions is associated with many neurological and mental illnesses, such as Alzheimer and Parkinson diseases, addiction, and depression. In computational science, Bayesian networks (BN) have been used in a broad range of studies to model complex data set in the presence of uncertainty and wh...
Bayesian Analysis of the Black-Scholes Option Price
Darsinos, Theofanis; Stephen E Satchell
2001-01-01
This paper investigates the statistical properties of the Black-Scholes option price under a Bayesian approach. We incorporate randomness, both in the price process and in volatility, to derive the prior and posterior densities of a European call option. Expressions for the density of the option price conditional on the sample estimates of volatility and on the asset price respectively, are also derived. Numerical results are presented to compare how the dispersion of the option price changes...
Hierarchical Bayesian analysis of somatic mutation data in cancer
Ding, Jie; Trippa, Lorenzo; Zhong, Xiaogang; Parmigiani, Giovanni
2013-01-01
Identifying genes underlying cancer development is critical to cancer biology and has important implications across prevention, diagnosis and treatment. Cancer sequencing studies aim at discovering genes with high frequencies of somatic mutations in specific types of cancer, as these genes are potential driving factors (drivers) for cancer development. We introduce a hierarchical Bayesian methodology to estimate gene-specific mutation rates and driver probabilities from somatic mutation data ...
Bayesian analysis of the Hector’s Dolphin data
King, R; Brooks, S.P.
2004-01-01
In recent years there have been increasing concerns for many wildlife populations, due to decreasing population trends. This has led to the introduction of management schemes to increase the survival rates and hence the population size of many species of animals. We concentrate on a particular dolphin population situated off the coast of New Zealand, and investigate whether the introduction of a fishing gill net ban was effective in decreasing dolphin mortality. We undertake a Bayesian analys...
Bayesian models for comparative analysis integrating phylogenetic uncertainty
Directory of Open Access Journals (Sweden)
Villemereuil Pierre de
2012-06-01
Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible
Bayesian analysis of hierarchical multi-fidelity codes
Gratiet, Loic Le
2011-01-01
This paper deals with the Gaussian process based approximation of a code which can be run at different levels of accuracy. This co-kriging method allows us to improve a surrogate model of a complex computer code using fast approximations of it. In particular, we focus on the case of a large number of code levels on the one hand and on a Bayesian approach when we have 2 levels on the other hand. Moreover, based on a Bayes linear formulation, an extension of the universal kriging equations are provided for the co-kriging model. We also address the problem of nested space-filling design for multi-fidelity computer experiments and we provide a significant simplification of the computation of the co-kriging cross-validation equations. A hydrodynamic simulator example is used to illustrate the comparison Bayesian versus non-Bayesian co-kriging. A thermodynamic example is used to illustrate the comparison between 2-level and 3-level co-kriging.
Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel
2012-01-01
In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the mo
Nuclear stockpile stewardship and Bayesian image analysis (DARHT and the BIE)
Energy Technology Data Exchange (ETDEWEB)
Carroll, James L [Los Alamos National Laboratory
2011-01-11
Since the end of nuclear testing, the reliability of our nation's nuclear weapon stockpile has been performed using sub-critical hydrodynamic testing. These tests involve some pretty 'extreme' radiography. We will be discussing the challenges and solutions to these problems provided by DARHT (the world's premiere hydrodynamic testing facility) and the BIE or Bayesian Inference Engine (a powerful radiography analysis software tool). We will discuss the application of Bayesian image analysis techniques to this important and difficult problem.
Wavelet coherence analysis of change blindness
International Nuclear Information System (INIS)
Change blindness is the incapability of the brain to detect substantial visual changes in the presence of other visual interruption. The objectives of this study are to examine the EEG (Electroencephalographic) based changes in functional connectivity of the brain due to the change blindness. The functional connectivity was estimated using the wavelet-based MSC (Magnitude Square Coherence) function of ERPs (Event Related Potentials). The ERPs of 30 subjects were used and were recorded using the visual attention experiment in which subjects were instructed to detect changes in visual stimulus presented before them through the computer monitor. The two-way ANOVA statistical test revealed significant increase in both gamma and theta band MSCs, and significant decrease in beta band MSC for change detection trials. These findings imply that change blindness might be associated to the lack of functional connectivity in gamma and theta bands and increase of functional connectivity in beta band. Since gamma, theta, and beta frequency bands reflect different functions of cognitive process such as maintenance, encoding, retrieval, and matching and work load of VSTM (Visual Short Term Memory), the change in functional connectivity might be correlated to these cognitive processes during change blindness. (author)
A Bayesian Analysis of the Radioactive Releases of Fukushima
DEFF Research Database (Denmark)
Tomioka, Ryota; Mørup, Morten
2012-01-01
The Fukushima Daiichi disaster 11 March, 2011 is considered the largest nuclear accident since the 1986 Chernobyl disaster and has been rated at level 7 on the International Nuclear Event Scale. As different radioactive materials have different effects to human body, it is important to know...... the types of nuclides and their levels of concentration from the recorded mixture of radiations to take necessary measures. We presently formulate a Bayesian generative model for the data available on radioactive releases from the Fukushima Daiichi disaster across Japan. From the sparsely sampled...... detailed account also of the latent structure present in the data of the Fukushima Daiichi disaster....
Introduction to Bayesian statistics
Bolstad, William M
2016-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
A Bayesian Analysis of the Ages of Four Open Clusters
Jeffery, Elizabeth J; van Dyk, David A; Stenning, David C; Robinson, Elliot; Stein, Nathan; Jefferys, W H
2016-01-01
In this paper we apply a Bayesian technique to determine the best fit of stellar evolution models to find the main sequence turn off age and other cluster parameters of four intermediate-age open clusters: NGC 2360, NGC 2477, NGC 2660, and NGC 3960. Our algorithm utilizes a Markov chain Monte Carlo technique to fit these various parameters, objectively finding the best-fit isochrone for each cluster. The result is a high-precision isochrone fit. We compare these results with the those of traditional "by-eye" isochrone fitting methods. By applying this Bayesian technique to NGC 2360, NGC 2477, NGC 2660, and NGC 3960, we determine the ages of these clusters to be 1.35 +/- 0.05, 1.02 +/- 0.02, 1.64 +/- 0.04, and 0.860 +/- 0.04 Gyr, respectively. The results of this paper continue our effort to determine cluster ages to higher precision than that offered by these traditional methods of isochrone fitting.
Objective Bayesian analysis of "on/off" measurements
Casadei, Diego
2014-01-01
In high-energy astrophysics, it is common practice to account for the background overlaid with the counts from the source of interest with the help of auxiliary measurements carried on by pointing off-source. In this "on/off" measurement, one knows the number of photons detected while pointing to the source, the number of photons collected while pointing away of the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of detected photons is so low that the approximations which hold asymptotically are not valid. On the other hand, the analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. The Bayesian approach to statistical inference provides a probability distribution describing our degree of belief on the possible outcomes of the parameter of interest, over its entire range. In addition to the specification of the model, in this case assumed to obey Po...
Bayesian analysis of deterministic and stochastic prisoner's dilemma games
Directory of Open Access Journals (Sweden)
Howard Kunreuther
2009-08-01
Full Text Available This paper compares the behavior of individuals playing a classic two-person deterministic prisoner's dilemma (PD game with choice data obtained from repeated interdependent security prisoner's dilemma games with varying probabilities of loss and the ability to learn (or not learn about the actions of one's counterpart, an area of recent interest in experimental economics. This novel data set, from a series of controlled laboratory experiments, is analyzed using Bayesian hierarchical methods, the first application of such methods in this research domain. We find that individuals are much more likely to be cooperative when payoffs are deterministic than when the outcomes are probabilistic. A key factor explaining this difference is that subjects in a stochastic PD game respond not just to what their counterparts did but also to whether or not they suffered a loss. These findings are interpreted in the context of behavioral theories of commitment, altruism and reciprocity. The work provides a linkage between Bayesian statistics, experimental economics, and consumer psychology.
Exclusive breastfeeding practice in Nigeria: a bayesian stepwise regression analysis.
Gayawan, Ezra; Adebayo, Samson B; Chitekwe, Stanley
2014-11-01
Despite the importance of breast milk, the prevalence of exclusive breastfeeding (EBF) in Nigeria is far lower than what has been recommended for developing countries. Worse still, the practise has been on downward trend in the country recently. This study was aimed at investigating the determinants and geographical variations of EBF in Nigeria. Any intervention programme would require a good knowledge of factors that enhance the practise. A pooled data set from Nigeria Demographic and Health Survey conducted in 1999, 2003, and 2008 were analyzed using a Bayesian stepwise approach that involves simultaneous selection of variables and smoothing parameters. Further, the approach allows for geographical variations at a highly disaggregated level of states to be investigated. Within a Bayesian context, appropriate priors are assigned on all the parameters and functions. Findings reveal that education of women and their partners, place of delivery, mother's age at birth, and current age of child are associated with increasing prevalence of EBF. However, visits for antenatal care during pregnancy are not associated with EBF in Nigeria. Further, results reveal considerable geographical variations in the practise of EBF. The likelihood of exclusively breastfeeding children are significantly higher in Kwara, Kogi, Osun, and Oyo states but lower in Jigawa, Katsina, and Yobe. Intensive interventions that can lead to improved practise are required in all states in Nigeria. The importance of breastfeeding needs to be emphasized to women during antenatal visits as this can encourage and enhance the practise after delivery. PMID:24619227
Risk Analysis of New Product Development Using Bayesian Networks
Directory of Open Access Journals (Sweden)
MohammadRahim Ramezanian
2012-06-01
Full Text Available The process of presenting new product development (NPD to market is of great importance due to variability of competitive rules in the business world. The product development teams face a lot of pressures due to rapid growth of technology, increased risk-taking of world markets and increasing variations in the customers` needs. However, the process of NPD is always associated with high uncertainties and complexities. To be successful in completing NPD project, existing risks should be identified and assessed. On the other hand, the Bayesian networks as a strong approach of decision making modeling of uncertain situations has attracted many researchers in various areas. These networks provide a decision supporting system for problems with uncertainties or probable reasoning. In this paper, the available risk factors in product development have been first identified in an electric company and then, the Bayesian network has been utilized and their interrelationships have been modeled to evaluate the available risk in the process. To determine the primary and conditional probabilities of the nodes, the viewpoints of experts in this area have been applied. The available risks in this process have been divided to High (H, Medium (M and Low (L groups and analyzed by the Agena Risk software. The findings derived from software output indicate that the production of the desired product has relatively high risk. In addition, Predictive support and Diagnostic support have been performed on the model with two different scenarios..
Risk Analysis of New Product Development Using Bayesian Networks
Directory of Open Access Journals (Sweden)
Mohammad Rahim Ramezanian
2012-01-01
Full Text Available The process of presenting new product development (NPD to market is of great importance due to variability of competitive rules in the business world. The product development teams face a lot of pressures due to rapid growth of technology, increased risk-taking of world markets and increasing variations in the customers` needs. However, the process of NPD is always associated with high uncertainties and complexities. To be successful in completing NPD project, existing risks should be identified and assessed. On the other hand, the Bayesian networks as a strong approach of decision making modeling of uncertain situations has attracted many researchers in various areas. These networks provide a decision supporting system for problems with uncertainties or probable reasoning. In this paper, the available risk factors in product development have been first identified in an electric company and then, the Bayesian network has been utilized and their interrelationships have been modeled to evaluate the available risk in the process. To determine the primary and conditional probabilities of the nodes, the viewpoints of experts in this area have been applied. The available risks in this process have been divided to High (H, Medium (M and Low (L groups and analyzed by the Agena Risk software. The findings derived from software output indicate that the production of the desired product has relatively high risk. In addition, Predictive support and Diagnostic support have been performed on the model with two different scenarios.
Theoretical analysis of the coherence-brightened laser in air
Yuan, Luqi; Hokr, Brett H.; Traverso, Andrew J.; Voronine, Dmitri V.; Rostovtsev, Yuri; Sokolov, Alexei V.; Scully, Marlan O.
2013-02-01
We present a detailed theoretical study of a recent experiment [A. J. Traverso , Proc. Natl. Acad. Sci. USAPNASA60027-842410.1073/pnas.1211481109 109, 15185 (2012)] in which a laserlike source is created in air by pumping with a nanosecond pulse. The source generates radiation in the forward and backward directions. The temporal behavior of the emitted pulses is investigated for different pump shapes and durations. Our analysis indicates that the spiky emission is due to quantum coherence via cooperation between atoms of an ensemble, which leads to strong-oscillatory superfluorescence. We show that these cooperative nonadiabatic coherence effects cannot be described by rate equations and instead a full set of the Maxwell-Bloch equations must be used. We consider a range of parameters and study transitions between various regimes. Understanding these coherence-brightened processes in air should lead to improvements in environmental, atmospheric remote sensing and other applications.
Dyadic analysis of partially coherent submillimeter-wave antenna systems
Withington, S.; Yassin, G.; Murphy, J. A.
2001-08-01
We describe a procedure for simulating the behavior of partially coherent submillimeter-wave antenna systems. The procedure is based on the principle that the second-order statistical properties of any partially coherent vector field can be decomposed into a sum of fully coherent, but completely uncorrelated, natural modes. Any of the standard electromagnetic analysis techniques-physical optics, geometrical theory of diffraction, etc.-can be used to propagate and scatter the modes individually, and the statistical properties of the total transformed field reconstructed at the output surface by means of superposition. In the case of modal optics-plane waves, Gaussian optics, waveguide mode matching, etc.-the properties of the field can be traced directly by means of scattering matrices. The overall procedure is of considerable value for calculating the behavior of astronomical instruments comprising planar and waveguide multimode bolometers, submillimeter-wave optical components, and large reflecting antennas.
Xu, Chengcheng; Wang, Wei; Liu, Pan; Li, Zhibin
2015-12-01
This study aimed to develop a real-time crash risk model with limited data in China by using Bayesian meta-analysis and Bayesian inference approach. A systematic review was first conducted by using three different Bayesian meta-analyses, including the fixed effect meta-analysis, the random effect meta-analysis, and the meta-regression. The meta-analyses provided a numerical summary of the effects of traffic variables on crash risks by quantitatively synthesizing results from previous studies. The random effect meta-analysis and the meta-regression produced a more conservative estimate for the effects of traffic variables compared with the fixed effect meta-analysis. Then, the meta-analyses results were used as informative priors for developing crash risk models with limited data. Three different meta-analyses significantly affect model fit and prediction accuracy. The model based on meta-regression can increase the prediction accuracy by about 15% as compared to the model that was directly developed with limited data. Finally, the Bayesian predictive densities analysis was used to identify the outliers in the limited data. It can further improve the prediction accuracy by 5.0%.
Gutiérrez, Jose Manuel; San Martín, Daniel; Herrera, Sixto; Santiago Cofiño, Antonio
2016-04-01
The growing availability of spatial datasets (observations, reanalysis, and regional and global climate models) demands efficient multivariate spatial modeling techniques for many problems of interest (e.g. teleconnection analysis, multi-site downscaling, etc.). Complex networks have been recently applied in this context using graphs built from pairwise correlations between the different stations (or grid boxes) forming the dataset. However, this analysis does not take into account the full dependence structure underlying the data, gien by all possible marginal and conditional dependencies among the stations, and does not allow a probabilistic analysis of the dataset. In this talk we introduce Bayesian networks as an alternative multivariate analysis and modeling data-driven technique which allows building a joint probability distribution of the stations including all relevant dependencies in the dataset. Bayesian networks is a sound machine learning technique using a graph to 1) encode the main dependencies among the variables and 2) to obtain a factorization of the joint probability distribution of the stations given by a reduced number of parameters. For a particular problem, the resulting graph provides a qualitative analysis of the spatial relationships in the dataset (alternative to complex network analysis), and the resulting model allows for a probabilistic analysis of the dataset. Bayesian networks have been widely applied in many fields, but their use in climate problems is hampered by the large number of variables (stations) involved in this field, since the complexity of the existing algorithms to learn from data the graphical structure grows nonlinearly with the number of variables. In this contribution we present a modified local learning algorithm for Bayesian networks adapted to this problem, which allows inferring the graphical structure for thousands of stations (from observations) and/or gridboxes (from model simulations) thus providing new
Bayesian Analysis of Multiple Populations I: Statistical and Computational Methods
Stenning, D C; Robinson, E; van Dyk, D A; von Hippel, T; Sarajedini, A; Stein, N
2016-01-01
We develop a Bayesian model for globular clusters composed of multiple stellar populations, extending earlier statistical models for open clusters composed of simple (single) stellar populations (vanDyk et al. 2009, Stein et al. 2013). Specifically, we model globular clusters with two populations that differ in helium abundance. Our model assumes a hierarchical structuring of the parameters in which physical properties---age, metallicity, helium abundance, distance, absorption, and initial mass---are common to (i) the cluster as a whole or to (ii) individual populations within a cluster, or are unique to (iii) individual stars. An adaptive Markov chain Monte Carlo (MCMC) algorithm is devised for model fitting that greatly improves convergence relative to its precursor non-adaptive MCMC algorithm. Our model and computational tools are incorporated into an open-source software suite known as BASE-9. We use numerical studies to demonstrate that our method can recover parameters of two-population clusters, and al...
Unsupervised Transient Light Curve Analysis Via Hierarchical Bayesian Inference
Sanders, Nathan; Soderberg, Alicia
2014-01-01
Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometr...
Variational Bayesian Causal Connectivity Analysis for fMRI
Directory of Open Access Journals (Sweden)
Martin eLuessi
2014-05-01
Full Text Available The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions.
A Software Risk Analysis Model Using Bayesian Belief Network
Institute of Scientific and Technical Information of China (English)
Yong Hu; Juhua Chen; Mei Liu; Yang Yun; Junbiao Tang
2006-01-01
The uncertainty during the period of software project development often brings huge risks to contractors and clients. Ifwe can find an effective method to predict the cost and quality of software projects based on facts like the project character and two-side cooperating capability at the beginning of the project, we can reduce the risk.Bayesian Belief Network(BBN) is a good tool for analyzing uncertain consequences, but it is difficult to produce precise network structure and conditional probability table. In this paper, we built up network structure by Delphi method for conditional probability table learning, and learn update probability table and nodes' confidence levels continuously according to the application cases, which made the evaluation network have learning abilities, and evaluate the software development risk of organization more accurately. This paper also introduces EM algorithm, which will enhance the ability to produce hidden nodes caused by variant software projects.
Family Background Variables as Instruments for Education in Income Regressions: A Bayesian Analysis
Hoogerheide, Lennart; Block, Joern H.; Thurik, Roy
2012-01-01
The validity of family background variables instrumenting education in income regressions has been much criticized. In this paper, we use data from the 2004 German Socio-Economic Panel and Bayesian analysis to analyze to what degree violations of the strict validity assumption affect the estimation results. We show that, in case of moderate direct…
In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...
A Bayesian multidimensional scaling procedure for the spatial analysis of revealed choice data
DeSarbo, WS; Kim, Y; Fong, D
1999-01-01
We present a new Bayesian formulation of a vector multidimensional scaling procedure for the spatial analysis of binary choice data. The Gibbs sampler is gainfully employed to estimate the posterior distribution of the specified scalar products, bilinear model parameters. The computational procedure
Bayesian Factor Analysis as a Variable-Selection Problem: Alternative Priors and Consequences.
Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric
2016-01-01
Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor-loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, Muthén & Asparouhov proposed a Bayesian structural equation modeling (BSEM) approach to explore the presence of cross loadings in CFA models. We show that the issue of determining factor-loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov's approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike-and-slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set is used to demonstrate our approach. PMID:27314566
Bayesian Factor Analysis as a Variable-Selection Problem: Alternative Priors and Consequences.
Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric
2016-01-01
Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor-loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, Muthén & Asparouhov proposed a Bayesian structural equation modeling (BSEM) approach to explore the presence of cross loadings in CFA models. We show that the issue of determining factor-loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov's approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike-and-slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set is used to demonstrate our approach.
Micronutrients in HIV: a Bayesian meta-analysis.
Directory of Open Access Journals (Sweden)
George M Carter
Full Text Available Approximately 28.5 million people living with HIV are eligible for treatment (CD4<500, but currently have no access to antiretroviral therapy. Reduced serum level of micronutrients is common in HIV disease. Micronutrient supplementation (MNS may mitigate disease progression and mortality.We synthesized evidence on the effect of micronutrient supplementation on mortality and rate of disease progression in HIV disease.We searched MEDLINE, EMBASE, the Cochrane Central, AMED and CINAHL databases through December 2014, without language restriction, for studies of greater than 3 micronutrients versus any or no comparator. We built a hierarchical Bayesian random effects model to synthesize results. Inferences are based on the posterior distribution of the population effects; posterior distributions were approximated by Markov chain Monte Carlo in OpenBugs.From 2166 initial references, we selected 49 studies for full review and identified eight reporting on disease progression and/or mortality. Bayesian synthesis of data from 2,249 adults in three studies estimated the relative risk of disease progression in subjects on MNS vs. control as 0.62 (95% credible interval, 0.37, 0.96. Median number needed to treat is 8.4 (4.8, 29.9 and the Bayes Factor 53.4. Based on data reporting on 4,095 adults reporting mortality in 7 randomized controlled studies, the RR was 0.84 (0.38, 1.85, NNT is 25 (4.3, ∞.MNS significantly and substantially slows disease progression in HIV+ adults not on ARV, and possibly reduces mortality. Micronutrient supplements are effective in reducing progression with a posterior probability of 97.9%. Considering MNS low cost and lack of adverse effects, MNS should be standard of care for HIV+ adults not yet on ARV.
Direct message passing for hybrid Bayesian networks and performance analysis
Sun, Wei; Chang, K. C.
2010-04-01
Probabilistic inference for hybrid Bayesian networks, which involves both discrete and continuous variables, has been an important research topic over the recent years. This is not only because a number of efficient inference algorithms have been developed and used maturely for simple types of networks such as pure discrete model, but also for the practical needs that continuous variables are inevitable in modeling complex systems. Pearl's message passing algorithm provides a simple framework to compute posterior distribution by propagating messages between nodes and can provides exact answer for polytree models with pure discrete or continuous variables. In addition, applying Pearl's message passing to network with loops usually converges and results in good approximation. However, for hybrid model, there is a need of a general message passing algorithm between different types of variables. In this paper, we develop a method called Direct Message Passing (DMP) for exchanging messages between discrete and continuous variables. Based on Pearl's algorithm, we derive formulae to compute messages for variables in various dependence relationships encoded in conditional probability distributions. Mixture of Gaussian is used to represent continuous messages, with the number of mixture components up to the size of the joint state space of all discrete parents. For polytree Conditional Linear Gaussian (CLG) Bayesian network, DMP has the same computational requirements and can provide exact solution as the one obtained by the Junction Tree (JT) algorithm. However, while JT can only work for the CLG model, DMP can be applied for general nonlinear, non-Gaussian hybrid model to produce approximate solution using unscented transformation and loopy propagation. Furthermore, we can scale the algorithm by restricting the number of mixture components in the messages. Empirically, we found that the approximation errors are relatively small especially for nodes that are far away from
Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.
2016-11-01
We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log(R^' }_{ {HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.
Figueira, P.; Faria, J. P.; Adibekyan, V. Zh.; Oshagh, M.; Santos, N. C.
2016-05-01
We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, ρ, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short (˜ 130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log( R^' }_{{HK}}) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the reader interested in this kind of problem to apply our code to his/her own scientific problems. The full understanding of what the Bayesian framework is can only be gained through the insight that comes by handling priors, assessing the convergence of Monte Carlo runs, and a multitude of other practical problems. We hope to contribute so that Bayesian analysis becomes a tool in the toolkit of researchers, and they understand by experience its advantages and limitations.
Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A
2015-07-01
Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments. PMID:25897515
Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L
2016-02-10
Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure.
Iskandar, Ismed; Satria Gondokaryono, Yudi
2016-02-01
In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range
Damage Detection and Quantification Using Transmissibility Coherence Analysis
Directory of Open Access Journals (Sweden)
Yun-Lai Zhou
2015-01-01
Full Text Available A new transmissibility-based damage detection and quantification approach is proposed. Based on the operational modal analysis, the transmissibility is extracted from system responses and transmissibility coherence is defined and analyzed. Afterwards, a sensitive-damage indicator is defined in order to detect and identify the severity of damage and compared with an indicator developed by other authors. The proposed approach is validated on data from a physics-based numerical model as well as experimental data from a three-story aluminum frame structure. For both numerical simulation and experiment the results of the new indicator reveal a better performance than coherence measure proposed in Rizos et al., 2008, Rizos et al., 2002, Fassois and Sakellariou, 2007, especially when nonlinearity occurs, which might be further used in real engineering. The main contribution of this study is the construction of the relation between transmissibility coherence and frequency response function coherence and the construction of an effective indicator based on the transmissibility modal assurance criteria for damage (especially for minor nonlinearity detection as well as quantification.
Bayesian Analysis of Cosmic Ray Propagation: Evidence against Homogeneous Diffusion
Jóhannesson, G.; Ruiz de Austri, R.; Vincent, A. C.; Moskalenko, I. V.; Orlando, E.; Porter, T. A.; Strong, A. W.; Trotta, R.; Feroz, F.; Graff, P.; Hobson, M. P.
2016-06-01
We present the results of the most complete scan of the parameter space for cosmic ray (CR) injection and propagation. We perform a Bayesian search of the main GALPROP parameters, using the MultiNest nested sampling algorithm, augmented by the BAMBI neural network machine-learning package. This is the first study to separate out low-mass isotopes (p, \\bar{p}, and He) from the usual light elements (Be, B, C, N, and O). We find that the propagation parameters that best-fit p,\\bar{p}, and He data are significantly different from those that fit light elements, including the B/C and 10Be/9Be secondary-to-primary ratios normally used to calibrate propagation parameters. This suggests that each set of species is probing a very different interstellar medium, and that the standard approach of calibrating propagation parameters using B/C can lead to incorrect results. We present posterior distributions and best-fit parameters for propagation of both sets of nuclei, as well as for the injection abundances of elements from H to Si. The input GALDEF files with these new parameters will be included in an upcoming public GALPROP update.
On the On-Off Problem: An Objective Bayesian Analysis
Ahnen, Max Ludwig
2015-01-01
The On-Off problem, aka. Li-Ma problem, is a statistical problem where a measured rate is the sum of two parts. The first is due to a signal and the second due to a background, both of which are unknown. Mostly frequentist solutions are being used that are only adequate for high count numbers. When the events are rare such an approximation is not good enough. Indeed, in high-energy astrophysics this is often the rule rather than the exception. I will present a universal objective Bayesian solution that depends only on the initial three parameters of the On-Off problem: the number of events in the "on" region, the number of events in the "off" region, and their ratio-of-exposure. With a two-step approach it is possible to infer the signal's significance, strength, uncertainty or upper limit in a unified a way. The approach is valid without restrictions for any count number including zero and may be widely applied in particle physics, cosmic-ray physics and high-energy astrophysics. I apply the method to Gamma ...
Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler*
Jin, Ick Hoon; Yuan, Ying; Liang, Faming
2013-01-01
Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulati...
OVERALL SENSITIVITY ANALYSIS UTILIZING BAYESIAN NETWORK FOR THE QUESTIONNAIRE INVESTIGATION ON SNS
Tsuyoshi Aburai; Kazuhiro Takeyasu
2013-01-01
Social Networking Service (SNS) is prevailing rapidly in Japan in recent years. The most popular ones are Facebook, mixi, and Twitter, which are utilized in various fields of life together with the convenient tool such as smart-phone. In this work, a questionnaire investigation is carried out in order to clarify the current usage condition, issues and desired functions. More than 1,000 samples are gathered. Bayesian network is utilized for this analysis. Sensitivity analysis is carried out by...
UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE
Energy Technology Data Exchange (ETDEWEB)
Sanders, N. E.; Soderberg, A. M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Betancourt, M., E-mail: nsanders@cfa.harvard.edu [Department of Statistics, University of Warwick, Coventry CV4 7AL (United Kingdom)
2015-02-10
Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.
Individual organisms as units of analysis: Bayesian-clustering alternatives in population genetics.
Mank, Judith E; Avise, John C
2004-12-01
Population genetic analyses traditionally focus on the frequencies of alleles or genotypes in 'populations' that are delimited a priori. However, there are potential drawbacks of amalgamating genetic data into such composite attributes of assemblages of specimens: genetic information on individual specimens is lost or submerged as an inherent part of the analysis. A potential also exists for circular reasoning when a population's initial identification and subsequent genetic characterization are coupled. In principle, these problems are circumvented by some newer methods of population identification and individual assignment based on statistical clustering of specimen genotypes. Here we evaluate a recent method in this genre--Bayesian clustering--using four genotypic data sets involving different types of molecular markers in non-model organisms from nature. As expected, measures of population genetic structure (F(ST) and phiST) tended to be significantly greater in Bayesian a posteriori data treatments than in analyses where populations were delimited a priori. In the four biological contexts examined, which involved both geographic population structures and hybrid zones, Bayesian clustering was able to recover differentiated populations, and Bayesian assignments were able to identify likely population sources of specific individuals.
Application of Bayesian graphs to SN Ia data analysis and compression
Ma, Cong; Corasaniti, Pier-Stefano; Bassett, Bruce A.
2016-08-01
Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the Joint Light-curve Analysis (JLA) dataset (Betoule et al. 2014). In contrast to the χ2 approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with χ2 analysis results we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal 6σ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only 2.4σ. Systematic offsets on the cosmological parameters remain small, but may increase by combining constraints from complementary cosmological probes. The bias of the χ2 analysis is due to neglecting the parameter-dependent log-determinant of the data covariance, which gives more statistical weight to larger values of the standardization parameters. We find a similar effect on compressed distance modulus data. To this end we implement a fully consistent compression method of the JLA dataset that uses a Gaussian approximation of the posterior distribution for fast generation of compressed data. Overall, the results of our analysis emphasize the need for a fully consistent Bayesian statistical approach in the analysis of future large SN Ia datasets.
Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination
Chiesa, D.; Previtali, E.; Sisti, M.
2014-04-01
In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.
Figueira, P; Adibekyan, V Zh; Oshagh, M; Santos, N C
2016-01-01
We apply the Bayesian framework to assess the presence of a correlation between two quantities. To do so, we estimate the probability distribution of the parameter of interest, $\\rho$, characterizing the strength of the correlation. We provide an implementation of these ideas and concepts using python programming language and the pyMC module in a very short ($\\sim$130 lines of code, heavily commented) and user-friendly program. We used this tool to assess the presence and properties of the correlation between planetary surface gravity and stellar activity level as measured by the log($R'_{\\mathrm{HK}}$) indicator. The results of the Bayesian analysis are qualitatively similar to those obtained via p-value analysis, and support the presence of a correlation in the data. The results are more robust in their derivation and more informative, revealing interesting features such as asymmetric posterior distributions or markedly different credible intervals, and allowing for a deeper exploration. We encourage the re...
Hongjun Xiao; Yiqi Liu; Daoping Huang
2016-01-01
Mainly due to the hostile environment in wastewater plants (WWTPs), the reliability of sensors with respect to important qualities is often poor. In this work, we present the design of a semiadaptive fault diagnosis method based on the variational Bayesian mixture factor analysis (VBMFA) to support process monitoring. The proposed method is capable of capturing strong nonlinearity and the significant dynamic feature of WWTPs that seriously limit the application of conventional multivariate st...
A Hybrid Approach for Reliability Analysis Based on Analytic Hierarchy Process and Bayesian Network
Zubair, Muhammad
2014-01-01
By using analytic hierarchy process (AHP) and Bayesian Network (BN) the present research signifies the technical and non-technical issues of nuclear accidents. The study exposed that the technical faults was one major reason of these accidents. Keep an eye on other point of view it becomes clearer that human behavior like dishonesty, insufficient training, and selfishness are also play a key role to cause these accidents. In this study, a hybrid approach for reliability analysis based on AHP ...
Lu, Zhaohua; Zhu, Hongtu; Knickmeyer, Rebecca C.; Sullivan, Patrick F.; Stephanie, Williams N.; Zou, Fei
2015-01-01
The power of genome-wide association studies (GWAS) for mapping complex traits with single SNP analysis may be undermined by modest SNP effect sizes, unobserved causal SNPs, correlation among adjacent SNPs, and SNP-SNP interactions. Alternative approaches for testing the association between a single SNP-set and individual phenotypes have been shown to be promising for improving the power of GWAS. We propose a Bayesian latent variable selection (BLVS) method to simultaneously model the joint a...
conting : an R package for Bayesian analysis of complete and incomplete contingency tables
Overstall, Antony M.; Ruth King
2014-01-01
The aim of this paper is to demonstrate the R package conting for the Bayesian analysis of complete and incomplete contingency tables using hierarchical log-linear models. This package allows a user to identify interactions between categorical factors (via complete contingency tables) and to estimate closed population sizes using capture-recapture studies (via incomplete contingency tables). The models are fitted using Markov chain Monte Carlo methods. In particular, implementations of the ...
Modeling and Analysis of Call Center Arrival Data: A Bayesian Approach
Refik Soyer; M. Murat Tarimcilar
2008-01-01
In this paper, we present a modulated Poisson process model to describe and analyze arrival data to a call center. The attractive feature of this model is that it takes into account both covariate and time effects on the call volume intensity, and in so doing, enables us to assess the effectiveness of different advertising strategies along with predicting the arrival patterns. A Bayesian analysis of the model is developed and an extension of the model is presented to describe potential hetero...
Bayesian Propensity Score Analysis: Simulation and Case Study
Kaplan, David; Chen, Cassie J. S.
2011-01-01
Propensity score analysis (PSA) has been used in a variety of settings, such as education, epidemiology, and sociology. Most typically, propensity score analysis has been implemented within the conventional frequentist perspective of statistics. This perspective, as is well known, does not account for uncertainty in either the parameters of the…
Dating ancient Chinese celadon porcelain by neutron activation analysis and bayesian classification
International Nuclear Information System (INIS)
Dating ancient Chinese porcelain is one of the most important and difficult problems in porcelain archaeological field. Eighteen elements in bodies of ancient celadon porcelains fired in Southern Song to Yuan period (AD 1127-1368) and Ming dynasty (AD 1368-1644), including La, Sm, U, Ce, etc., were determined by neutron activation analysis (NAA). After the outliers of experimental data were excluded and multivariate normal distribution was tested, and Bayesian classification was used for dating of 165 ancient celadon porcelain samples. The results show that 98.2% of total ancient celadon porcelain samples are classified correctly. It means that NAA and Bayesian classification are very useful for dating ancient porcelain. (authors)
Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test
DEFF Research Database (Denmark)
Wang, C.; Turnbull, B.W.; Nielsen, Søren Saxmose;
2011-01-01
A Bayesian methodology was developed based on a latent change-point model to evaluate the performance of milk ELISA and fecal culture tests for longitudinal Johne's disease diagnostic data. The situation of no perfect reference test was considered; that is, no “gold standard.” A change......-point process with a Weibull survival hazard function was used to model the progression of the hidden disease status. The model adjusted for the fixed effects of covariate variables and random effects of subject on the diagnostic testing procedure. Markov chain Monte Carlo methods were used to compute....... An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has...
Risks Analysis of Logistics Financial Business Based on Evidential Bayesian Network
Directory of Open Access Journals (Sweden)
Ying Yan
2013-01-01
Full Text Available Risks in logistics financial business are identified and classified. Making the failure of the business as the root node, a Bayesian network is constructed to measure the risk levels in the business. Three importance indexes are calculated to find the most important risks in the business. And more, considering the epistemic uncertainties in the risks, evidence theory associate with Bayesian network is used as an evidential network in the risk analysis of logistics finance. To find how much uncertainty in root node is produced by each risk, a new index, epistemic importance, is defined. Numerical examples show that the proposed methods could provide a lot of useful information. With the information, effective approaches could be found to control and avoid these sensitive risks, thus keep logistics financial business working more reliable. The proposed method also gives a quantitative measure of risk levels in logistics financial business, which provides guidance for the selection of financing solutions.
MorePower 6.0 for ANOVA with relational confidence intervals and Bayesian analysis.
Campbell, Jamie I D; Thompson, Valerie A
2012-12-01
MorePower 6.0 is a flexible freeware statistical calculator that computes sample size, effect size, and power statistics for factorial ANOVA designs. It also calculates relational confidence intervals for ANOVA effects based on formulas from Jarmasz and Hollands (Canadian Journal of Experimental Psychology 63:124-138, 2009), as well as Bayesian posterior probabilities for the null and alternative hypotheses based on formulas in Masson (Behavior Research Methods 43:679-690, 2011). The program is unique in affording direct comparison of these three approaches to the interpretation of ANOVA tests. Its high numerical precision and ability to work with complex ANOVA designs could facilitate researchers' attention to issues of statistical power, Bayesian analysis, and the use of confidence intervals for data interpretation. MorePower 6.0 is available at https://wiki.usask.ca/pages/viewpageattachments.action?pageId=420413544 .
Application of Bayesian graphs to SN Ia data analysis and compression
Ma, Con; Bassett, Bruce A
2016-01-01
Bayesian graphical models are an efficient tool for modelling complex data and derive self-consistent expressions of the posterior distribution of model parameters. We apply Bayesian graphs to perform statistical analyses of Type Ia supernova (SN Ia) luminosity distance measurements from the Joint Light-curve Analysis (JLA) dataset (Betoule et al. 2014, arXiv:1401.4064). In contrast to the $\\chi^2$ approach used in previous studies, the Bayesian inference allows us to fully account for the standard-candle parameter dependence of the data covariance matrix. Comparing with $\\chi^2$ analysis results we find a systematic offset of the marginal model parameter bounds. We demonstrate that the bias is statistically significant in the case of the SN Ia standardization parameters with a maximal $6\\sigma$ shift of the SN light-curve colour correction. In addition, we find that the evidence for a host galaxy correction is now only $2.4\\sigma$. Systematic offsets on the cosmological parameters remain small, but may incre...
bspmma: An R Package for Bayesian Semiparametric Models for Meta-Analysis
Directory of Open Access Journals (Sweden)
Deborah Burr
2012-07-01
Full Text Available We introduce an R package, bspmma, which implements a Dirichlet-based random effects model specific to meta-analysis. In meta-analysis, when combining effect estimates from several heterogeneous studies, it is common to use a random-effects model. The usual frequentist or Bayesian models specify a normal distribution for the true effects. However, in many situations, the effect distribution is not normal, e.g., it can have thick tails, be skewed, or be multi-modal. A Bayesian nonparametric model based on mixtures of Dirichlet process priors has been proposed in the literature, for the purpose of accommodating the non-normality. We review this model and then describe a competitor, a semiparametric version which has the feature that it allows for a well-defined centrality parameter convenient for determining whether the overall effect is significant. This second Bayesian model is based on a different version of the Dirichlet process prior, and we call it the "conditional Dirichlet model". The package contains functions to carry out analyses based on either the ordinary or the conditional Dirichlet model, functions for calculating certain Bayes factors that provide a check on the appropriateness of the conditional Dirichlet model, and functions that enable an empirical Bayes selection of the precision parameter of the Dirichlet process. We illustrate the use of the package on two examples, and give an interpretation of the results in these two different scenarios.
cudaBayesreg: Parallel Implementation of a Bayesian Multilevel Model for fMRI Data Analysis
Directory of Open Access Journals (Sweden)
Adelino R. Ferreira da Silva
2011-10-01
Full Text Available Graphic processing units (GPUs are rapidly gaining maturity as powerful general parallel computing devices. A key feature in the development of modern GPUs has been the advancement of the programming model and programming tools. Compute Unified Device Architecture (CUDA is a software platform for massively parallel high-performance computing on Nvidia many-core GPUs. In functional magnetic resonance imaging (fMRI, the volume of the data to be processed, and the type of statistical analysis to perform call for high-performance computing strategies. In this work, we present the main features of the R-CUDA package cudaBayesreg which implements in CUDA the core of a Bayesian multilevel model for the analysis of brain fMRI data. The statistical model implements a Gibbs sampler for multilevel/hierarchical linear models with a normal prior. The main contribution for the increased performance comes from the use of separate threads for fitting the linear regression model at each voxel in parallel. The R-CUDA implementation of the Bayesian model proposed here has been able to reduce significantly the run-time processing of Markov chain Monte Carlo (MCMC simulations used in Bayesian fMRI data analyses. Presently, cudaBayesreg is only configured for Linux systems with Nvidia CUDA support.
Time-varying nonstationary multivariate risk analysis using a dynamic Bayesian copula
Sarhadi, Ali; Burn, Donald H.; Concepción Ausín, María.; Wiper, Michael P.
2016-03-01
A time-varying risk analysis is proposed for an adaptive design framework in nonstationary conditions arising from climate change. A Bayesian, dynamic conditional copula is developed for modeling the time-varying dependence structure between mixed continuous and discrete multiattributes of multidimensional hydrometeorological phenomena. Joint Bayesian inference is carried out to fit the marginals and copula in an illustrative example using an adaptive, Gibbs Markov Chain Monte Carlo (MCMC) sampler. Posterior mean estimates and credible intervals are provided for the model parameters and the Deviance Information Criterion (DIC) is used to select the model that best captures different forms of nonstationarity over time. This study also introduces a fully Bayesian, time-varying joint return period for multivariate time-dependent risk analysis in nonstationary environments. The results demonstrate that the nature and the risk of extreme-climate multidimensional processes are changed over time under the impact of climate change, and accordingly the long-term decision making strategies should be updated based on the anomalies of the nonstationary environment.
Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang
2016-07-01
This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.
Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test.
Wang, C; Turnbull, B W; Nielsen, S S; Gröhn, Y T
2011-05-01
A Bayesian methodology was developed based on a latent change-point model to evaluate the performance of milk ELISA and fecal culture tests for longitudinal Johne's disease diagnostic data. The situation of no perfect reference test was considered; that is, no "gold standard." A change-point process with a Weibull survival hazard function was used to model the progression of the hidden disease status. The model adjusted for the fixed effects of covariate variables and random effects of subject on the diagnostic testing procedure. Markov chain Monte Carlo methods were used to compute the posterior estimates of the model parameters that provide the basis for inference concerning the accuracy of the diagnostic procedure. Based on the Bayesian approach, the posterior probability distribution of the change-point onset time can be obtained and used as a criterion for infection diagnosis. An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has an area under the receiver operating characteristic curve (AUC) of 0.984, and is superior to the raw ELISA (AUC=0.911) and fecal culture (sensitivity=0.358, specificity=0.980) tests for Johne's disease diagnosis. PMID:21524521
Puncher, M; Birchall, A; Bull, R K
2014-12-01
In Bayesian inference, the initial knowledge regarding the value of a parameter, before additional data are considered, is represented as a prior probability distribution. This paper describes the derivation of a prior distribution of intake that was used for the Bayesian analysis of plutonium and uranium worker doses in a recent epidemiology study. The chosen distribution is log-normal with a geometric standard deviation of 6 and a median value that is derived for each worker based on the duration of the work history and the number of reported acute intakes. The median value is a function of the work history and a constant related to activity in air concentration, M, which is derived separately for uranium and plutonium. The value of M is based primarily on measurements of plutonium and uranium in air derived from historical personal air sampler (PAS) data. However, there is significant uncertainty on the value of M that results from paucity of PAS data and from extrapolating these measurements to actual intakes. This paper compares posterior and prior distributions of intake and investigates the sensitivity of the Bayesian analyses to the assumed value of M. It is found that varying M by a factor of 10 results in a much smaller factor of 2 variation in mean intake and lung dose for both plutonium and uranium. It is concluded that if a log-normal distribution is considered to adequately represent worker intakes, then the Bayesian posterior distribution of dose is relatively insensitive to the value assumed of M. PMID:24191121
Excitonic Coherence in Semiconductor Nanostructures Measured by Speckle Analysis
DEFF Research Database (Denmark)
Langbein, Wolfgang; Hvam, Jørn Märcher
1999-01-01
A new method to measure the time-dependent coherence of optical excitations in solids is presented, in which the coherence degree of light emission is deduced from its intensity fluctuations over the emission directions (speckles). With this method the decays of intensity and coherence are determ...
Alves, Nelson A; Rizzi, Leandro G
2015-01-01
Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature $\\beta(E)$ and the microcanonical entropy $S(E)$ is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms $H(E)$, which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for $H(E)$ in order to evaluate $\\beta(E)$ by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distrib...
Bayesian Analysis and Segmentation of Multichannel Image Sequences
Chang, Michael Ming Hsin
This thesis is concerned with the segmentation and analysis of multichannel image sequence data. In particular, we use maximum a posteriori probability (MAP) criterion and Gibbs random fields (GRF) to formulate the problems. We start by reviewing the significance of MAP estimation with GRF priors and study the feasibility of various optimization methods for implementing the MAP estimator. We proceed to investigate three areas where image data and parameter estimates are present in multichannels, multiframes, and interrelated in complicated manners. These areas of study include color image segmentation, multislice MR image segmentation, and optical flow estimation and segmentation in multiframe temporal sequences. Besides developing novel algorithms in each of these areas, we demonstrate how to exploit the potential of MAP estimation and GRFs, and we propose practical and efficient implementations. Illustrative examples and relevant experimental results are included.
Brane inflation and the WMAP data: a Bayesian analysis
Lorenz, Larissa; Ringeval, Christophe
2007-01-01
The Wilkinson Microwave Anisotropy Probe (WMAP) constraints on string inspired ``brane inflation'' are investigated. Here, the inflaton field is interpreted as the distance between two branes placed in a flux-enriched background geometry and has a Dirac-Born-Infeld (DBI) kinetic term. Our method relies on an exact numerical integration of the inflationary power spectra coupled to a Markov-Chain Monte-Carlo exploration of the parameter space. This analysis is valid for any perturbative value of the string coupling constant and of the string length, and includes a phenomenological modelling of the reheating era to describe the post-inflationary evolution. It is found that the data favour a scenario where inflation stops by violation of the slow-roll conditions well before brane annihilation, rather than by tachyonic instability. Concerning the background geometry, it is established that log(v) > -10 at 95% confidence level (CL), where "v" is the dimensionless ratio of the five-dimensional sub-manifold at the ba...
OVERALL SENSITIVITY ANALYSIS UTILIZING BAYESIAN NETWORK FOR THE QUESTIONNAIRE INVESTIGATION ON SNS
Directory of Open Access Journals (Sweden)
Tsuyoshi Aburai
2013-11-01
Full Text Available Social Networking Service (SNS is prevailing rapidly in Japan in recent years. The most popular ones are Facebook, mixi, and Twitter, which are utilized in various fields of life together with the convenient tool such as smart-phone. In this work, a questionnaire investigation is carried out in order to clarify the current usage condition, issues and desired functions. More than 1,000 samples are gathered. Bayesian network is utilized for this analysis. Sensitivity analysis is carried out by setting evidence to all items. This enables overall analysis for each item. We analyzed them by sensitivity analysis and some useful results were obtained. We have presented the paper concerning this. But the volume becomes too large, therefore we have split them and this paper shows the latter half of the investigation result by setting evidence to Bayesian Network parameters. Differences in usage objectives and SNS sites are made clear by the attributes and preference of SNS users. They can be utilized effectively for marketing by clarifying the target customer through the sensitivity analysis.
BayesLCA: An R Package for Bayesian Latent Class Analysis
Directory of Open Access Journals (Sweden)
Arthur White
2014-11-01
Full Text Available The BayesLCA package for R provides tools for performing latent class analysis within a Bayesian setting. Three methods for fitting the model are provided, incorporating an expectation-maximization algorithm, Gibbs sampling and a variational Bayes approximation. The article briefly outlines the methodology behind each of these techniques and discusses some of the technical difficulties associated with them. Methods to remedy these problems are also described. Visualization methods for each of these techniques are included, as well as criteria to aid model selection.
Filipponi, A.; Di Cicco, A.; Principi, E.
2012-12-01
A Bayesian data-analysis approach to data sets of maximum undercooling temperatures recorded in repeated melting-cooling cycles of high-purity samples is proposed. The crystallization phenomenon is described in terms of a nonhomogeneous Poisson process driven by a temperature-dependent sample nucleation rate J(T). The method was extensively tested by computer simulations and applied to real data for undercooled liquid Ge. It proved to be particularly useful in the case of scarce data sets where the usage of binned data would degrade the available experimental information.
Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification
Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang
2016-01-01
Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures. PMID:27463975
Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification.
Directory of Open Access Journals (Sweden)
Fang Yan
Full Text Available Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie analysis was proposed by mapping bow-tie analysis into Bayesian network (BN. Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures.
Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification.
Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang
2016-01-01
Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures. PMID:27463975
Analysis of forest backscattering characteristics based on polarization coherence tomography
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
It is difficult to make an inventory of vertical profiles of forest structure parameters in field measurements.However,analysis and understanding of forest backscattering characteristics contribute to estimation and detection of forest vertical structure because of the close relationships between backscattering characteristics and structure parameters.The vertical structure function in the complex interferometric coherence definition,which represents the vertical variation of microwave scattering with the penetration depth at a point in the 2-D radar image and can be used to analyze the forest backscattering characteristics,can be reconstructed from polarization coherence tomography(PCT).Based on PCT,the paper analyzes the forest backscattering characteristics and explores the inherent relationship between the result of PCT and the forest structure parameters from numerical simulation of Random Volume over Ground model(RVoG),Polarimetric SAR interferometry(PolInSAR)simulation of forest scene and PolInSAR data at L-band of the test site Traunstein.Firstly,the effects of the extinction coefficient and surface-to-volume scattering ratio in RVoG model on vertical backscattering characteristics are analyzed by means of numerical simulation.Secondly,by applying PCT to L-band POLInSAR simulations of forest scene,different variations of vertical backscattering due to different extinction coefficients and the ratios of surface-to-volume scattering resulting from different polarizations,forest types and densities are displayed and analyzed.Then a concept of relative average backscattering intensity is presented,and the factors which affect its vertical distribution are also discussed.Preliminary results show that there is high sensitivity of the vertical distribution of forest relative average backscattering intensity to the polarization,forest type and density.Finally,based on repeat pass DLR E-SAR L-band airborne POLInSAR data,the capability of PCT technology for detection
Bayesian analysis of esophageal cancer mortality in the presence of misclassification
Directory of Open Access Journals (Sweden)
Mohamad Amin Pourhoseingholi
2011-12-01
Full Text Available
Background: Esophageal cancer (EC is one of the most common cancers worldwide. Mortality is a familiar projection that addresses the burden of cancers. With regards to cancer mortality, data are important and used to monitor the effects of screening programs, earlier diagnosis and other prognostic factors. But according to the Iranian death registry, about 20% of death statistics are still recorded in misclassified categories. The aim of this study is to estimate EC mortality in the Iranian population, using a Bayesian approach in order to revise this misclassification.
Methods: We analyzed National death Statistics reported by the Iranian Ministry of Health and Medical Education from 1995 to 2004. ECs [ICD-9; C15] were expressed as annual mortality rates/100,000, overall, by sex, by age group and age standardized rate (ASR. The Bayesian approach was used to correct and account for misclassification effects in Poisson count regression, with a beta prior employed to estimate the mortality rate of EC in age and sex groups.
Results: According to the Bayesian analysis, there were between 20 to 30 percent underreported deaths in mortality records related to EC, and the rate of mortality from EC has increased through recent years.
Conclusions: Our findings suggested a substantial undercount of EC mortality in the Iranian population. So
policy makers who determine research and treatment priorities based on reported death rates should notice of this underreported data.
BayGO: Bayesian analysis of ontology term enrichment in microarray data
Directory of Open Access Journals (Sweden)
Gomes Suely L
2006-02-01
Full Text Available Abstract Background The search for enriched (aka over-represented or enhanced ontology terms in a list of genes obtained from microarray experiments is becoming a standard procedure for a system-level analysis. This procedure tries to summarize the information focussing on classification designs such as Gene Ontology, KEGG pathways, and so on, instead of focussing on individual genes. Although it is well known in statistics that association and significance are distinct concepts, only the former approach has been used to deal with the ontology term enrichment problem. Results BayGO implements a Bayesian approach to search for enriched terms from microarray data. The R source-code is freely available at http://blasto.iq.usp.br/~tkoide/BayGO in three versions: Linux, which can be easily incorporated into pre-existent pipelines; Windows, to be controlled interactively; and as a web-tool. The software was validated using a bacterial heat shock response dataset, since this stress triggers known system-level responses. Conclusion The Bayesian model accounts for the fact that, eventually, not all the genes from a given category are observable in microarray data due to low intensity signal, quality filters, genes that were not spotted and so on. Moreover, BayGO allows one to measure the statistical association between generic ontology terms and differential expression, instead of working only with the common significance analysis.
Afreen, Nazia; Naqvi, Irshad H; Broor, Shobha; Ahmed, Anwar; Kazim, Syed Naqui; Dohare, Ravins; Kumar, Manoj; Parveen, Shama
2016-03-01
Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011-2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011-2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India.
Application of evidence theory in information fusion of multiple sources in bayesian analysis
Institute of Scientific and Technical Information of China (English)
周忠宝; 蒋平; 武小悦
2004-01-01
How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.
Integrative Bayesian analysis of neuroimaging-genetic data with application to cocaine dependence.
Azadeh, Shabnam; Hobbs, Brian P; Ma, Liangsuo; Nielsen, David A; Moeller, F Gerard; Baladandayuthapani, Veerabhadran
2016-01-15
Neuroimaging and genetic studies provide distinct and complementary information about the structural and biological aspects of a disease. Integrating the two sources of data facilitates the investigation of the links between genetic variability and brain mechanisms among different individuals for various medical disorders. This article presents a general statistical framework for integrative Bayesian analysis of neuroimaging-genetic (iBANG) data, which is motivated by a neuroimaging-genetic study in cocaine dependence. Statistical inference necessitated the integration of spatially dependent voxel-level measurements with various patient-level genetic and demographic characteristics under an appropriate probability model to account for the multiple inherent sources of variation. Our framework uses Bayesian model averaging to integrate genetic information into the analysis of voxel-wise neuroimaging data, accounting for spatial correlations in the voxels. Using multiplicity controls based on the false discovery rate, we delineate voxels associated with genetic and demographic features that may impact diffusion as measured by fractional anisotropy (FA) obtained from DTI images. We demonstrate the benefits of accounting for model uncertainties in both model fit and prediction. Our results suggest that cocaine consumption is associated with FA reduction in most white matter regions of interest in the brain. Additionally, gene polymorphisms associated with GABAergic, serotonergic and dopaminergic neurotransmitters and receptors were associated with FA. PMID:26484829
Risk analysis of emergent water pollution accidents based on a Bayesian Network.
Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie
2016-01-01
To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents.
Bayesian flux balance analysis applied to a skeletal muscle metabolic model.
Heino, Jenni; Tunyan, Knarik; Calvetti, Daniela; Somersalo, Erkki
2007-09-01
In this article, the steady state condition for the multi-compartment models for cellular metabolism is considered. The problem is to estimate the reaction and transport fluxes, as well as the concentrations in venous blood when the stoichiometry and bound constraints for the fluxes and the concentrations are given. The problem has been addressed previously by a number of authors, and optimization-based approaches as well as extreme pathway analysis have been proposed. These approaches are briefly discussed here. The main emphasis of this work is a Bayesian statistical approach to the flux balance analysis (FBA). We show how the bound constraints and optimality conditions such as maximizing the oxidative phosphorylation flux can be incorporated into the model in the Bayesian framework by proper construction of the prior densities. We propose an effective Markov chain Monte Carlo (MCMC) scheme to explore the posterior densities, and compare the results with those obtained via the previously studied linear programming (LP) approach. The proposed methodology, which is applied here to a two-compartment model for skeletal muscle metabolism, can be extended to more complex models.
Directory of Open Access Journals (Sweden)
Madsen Per
2003-03-01
Full Text Available Abstract A fully Bayesian analysis using Gibbs sampling and data augmentation in a multivariate model of Gaussian, right censored, and grouped Gaussian traits is described. The grouped Gaussian traits are either ordered categorical traits (with more than two categories or binary traits, where the grouping is determined via thresholds on the underlying Gaussian scale, the liability scale. Allowances are made for unequal models, unknown covariance matrices and missing data. Having outlined the theory, strategies for implementation are reviewed. These include joint sampling of location parameters; efficient sampling from the fully conditional posterior distribution of augmented data, a multivariate truncated normal distribution; and sampling from the conditional inverse Wishart distribution, the fully conditional posterior distribution of the residual covariance matrix. Finally, a simulated dataset was analysed to illustrate the methodology. This paper concentrates on a model where residuals associated with liabilities of the binary traits are assumed to be independent. A Bayesian analysis using Gibbs sampling is outlined for the model where this assumption is relaxed.
Korsgaard, Inge Riis; Lund, Mogens Sandø; Sorensen, Daniel; Gianola, Daniel; Madsen, Per; Jensen, Just
2003-01-01
A fully Bayesian analysis using Gibbs sampling and data augmentation in a multivariate model of Gaussian, right censored, and grouped Gaussian traits is described. The grouped Gaussian traits are either ordered categorical traits (with more than two categories) or binary traits, where the grouping is determined via thresholds on the underlying Gaussian scale, the liability scale. Allowances are made for unequal models, unknown covariance matrices and missing data. Having outlined the theory, strategies for implementation are reviewed. These include joint sampling of location parameters; efficient sampling from the fully conditional posterior distribution of augmented data, a multivariate truncated normal distribution; and sampling from the conditional inverse Wishart distribution, the fully conditional posterior distribution of the residual covariance matrix. Finally, a simulated dataset was analysed to illustrate the methodology. This paper concentrates on a model where residuals associated with liabilities of the binary traits are assumed to be independent. A Bayesian analysis using Gibbs sampling is outlined for the model where this assumption is relaxed.
Risk analysis of emergent water pollution accidents based on a Bayesian Network.
Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie
2016-01-01
To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. PMID:26433361
Directory of Open Access Journals (Sweden)
Nazia Afreen
2016-03-01
Full Text Available Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011-2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011-2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India.
Current trends in Bayesian methodology with applications
Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia
2015-01-01
Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on
Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie
2016-03-01
In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989
Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie
2016-03-01
In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature.
A Bayesian analysis of the 69 highest energy cosmic rays detected by the Pierre Auger Observatory
Khanin, Alexander; Mortlock, Daniel J.
2016-08-01
The origins of ultrahigh energy cosmic rays (UHECRs) remain an open question. Several attempts have been made to cross-correlate the arrival directions of the UHECRs with catalogues of potential sources, but no definite conclusion has been reached. We report a Bayesian analysis of the 69 events, from the Pierre Auger Observatory (PAO), that aims to determine the fraction of the UHECRs that originate from known AGNs in the Veron-Cety & Verson (VCV) catalogue, as well as AGNs detected with the Swift Burst Alert Telescope (Swift-BAT), galaxies from the 2MASS Redshift Survey (2MRS), and an additional volume-limited sample of 17 nearby AGNs. The study makes use of a multilevel Bayesian model of UHECR injection, propagation and detection. We find that for reasonable ranges of prior parameters the Bayes factors disfavour a purely isotropic model. For fiducial values of the model parameters, we report 68 per cent credible intervals for the fraction of source originating UHECRs of 0.09^{+0.05}_{-0.04}, 0.25^{+0.09}_{-0.08}, 0.24^{+0.12}_{-0.10}, and 0.08^{+0.04}_{-0.03} for the VCV, Swift-BAT and 2MRS catalogues, and the sample of 17 AGNs, respectively.
Critically evaluating the theory and performance of Bayesian analysis of macroevolutionary mixtures.
Moore, Brian R; Höhna, Sebastian; May, Michael R; Rannala, Bruce; Huelsenbeck, John P
2016-08-23
Bayesian analysis of macroevolutionary mixtures (BAMM) has recently taken the study of lineage diversification by storm. BAMM estimates the diversification-rate parameters (speciation and extinction) for every branch of a study phylogeny and infers the number and location of diversification-rate shifts across branches of a tree. Our evaluation of BAMM reveals two major theoretical errors: (i) the likelihood function (which estimates the model parameters from the data) is incorrect, and (ii) the compound Poisson process prior model (which describes the prior distribution of diversification-rate shifts across branches) is incoherent. Using simulation, we demonstrate that these theoretical issues cause statistical pathologies; posterior estimates of the number of diversification-rate shifts are strongly influenced by the assumed prior, and estimates of diversification-rate parameters are unreliable. Moreover, the inability to correctly compute the likelihood or to correctly specify the prior for rate-variable trees precludes the use of Bayesian approaches for testing hypotheses regarding the number and location of diversification-rate shifts using BAMM. PMID:27512038
Directory of Open Access Journals (Sweden)
Kai Cao
2016-05-01
Full Text Available Objective: To explore the spatial-temporal interaction effect within a Bayesian framework and to probe the ecological influential factors for tuberculosis. Methods: Six different statistical models containing parameters of time, space, spatial-temporal interaction and their combination were constructed based on a Bayesian framework. The optimum model was selected according to the deviance information criterion (DIC value. Coefficients of climate variables were then estimated using the best fitting model. Results: The model containing spatial-temporal interaction parameter was the best fitting one, with the smallest DIC value (−4,508,660. Ecological analysis results showed the relative risks (RRs of average temperature, rainfall, wind speed, humidity, and air pressure were 1.00324 (95% CI, 1.00150–1.00550, 1.01010 (95% CI, 1.01007–1.01013, 0.83518 (95% CI, 0.93732–0.96138, 0.97496 (95% CI, 0.97181–1.01386, and 1.01007 (95% CI, 1.01003–1.01011, respectively. Conclusions: The spatial-temporal interaction was statistically meaningful and the prevalence of tuberculosis was influenced by the time and space interaction effect. Average temperature, rainfall, wind speed, and air pressure influenced tuberculosis. Average humidity had no influence on tuberculosis.
A Bayesian analysis of the 69 highest energy cosmic rays detected by the Pierre Auger Observatory
Khanin, Alexander
2016-01-01
The origins of ultra-high energy cosmic rays (UHECRs) remain an open question. Several attempts have been made to cross-correlate the arrival directions of the UHECRs with catalogs of potential sources, but no definite conclusion has been reached. We report a Bayesian analysis of the 69 events from the Pierre Auger Observatory (PAO), that aims to determine the fraction of the UHECRs that originate from known AGNs in the Veron-Cety & Veron (VCV) catalog, as well as AGNs detected with the Swift Burst Alert Telescope (Swift-BAT), galaxies from the 2MASS Redshift Survey (2MRS), and an additional volume-limited sample of 17 nearby AGNs. The study makes use of a multi-level Bayesian model of UHECR injection, propagation and detection. We find that for reasonable ranges of prior parameters, the Bayes factors disfavour a purely isotropic model. For fiducial values of the model parameters, we report 68% credible intervals for the fraction of source originating UHECRs of 0.09+0.05-0.04, 0.25+0.09-0.08, 0.24+0.12-0....
Timeresolved Speckle Analysis: Probing the Coherence of Excitonic Secondary Emission
DEFF Research Database (Denmark)
Langbein, Wolfgang; Hvam, Jørn Märcher; Zimmermann, R.;
1998-01-01
A new technique to analyze the time-dependent coherence of light emitted in a non-specular direction is presented. We demonstrate that the coherence degree of the emission can be deduced from the intensity fluctuations over the emission directions (speckles). The secondary emission of excitons...... in semiconductor quantum wells is investigated. Here, a partial coherence results from an interplay between scattering due to static disorder and inelastic relaxation, without any influence of the radiative decay. The temperature dependence is well explained by dephasing due to phonon scattering....
A covariant approach to the analysis of coherent processes
Karasev, V. P.; Shelepin, L. A.
A general group-theory approach to the study of coherent phenomena in many-particle multilevel systems interacting with boson fields is formulated on the basis of the dynamic symmetry theory and the formalism of group representations. Algebraic models of the systems under study are proposed using this approach which yield generalized Dicke models with allowance for both radiation and collisional processes and also consider systems with a varying number of particles. By using Glauber coherent states and generalized coherent states of SU (n) groups, covariant methods are developed for calculating various characteristics of cooperative phenomena and processes on the basis of the models proposed here.
DEFF Research Database (Denmark)
2010-01-01
of multiple genetic markers measured in multiple studies, based on the analysis of individual participant data. First, for a single genetic marker in one study, we show that the usual ratio of coefficients approach can be reformulated as a regression with heterogeneous error in the explanatory variable......Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context....... This can be implemented using a Bayesian approach, which is next extended to include multiple genetic markers. We then propose a hierarchical model for undertaking a meta-analysis of multiple studies, in which it is not necessary that the same genetic markers are measured in each study. This provides...
Bayesian Analysis of Inertial Confinement Fusion Experiments at the National Ignition Facility
Gaffney, J A; Sonnad, V; Libby, S B
2012-01-01
We develop a Bayesian inference method that allows the efficient determination of several interesting parameters from complicated high-energy-density experiments performed on the National Ignition Facility (NIF). The model is based on an exploration of phase space using the hydrodynamic code HYDRA. A linear model is used to describe the effect of nuisance parameters on the analysis, allowing an analytic likelihood to be derived that can be determined from a small number of HYDRA runs and then used in existing advanced statistical analysis methods. This approach is applied to a recent experiment in order to determine the carbon opacity and X-ray drive; it is found that the inclusion of prior expert knowledge and fluctuations in capsule dimensions and chemical composition significantly improve the agreement between experiment and theoretical opacity calculations. A parameterisation of HYDRA results is used to test the application of both Markov chain Monte Carlo (MCMC) and genetic algorithm (GA) techniques to e...
Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler.
Jin, Ick Hoon; Yuan, Ying; Liang, Faming
2013-10-01
Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency. PMID:24653788
Bayesian analysis for exponential random graph models using the adaptive exchange sampler
Jin, Ick Hoon
2013-01-01
Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the existence of intractable normalizing constants. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the issue of intractable normalizing constants encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency.
Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler*
Jin, Ick Hoon; Yuan, Ying; Liang, Faming
2014-01-01
Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency. PMID:24653788
Bayesian Analysis of $C_{x'}$ and $C_{z'}$ Double Polarizations in Kaon Photoproduction
Hutauruk, P T P
2010-01-01
Have been analyzed the latest experimental data for $\\gamma + p \\to K^{+} + \\Lambda$ reaction of $C_{x'}$ and $C_{z'}$ double polarizations. In theoretical calculation, all of these observables can be classified into four Legendre classes and represented by associated Legendre polynomial function itself \\cite{fasano92}. In this analysis we attempt to determine the best data model for both observables. We use the bayesian technique to select the best model by calculating the posterior probabilities and comparing the posterior among the models. The posteriors probabilities for each data model are computed using a Nested sampling integration. From this analysis we concluded that $C_{x'}$ and $C_{z'}$ double polarizations require two and three order of associated Legendre polynomials respectively to describe the data well. The extracted coefficients of each observable will also be presented. It shows the structure of baryon resonances qualitatively
Bayesian Analysis of Hybrid EoS based on Astrophysical Observational Data
Alvarez-Castillo, David; Blaschke, David; Grigorian, Hovik
2014-01-01
The most basic features of a neutron star (NS) are its radius and mass which so far have not been well determined simultaneously for a single object. In some cases masses are precisely measured like in the case of binary systems but radii are quite uncertain. In the other hand, for isolated neutron stars some radius and mass measurements exist but lack the necessary precision to inquire into their interiors. However, the present observable data allows to make probabilistic estimation of the internal structure of the star. In this work preliminary probabilistic estimation of the super dense stellar matter equation of state using Bayesian Analysis and modelling of relativistic configurations of neutron stars is shown. This analysis is important for research of existence the quark-gluon plasma in massive (around 2 sun masses) neutron stars.
Mass-radius constraints for the neutron star EoS - Bayesian analysis
Ayriyan, A; Blaschke, D; Grigorian, H
2015-01-01
We suggest a new Bayesian analysis (BA) using disjunct M-R constraints for extracting probability measures for cold, dense matter equations of state (EoS). One of the key issues of such an analysis is the question of a deconfinement transition in compact stars and whether it proceeds as a crossover or rather as a first order transition. We show by postulating results of not yet existing radius measurements for the known pulsars with a mass of $2M_\\odot$ that a radius gap of about 3 km would clearly select an EoS with a strong first order phase transition as the most probably one. This would support the existence of a critical endpoint in the QCD phase diagram under scrutiny in present and upcoming heavy-ion collision experiments.
High-density EEG coherence analysis using functional units applied to mental fatigue
Caat, Michael ten; Lorist, Monicque M.; Bezdan, Eniko; Roerdink, Jos B.T.M.; Maurits, Natasha M.
2008-01-01
Electroencephalography (EEG) coherence provides a quantitative measure of functional brain connectivity which is calculated between pairs of signals as a function of frequency. Without hypotheses, traditional coherence analysis would be cumbersome for high-density EEG which employs a large number of
Directory of Open Access Journals (Sweden)
Hakan Sarikaya
Full Text Available OBJECTIVE: To compare the effects of antiplatelets and anticoagulants on stroke and death in patients with acute cervical artery dissection. DESIGN: Systematic review with Bayesian meta-analysis. DATA SOURCES: The reviewers searched MEDLINE and EMBASE from inception to November 2012, checked reference lists, and contacted authors. STUDY SELECTION: Studies were eligible if they were randomised, quasi-randomised or observational comparisons of antiplatelets and anticoagulants in patients with cervical artery dissection. DATA EXTRACTION: Data were extracted by one reviewer and checked by another. Bayesian techniques were used to appropriately account for studies with scarce event data and imbalances in the size of comparison groups. DATA SYNTHESIS: Thirty-seven studies (1991 patients were included. We found no randomised trial. The primary analysis revealed a large treatment effect in favour of antiplatelets for preventing the primary composite outcome of ischaemic stroke, intracranial haemorrhage or death within the first 3 months after treatment initiation (relative risk 0.32, 95% credibility interval 0.12 to 0.63, while the degree of between-study heterogeneity was moderate (τ(2 = 0.18. In an analysis restricted to studies of higher methodological quality, the possible advantage of antiplatelets over anticoagulants was less obvious than in the main analysis (relative risk 0.73, 95% credibility interval 0.17 to 2.30. CONCLUSION: In view of these results and the safety advantages, easier usage and lower cost of antiplatelets, we conclude that antiplatelets should be given precedence over anticoagulants as a first line treatment in patients with cervical artery dissection unless results of an adequately powered randomised trial suggest the opposite.
Ex vivo brain tumor analysis using spectroscopic optical coherence tomography
Lenz, Marcel; Krug, Robin; Welp, Hubert; Schmieder, Kirsten; Hofmann, Martin R.
2016-03-01
A big challenge during neurosurgeries is to distinguish between healthy tissue and cancerous tissue, but currently a suitable non-invasive real time imaging modality is not available. Optical Coherence Tomography (OCT) is a potential technique for such a modality. OCT has a penetration depth of 1-2 mm and a resolution of 1-15 μm which is sufficient to illustrate structural differences between healthy tissue and brain tumor. Therefore, we investigated gray and white matter of healthy central nervous system and meningioma samples with a Spectral Domain OCT System (Thorlabs Callisto). Additional OCT images were generated after paraffin embedding and after the samples were cut into 10 μm thin slices for histological investigation with a bright field microscope. All samples were stained with Hematoxylin and Eosin. In all cases B-scans and 3D images were made. Furthermore, a camera image of the investigated area was made by the built-in video camera of our OCT system. For orientation, the backsides of all samples were marked with blue ink. The structural differences between healthy tissue and meningioma samples were most pronounced directly after removal. After paraffin embedding these differences diminished. A correlation between OCT en face images and microscopy images can be seen. In order to increase contrast, post processing algorithms were applied. Hence we employed Spectroscopic OCT, pattern recognition algorithms and machine learning algorithms such as k-means Clustering and Principal Component Analysis.
Krychowiak, M.; König, R.; Klinger, T.; Fischer, R.
2004-11-01
At the stellarator Wendelstein 7-AS (W7-AS) a spectrally resolving two channel system for the measurement of line-of-sight averaged Zeff values has been tested in preparation for its planned installation as a multichannel Zeff-profile measurement system on the stellarator Wendelstein 7-X (W7-X) which is presently under construction. The measurement is performed using the bremsstrahlung intensity in the wavelength region of ultraviolet to near infrared. The spectrally resolved measurement allows to eliminate signal contamination by line radiation. For statistical data analysis a procedure based on Bayesian probability theory has been developed. With this method it is possible to estimate the bremsstrahlung background in the measured signal and its error without the necessity to fit the spectral lines. For evaluation of the random error in Zeff the signal noise has been investigated. Furthermore, the linearity and behavior of the charge-coupled device detector at saturation has been analyzed.
Bayesian analysis of flaw sizing data of the NESC III exercise
International Nuclear Information System (INIS)
Non-destructive inspections are performed to give confidence of the non-existence of flaws exceeding a certain safe limit in the inspected structural component. The principal uncertainties related to these inspections are the probability of not detecting an existing flaw larger than a given size, the probability of a false call, and the uncertainty related to the sizing of a flaw. Inspection reliability models aim to account for these uncertainties. This paper presents the analysis of sizing uncertainty of flaws for the results of the NESC III Round Robin Trials on defect-containing dissimilar metal welds. Model parameters are first estimated to characterize the sizing capabilities of various teams. A Bayesian updating of the flaw depth distribution is then demonstrated by combining information from measurement results and sizing performance
Constraints on cosmic-ray propagation models from a global Bayesian analysis
Trotta, R; Moskalenko, I V; Porter, T A; de Austri, R Ruiz; Strong, A W
2010-01-01
Research in many areas of modern physics such as, e.g., indirect searches for dark matter and particle acceleration in SNR shocks, rely heavily on studies of cosmic rays (CRs) and associated diffuse emissions (radio, microwave, X-rays, gamma rays). While very detailed numerical models of CR propagation exist, a quantitative statistical analysis of such models has been so far hampered by the large computational effort that those models require. Although statistical analyses have been carried out before using semi-analytical models (where the computation is much faster), the evaluation of the results obtained from such models is difficult, as they necessarily suffer from many simplifying assumptions, The main objective of this paper is to present a working method for a full Bayesian parameter estimation for a numerical CR propagation model. For this study, we use the GALPROP code, the most advanced of its kind, that uses astrophysical information, nuclear and particle data as input to self-consistently predict ...
Intuitive logic revisited: new data and a Bayesian mixed model meta-analysis.
Directory of Open Access Journals (Sweden)
Henrik Singmann
Full Text Available Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012. These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies (n = 287 without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim.
Bayesian semiparametric power spectral density estimation in gravitational wave data analysis
Edwards, Matthew C; Christensen, Nelson
2015-01-01
The standard noise model in gravitational wave (GW) data analysis assumes detector noise is stationary and Gaussian distributed, with a known power spectral density (PSD) that is usually estimated using clean off-source data. Real GW data often depart from these assumptions, and misspecified parametric models of the PSD could result in misleading inferences. We propose a Bayesian semiparametric approach to improve this. We use a nonparametric Bernstein polynomial prior on the PSD, with weights attained via a Dirichlet process distribution, and update this using the Whittle likelihood. Posterior samples are obtained using a Metropolis-within-Gibbs sampler. We simultaneously estimate the reconstruction parameters of a rotating core collapse supernova GW burst that has been embedded in simulated Advanced LIGO noise. We also discuss an approach to deal with non-stationary data by breaking longer data streams into smaller and locally stationary components.
Edwards, Matthew C.; Meyer, Renate; Christensen, Nelson
2015-09-01
The standard noise model in gravitational wave (GW) data analysis assumes detector noise is stationary and Gaussian distributed, with a known power spectral density (PSD) that is usually estimated using clean off-source data. Real GW data often depart from these assumptions, and misspecified parametric models of the PSD could result in misleading inferences. We propose a Bayesian semiparametric approach to improve this. We use a nonparametric Bernstein polynomial prior on the PSD, with weights attained via a Dirichlet process distribution, and update this using the Whittle likelihood. Posterior samples are obtained using a blocked Metropolis-within-Gibbs sampler. We simultaneously estimate the reconstruction parameters of a rotating core collapse supernova GW burst that has been embedded in simulated Advanced LIGO noise. We also discuss an approach to deal with nonstationary data by breaking longer data streams into smaller and locally stationary components.
Bayesian analysis of general failure data from an ageing distribution: advances in numerical methods
Energy Technology Data Exchange (ETDEWEB)
Procaccia, H.; Villain, B. [Electricite de France (EDF), 93 - Saint-Denis (France); Clarotti, C.A. [ENEA, Casaccia (Italy)
1996-12-31
EDF and ENEA carried out a joint research program for developing the numerical methods and computer codes needed for Bayesian analysis of component-lives in the case of ageing. Early results of this study were presented at ESREL`94. Since then the following further steps have been gone: input data have been generalized to the case that observed lives are censored both on the right and on the left; allowable life distributions are Weibull and gamma - their parameters are both unknown and can be statistically dependent; allowable priors are histograms relative to different parametrizations of the life distribution of concern; first-and-second-order-moments of the posterior distributions can be computed. In particular the covariance will give some important information about the degree of the statistical dependence between the parameters of interest. An application of the code to the appearance of a stress corrosion cracking in a tube of the PWR Steam Generator system is presented. (authors). 10 refs.
Bayesian Reliability Analysis of Non-Stationarity in Multi-agent Systems
Directory of Open Access Journals (Sweden)
TONT Gabriela
2013-05-01
Full Text Available The Bayesian methods provide information about the meaningful parameters in a statistical analysis obtained by combining the prior and sampling distributions to form the posterior distribution of theparameters. The desired inferences are obtained from this joint posterior. An estimation strategy for hierarchical models, where the resulting joint distribution of the associated model parameters cannotbe evaluated analytically, is to use sampling algorithms, known as Markov Chain Monte Carlo (MCMC methods, from which approximate solutions can be obtained. Both serial and parallel configurations of subcomponents are permitted. The capability of time-dependent method to describe a multi-state system is based on a case study, assessingthe operatial situation of studied system. The rationality and validity of the presented model are demonstrated via a case of study. The effect of randomness of the structural parameters is alsoexamined.
Selection of Trusted Service Providers by Enforcing Bayesian Analysis in iVCE
Institute of Scientific and Technical Information of China (English)
GU Bao-jun; LI Xiao-yong; WANG Wei-nong
2008-01-01
The initiative of internet-based virtual computing environment (iVCE) aims to provide the end users and applications With a harmonious, trustworthy and transparent integrated computing environment which will facilitate sharing and collaborating of network resources between applications. Trust management is an elementary component for iVCE. The uncertain and dynamic characteristics of iVCE necessitate the requirement for the trust management to be subjective, historical evidence based and context dependent. This paper presents a Bayesian analysis-based trust model, which aims to secure the active agents for selecting appropriate trustod services in iVCE. Simulations are made to analyze the properties of the trust model which show that the subjective prior information influences trust evaluation a lot and the model stimulates positive interactions.
Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E; Harper, Martin
2013-01-01
This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure.
Fermi's paradox, extraterrestrial life and the future of humanity: a Bayesian analysis
Verendel, Vilhelm
2015-01-01
The Great Filter interpretation of Fermi's great silence asserts that $Npq$ is not a very large number, where $N$ is the number of potentially life-supporting planets in the observable universe, $p$ is the probability that a randomly chosen such planet develops intelligent life to the level of present-day human civilization, and $q$ is the conditional probability that it then goes on to develop a technological supercivilization visible all over the observable universe. Evidence suggests that $N$ is huge, which implies that $pq$ is very small. Hanson (1998) and Bostrom (2008) have argued that the discovery of extraterrestrial life would point towards $p$ not being small and therefore a very small $q$, which can be seen as bad news for humanity's prospects of colonizing the universe. Here we investigate whether a Bayesian analysis supports their argument, and the answer turns out to depend critically on the choice of prior distribution.
Intuitive logic revisited: new data and a Bayesian mixed model meta-analysis.
Singmann, Henrik; Klauer, Karl Christoph; Kellen, David
2014-01-01
Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid) of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012). These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies (n = 287) without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects) which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim.
Zhang, Hua; Huo, Mingdong; Chao, Jianqian; Liu, Pei
2016-01-01
Background Hepatitis B virus (HBV) infection is a major problem for public health; timely antiviral treatment can significantly prevent the progression of liver damage from HBV by slowing down or stopping the virus from reproducing. In the study we applied Bayesian approach to cost-effectiveness analysis, using Markov Chain Monte Carlo (MCMC) simulation methods for the relevant evidence input into the model to evaluate cost-effectiveness of entecavir (ETV) and lamivudine (LVD) therapy for chronic hepatitis B (CHB) in Jiangsu, China, thus providing information to the public health system in the CHB therapy. Methods Eight-stage Markov model was developed, a hypothetical cohort of 35-year-old HBeAg-positive patients with CHB was entered into the model. Treatment regimens were LVD100mg daily and ETV 0.5 mg daily. The transition parameters were derived either from systematic reviews of the literature or from previous economic studies. The outcome measures were life-years, quality-adjusted lifeyears (QALYs), and expected costs associated with the treatments and disease progression. For the Bayesian models all the analysis was implemented by using WinBUGS version 1.4. Results Expected cost, life expectancy, QALYs decreased with age. Cost-effectiveness increased with age. Expected cost of ETV was less than LVD, while life expectancy and QALYs were higher than that of LVD, ETV strategy was more cost-effective. Costs and benefits of the Monte Carlo simulation were very close to the results of exact form among the group, but standard deviation of each group indicated there was a big difference between individual patients. Conclusions Compared with lamivudine, entecavir is the more cost-effective option. CHB patients should accept antiviral treatment as soon as possible as the lower age the more cost-effective. Monte Carlo simulation obtained costs and effectiveness distribution, indicate our Markov model is of good robustness. PMID:27574976
Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas
2015-01-01
"Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis.
BGX: a Bioconductor package for the Bayesian integrated analysis of Affymetrix GeneChips
Directory of Open Access Journals (Sweden)
Hein Anne-Mette K
2007-11-01
Full Text Available Abstract Background Affymetrix 3' GeneChip microarrays are widely used to profile the expression of thousands of genes simultaneously. They differ from many other microarray types in that GeneChips are hybridised using a single labelled extract and because they contain multiple 'match' and 'mismatch' sequences for each transcript. Most algorithms extract the signal from GeneChip experiments in a sequence of separate steps, including background correction and normalisation, which inhibits the simultaneous use of all available information. They principally provide a point estimate of gene expression and, in contrast to BGX, do not fully integrate the uncertainty arising from potentially heterogeneous responses of the probes. Results BGX is a new Bioconductor R package that implements an integrated Bayesian approach to the analysis of 3' GeneChip data. The software takes into account additive and multiplicative error, non-specific hybridisation and replicate summarisation in the spirit of the model outlined in 1. It also provides a posterior distribution for the expression of each gene. Moreover, BGX can take into account probe affinity effects from probe sequence information where available. The package employs a novel adaptive Markov chain Monte Carlo (MCMC algorithm that raises considerably the efficiency with which the posterior distributions are sampled from. Finally, BGX incorporates various ways to analyse the results, such as ranking genes by expression level as well as statistically based methods for estimating the amount of up and down regulated genes between two conditions. Conclusion BGX performs well relative to other widely used methods at estimating expression levels and fold changes. It has the advantage that it provides a statistically sound measure of uncertainty for its estimates. BGX includes various analysis functions to visualise and exploit the rich output that is produced by the Bayesian model.
Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas
2015-01-01
"Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482
A parametric Bayesian combination of local and regional information in flood frequency analysis
Seidou, O.; Ouarda, T. B. M. J.; Barbet, M.; Bruneau, P.; BobéE, B.
2006-11-01
Because of their impact on hydraulic structure design as well as on floodplain management, flood quantiles must be estimated with the highest precision given available information. If the site of interest has been monitored for a sufficiently long period (more than 30-40 years), at-site frequency analysis can be used to estimate flood quantiles with a fair precision. Otherwise, regional estimation may be used to mitigate the lack of data, but local information is then ignored. A commonly used approach to combine at-site and regional information is the linear empirical Bayes estimation: Under the assumption that both local and regional flood quantile estimators have a normal distribution, the empirical Bayesian estimator of the true quantile is the weighted average of both estimations. The weighting factor for each estimator is conversely proportional to its variance. We propose in this paper an alternative Bayesian method for combining local and regional information which provides the full probability density of quantiles and parameters. The application of the method is made with the generalized extreme values (GEV) distribution, but it can be extended to other types of extreme value distributions. In this method the prior distributions are obtained using a regional log linear regression model, and then local observations are used within a Markov chain Monte Carlo algorithm to infer the posterior distributions of parameters and quantiles. Unlike the empirical Bayesian approach the proposed method works even with a single local observation. It also relaxes the hypothesis of normality of the local quantiles probability distribution. The performance of the proposed methodology is compared to that of local, regional, and empirical Bayes estimators on three generated regional data sets with different statistical characteristics. The results show that (1) when the regional log linear model is unbiased, the proposed method gives better estimations of the GEV quantiles and
Sea-level variability in tide-gauge and geological records: An empirical Bayesian analysis (Invited)
Kopp, R. E.; Hay, C.; Morrow, E.; Mitrovica, J. X.; Horton, B.; Kemp, A.
2013-12-01
Sea level varies at a range of temporal and spatial scales, and understanding all its significant sources of variability is crucial to building sea-level rise projections relevant to local decision-making. In the twentieth-century record, sites along the U.S. east coast have exhibited typical year-to-year variability of several centimeters. A faster-than-global increase in sea-level rise in the northeastern United States since about 1990 has led some to hypothesize a 'sea-level rise hot spot' in this region, perhaps driven by a trend in the Atlantic Meridional Overturning Circulation related to anthropogenic climate change [1]. However, such hypotheses must be evaluated in the context of natural variability, as revealed by observational and paleo-records. Bayesian and empirical Bayesian statistical approaches are well suited for assimilating data from diverse sources, such as tide-gauges and peats, with differing data availability and uncertainties, and for identifying regionally covarying patterns within these data. We present empirical Bayesian analyses of twentieth-century tide gauge data [2]. We find that the mid-Atlantic region of the United States has experienced a clear acceleration of sea level relative to the global average since about 1990, but this acceleration does not appear to be unprecedented in the twentieth-century record. The rate and extent of this acceleration instead appears comparable to an acceleration observed in the 1930s and 1940s. Both during the earlier episode of acceleration and today, the effect appears to be significantly positively correlated with the Atlantic Multidecadal Oscillation and likely negatively correlated with the North Atlantic Oscillation [2]. The Holocene and Common Era database of geological sea-level rise proxies [3,4] may allow these relationships to be assessed beyond the span of the direct observational record. At a global scale, similar approaches can be employed to look for the spatial fingerprints of land ice
Hip fracture in the elderly: a re-analysis of the EPIDOS study with causal Bayesian networks.
Directory of Open Access Journals (Sweden)
Pascal Caillet
Full Text Available Hip fractures commonly result in permanent disability, institutionalization or death in elderly. Existing hip-fracture predicting tools are underused in clinical practice, partly due to their lack of intuitive interpretation. By use of a graphical layer, Bayesian network models could increase the attractiveness of fracture prediction tools. Our aim was to study the potential contribution of a causal Bayesian network in this clinical setting. A logistic regression was performed as a standard control approach to check the robustness of the causal Bayesian network approach.EPIDOS is a multicenter study, conducted in an ambulatory care setting in five French cities between 1992 and 1996 and updated in 2010. The study included 7598 women aged 75 years or older, in which fractures were assessed quarterly during 4 years. A causal Bayesian network and a logistic regression were performed on EPIDOS data to describe major variables involved in hip fractures occurrences.Both models had similar association estimations and predictive performances. They detected gait speed and mineral bone density as variables the most involved in the fracture process. The causal Bayesian network showed that gait speed and bone mineral density were directly connected to fracture and seem to mediate the influence of all the other variables included in our model. The logistic regression approach detected multiple interactions involving psychotropic drug use, age and bone mineral density.Both approaches retrieved similar variables as predictors of hip fractures. However, Bayesian network highlighted the whole web of relation between the variables involved in the analysis, suggesting a possible mechanism leading to hip fracture. According to the latter results, intervention focusing concomitantly on gait speed and bone mineral density may be necessary for an optimal prevention of hip fracture occurrence in elderly people.
Theoretical and numerical analysis of coherent Smith-Purcell radiation
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
Coherent enhancement of Smith-Purcell radiation has attracted people's attention not only in adopting a better source but also in beam diagnostics aspect. In this paper, we study the intrinsic mechanism of coherent Smith-Purcell radiation on the basis of the van den Berg model. The emitted power of Smith-Purcell radiation is determined by the bunch profile in transverse and longitudinal directions. For short bunch whose longitudinal pulse length is comparable with the radiation wavelength, it can be concluded approximately that the power is proportional to the square number of electrons per bunch.
Bias correction and Bayesian analysis of aggregate counts in SAGE libraries
Directory of Open Access Journals (Sweden)
Briggs William M
2010-02-01
Full Text Available Abstract Background Tag-based techniques, such as SAGE, are commonly used to sample the mRNA pool of an organism's transcriptome. Incomplete digestion during the tag formation process may allow for multiple tags to be generated from a given mRNA transcript. The probability of forming a tag varies with its relative location. As a result, the observed tag counts represent a biased sample of the actual transcript pool. In SAGE this bias can be avoided by ignoring all but the 3' most tag but will discard a large fraction of the observed data. Taking this bias into account should allow more of the available data to be used leading to increased statistical power. Results Three new hierarchical models, which directly embed a model for the variation in tag formation probability, are proposed and their associated Bayesian inference algorithms are developed. These models may be applied to libraries at both the tag and aggregate level. Simulation experiments and analysis of real data are used to contrast the accuracy of the various methods. The consequences of tag formation bias are discussed in the context of testing differential expression. A description is given as to how these algorithms can be applied in that context. Conclusions Several Bayesian inference algorithms that account for tag formation effects are compared with the DPB algorithm providing clear evidence of superior performance. The accuracy of inferences when using a particular non-informative prior is found to depend on the expression level of a given gene. The multivariate nature of the approach easily allows both univariate and joint tests of differential expression. Calculations demonstrate the potential for false positive and negative findings due to variation in tag formation probabilities across samples when testing for differential expression.
Gamma prior distribution selection for Bayesian analysis of failure rate and reliability
International Nuclear Information System (INIS)
It is assumed that the phenomenon under study is such that the time-to-failure may be modeled by an exponential distribution with failure rate lambda. For Bayesian analyses of the assumed model, the family of gamma distributions provides conjugate prior models for lambda. Thus, an experimenter needs to select a particular gamma model to conduct a Bayesian reliability analysis. The purpose of this report is to present a methodology that can be used to translate engineering information, experience, and judgment into a choice of a gamma prior distribution. The proposed methodology assumes that the practicing engineer can provide percentile data relating to either the failure rate or the reliability of the phenomenon being investigated. For example, the methodology will select the gamma prior distribution which conveys an engineer's belief that the failure rate lambda simultaneously satisfies the probability statements, P(lambda less than 1.0 x 10-3) equals 0.50 and P(lambda less than 1.0 x 10-5) equals 0.05. That is, two percentiles provided by an engineer are used to determine a gamma prior model which agrees with the specified percentiles. For those engineers who prefer to specify reliability percentiles rather than the failure rate percentiles illustrated above, it is possible to use the induced negative-log gamma prior distribution which satisfies the probability statements, P(R(t0) less than 0.99) equals 0.50 and P(R(t0) less than 0.99999) equals 0.95, for some operating time t0. The report also includes graphs for selected percentiles which assist an engineer in applying the procedure. 28 figures, 16 tables
Reliability Analysis of I and C Architecture of Research Reactors Using Bayesian Networks
International Nuclear Information System (INIS)
The objective of this research project is to identify a configuration of architecture which gives highest availability with maintaining low cost of manufacturing. In this regard, two configurations of a single channel of RPS are formulated in the current article and BN models were constructed. Bayesian network analysis was performed to find the reliability features. This is a continuation of study towards the standardization of I and C architecture for low and medium power research reactors. This research is the continuation of study to analyze the reliability of single channel of Reactor Protection System (RPS) using Bayesian networks. The focus of research was on the development of architecture for low power research reactors. What level of reliability is sufficient for protection, safety and control systems in case of low power research reactors? There should be a level which should satisfy all the regulatory requirements as well as operational demands with optimized cost of construction. Scholars, researchers and material investigators from educational and research institutes are demanding for construction of more research reactors. In order to meet this demand and construct more units, it is necessary to do more research in various areas. The research is also needed to make a standardization of research reactor I and C architectures on the same lines of commercial power plants. The research reactors are categorized into two broad categories, Low power research reactors and medium to high power research reactors. According to IAEA TECDOC-1234, Research reactors with 0.250-2.0 MW power rating or 2.5-10 Χ 1011 n/cm2.s. flux are termed low power reactor whereas research reactors ranging from 2-10 MW power rating or 0.1-10 Χ 1013 n/cm2.s. are considered as Medium to High power research reactors. Some other standards (IAEA NP-T-5.1) define multipurpose research reactor ranging from power few hundred KW to 10 MW as low power research reactor
Gamma prior distribution selection for Bayesian analysis of failure rate and reliability
Energy Technology Data Exchange (ETDEWEB)
Waller, R.A.; Johnson, M.M.; Waterman, M.S.; Martz, H.F. Jr.
1976-07-01
It is assumed that the phenomenon under study is such that the time-to-failure may be modeled by an exponential distribution with failure rate lambda. For Bayesian analyses of the assumed model, the family of gamma distributions provides conjugate prior models for lambda. Thus, an experimenter needs to select a particular gamma model to conduct a Bayesian reliability analysis. The purpose of this report is to present a methodology that can be used to translate engineering information, experience, and judgment into a choice of a gamma prior distribution. The proposed methodology assumes that the practicing engineer can provide percentile data relating to either the failure rate or the reliability of the phenomenon being investigated. For example, the methodology will select the gamma prior distribution which conveys an engineer's belief that the failure rate lambda simultaneously satisfies the probability statements, P(lambda less than 1.0 x 10/sup -3/) equals 0.50 and P(lambda less than 1.0 x 10/sup -5/) equals 0.05. That is, two percentiles provided by an engineer are used to determine a gamma prior model which agrees with the specified percentiles. For those engineers who prefer to specify reliability percentiles rather than the failure rate percentiles illustrated above, it is possible to use the induced negative-log gamma prior distribution which satisfies the probability statements, P(R(t/sub 0/) less than 0.99) equals 0.50 and P(R(t/sub 0/) less than 0.99999) equals 0.95, for some operating time t/sub 0/. The report also includes graphs for selected percentiles which assist an engineer in applying the procedure. 28 figures, 16 tables.
Numerical analysis of partially coherent radiation at soft x-ray beamline.
Meng, Xiangyu; Xue, Chaofan; Yu, Huaina; Wang, Yong; Wu, Yanqing; Tai, Renzhong
2015-11-16
A new model for numerical analysis of partially coherent x-ray at synchrotron beamlines is presented. The model is based on statistical optics. Four-dimensional coherence function, Mutual Optical Intensity (MOI), is applied to describe the wavefront of the partially coherent light. The propagation of MOI through optical elements in the beamline is deduced with numerical calculation. The coherence of x-ray through beamlines can be acquired. We applied the model to analyze the coherence in the STXM beamline at SSRF, and got the coherence length of the beam at the endstation. To verify the theoretical results, the diffraction experiment of a single slit was performed and the diffraction pattern was simulated to get the coherence length, (31 ± 3.0) µm × (25 ± 2.1) µm (H × V), which had a good agreement with the theoretical results, (30.7 ± 0.6) µm × (31 ± 5.3) µm (H × V). The model is applicable to analyze the coherence in synchrotron beamlines.
A Cohesive and Coherent Analysis of The Purloined Letter
Institute of Scientific and Technical Information of China (English)
杜娟娟; 郭鸿雁
2013-01-01
The thesis aims to do a research on the cohesion and coherence in The Purloined Letter written by Edgar Allan Poe based on the theories of Halliday and Hasan as well as the Chinese scholars Zhang Delu and Liu Rushan. By the means of them, Poe’s ingenious composition and vivid characters emerge in the mind of the readers.
Energy Technology Data Exchange (ETDEWEB)
Itagaki, H. [Yokohama National University, Yokohama (Japan). Faculty of Engineering; Asada, H.; Ito, S. [National Aerospace Laboratory, Tokyo (Japan); Shinozuka, M.
1996-12-31
Risk assessed structural positions in a pressurized fuselage of a transport-type aircraft applied with damage tolerance design are taken up as the subject of discussion. A small number of data obtained from inspections on the positions was used to discuss the Bayesian reliability analysis that can estimate also a proper non-periodic inspection schedule, while estimating proper values for uncertain factors. As a result, time period of generating fatigue cracks was determined according to procedure of detailed visual inspections. The analysis method was found capable of estimating values that are thought reasonable and the proper inspection schedule using these values, in spite of placing the fatigue crack progress expression in a very simple form and estimating both factors as the uncertain factors. Thus, the present analysis method was verified of its effectiveness. This study has discussed at the same time the structural positions, modeling of fatigue cracks generated and develop in the positions, conditions for destruction, damage factors, and capability of the inspection from different viewpoints. This reliability analysis method is thought effective also on such other structures as offshore structures. 18 refs., 8 figs., 1 tab.
Coherence, Pseudo-Coherence, and Non-Coherence.
Enkvist, Nils Erik
Analysis of the factors that make a text coherent or non-coherent suggests that total coherence requires cohesion not only on the textual surface but on the semantic level as well. Syntactic evidence of non-coherence includes lack of formal agreement blocking a potential cross-reference, anaphoric and cataphoric references that do not follow their…
Application of Bayesian and cost benefit risk analysis in water resources management
Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.
2016-03-01
Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.
Shoemaker, Christine; Espinet, Antoine; Pang, Min
2015-04-01
Models of complex environmental systems can be computationally expensive in order to describe the dynamic interactions of the many components over a sizeable time period. Diagnostics of these systems can include forward simulations of calibrated models under uncertainty and analysis of alternatives of systems management. This discussion will focus on applications of new surrogate optimization and uncertainty analysis methods to environmental models that can enhance our ability to extract information and understanding. For complex models, optimization and especially uncertainty analysis can require a large number of model simulations, which is not feasible for computationally expensive models. Surrogate response surfaces can be used in Global Optimization and Uncertainty methods to obtain accurate answers with far fewer model evaluations, which made the methods practical for computationally expensive models for which conventional methods are not feasible. In this paper we will discuss the application of the SOARS surrogate method for estimating Bayesian posterior density functions for model parameters for a TOUGH2 model of geologic carbon sequestration. We will also briefly discuss new parallel surrogate global optimization algorithm applied to two groundwater remediation sites that was implemented on a supercomputer with up to 64 processors. The applications will illustrate the use of these methods to predict the impact of monitoring and management on subsurface contaminants.
Directory of Open Access Journals (Sweden)
C. Mukherjee
2011-01-01
Full Text Available Inverse modeling applications in atmospheric chemistry are increasingly addressing the challenging statistical issues of data synthesis by adopting refined statistical analysis methods. This paper advances this line of research by addressing several central questions in inverse modeling, focusing specifically on Bayesian statistical computation. Motivated by problems of refining bottom-up estimates of source/sink fluxes of trace gas and aerosols based on increasingly high-resolution satellite retrievals of atmospheric chemical concentrations, we address head-on the need for integrating formal spatial statistical methods of residual error structure in global scale inversion models. We do this using analytically and computationally tractable spatial statistical models, know as conditional autoregressive spatial models, as components of a global inversion framework. We develop Markov chain Monte Carlo methods to explore and fit these spatial structures in an overall statistical framework that simultaneously estimates source fluxes. Additional aspects of the study extend the statistical framework to utilize priors in a more physically realistic manner, and to formally address and deal with missing data in satellite retrievals. We demonstrate the analysis in the context of inferring carbon monoxide (CO sources constrained by satellite retrievals of column CO from the Measurement of Pollution in the Troposphere (MOPITT instrument on the TERRA satellite, paying special attention to evaluating performance of the inverse approach using various statistical diagnostic metrics. This is developed using synthetic data generated to resemble MOPITT data to define a~proof-of-concept and model assessment, and then in analysis of real MOPITT data.
International Nuclear Information System (INIS)
We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented
Naganathan, Athi N; Perez-Jimenez, Raul; Muñoz, Victor; Sanchez-Ruiz, Jose M
2011-10-14
The realization that folding free energy barriers can be small enough to result in significant population of the species at the barrier top has sprouted in several methods to estimate folding barriers from equilibrium experiments. Some of these approaches are based on fitting the experimental thermogram measured by differential scanning calorimetry (DSC) to a one-dimensional representation of the folding free-energy surface (FES). Different physical models have been used to represent the FES: (1) a Landau quartic polynomial as a function of the total enthalpy, which acts as an order parameter; (2) the projection onto a structural order parameter (i.e. number of native residues or native contacts) of the free energy of all the conformations generated by Ising-like statistical mechanical models; and (3) mean-field models that define conformational entropy and stabilization energy as functions of a continuous local order parameter. The fundamental question that emerges is how can we obtain robust, model-independent estimates of the thermodynamic folding barrier from the analysis of DSC experiments. Here we address this issue by comparing the performance of various FES models in interpreting the thermogram of a protein with a marginal folding barrier. We chose the small α-helical protein PDD, which folds-unfolds in microseconds crossing a free energy barrier previously estimated as ~1 RT. The fits of the PDD thermogram to the various models and assumptions produce FES with a consistently small free energy barrier separating the folded and unfolded ensembles. However, the fits vary in quality as well as in the estimated barrier. Applying Bayesian probabilistic analysis we rank the fit performance using a statistically rigorous criterion that leads to a global estimate of the folding barrier and its precision, which for PDD is 1.3 ± 0.4 kJ mol(-1). This result confirms that PDD folds over a minor barrier consistent with the downhill folding regime. We have further
A BAYESIAN HIERARCHICAL SPATIAL POINT PROCESS MODEL FOR MULTI-TYPE NEUROIMAGING META-ANALYSIS.
Kang, Jian; Nichols, Thomas E; Wager, Tor D; Johnson, Timothy D
2014-09-01
Neuroimaging meta-analysis is an important tool for finding consistent effects over studies that each usually have 20 or fewer subjects. Interest in meta-analysis in brain mapping is also driven by a recent focus on so-called "reverse inference": where as traditional "forward inference" identifies the regions of the brain involved in a task, a reverse inference identifies the cognitive processes that a task engages. Such reverse inferences, however, requires a set of meta-analysis, one for each possible cognitive domain. However, existing methods for neuroimaging meta-analysis have significant limitations. Commonly used methods for neuroimaging meta-analysis are not model based, do not provide interpretable parameter estimates, and only produce null hypothesis inferences; further, they are generally designed for a single group of studies and cannot produce reverse inferences. In this work we address these limitations by adopting a non-parametric Bayesian approach for meta analysis data from multiple classes or types of studies. In particular, foci from each type of study are modeled as a cluster process driven by a random intensity function that is modeled as a kernel convolution of a gamma random field. The type-specific gamma random fields are linked and modeled as a realization of a common gamma random field, shared by all types, that induces correlation between study types and mimics the behavior of a univariate mixed effects model. We illustrate our model on simulation studies and a meta analysis of five emotions from 219 studies and check model fit by a posterior predictive assessment. In addition, we implement reverse inference by using the model to predict study type from a newly presented study. We evaluate this predictive performance via leave-one-out cross validation that is efficiently implemented using importance sampling techniques.
Bayesian Analysis Made Simple An Excel GUI for WinBUGS
Woodward, Philip
2011-01-01
From simple NLMs to complex GLMMs, this book describes how to use the GUI for WinBUGS - BugsXLA - an Excel add-in written by the author that allows a range of Bayesian models to be easily specified. With case studies throughout, the text shows how to routinely apply even the more complex aspects of model specification, such as GLMMs, outlier robust models, random effects Emax models, auto-regressive errors, and Bayesian variable selection. It provides brief, up-to-date discussions of current issues in the practical application of Bayesian methods. The author also explains how to obtain free so
Monte-Carlo and Bayesian techniques in gravitational wave burst data analysis
Searle, Antony C
2008-01-01
Monte-Carlo simulations are used in the gravitational wave burst detection community to demonstrate and compare the properties of different search techniques. We note that every Monte-Carlo simulation has a corresponding optimal search technique according to both the non-Bayesian Neyman-Pearson criterion and the Bayesian approach, and that this optimal search technique is the Bayesian statistic. When practical, we recommend deriving the optimal statistic for a credible Monte-Carlo simulation, rather than testing ad hoc statistics against that simulation.
Batterbee, D C; Sims, N D; Becker, W; Worden, K; Rowson, J
2011-11-01
Non-accidental head injury in infants, or shaken baby syndrome, is a highly controversial and disputed topic. Biomechanical studies often suggest that shaking alone cannot cause the classical symptoms, yet many medical experts believe the contrary. Researchers have turned to finite element modelling for a more detailed understanding of the interactions between the brain, skull, cerebrospinal fluid (CSF), and surrounding tissues. However, the uncertainties in such models are significant; these can arise from theoretical approximations, lack of information, and inherent variability. Consequently, this study presents an uncertainty analysis of a finite element model of a human head subject to shaking. Although the model geometry was greatly simplified, fluid-structure-interaction techniques were used to model the brain, skull, and CSF using a Eulerian mesh formulation with penalty-based coupling. Uncertainty and sensitivity measurements were obtained using Bayesian sensitivity analysis, which is a technique that is relatively new to the engineering community. Uncertainty in nine different model parameters was investigated for two different shaking excitations: sinusoidal translation only, and sinusoidal translation plus rotation about the base of the head. The level and type of sensitivity in the results was found to be highly dependent on the excitation type.
Bayesian Analysis for Dynamic Generalized Linear Latent Model with Application to Tree Survival Rate
Directory of Open Access Journals (Sweden)
Yu-sheng Cheng
2014-01-01
Full Text Available Logistic regression model is the most popular regression technique, available for modeling categorical data especially for dichotomous variables. Classic logistic regression model is typically used to interpret relationship between response variables and explanatory variables. However, in real applications, most data sets are collected in follow-up, which leads to the temporal correlation among the data. In order to characterize the different variables correlations, a new method about the latent variables is introduced in this study. At the same time, the latent variables about AR (1 model are used to depict time dependence. In the framework of Bayesian analysis, parameters estimates and statistical inferences are carried out via Gibbs sampler with Metropolis-Hastings (MH algorithm. Model comparison, based on the Bayes factor, and forecasting/smoothing of the survival rate of the tree are established. A simulation study is conducted to assess the performance of the proposed method and a pika data set is analyzed to illustrate the real application. Since Bayes factor approaches vary significantly, efficiency tests have been performed in order to decide which solution provides a better tool for the analysis of real relational data sets.
Sandric, I.; Petropoulos, Y.; Chitu, Z.; Mihai, B.
2012-04-01
The landslide hazard analysis models takes into consideration both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. The latter is expressed not as land use classes, as for example CORINE, but as leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index was derived from Landsat time series images, starting from 1984 and up to 2011. All the images available for the Panatau administrative unit in Buzau County, Romania, have been downloaded from http://earthexplorer.usgs.gov, including the images with cloud cover. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created
Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey
Leclercq, Florent; Wandelt, Benjamin
2015-01-01
Recent application of the Bayesian algorithm BORG to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments and clusters) on the basis of the tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces highly detailed and accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis...
Bayesian analysis of anisotropic cosmologies: Bianchi VII_h and WMAP
McEwen, J D; Feeney, S M; Peiris, H V; Lasenby, A N
2013-01-01
We perform a definitive analysis of Bianchi VII_h cosmologies with WMAP observations of the cosmic microwave background (CMB) temperature anisotropies. Bayesian analysis techniques are developed to study anisotropic cosmologies using full-sky and partial-sky, masked CMB temperature data. We apply these techniques to analyse the full-sky internal linear combination (ILC) map and a partial-sky, masked W-band map of WMAP 9-year observations. In addition to the physically motivated Bianchi VII_h model, we examine phenomenological models considered in previous studies, in which the Bianchi VII_h parameters are decoupled from the standard cosmological parameters. In the two phenomenological models considered, Bayes factors of 1.7 and 1.1 units of log-evidence favouring a Bianchi component are found in full-sky ILC data. The corresponding best-fit Bianchi maps recovered are similar for both phenomenological models and are very close to those found in previous studies using earlier WMAP data releases. However, no evi...
BayesPeak: Bayesian analysis of ChIP-seq data
Directory of Open Access Journals (Sweden)
Stark Rory
2009-09-01
Full Text Available Abstract Background High-throughput sequencing technology has become popular and widely used to study protein and DNA interactions. Chromatin immunoprecipitation, followed by sequencing of the resulting samples, produces large amounts of data that can be used to map genomic features such as transcription factor binding sites and histone modifications. Methods Our proposed statistical algorithm, BayesPeak, uses a fully Bayesian hidden Markov model to detect enriched locations in the genome. The structure accommodates the natural features of the Solexa/Illumina sequencing data and allows for overdispersion in the abundance of reads in different regions. Moreover, a control sample can be incorporated in the analysis to account for experimental and sequence biases. Markov chain Monte Carlo algorithms are applied to estimate the posterior distributions of the model parameters, and posterior probabilities are used to detect the sites of interest. Conclusion We have presented a flexible approach for identifying peaks from ChIP-seq reads, suitable for use on both transcription factor binding and histone modification data. Our method estimates probabilities of enrichment that can be used in downstream analysis. The method is assessed using experimentally verified data and is shown to provide high-confidence calls with low false positive rates.
Updating reliability data using feedback analysis: feasibility of a Bayesian subjective method
International Nuclear Information System (INIS)
For years, EDF has used Probabilistic Safety Assessment to evaluate a global indicator of the safety of its nuclear power plants and to optimize the performance while ensuring a certain safety level. Therefore, robustness and relevancy of PSA are very important. That is the reason why EDF wants to improve the relevancy of the reliability parameters used in these models. This article aims to propose a Bayesian approach to build PSA parameters when feedback data is not large enough to use the frequentist method. Our method is called subjective because its purpose is to give engineers pragmatic criteria to apply Bayesian in a controlled and consistent way. Using Bayesian is quite common for example in the United States, because the nuclear power plants are less standardized. Bayesian is often used with generic data as prior. So we have to adapt the general methodology within EDF context. (authors)
Wagner-Kaiser, R; Sarajedini, A; von Hippel, T; van Dyk, D A; Robinson, E; Stein, N; Jefferys, W H
2016-01-01
We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of 30 Galactic Globular Clusters to characterize two distinct stellar populations. A sophisticated Bayesian technique is employed to simultaneously sample the joint posterior distribution of age, distance, and extinction for each cluster, as well as unique helium values for two populations within each cluster and the relative proportion of those populations. We find the helium differences among the two populations in the clusters fall in the range of ~0.04 to 0.11. Because adequate models varying in CNO are not presently available, we view these spreads as upper limits and present them with statistical rather than observational uncertainties. Evidence supports previous studies suggesting an increase in helium content concurrent with increasing mass of the cluster and also find that the proportion of the first population of stars increases with mass as well. Our results are examined in the context of proposed g...
Directory of Open Access Journals (Sweden)
Sven Hassler
2014-12-01
Full Text Available In order to increase the understanding on what determines health among immigrants, ethnic minorities and indigenous people concepts as acculturation, identity and sense of coherence (SOC have become central for the analysis. The process of acculturation and the associated concepts of integration, assimilation, marginalization and separation have often been referred to when describing the health of immigrants and indigenous, of which integration has been considered to provide the better conditions for good health. The aim of this study is to explore the mutual relations between the concepts of acculturation, SOC and identity by an abductive reasoning based on an investigation on a group of Sami regarding their cultural and ethnic self-identification. By this explorative approach the study also seek to touch upon some of the relevant neighboring concepts such as cultural memory and position them among the more established social determinants of health. The study demonstrates that coherence as a psychosocial characteristic is appearing in different concepts and models in the area of acculturation and cognitive development as well as in cultural memory. It has an intra-individual dimension expressed in the theories of cognitive development and cultural memory and inter-individual, social dimension noticeable in SOC and the process of acculturation. The mutual correspondence of these structures of thought, values and perspectives have yet to be clarified and understood, especially in relation to health.
Kim, J.; Kwon, H. H.
2014-12-01
The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, This study aims to develop a hierarchical Bayesian model based regional frequency analysis in that spatial patterns of the design rainfall with geographical information are explicitly incorporated. This study assumes that the parameters of Gumbel distribution are a function of geographical characteristics (e.g. altitude, latitude and longitude) within a general linear regression framework. Posterior distributions of the regression parameters are estimated by Bayesian Markov Chain Monte Calro (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the Gumbel distribution by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Acknowledgement: This research was supported by a grant (14AWMP-B079364-01) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
No control genes required: Bayesian analysis of qRT-PCR data.
Directory of Open Access Journals (Sweden)
Mikhail V Matz
Full Text Available BACKGROUND: Model-based analysis of data from quantitative reverse-transcription PCR (qRT-PCR is potentially more powerful and versatile than traditional methods. Yet existing model-based approaches cannot properly deal with the higher sampling variances associated with low-abundant targets, nor do they provide a natural way to incorporate assumptions about the stability of control genes directly into the model-fitting process. RESULTS: In our method, raw qPCR data are represented as molecule counts, and described using generalized linear mixed models under Poisson-lognormal error. A Markov Chain Monte Carlo (MCMC algorithm is used to sample from the joint posterior distribution over all model parameters, thereby estimating the effects of all experimental factors on the expression of every gene. The Poisson-based model allows for the correct specification of the mean-variance relationship of the PCR amplification process, and can also glean information from instances of no amplification (zero counts. Our method is very flexible with respect to control genes: any prior knowledge about the expected degree of their stability can be directly incorporated into the model. Yet the method provides sensible answers without such assumptions, or even in the complete absence of control genes. We also present a natural Bayesian analogue of the "classic" analysis, which uses standard data pre-processing steps (logarithmic transformation and multi-gene normalization but estimates all gene expression changes jointly within a single model. The new methods are considerably more flexible and powerful than the standard delta-delta Ct analysis based on pairwise t-tests. CONCLUSIONS: Our methodology expands the applicability of the relative-quantification analysis protocol all the way to the lowest-abundance targets, and provides a novel opportunity to analyze qRT-PCR data without making any assumptions concerning target stability. These procedures have been
Strauss, Jillian; Miranda-Moreno, Luis F; Morency, Patrick
2013-10-01
This study proposes a two-equation Bayesian modelling approach to simultaneously study cyclist injury occurrence and bicycle activity at signalized intersections as joint outcomes. This approach deals with the potential presence of endogeneity and unobserved heterogeneities and is used to identify factors associated with both cyclist injuries and volumes. Its application to identify high-risk corridors is also illustrated. Montreal, Quebec, Canada is the application environment, using an extensive inventory of a large sample of signalized intersections containing disaggregate motor-vehicle traffic volumes and bicycle flows, geometric design, traffic control and built environment characteristics in the vicinity of the intersections. Cyclist injury data for the period of 2003-2008 is used in this study. Also, manual bicycle counts were standardized using temporal and weather adjustment factors to obtain average annual daily volumes. Results confirm and quantify the effects of both bicycle and motor-vehicle flows on cyclist injury occurrence. Accordingly, more cyclists at an intersection translate into more cyclist injuries but lower injury rates due to the non-linear association between bicycle volume and injury occurrence. Furthermore, the results emphasize the importance of turning motor-vehicle movements. The presence of bus stops and total crosswalk length increase cyclist injury occurrence whereas the presence of a raised median has the opposite effect. Bicycle activity through intersections was found to increase as employment, number of metro stations, land use mix, area of commercial land use type, length of bicycle facilities and the presence of schools within 50-800 m of the intersection increase. Intersections with three approaches are expected to have fewer cyclists than those with four. Using Bayesian analysis, expected injury frequency and injury rates were estimated for each intersection and used to rank corridors. Corridors with high bicycle volumes
Iryna Lobach; Ruzong Fan
2012-01-01
A key component to understanding etiology of complex diseases, such as cancer, diabetes, alcohol dependence, is to investigate gene-environment interactions. This work is motivated by the following two concerns in the analysis of gene-environment interactions. First, multiple genetic markers in moderate linkage disequilibrium may be involved in susceptibility to a complex disease. Second, environmental factors may be subject to misclassification. We develop a genotype based Bayesian pseudolik...
Rubio, Francisco J; Genton, Marc G
2016-06-30
We study Bayesian linear regression models with skew-symmetric scale mixtures of normal error distributions. These kinds of models can be used to capture departures from the usual assumption of normality of the errors in terms of heavy tails and asymmetry. We propose a general noninformative prior structure for these regression models and show that the corresponding posterior distribution is proper under mild conditions. We extend these propriety results to cases where the response variables are censored. The latter scenario is of interest in the context of accelerated failure time models, which are relevant in survival analysis. We present a simulation study that demonstrates good frequentist properties of the posterior credible intervals associated with the proposed priors. This study also sheds some light on the trade-off between increased model flexibility and the risk of over-fitting. We illustrate the performance of the proposed models with real data. Although we focus on models with univariate response variables, we also present some extensions to the multivariate case in the Supporting Information. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26856806
Ryu, Duchwan
2010-09-28
We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.
Rubio, Francisco J.
2016-02-09
We study Bayesian linear regression models with skew-symmetric scale mixtures of normal error distributions. These kinds of models can be used to capture departures from the usual assumption of normality of the errors in terms of heavy tails and asymmetry. We propose a general noninformative prior structure for these regression models and show that the corresponding posterior distribution is proper under mild conditions. We extend these propriety results to cases where the response variables are censored. The latter scenario is of interest in the context of accelerated failure time models, which are relevant in survival analysis. We present a simulation study that demonstrates good frequentist properties of the posterior credible intervals associated with the proposed priors. This study also sheds some light on the trade-off between increased model flexibility and the risk of over-fitting. We illustrate the performance of the proposed models with real data. Although we focus on models with univariate response variables, we also present some extensions to the multivariate case in the Supporting Information.
Bayesian time series analysis of segments of the Rocky Mountain trumpeter swan population
Wright, Christopher K.; Sojda, Richard S.; Goodman, Daniel
2002-01-01
A Bayesian time series analysis technique, the dynamic linear model, was used to analyze counts of Trumpeter Swans (Cygnus buccinator) summering in Idaho, Montana, and Wyoming from 1931 to 2000. For the Yellowstone National Park segment of white birds (sub-adults and adults combined) the estimated probability of a positive growth rate is 0.01. The estimated probability of achieving the Subcommittee on Rocky Mountain Trumpeter Swans 2002 population goal of 40 white birds for the Yellowstone segment is less than 0.01. Outside of Yellowstone National Park, Wyoming white birds are estimated to have a 0.79 probability of a positive growth rate with a 0.05 probability of achieving the 2002 objective of 120 white birds. In the Centennial Valley in southwest Montana, results indicate a probability of 0.87 that the white bird population is growing at a positive rate with considerable uncertainty. The estimated probability of achieving the 2002 Centennial Valley objective of 160 white birds is 0.14 but under an alternative model falls to 0.04. The estimated probability that the Targhee National Forest segment of white birds has a positive growth rate is 0.03. In Idaho outside of the Targhee National Forest, white birds are estimated to have a 0.97 probability of a positive growth rate with a 0.18 probability of attaining the 2002 goal of 150 white birds.
Bayesian analysis of diagnostic test accuracy when disease state is unverified for some subjects.
Pennello, Gene A
2011-09-01
Studies of the accuracy of medical tests to diagnose the presence or absence of disease can suffer from an inability to verify the true disease state in everyone. When verification is missing at random (MAR), the missing data mechanism can be ignored in likelihood-based inference. However, this assumption may not hold even approximately. When verification is nonignorably missing, the most general model of the distribution of disease state, test result, and verification indicator is overparameterized. Parameters are only partially identified, creating regions of ignorance for maximum likelihood estimators. For studies of a single test, we use Bayesian analysis to implement the most general nonignorable model, a reduced nonignorable model with identifiable parameters, and the MAR model. Simple Gibbs sampling algorithms are derived that enable computation of the posterior distribution of test accuracy parameters. In particular, the posterior distribution is easily obtained for the most general nonignorable model, which makes relatively weak assumptions about the missing data mechanism. For this model, the posterior distribution combines two sources of uncertainty: ignorance in the estimation of partially identified parameters, and imprecision due to finite sampling variability. We compare the three models on data from a study of the accuracy of scintigraphy to diagnose liver disease.
Composite behavior analysis for video surveillance using hierarchical dynamic Bayesian networks
Cheng, Huanhuan; Shan, Yong; Wang, Runsheng
2011-03-01
Analyzing composite behaviors involving objects from multiple categories in surveillance videos is a challenging task due to the complicated relationships among human and objects. This paper presents a novel behavior analysis framework using a hierarchical dynamic Bayesian network (DBN) for video surveillance systems. The model is built for extracting objects' behaviors and their relationships by representing behaviors using spatial-temporal characteristics. The recognition of object behaviors is processed by the DBN at multiple levels: features of objects at low level, objects and their relationships at middle level, and event at high level, where event refers to behaviors of a single type object as well as behaviors consisting of several types of objects such as ``a person getting in a car.'' Furthermore, to reduce the complexity, a simple model selection criterion is addressed, by which the appropriated model is picked out from a pool of candidate models. Experiments are shown to demonstrate that the proposed framework could efficiently recognize and semantically describe composite object and human activities in surveillance videos.
Bayesian analysis of the kinetics of quantal transmitter secretion at the neuromuscular junction.
Saveliev, Anatoly; Khuzakhmetova, Venera; Samigullin, Dmitry; Skorinkin, Andrey; Kovyazina, Irina; Nikolsky, Eugeny; Bukharaeva, Ellya
2015-10-01
The timing of transmitter release from nerve endings is considered nowadays as one of the factors determining the plasticity and efficacy of synaptic transmission. In the neuromuscular junction, the moments of release of individual acetylcholine quanta are related to the synaptic delays of uniquantal endplate currents recorded under conditions of lowered extracellular calcium. Using Bayesian modelling, we performed a statistical analysis of synaptic delays in mouse neuromuscular junction with different patterns of rhythmic nerve stimulation and when the entry of calcium ions into the nerve terminal was modified. We have obtained a statistical model of the release timing which is represented as the summation of two independent statistical distributions. The first of these is the exponentially modified Gaussian distribution. The mixture of normal and exponential components in this distribution can be interpreted as a two-stage mechanism of early and late periods of phasic synchronous secretion. The parameters of this distribution depend on both the stimulation frequency of the motor nerve and the calcium ions' entry conditions. The second distribution was modelled as quasi-uniform, with parameters independent of nerve stimulation frequency and calcium entry. Two different probability density functions for the distribution of synaptic delays suggest at least two independent processes controlling the time course of secretion, one of them potentially involving two stages. The relative contribution of these processes to the total number of mediator quanta released depends differently on the motor nerve stimulation pattern and on calcium ion entry into nerve endings. PMID:26129670
International Nuclear Information System (INIS)
Bayesian network (BN) is a powerful tool for human reliability analysis (HRA) as it can characterize the dependency among different human performance shaping factors (PSFs) and associated actions. It can also quantify the importance of different PSFs that may cause a human error. Data required to fully quantify BN for HRA in offshore emergency situations are not readily available. For many situations, there is little or no appropriate data. This presents significant challenges to assign the prior and conditional probabilities that are required by the BN approach. To handle the data scarcity problem, this paper presents a data collection methodology using a virtual environment for a simplified BN model of offshore emergency evacuation. A two-level, three-factor experiment is used to collect human performance data under different mustering conditions. Collected data are integrated in the BN model and results are compared with a previous study. The work demonstrates that the BN model can assess the human failure likelihood effectively. Besides, the BN model provides the opportunities to incorporate new evidence and handle complex interactions among PSFs and associated actions
Bayesian analysis of sparse anisotropic universe models and application to the 5-yr WMAP data
Groeneboom, Nicolaas E
2008-01-01
We extend the previously described CMB Gibbs sampling framework to allow for exact Bayesian analysis of anisotropic universe models, and apply this method to the 5-year WMAP temperature observations. This involves adding support for non-diagonal signal covariance matrices, and implementing a general spectral parameter MCMC sampler. As a worked example we apply these techniques to the model recently introduced by Ackerman et al., describing for instance violations of rotational invariance during the inflationary epoch. After verifying the code with simulated data, we analyze the foreground-reduced 5-year WMAP temperature sky maps. For l < 400 and the W-band data, we find tentative evidence for a preferred direction pointing towards (l,b) = (110 deg, 10 deg) with an anisotropy amplitude of g* = 0.15 +- 0.039, nominally equivalent to a 3.8 sigma detection. Similar results are obtained from the V-band data [g* = 0.11 +- 0.039; (l,b) = (130 deg, 20 deg)]. Further, the preferred direction is stable with respect ...
Bayesian analysis of heavy-tailed and long-range dependent Processes
Graves, Timothy; Watkins, Nick; Gramacy, Robert; Franzke, Christian
2014-05-01
We have used MCMC algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modelling long range dependence (e.g. Beran et al, 2013). Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. We have extended the ARFIMA model by weakening the Gaussianity assumption, assuming an alpha-stable, heavy tailed, distribution for the innovations, and performing joint inference on d and alpha. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other popular measures of d.
Bayesian analysis of the linear reaction norm model with unknown covariates.
Su, G; Madsen, P; Lund, M S; Sorensen, D; Korsgaard, I R; Jensen, J
2006-07-01
The reaction norm model is becoming a popular approach for the analysis of genotype x environment interactions. In a classical reaction norm model, the expression of a genotype in different environments is described as a linear function (a reaction norm) of an environmental gradient or value. An environmental value is typically defined as the mean performance of all genotypes in the environment, which is usually unknown. One approximation is to estimate the mean phenotypic performance in each environment and then treat these estimates as known covariates in the model. However, a more satisfactory alternative is to infer environmental values simultaneously with the other parameters of the model. This study describes a method and its Bayesian Markov Chain Monte Carlo implementation that makes this possible. Frequentist properties of the proposed method are tested in a simulation study. Estimates of parameters of interest agree well with the true values. Further, inferences about genetic parameters from the proposed method are similar to those derived from a reaction norm model using true environmental values. On the other hand, using phenotypic means as proxies for environmental values results in poor inferences.
Li, Qian; Trivedi, Pravin K
2016-02-01
This paper develops an extended specification of the two-part model, which controls for unobservable self-selection and heterogeneity of health insurance, and analyzes the impact of Medicare supplemental plans on the prescription drug expenditure of the elderly, using a linked data set based on the Medicare Current Beneficiary Survey data for 2003-2004. The econometric analysis is conducted using a Bayesian econometric framework. We estimate the treatment effects for different counterfactuals and find significant evidence of endogeneity in plan choice and the presence of both adverse and advantageous selections in the supplemental insurance market. The average incentive effect is estimated to be $757 (2004 value) or 41% increase per person per year for the elderly enrolled in supplemental plans with drug coverage against the Medicare fee-for-service counterfactual and is $350 or 21% against the supplemental plans without drug coverage counterfactual. The incentive effect varies by different sources of drug coverage: highest for employer-sponsored insurance plans, followed by Medigap and managed medicare plans.
Assessment of occupational safety risks in Floridian solid waste systems using Bayesian analysis.
Bastani, Mehrad; Celik, Nurcin
2015-10-01
Safety risks embedded within solid waste management systems continue to be a significant issue and are prevalent at every step in the solid waste management process. To recognise and address these occupational hazards, it is necessary to discover the potential safety concerns that cause them, as well as their direct and/or indirect impacts on the different types of solid waste workers. In this research, our goal is to statistically assess occupational safety risks to solid waste workers in the state of Florida. Here, we first review the related standard industrial codes to major solid waste management methods including recycling, incineration, landfilling, and composting. Then, a quantitative assessment of major risks is conducted based on the data collected using a Bayesian data analysis and predictive methods. The risks estimated in this study for the period of 2005-2012 are then compared with historical statistics (1993-1997) from previous assessment studies. The results have shown that the injury rates among refuse collectors in both musculoskeletal and dermal injuries have decreased from 88 and 15 to 16 and three injuries per 1000 workers, respectively. However, a contrasting trend is observed for the injury rates among recycling workers, for whom musculoskeletal and dermal injuries have increased from 13 and four injuries to 14 and six injuries per 1000 workers, respectively. Lastly, a linear regression model has been proposed to identify major elements of the high number of musculoskeletal and dermal injuries. PMID:26219294
Paired Comparison Analysis of the van Baaren Model Using Bayesian Approach with Noninformative Prior
Directory of Open Access Journals (Sweden)
Saima Altaf
2012-03-01
Full Text Available 800x600 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} One technique being commonly studied these days because of its attractive applications for the comparison of several objects is the method of paired comparisons. This technique permits the ranking of the objects by means of a score, which reflects the merit of the items on a linear scale. The present study is concerned with the Bayesian analysis of a paired comparison model, namely the van Baaren model VI using noninformative uniform prior. For this purpose, the joint posterior distribution for the parameters of the model, their marginal distributions, posterior estimates (means and modes, the posterior probabilities for comparing the two treatment parameters and the predictive probabilities are obtained.
Bayesian Modeling of MPSS Data: Gene Expression Analysis of Bovine Salmonella Infection
Dhavala, Soma S.
2010-09-01
Massively Parallel Signature Sequencing (MPSS) is a high-throughput, counting-based technology available for gene expression profiling. It produces output that is similar to Serial Analysis of Gene Expression and is ideal for building complex relational databases for gene expression. Our goal is to compare the in vivo global gene expression profiles of tissues infected with different strains of Salmonella obtained using the MPSS technology. In this article, we develop an exact ANOVA type model for this count data using a zero-inflatedPoisson distribution, different from existing methods that assume continuous densities. We adopt two Bayesian hierarchical models-one parametric and the other semiparametric with a Dirichlet process prior that has the ability to "borrow strength" across related signatures, where a signature is a specific arrangement of the nucleotides, usually 16-21 base pairs long. We utilize the discreteness of Dirichlet process prior to cluster signatures that exhibit similar differential expression profiles. Tests for differential expression are carried out using nonparametric approaches, while controlling the false discovery rate. We identify several differentially expressed genes that have important biological significance and conclude with a summary of the biological discoveries. This article has supplementary materials online. © 2010 American Statistical Association.
Li, Qian; Trivedi, Pravin K
2016-02-01
This paper develops an extended specification of the two-part model, which controls for unobservable self-selection and heterogeneity of health insurance, and analyzes the impact of Medicare supplemental plans on the prescription drug expenditure of the elderly, using a linked data set based on the Medicare Current Beneficiary Survey data for 2003-2004. The econometric analysis is conducted using a Bayesian econometric framework. We estimate the treatment effects for different counterfactuals and find significant evidence of endogeneity in plan choice and the presence of both adverse and advantageous selections in the supplemental insurance market. The average incentive effect is estimated to be $757 (2004 value) or 41% increase per person per year for the elderly enrolled in supplemental plans with drug coverage against the Medicare fee-for-service counterfactual and is $350 or 21% against the supplemental plans without drug coverage counterfactual. The incentive effect varies by different sources of drug coverage: highest for employer-sponsored insurance plans, followed by Medigap and managed medicare plans. PMID:25504934
Bayesian oligogenic analysis of quantitative and qualitative traits in general pedigrees.
Uimari, P; Sillanpää, M J
2001-11-01
A Bayesian method for multipoint oligogenic analysis of quantitative and qualitative traits is presented. This method can be applied to general pedigrees, which do not necessarily have to be "peelable" and can have large numbers of markers. The number of quantitative/qualitative trait loci (QTL), their map positions in the genome, and phenotypic effects (mode of inheritances) are all estimated simultaneously within the same framework. The summaries of the estimated parameters are based on the marginal posterior distributions that are obtained through Markov chain Monte Carlo (MCMC) methods. The method uses founder alleles together with segregation indicators in order to determine the genotypes of the trait loci of all individuals in the pedigree. To improve mixing properties of the sampler, we propose (1) joint sampling of map position and segregation indicators, (2) omitting data augmentation for untyped or uninformative markers (homozygous parent), and (3) updating several markers jointly within a single block. The performance of the method was tested with two replicate GAW10 data sets (considering two levels of available marker information). The results were concordant and similar to those presented earlier with other methods. These analyses clearly illustrate the utility and wide applicability of the method. PMID:11668579
Feng, Jie; Oliva, Alberto
2016-01-01
The AMS-02 experiment has reported a new measurement of the antiproton/proton ratio in Galactic cosmic rays (CRs). In the energy range $E\\sim\\,$60-450 GeV, this ratio is found to be remarkably constant. Using recent data on CR proton, helium, carbon, 10Be/9Be, and B/C ratio, we have performed a global Bayesian analysis based on a Markov-Chain Monte-Carlo sampling algorithm under a "two halo model" of CR propagation. In this model, CRs are allowed to experience a different type of diffusion when they propagate in the region close of the Galactic disk. We found that the vertical extent of this region is about 900 pc above and below the disk, and the corresponding diffusion coefficient scales with energy as $D\\sim\\,E^{0.15}$, describing well the observations on primary CR spectra, secondary/primary ratios and anisotropy. Under this model we have carried out improved calculations of antiparticle spectra arising from secondary CR production and their corresponding uncertainties. We made use of Monte-Carlo generato...
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
A Semantic Connected Coherence Scheme for Efficient Image Segmentation
Directory of Open Access Journals (Sweden)
S.Pannirselvam
2012-06-01
Full Text Available Image processing is a comprehensively research topic with an elongated history. Segmenting an image is the most challenging and difficult task in image processing and analysis. The principal intricacy met in image segmentation is the ability of techniques to discover semantic objects efficiently from an image without any prior knowledge. One recent work presented connected coherence tree algorithm (CCTA for image segmentation (with no prior knowledge which discovered regions of semantic coherence based on neighbor coherence segmentation criteria. It deployed an adaptive spatial scale and a suitable intensity-difference scale to extract several sets of coherent neighboring pixels and maximize the probability of single image content and minimize complex backgrounds. However CCTA segmented images either consists of small, lengthy and slender objects or rigorously ruined by noise, irregular lighting, occlusion, poor illumination, and shadow.In this paper, we present a Cluster based Semantic Coherent Tree (CBSCT scheme for image segmentation. CBSCT’s initial work is on the semantic connected coherence criteria for the image segregation. Semantic coherent regions are clustered based on Bayesian nearest neighbor search of neighborhood pixels. The segmentation regions are extracted from the images based on the cluster object purity obtained through semantic coherent regions. The clustered image regions are post processed with non linear noise filters. Performance metrics used in the evaluation of CBSCT are semantic coherent pixel size, number of cluster objects, and purity levels of the cluster, segmented coherent region intensity threshold, and quality of segmented images in terms of image clarity with PSNR.
Directory of Open Access Journals (Sweden)
Fenghua Tian
2016-01-01
Full Text Available Cerebral autoregulation represents the physiological mechanisms that keep brain perfusion relatively constant in the face of changes in blood pressure and thus plays an essential role in normal brain function. This study assessed cerebral autoregulation in nine newborns with moderate-to-severe hypoxic–ischemic encephalopathy (HIE. These neonates received hypothermic therapy during the first 72 h of life while mean arterial pressure (MAP and cerebral tissue oxygenation saturation (SctO2 were continuously recorded. Wavelet coherence analysis, which is a time-frequency domain approach, was used to characterize the dynamic relationship between spontaneous oscillations in MAP and SctO2. Wavelet-based metrics of phase, coherence and gain were derived for quantitative evaluation of cerebral autoregulation. We found cerebral autoregulation in neonates with HIE was time-scale-dependent in nature. Specifically, the spontaneous changes in MAP and SctO2 had in-phase coherence at time scales of less than 80 min (<0.0002 Hz in frequency, whereas they showed anti-phase coherence at time scales of around 2.5 h (~0.0001 Hz in frequency. Both the in-phase and anti-phase coherence appeared to be related to worse clinical outcomes. These findings suggest the potential clinical use of wavelet coherence analysis to assess dynamic cerebral autoregulation in neonatal HIE during hypothermia.
Bayesian belief networks for human reliability analysis: A review of applications and gaps
International Nuclear Information System (INIS)
The use of Bayesian Belief Networks (BBNs) in risk analysis (and in particular Human Reliability Analysis, HRA) is fostered by a number of features, attractive in fields with shortage of data and consequent reliance on subjective judgments: the intuitive graphical representation, the possibility of combining diverse sources of information, the use the probabilistic framework to characterize uncertainties. In HRA, BBN applications are steadily increasing, each emphasizing a different BBN feature or a different HRA aspect to improve. This paper aims at a critical review of these features as well as at suggesting research needs. Five groups of BBN applications are analysed: modelling of organizational factors, analysis of the relationships among failure influencing factors, BBN-based extensions of existing HRA methods, dependency assessment among human failure events, assessment of situation awareness. Further, the paper analyses the process for building BBNs and in particular how expert judgment is used in the assessment of the BBN conditional probability distributions. The gaps identified in the review suggest the need for establishing more systematic frameworks to integrate the different sources of information relevant for HRA (cognitive models, empirical data, and expert judgment) and to investigate algorithms to avoid elicitation of many relationships via expert judgment. - Highlights: • We analyze BBN uses for HRA applications; but some conclusions can be generalized. • Special review focus on BBN building approaches, key for model acceptance. • Gaps relate to the transparency of the BBN building and quantification phases. • Need for more systematic framework to integrate different sources of information. • Need of ways to avoid elicitation of many relationships via expert judgment
Directory of Open Access Journals (Sweden)
Hero Alfred
2010-11-01
Full Text Available Abstract Background Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP, the Indian Buffet Process (IBP, and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB analysis. Results Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV, Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD, closely related non-Bayesian approaches. Conclusions Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.
Draper, D.
2001-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Canadian energy and climate policies: A SWOT analysis in search of federal/provincial coherence
International Nuclear Information System (INIS)
This paper presents an analysis of Canadian energy and climate policies in terms of the coherence between federal and provincial/territorial strategies. After briefly describing the institutional, energy, and climate contexts, we perform a SWOT analysis on the themes of energy security, energy efficiency, and technology and innovation. Within this analytical framework, we discuss the coherence of federal and provincial policies and of energy and climate policies. Our analysis shows that there is a lack of consistency in the Canadian energy and climate strategies beyond the application of market principles. Furthermore, in certain sectors, the Canadian approach amounts to an amalgam of decisions made at a provincial level without cooperation with other provinces or with the federal government. One way to improve policy coherence would be to increase the cooperation between the different jurisdictions by using a combination of policy tools and by relying on existing intergovernmental agencies. - Highlights: • We perform a SWOT analysis of the Canadian energy and climate policies. • We analyse policy coherence between federal and provincial/territorial strategies. • We show that a lack of coordination leads to a weak coherence among policies. • The absence of cooperation results in additional costs for Canada
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.
Health at the borders: Bayesian multilevel analysis of women's malnutrition determinants in Ethiopia
Directory of Open Access Journals (Sweden)
Tefera Darge Delbiso
2016-07-01
Full Text Available Background: Women's malnutrition, particularly undernutrition, remains an important public health challenge in Ethiopia. Although various studies examined the levels and determinants of women's nutritional status, the influence of living close to an international border on women's nutrition has not been investigated. Yet, Ethiopian borders are regularly affected by conflict and refugee flows, which might ultimately impact health. Objective: To investigate the impact of living close to borders in the nutritional status of women in Ethiopia, while considering other important covariates. Design: Our analysis was based on the body mass index (BMI of 6,334 adult women aged 20–49 years, obtained from the 2011 Ethiopian Demographic and Health Survey (EDHS. A Bayesian multilevel multinomial logistic regression analysis was used to capture the clustered structure of the data and the possible correlation that may exist within and between clusters. Results: After controlling for potential confounders, women living close to borders (i.e. ≤100 km in Ethiopia were 59% more likely to be underweight (posterior odds ratio [OR]=1.59; 95% credible interval [CrI]: 1.32–1.90 than their counterparts living far from the borders. This result was robust to different choices of border delineation (i.e. ≤50, ≤75, ≤125, and ≤150 km. Women from poor families, those who have no access to improved toilets, reside in lowland areas, and are Muslim, were independently associated with underweight. In contrast, more wealth, higher education, older age, access to improved toilets, being married, and living in urban or lowlands were independently associated with overweight. Conclusions: The problem of undernutrition among women in Ethiopia is most worrisome in the border areas. Targeted interventions to improve nutritional status in these areas, such as improved access to sanitation, economic and livelihood support, are recommended.
Bayesian hierarchical multi-subject multiscale analysis of functional MRI data.
Sanyal, Nilotpal; Ferreira, Marco A R
2012-11-15
We develop a methodology for Bayesian hierarchical multi-subject multiscale analysis of functional Magnetic Resonance Imaging (fMRI) data. We begin by modeling the brain images temporally with a standard general linear model. After that, we transform the resulting estimated standardized regression coefficient maps through a discrete wavelet transformation to obtain a sparse representation in the wavelet space. Subsequently, we assign to the wavelet coefficients a prior that is a mixture of a point mass at zero and a Gaussian white noise. In this mixture prior for the wavelet coefficients, the mixture probabilities are related to the pattern of brain activity across different resolutions. To incorporate this information, we assume that the mixture probabilities for wavelet coefficients at the same location and level are common across subjects. Furthermore, we assign for the mixture probabilities a prior that depends on a few hyperparameters. We develop an empirical Bayes methodology to estimate the hyperparameters and, as these hyperparameters are shared by all subjects, we obtain precise estimated values. Then we carry out inference in the wavelet space and obtain smoothed images of the regression coefficients by applying the inverse wavelet transform to the posterior means of the wavelet coefficients. An application to computer simulated synthetic data has shown that, when compared to single-subject analysis, our multi-subject methodology performs better in terms of mean squared error. Finally, we illustrate the utility and flexibility of our multi-subject methodology with an application to an event-related fMRI dataset generated by Postle (2005) through a multi-subject fMRI study of working memory related brain activation. PMID:22951257
A Bayesian Network Approach for Offshore Risk Analysis Through Linguistic Variables
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
This paper presents a new approach for offshore risk analysis that is capable of dealing with linguistic probabilities in Bayesian networks (BNs). In this paper, linguistic probabilities are used to describe occurrence likelihood of hazardous events that may cause possible accidents in offshore operations. In order to use fuzzy information, an f-weighted valuation function is proposed to transform linguistic judgements into crisp probability distributions which can be easily put into a BN to model causal relationships among risk factors. The use of linguistic variables makes it easier for human experts to express their knowledge, and the transformation of linguistic judgements into crisp probabilities can significantly save the cost of computation, modifying and maintaining a BN model. The flexibility of the method allows for multiple forms of information to be used to quantify model relationships, including formally assessed expert opinion when quantitative data are lacking, or when only qualitative or vague statements can be made. The model is a modular representation of uncertain knowledge caused due to randomness, vagueness and ignorance. This makes the risk analysis of offshore engineering systems more functional and easier in many assessment contexts. Specifically, the proposed f-weighted valuation function takes into account not only the dominating values, but also the α-level values that are ignored by conventional valuation methods. A case study of the collision risk between a Floating Production, Storage and Off-loading (FPSO) unit and the authorised vessels due to human elements during operation is used to illustrate the application of the proposed model.
Temporal analysis of the coherent properties of optical images of rough nonplanar objects
Mandrosov, V. I.
2009-01-01
The possibility of using temporal analysis to find the relation between chromatic properties of probe radiation and coherent properties of the optical images of rough non-planar objects is substantiated. The analysis is based on the use of the time correlation function and on the study of the speckl
Wendling, Thierry; Tsamandouras, Nikolaos; Dumitras, Swati; Pigeolet, Etienne; Ogungbenro, Kayode; Aarons, Leon
2016-01-01
Whole-body physiologically based pharmacokinetic (PBPK) models are increasingly used in drug development for their ability to predict drug concentrations in clinically relevant tissues and to extrapolate across species, experimental conditions and sub-populations. A whole-body PBPK model can be fitted to clinical data using a Bayesian population approach. However, the analysis might be time consuming and numerically unstable if prior information on the model parameters is too vague given the complexity of the system. We suggest an approach where (i) a whole-body PBPK model is formally reduced using a Bayesian proper lumping method to retain the mechanistic interpretation of the system and account for parameter uncertainty, (ii) the simplified model is fitted to clinical data using Markov Chain Monte Carlo techniques and (iii) the optimised reduced PBPK model is used for extrapolation. A previously developed 16-compartment whole-body PBPK model for mavoglurant was reduced to 7 compartments while preserving plasma concentration-time profiles (median and variance) and giving emphasis to the brain (target site) and the liver (elimination site). The reduced model was numerically more stable than the whole-body model for the Bayesian analysis of mavoglurant pharmacokinetic data in healthy adult volunteers. Finally, the reduced yet mechanistic model could easily be scaled from adults to children and predict mavoglurant pharmacokinetics in children aged from 3 to 11 years with similar performance compared with the whole-body model. This study is a first example of the practicality of formal reduction of complex mechanistic models for Bayesian inference in drug development.
Analysis on the Cohesive and Coherent Features of John Kennedy’s First Inaugural Address
Institute of Scientific and Technical Information of China (English)
范莹芳
2013-01-01
John Kennedy’s first inaugural address is one of the widely appreciated speeches worldwide. It is famous not only for calling up the American people to well serve the country, but also for its extraordinary linguistic power to arouse the listeners’ emotions, which lies to a great extent in the marvelous employment of the cohesive and coherent devices in the process of its de⁃livery. Cohesion and coherence are two elementary and significant concepts in the theoretical system of discourse analysis. There⁃fore, they play an important role in the structuring, arrangement, interpretation and analysis of a discourse. In this sense, it is sig⁃nificant to analyze the cohesive and coherent features of John Kennedy’s first inaugural address in order to obtain a penetrating comprehension of the speech in many aspects. A detailed analysis on the cohesive and coherent features of the speech has been conducted in this paper. In the aspect of cohesion in the address, the devices employed fall into two categories: structural cohe⁃sion and non-structural cohesion. Structural cohesive devices used in the discourse are mainly grammatical cohesion and lexical cohesion like repetition, ellipsis, conjunction, etc. Non-structural methods adopted in the speech are transitivity, mood and mo⁃dality, thematic progression, parallel structure and so on. In the aspect of coherence, five levels of coherent methods have been employed, namely, lexical level, syntax level, semantic level, phonological level and social semiotic level. The neat intermingling of the cohesive and coherent methods function cooperatively and lead to the smooth going of the text.
DEFF Research Database (Denmark)
Pedersen, Thorkild Find
2003-01-01
frequency estimation techniques are considered for predicting the true fundamental frequency from measured acoustic noise or vibration signal. Among the methods are auto-correlation based methods, subspace methods, interpolated Fourier transform methods, and adaptive filters. A modified version...... of an adaptive comb filter is derived for tracking non-stationary signals. The estimation problem is then rephrased in terms of the Bayesian statistical framework. In the Bayesian framework both parameters and observations are considered stochastic processes. The result of the estimation is an expression...... for the probability density function (PDF) of the parameters conditioned on observation. Considering the fundamental frequency as a parameter and the acoustic and vibration signals as observations, a novel Bayesian frequency estimator is developed. With simulations the new estimator is shown to be superior to any...
Measurement and analysis of coherent synchrotron radiation effects at FLASH
Energy Technology Data Exchange (ETDEWEB)
Beutner, B.
2007-12-15
The vacuum-ultra-violet Free Electron Laser in Hamburg (FLASH) is a linac driven SASE-FEL. High peak currents are produced using magnetic bunch compression chicanes. In these magnetic chicanes, the energy distribution along an electron bunch is changed by eff ects of Coherent Synchrotron Radiation (CSR). Energy changes in dispersive bunch compressor chicanes lead to transverse displacements along the bunch. These CSR induced displacements are studied using a transverse deflecting RF-structure. Experiments and simulations concerning the charge dependence of such transverse displacements are presented and analysed. In these experiments an over-compression scheme is used which reduces the peak current downstream the bunch compressor chicanes. Therefore other self interactions like space charge forces which might complicate the measurements are suppressed. Numerical simulations are used to analyse the beam dynamics under the influence of CSR forces. The results of these numerical simulations are compared with the data obtained in the over-compression experiments at FLASH. (orig.)
Statistical analysis of motion contrast in optical coherence tomography angiography
Cheng, Yuxuan; Pan, Cong; Lu, Tongtong; Hong, Tianyu; Ding, Zhihua; Li, Peng
2015-01-01
Optical coherence tomography angiography (Angio-OCT), mainly based on the temporal dynamics of OCT scattering signals, has found a range of potential applications in clinical and scientific researches. In this work, based on the model of random phasor sums, temporal statistics of the complex-valued OCT signals are mathematically described. Statistical distributions of the amplitude differential (AD) and complex differential (CD) Angio-OCT signals are derived. The theories are validated through the flow phantom and live animal experiments. Using the model developed in this work, the origin of the motion contrast in Angio-OCT is mathematically explained, and the implications in the improvement of motion contrast are further discussed, including threshold determination and its residual classification error, averaging method, and scanning protocol. The proposed mathematical model of Angio-OCT signals can aid in the optimal design of the system and associated algorithms.
Statistical analysis of motion contrast in optical coherence tomography angiography
Cheng, Yuxuan; Guo, Li; Pan, Cong; Lu, Tongtong; Hong, Tianyu; Ding, Zhihua; Li, Peng
2015-11-01
Optical coherence tomography angiography (Angio-OCT), mainly based on the temporal dynamics of OCT scattering signals, has found a range of potential applications in clinical and scientific research. Based on the model of random phasor sums, temporal statistics of the complex-valued OCT signals are mathematically described. Statistical distributions of the amplitude differential and complex differential Angio-OCT signals are derived. The theories are validated through the flow phantom and live animal experiments. Using the model developed, the origin of the motion contrast in Angio-OCT is mathematically explained, and the implications in the improvement of motion contrast are further discussed, including threshold determination and its residual classification error, averaging method, and scanning protocol. The proposed mathematical model of Angio-OCT signals can aid in the optimal design of the system and associated algorithms.
Bayesian Analysis of Linear Inverse Problems with Applications in Economics and Finance
De Simoni, Anna
2009-01-01
In my PhD thesis I propose a Bayesian nonparametric estimation method for structural econometric models where the functional parameter of interest describes the economic agent's behavior. The structural parameter is characterized as the solution of a functional equation, or by using more technical words, as the solution of an inverse problem that can be either ill-posed or well-posed. From a Bayesian point of view, the parameter of interest is a random function and the solution to the infe...
Fancher, Chris M.; Han, Zhen; Levin, Igor; Page, Katharine; Reich, Brian J.; Smith, Ralph C.; Wilson, Alyson G.; Jones, Jacob L.
2016-01-01
A Bayesian inference method for refining crystallographic structures is presented. The distribution of model parameters is stochastically sampled using Markov chain Monte Carlo. Posterior probability distributions are constructed for all model parameters to properly quantify uncertainty by appropriately modeling the heteroskedasticity and correlation of the error structure. The proposed method is demonstrated by analyzing a National Institute of Standards and Technology silicon standard reference material. The results obtained by Bayesian inference are compared with those determined by Rietveld refinement. Posterior probability distributions of model parameters provide both estimates and uncertainties. The new method better estimates the true uncertainties in the model as compared to the Rietveld method. PMID:27550221
Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.
2012-01-01
The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.
New class of hybrid EoS and Bayesian M - R data analysis
Energy Technology Data Exchange (ETDEWEB)
Alvarez-Castillo, D. [JINR Dubna, Bogoliubov Laboratory of Theoretical Physics, Dubna (Russian Federation); Ayriyan, A.; Grigorian, H. [JINR Dubna, Laboratory of Information Technologies, Dubna (Russian Federation); Benic, S. [University of Zagreb, Department of Physics, Zagreb (Croatia); Blaschke, D. [JINR Dubna, Bogoliubov Laboratory of Theoretical Physics, Dubna (Russian Federation); National Research Nuclear University (MEPhI), Moscow (Russian Federation); Typel, S. [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany)
2016-03-15
We explore systematically a new class of two-phase equations of state (EoS) for hybrid stars that is characterized by three main features: (1) stiffening of the nuclear EoS at supersaturation densities due to quark exchange effects (Pauli blocking) between hadrons, modelled by an excluded volume correction; (2) stiffening of the quark matter EoS at high densities due to multiquark interactions; and (3) possibility for a strong first-order phase transition with an early onset and large density jump. The third feature results from a Maxwell construction for the possible transition from the nuclear to a quark matter phase and its properties depend on the two parameters used for (1) and (2), respectively. Varying these two parameters, one obtains a class of hybrid EoS that yields solutions of the Tolman-Oppenheimer-Volkoff (TOV) equations for sequences of hadronic and hybrid stars in the mass-radius diagram which cover the full range of patterns according to the Alford-Han-Prakash classification following which a hybrid star branch can be either absent, connected or disconnected with the hadronic one. The latter case often includes a tiny connected branch. The disconnected hybrid star branch, also called ''third family'', corresponds to high-mass twin stars characterized by the same gravitational mass but different radii. We perform a Bayesian analysis and demonstrate that the observation of such a pair of high-mass twin stars would have a sufficient discriminating power to favor hybrid EoS with a strong first-order phase transition over alternative EoS. (orig.)
Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals.
De Francesco, A; Guarini, E; Bafile, U; Formisano, F; Scaccia, L
2016-08-01
When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way. PMID:27627410
A method of spherical harmonic analysis in the geosciences via hierarchical Bayesian inference
Muir, J. B.; Tkalčić, H.
2015-11-01
The problem of decomposing irregular data on the sphere into a set of spherical harmonics is common in many fields of geosciences where it is necessary to build a quantitative understanding of a globally varying field. For example, in global seismology, a compressional or shear wave speed that emerges from tomographic images is used to interpret current state and composition of the mantle, and in geomagnetism, secular variation of magnetic field intensity measured at the surface is studied to better understand the changes in the Earth's core. Optimization methods are widely used for spherical harmonic analysis of irregular data, but they typically do not treat the dependence of the uncertainty estimates on the imposed regularization. This can cause significant difficulties in interpretation, especially when the best-fit model requires more variables as a result of underestimating data noise. Here, with the above limitations in mind, the problem of spherical harmonic expansion of irregular data is treated within the hierarchical Bayesian framework. The hierarchical approach significantly simplifies the problem by removing the need for regularization terms and user-supplied noise estimates. The use of the corrected Akaike Information Criterion for picking the optimal maximum degree of spherical harmonic expansion and the resulting spherical harmonic analyses are first illustrated on a noisy synthetic data set. Subsequently, the method is applied to two global data sets sensitive to the Earth's inner core and lowermost mantle, consisting of PKPab-df and PcP-P differential traveltime residuals relative to a spherically symmetric Earth model. The posterior probability distributions for each spherical harmonic coefficient are calculated via Markov Chain Monte Carlo sampling; the uncertainty obtained for the coefficients thus reflects the noise present in the real data and the imperfections in the spherical harmonic expansion.
Bayesian Analysis of Non-Gaussian Long-Range Dependent Processes
Graves, Timothy; Watkins, Nicholas; Franzke, Christian; Gramacy, Robert
2013-04-01
Recent studies [e.g. the Antarctic study of Franzke, J. Climate, 2010] have strongly suggested that surface temperatures exhibit long-range dependence (LRD). The presence of LRD would hamper the identification of deterministic trends and the quantification of their significance. It is well established that LRD processes exhibit stochastic trends over rather long periods of time. Thus, accurate methods for discriminating between physical processes that possess long memory and those that do not are an important adjunct to climate modeling. As we briefly review, the LRD idea originated at the same time as H-selfsimilarity, so it is often not realised that a model does not have to be H-self similar to show LRD [e.g. Watkins, GRL Frontiers, 2013]. We have used Markov Chain Monte Carlo algorithms to perform a Bayesian analysis of Auto-Regressive Fractionally-Integrated Moving-Average ARFIMA(p,d,q) processes, which are capable of modeling LRD. Our principal aim is to obtain inference about the long memory parameter, d, with secondary interest in the scale and location parameters. We have developed a reversible-jump method enabling us to integrate over different model forms for the short memory component. We initially assume Gaussianity, and have tested the method on both synthetic and physical time series. Many physical processes, for example the Faraday Antarctic time series, are significantly non-Gaussian. We have therefore extended this work by weakening the Gaussianity assumption, assuming an alpha-stable distribution for the innovations, and performing joint inference on d and alpha. Such a modified FARIMA(p,d,q) process is a flexible, initial model for non-Gaussian processes with long memory. We will present a study of the dependence of the posterior variance of the memory parameter d on the length of the time series considered. This will be compared with equivalent error diagnostics for other measures of d.
Cancer mortality inequalities in urban areas: a Bayesian small area analysis in Spanish cities
Directory of Open Access Journals (Sweden)
Martos Carmen M
2011-01-01
Full Text Available Abstract Background Intra-urban inequalities in mortality have been infrequently analysed in European contexts. The aim of the present study was to analyse patterns of cancer mortality and their relationship with socioeconomic deprivation in small areas in 11 Spanish cities. Methods It is a cross-sectional ecological design using mortality data (years 1996-2003. Units of analysis were the census tracts. A deprivation index was calculated for each census tract. In order to control the variability in estimating the risk of dying we used Bayesian models. We present the RR of the census tract with the highest deprivation vs. the census tract with the lowest deprivation. Results In the case of men, socioeconomic inequalities are observed in total cancer mortality in all cities, except in Castellon, Cordoba and Vigo, while Barcelona (RR = 1.53 95%CI 1.42-1.67, Madrid (RR = 1.57 95%CI 1.49-1.65 and Seville (RR = 1.53 95%CI 1.36-1.74 present the greatest inequalities. In general Barcelona and Madrid, present inequalities for most types of cancer. Among women for total cancer mortality, inequalities have only been found in Barcelona and Zaragoza. The excess number of cancer deaths due to socioeconomic deprivation was 16,413 for men and 1,142 for women. Conclusion This study has analysed inequalities in cancer mortality in small areas of cities in Spain, not only relating this mortality with socioeconomic deprivation, but also calculating the excess mortality which may be attributed to such deprivation. This knowledge is particularly useful to determine which geographical areas in each city need intersectorial policies in order to promote a healthy environment.
Slater, Hannah; Michael, Edwin
2013-01-01
There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account
Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals
De Francesco, A.; Guarini, E.; Bafile, U.; Formisano, F.; Scaccia, L.
2016-08-01
When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.
New class of hybrid EoS and Bayesian M - R data analysis
International Nuclear Information System (INIS)
We explore systematically a new class of two-phase equations of state (EoS) for hybrid stars that is characterized by three main features: (1) stiffening of the nuclear EoS at supersaturation densities due to quark exchange effects (Pauli blocking) between hadrons, modelled by an excluded volume correction; (2) stiffening of the quark matter EoS at high densities due to multiquark interactions; and (3) possibility for a strong first-order phase transition with an early onset and large density jump. The third feature results from a Maxwell construction for the possible transition from the nuclear to a quark matter phase and its properties depend on the two parameters used for (1) and (2), respectively. Varying these two parameters, one obtains a class of hybrid EoS that yields solutions of the Tolman-Oppenheimer-Volkoff (TOV) equations for sequences of hadronic and hybrid stars in the mass-radius diagram which cover the full range of patterns according to the Alford-Han-Prakash classification following which a hybrid star branch can be either absent, connected or disconnected with the hadronic one. The latter case often includes a tiny connected branch. The disconnected hybrid star branch, also called ''third family'', corresponds to high-mass twin stars characterized by the same gravitational mass but different radii. We perform a Bayesian analysis and demonstrate that the observation of such a pair of high-mass twin stars would have a sufficient discriminating power to favor hybrid EoS with a strong first-order phase transition over alternative EoS. (orig.)
New class of hybrid EoS and Bayesian M - R data analysis
Alvarez-Castillo, D.; Ayriyan, A.; Benic, S.; Blaschke, D.; Grigorian, H.; Typel, S.
2016-03-01
We explore systematically a new class of two-phase equations of state (EoS) for hybrid stars that is characterized by three main features: 1) stiffening of the nuclear EoS at supersaturation densities due to quark exchange effects (Pauli blocking) between hadrons, modelled by an excluded volume correction; 2) stiffening of the quark matter EoS at high densities due to multiquark interactions; and 3) possibility for a strong first-order phase transition with an early onset and large density jump. The third feature results from a Maxwell construction for the possible transition from the nuclear to a quark matter phase and its properties depend on the two parameters used for 1) and 2), respectively. Varying these two parameters, one obtains a class of hybrid EoS that yields solutions of the Tolman-Oppenheimer-Volkoff (TOV) equations for sequences of hadronic and hybrid stars in the mass-radius diagram which cover the full range of patterns according to the Alford-Han-Prakash classification following which a hybrid star branch can be either absent, connected or disconnected with the hadronic one. The latter case often includes a tiny connected branch. The disconnected hybrid star branch, also called "third family", corresponds to high-mass twin stars characterized by the same gravitational mass but different radii. We perform a Bayesian analysis and demonstrate that the observation of such a pair of high-mass twin stars would have a sufficient discriminating power to favor hybrid EoS with a strong first-order phase transition over alternative EoS.
Directory of Open Access Journals (Sweden)
Krzysztof Tomanek
2014-05-01
Full Text Available The purpose of this article is to present the basic methods for classifying text data. These methods make use of achievements earned in areas such as: natural language processing, the analysis of unstructured data. I introduce and compare two analytical techniques applied to text data. The first analysis makes use of thematic vocabulary tool (sentiment analysis. The second technique uses the idea of Bayesian classification and applies, so-called, naive Bayes algorithm. My comparison goes towards grading the efficiency of use of these two analytical techniques. I emphasize solutions that are to be used to build dictionary accurate for the task of text classification. Then, I compare supervised classification to automated unsupervised analysis’ effectiveness. These results reinforce the conclusion that a dictionary which has received good evaluation as a tool for classification should be subjected to review and modification procedures if is to be applied to new empirical material. Adaptation procedures used for analytical dictionary become, in my proposed approach, the basic step in the methodology of textual data analysis.
Analysis of Coherence Properties of 3-rd Generation Synchrotron Sources and Free-Electron Lasers
Vartanyants, I A
2009-01-01
A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and x-ray free-electron lasers (XFEL). Correlation properties of the wavefields are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source.
Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers
International Nuclear Information System (INIS)
A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)
A Bayesian analysis of rare B decays with advanced Monte Carlo methods
Energy Technology Data Exchange (ETDEWEB)
Beaujean, Frederik
2012-11-12
Searching for new physics in rare B meson decays governed by b {yields} s transitions, we perform a model-independent global fit of the short-distance couplings C{sub 7}, C{sub 9}, and C{sub 10} of the {Delta}B=1 effective field theory. We assume the standard-model set of b {yields} s{gamma} and b {yields} sl{sup +}l{sup -} operators with real-valued C{sub i}. A total of 59 measurements by the experiments BaBar, Belle, CDF, CLEO, and LHCb of observables in B{yields}K{sup *}{gamma}, B{yields}K{sup (*)}l{sup +}l{sup -}, and B{sub s}{yields}{mu}{sup +}{mu}{sup -} decays are used in the fit. Our analysis is the first of its kind to harness the full power of the Bayesian approach to probability theory. All main sources of theory uncertainty explicitly enter the fit in the form of nuisance parameters. We make optimal use of the experimental information to simultaneously constrain theWilson coefficients as well as hadronic form factors - the dominant theory uncertainty. Generating samples from the posterior probability distribution to compute marginal distributions and predict observables by uncertainty propagation is a formidable numerical challenge for two reasons. First, the posterior has multiple well separated maxima and degeneracies. Second, the computation of the theory predictions is very time consuming. A single posterior evaluation requires O(1s), and a few million evaluations are needed. Population Monte Carlo (PMC) provides a solution to both issues; a mixture density is iteratively adapted to the posterior, and samples are drawn in a massively parallel way using importance sampling. The major shortcoming of PMC is the need for cogent knowledge of the posterior at the initial stage. In an effort towards a general black-box Monte Carlo sampling algorithm, we present a new method to extract the necessary information in a reliable and automatic manner from Markov chains with the help of hierarchical clustering. Exploiting the latest 2012 measurements, the fit
A Bayesian analysis of rare B decays with advanced Monte Carlo methods
International Nuclear Information System (INIS)
Searching for new physics in rare B meson decays governed by b → s transitions, we perform a model-independent global fit of the short-distance couplings C7, C9, and C10 of the ΔB=1 effective field theory. We assume the standard-model set of b → sγ and b → sl+l- operators with real-valued Ci. A total of 59 measurements by the experiments BaBar, Belle, CDF, CLEO, and LHCb of observables in B→K*γ, B→K(*)l+l-, and Bs→μ+μ- decays are used in the fit. Our analysis is the first of its kind to harness the full power of the Bayesian approach to probability theory. All main sources of theory uncertainty explicitly enter the fit in the form of nuisance parameters. We make optimal use of the experimental information to simultaneously constrain theWilson coefficients as well as hadronic form factors - the dominant theory uncertainty. Generating samples from the posterior probability distribution to compute marginal distributions and predict observables by uncertainty propagation is a formidable numerical challenge for two reasons. First, the posterior has multiple well separated maxima and degeneracies. Second, the computation of the theory predictions is very time consuming. A single posterior evaluation requires O(1s), and a few million evaluations are needed. Population Monte Carlo (PMC) provides a solution to both issues; a mixture density is iteratively adapted to the posterior, and samples are drawn in a massively parallel way using importance sampling. The major shortcoming of PMC is the need for cogent knowledge of the posterior at the initial stage. In an effort towards a general black-box Monte Carlo sampling algorithm, we present a new method to extract the necessary information in a reliable and automatic manner from Markov chains with the help of hierarchical clustering. Exploiting the latest 2012 measurements, the fit reveals a flipped-sign solution in addition to a standard-model-like solution for the couplings Ci. The two solutions are related
Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks
DEFF Research Database (Denmark)
Skare, Øivind; Møller, Jesper; Vedel Jensen, Eva B.
A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...
Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks
DEFF Research Database (Denmark)
Skare, Øivind; Møller, Jesper; Jensen, Eva B. Vedel
2007-01-01
A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...
Another look at Bayesian analysis of AMMI models for genotype-environment data
Josse, J.; Eeuwijk, van F.A.; Piepho, H.P.; Denis, J.B.
2014-01-01
Linear–bilinear models are frequently used to analyze two-way data such as genotype-by-environment data. A well-known example of this class of models is the additive main effects and multiplicative interaction effects model (AMMI). We propose a new Bayesian treatment of such models offering a proper
Bayesian analysis of censored response data in family-based genetic association studies.
Del Greco M, Fabiola; Pattaro, Cristian; Minelli, Cosetta; Thompson, John R
2016-09-01
Biomarkers are subject to censoring whenever some measurements are not quantifiable given a laboratory detection limit. Methods for handling censoring have received less attention in genetic epidemiology, and censored data are still often replaced with a fixed value. We compared different strategies for handling a left-censored continuous biomarker in a family-based study, where the biomarker is tested for association with a genetic variant, S, adjusting for a covariate, X. Allowing different correlations between X and S, we compared simple substitution of censored observations with the detection limit followed by a linear mixed effect model (LMM), Bayesian model with noninformative priors, Tobit model with robust standard errors, the multiple imputation (MI) with and without S in the imputation followed by a LMM. Our comparison was based on real and simulated data in which 20% and 40% censoring were artificially induced. The complete data were also analyzed with a LMM. In the MICROS study, the Bayesian model gave results closer to those obtained with the complete data. In the simulations, simple substitution was always the most biased method, the Tobit approach gave the least biased estimates at all censoring levels and correlation values, the Bayesian model and both MI approaches gave slightly biased estimates but smaller root mean square errors. On the basis of these results the Bayesian approach is highly recommended for candidate gene studies; however, the computationally simpler Tobit and the MI without S are both good options for genome-wide studies.
How few countries will do? Comparative survey analysis from a Bayesian perspective
Hox, Joop; van de Schoot, Rens; Matthijsse, Suzette
2012-01-01
Meuleman and Billiet (2009) have carried out a simulation study aimed at the question how many countries are needed for accurate multilevel SEM estimation in comparative studies. The authors concluded that a sample of 50 to 100 countries is needed for accurate estimation. Recently, Bayesian estimati
Hierarchical Bayesian Analysis of Biased Beliefs and Distributional Other-Regarding Preferences
Directory of Open Access Journals (Sweden)
Jeroen Weesie
2013-02-01
Full Text Available This study investigates the relationship between an actor’s beliefs about others’ other-regarding (social preferences and her own other-regarding preferences, using an “avant-garde” hierarchical Bayesian method. We estimate two distributional other-regarding preference parameters, α and β, of actors using incentivized choice data in binary Dictator Games. Simultaneously, we estimate the distribution of actors’ beliefs about others α and β, conditional on actors’ own α and β, with incentivized belief elicitation. We demonstrate the benefits of the Bayesian method compared to it’s hierarchical frequentist counterparts. Results show a positive association between an actor’s own (α; β and her beliefs about average(α; β in the population. The association between own preferences and the variance in beliefs about others’ preferences in the population, however, is curvilinear for α and insignificant for β. These results are partially consistent with the cone effect [1,2] which is described in detail below. Because in the Bayesian-Nash equilibrium concept, beliefs and own preferences are assumed to be independent, these results cast doubt on the application of the Bayesian-Nash equilibrium concept to experimental data.
Coherent-state analysis of the quantum bouncing ball
Mather, William H.; Fox, Ronald F.
2006-03-01
Gaussian-Klauder coherent states are applied to the bound “quantum bouncer,” a gravitating particle above an infinite potential boundary. These Gaussian-Klauder states, originally created for Rydberg atoms, provide an overcomplete set of wave functions that mimic classical trajectories for extended times through the utilization of energy localization. For the quantum bouncer, analytic methods are applied presently to compute first and second moments of position and momentum operators, and from these results, at least two scalings of Gaussian-Klauder parameters are highlighted, one of which tends to remains localized for markedly more bounces than comparable states that are Gaussian in position (by an order of magnitude in some cases). We close with a connection that compares Gaussian-Klauder states and positional Gaussian states directly for the quantum bouncer, relating the two through a known energy-position duality of Airy functions. Our results, taken together, ultimately reemphasize the primacy of energy localization as a key ingredient for long-lived classical correspondence in systems with smooth spectra.
Bayesian Inference on Gravitational Waves
Directory of Open Access Journals (Sweden)
Asad Ali
2015-12-01
Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.
Analysis of dental abfractions by optical coherence tomography
Demjan, Enikö; Mărcăuţeanu, Corina; Bratu, Dorin; Sinescu, Cosmin; Negruţiu, Meda; Ionita, Ciprian; Topală, Florin; Hughes, Michael; Bradu, Adrian; Dobre, George; Podoleanu, Adrian Gh.
2010-02-01
Aim and objectives. Abfraction is the pathological loss of cervical hard tooth substance caused by biomechanical overload. High horizontal occlusal forces result in large stress concentrations in the cervical region of the teeth. These stresses may be high enough to cause microfractures in the dental hard tissues, eventually resulting in the loss of cervical enamel and dentin. The present study proposes the microstructural characterization of these cervical lesions by en face optical coherence tomography (eFOCT). Material and methods: 31 extracted bicuspids were investigated using eFOCT. 24 teeth derived from patients with active bruxism and occlusal interferences; they presented deep buccal abfractions and variable degrees of occlusal pathological attrition. The other 7 bicuspids were not exposed to occlusal overload and had a normal morphology of the dental crowns. The dental samples were investigated using an eFOCT system operating at 1300 nm (B-scan at 1 Hz and C-scan mode at 2 Hz). The system has a lateral resolution better than 5 μm and a depth resolution of 9 μm in tissue. OCT images were further compared with micro - computer tomography images. Results. The eFOCT investigation of bicuspids with a normal morphology revealed a homogeneous structure of the buccal cervical enamel. The C-scan and B-scan images obtained from the occlusal overloaded bicuspids visualized the wedge-shaped loss of cervical enamel and damage in the microstructure of the underlaying dentin. The high occlusal forces produced a characteristic pattern of large cracks, which reached the tooth surface. Conclusions: eFOCT is a promising imaging method for dental abfractions and it may offer some insight on the etiological mechanism of these noncarious cervical lesions.
Approaches to verbal, visual and musical creativity by EEG coherence analysis.
Petsche, H
1996-11-01
Our approach to coherence analysis of the on-going EEG yields data on the cooperation between all possible electrode sites of the 10/20 system. This procedure compares epochs during mental activity with epochs of EEG at rest and takes into account only significant differences which are represented as lines between the respective electrodes on maps of the brain surface. The patterns thus obtained reflect the temporal average of the changes of the overall coherence pattern caused by any mental task. They are interpreted as reflecting differential attention required for the achievement of the mental task in question. Acts of creative thinking, be it verbally, visually or musically, are characterized by more coherence increases between occipital and frontopolar electrode sites than any other mental tasks. The results are interpreted by a stronger involvement of long cortico-cortical fibre systems in creative tasks. PMID:8978440
DEFF Research Database (Denmark)
Hahn, GH; Christensen, KB; Leung, TS;
2010-01-01
Coherence between spontaneous fluctuations in arterial blood pressure (ABP) and the cerebral near-infrared spectroscopy signal can detect cerebral autoregulation. Because reliable measurement depends on signals with high signal-to-noise ratio, we hypothesized that coherence is more precisely...... for the variabilityABP among repeated measurements (i.e., weighting measurements with high variabilityABP in favor of those with low) improved the precision. The evidence of drift in individual infants was weak. Minimum monitoring time needed to discriminate among infants was 1.3–3.7 h. Coherence analysis in low...... frequencies (0.04–0.1 Hz) had higher precision and statistically more power than in very low frequencies (0.003–0.04 Hz). In conclusion, a reliable detection of cerebral autoregulation takes hours and the precision is improved by adjusting for variabilityABP between repeated measurements....
Greenacre, Michael; Lewi, Paul
2005-01-01
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to ...
A Bayesian ridge regression analysis of congestion's impact on urban expressway safety.
Shi, Qi; Abdel-Aty, Mohamed; Lee, Jaeyoung
2016-03-01
With the rapid growth of traffic in urban areas, concerns about congestion and traffic safety have been heightened. This study leveraged both Automatic Vehicle Identification (AVI) system and Microwave Vehicle Detection System (MVDS) installed on an expressway in Central Florida to explore how congestion impacts the crash occurrence in urban areas. Multiple congestion measures from the two systems were developed. To ensure more precise estimates of the congestion's effects, the traffic data were aggregated into peak and non-peak hours. Multicollinearity among traffic parameters was examined. The results showed the presence of multicollinearity especially during peak hours. As a response, ridge regression was introduced to cope with this issue. Poisson models with uncorrelated random effects, correlated random effects, and both correlated random effects and random parameters were constructed within the Bayesian framework. It was proven that correlated random effects could significantly enhance model performance. The random parameters model has similar goodness-of-fit compared with the model with only correlated random effects. However, by accounting for the unobserved heterogeneity, more variables were found to be significantly related to crash frequency. The models indicated that congestion increased crash frequency during peak hours while during non-peak hours it was not a major crash contributing factor. Using the random parameter model, the three congestion measures were compared. It was found that all congestion indicators had similar effects while Congestion Index (CI) derived from MVDS data was a better congestion indicator for safety analysis. Also, analyses showed that the segments with higher congestion intensity could not only increase property damage only (PDO) crashes, but also more severe crashes. In addition, the issues regarding the necessity to incorporate specific congestion indicator for congestion's effects on safety and to take care of the
A Bayesian ridge regression analysis of congestion's impact on urban expressway safety.
Shi, Qi; Abdel-Aty, Mohamed; Lee, Jaeyoung
2016-03-01
With the rapid growth of traffic in urban areas, concerns about congestion and traffic safety have been heightened. This study leveraged both Automatic Vehicle Identification (AVI) system and Microwave Vehicle Detection System (MVDS) installed on an expressway in Central Florida to explore how congestion impacts the crash occurrence in urban areas. Multiple congestion measures from the two systems were developed. To ensure more precise estimates of the congestion's effects, the traffic data were aggregated into peak and non-peak hours. Multicollinearity among traffic parameters was examined. The results showed the presence of multicollinearity especially during peak hours. As a response, ridge regression was introduced to cope with this issue. Poisson models with uncorrelated random effects, correlated random effects, and both correlated random effects and random parameters were constructed within the Bayesian framework. It was proven that correlated random effects could significantly enhance model performance. The random parameters model has similar goodness-of-fit compared with the model with only correlated random effects. However, by accounting for the unobserved heterogeneity, more variables were found to be significantly related to crash frequency. The models indicated that congestion increased crash frequency during peak hours while during non-peak hours it was not a major crash contributing factor. Using the random parameter model, the three congestion measures were compared. It was found that all congestion indicators had similar effects while Congestion Index (CI) derived from MVDS data was a better congestion indicator for safety analysis. Also, analyses showed that the segments with higher congestion intensity could not only increase property damage only (PDO) crashes, but also more severe crashes. In addition, the issues regarding the necessity to incorporate specific congestion indicator for congestion's effects on safety and to take care of the
Apples and oranges: avoiding different priors in Bayesian DNA sequence analysis
Directory of Open Access Journals (Sweden)
Posch Stefan
2010-03-01
Full Text Available Abstract Background One of the challenges of bioinformatics remains the recognition of short signal sequences in genomic DNA such as donor or acceptor splice sites, splicing enhancers or silencers, translation initiation sites, transcription start sites, transcription factor binding sites, nucleosome binding sites, miRNA binding sites, or insulator binding sites. During the last decade, a wealth of algorithms for the recognition of such DNA sequences has been developed and compared with the goal of improving their performance and to deepen our understanding of the underlying cellular processes. Most of these algorithms are based on statistical models belonging to the family of Markov random fields such as position weight matrix models, weight array matrix models, Markov models of higher order, or moral Bayesian networks. While in many comparative studies different learning principles or different statistical models have been compared, the influence of choosing different prior distributions for the model parameters when using different learning principles has been overlooked, and possibly lead to questionable conclusions. Results With the goal of allowing direct comparisons of different learning principles for models from the family of Markov random fields based on the same a-priori information, we derive a generalization of the commonly-used product-Dirichlet prior. We find that the derived prior behaves like a Gaussian prior close to the maximum and like a Laplace prior in the far tails. In two case studies, we illustrate the utility of the derived prior for a direct comparison of different learning principles with different models for the recognition of binding sites of the transcription factor Sp1 and human donor splice sites. Conclusions We find that comparisons of different learning principles using the same a-priori information can lead to conclusions different from those of previous studies in which the effect resulting from different
Bayesian Analysis of Hmi Images and Comparison to Tsi Variations and MWO Image Observables
Parker, D. G.; Ulrich, R. K.; Beck, J.; Tran, T. V.
2015-12-01
We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from June, 2010 to December, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables.The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment.Ulrich, R.K., Parker, D, Bertello, L. and
Takamizawa, Hisashi; Itoh, Hiroto; Nishiyama, Yutaka
2016-10-01
In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased. Comparing cluster IDs 2 and 6, embrittlement of high-Cu-bearing materials ( A flux effect with a higher flux range was demonstrated for cluster ID 3 comprising MTR irradiation in a high flux region (≤1 × 1013 n/cm2/s) [44]. For cluster ID 10, classification is rendered based upon flux effect, where embrittlement is accelerated in high Cu-bearing materials irradiated at lower flux levels (less than 5 × 109 n/cm2·s). This is possibly due to increased thermal equilibrium vacancies [44,45]. Per all the above considerations, it was hence ascertained that data belonging to identical cluster ID's maintain the similar embrittlement mechanism attributes.With the aim to examine the clustering versus the neutron fluence, the relationship between the Cu content representative of materials and the fluence for PWR and MTR irradiation was plotted in Fig. 13 (a) and (b). For enhancing plot clarities, data points were slightly
DEFF Research Database (Denmark)
Strathe, Anders Bjerring; Jørgensen, Henry; Kebreab, E;
2012-01-01
ABSTRACT SUMMARY The objective of the current study was to develop Bayesian simultaneous equation models for modelling energy intake and partitioning in growing pigs. A key feature of the Bayesian approach is that parameters are assigned prior distributions, which may reflect the current state...... of nature. In the models, rates of metabolizable energy (ME) intake, protein deposition (PD) and lipid deposition (LD) were treated as dependent variables accounting for residuals being correlated. Two complementary equation systems were used to model ME intake (MEI), PD and LD. Informative priors were...... genders (barrows, boars and gilts) selected on the basis of similar birth weight. The pigs were fed four diets based on barley, wheat and soybean meal supplemented with crystalline amino acids to meet or exceed Danish nutrient requirement standards. Nutrient balances and gas exchanges were measured at c...
Petit, V
2011-01-01
In this paper we describe a Bayesian statistical method designed to infer the magnetic properties of stars observed using high-resolution circular spectropolarimetry in the context of large surveys. This approach is well suited for analysing stars for which the stellar rotation period is not known, and therefore the rotational phases of the observations are ambiguous. The model assumes that the magnetic observations correspond to a dipole oblique rotator, a situation commonly encountered in intermediate and high-mass stars. Using reasonable assumptions regarding the model parameter prior probability density distributions, the Bayesian algorithm determines the posterior probability densities corresponding to the surface magnetic field geometry and strength by performing a comparison between the observed and computed Stokes V profiles. Based on the results of numerical simulations, we conclude that this method yields a useful estimate of the surface dipole field strength based on a small number (i.e. 1 or 2) of...
A Bayesian stochastic frontier analysis of Chinese fossil-fuel electricity generation companies
International Nuclear Information System (INIS)
This paper analyses the technical efficiency of Chinese fossil-fuel electricity generation companies from 1999 to 2011, using a Bayesian stochastic frontier model. The results reveal that efficiency varies among the fossil-fuel electricity generation companies that were analysed. We also focus on the factors of size, location, government ownership and mixed sources of electricity generation for the fossil-fuel electricity generation companies, and also examine their effects on the efficiency of these companies. Policy implications are derived. - Highlights: • We analyze the efficiency of 27 quoted Chinese fossil-fuel electricity generation companies during 1999–2011. • We adopt a Bayesian stochastic frontier model taking into consideration the identified heterogeneity. • With reform background in Chinese energy industry, we propose four hypotheses and check their influence on efficiency. • Big size, coastal location, government control and hydro energy sources all have increased costs
PARALLEL ADAPTIVE MULTILEVEL SAMPLING ALGORITHMS FOR THE BAYESIAN ANALYSIS OF MATHEMATICAL MODELS
Prudencio, Ernesto
2012-01-01
In recent years, Bayesian model updating techniques based on measured data have been applied to many engineering and applied science problems. At the same time, parallel computational platforms are becoming increasingly more powerful and are being used more frequently by the engineering and scientific communities. Bayesian techniques usually require the evaluation of multi-dimensional integrals related to the posterior probability density function (PDF) of uncertain model parameters. The fact that such integrals cannot be computed analytically motivates the research of stochastic simulation methods for sampling posterior PDFs. One such algorithm is the adaptive multilevel stochastic simulation algorithm (AMSSA). In this paper we discuss the parallelization of AMSSA, formulating the necessary load balancing step as a binary integer programming problem. We present a variety of results showing the effectiveness of load balancing on the overall performance of AMSSA in a parallel computational environment.
Muhammad eZubair
2014-01-01
The investigation of the nuclear accidents reveals that the accumulation of various technical and nontechnical lapses compounded the nuclear disaster. By using Analytic Hierarchy Process (AHP) and Bayesian Network (BN) the present research signifies the technical and nontechnical issues of nuclear accidents. The study exposed that besides technical fixes such as enhanced engineering safety features and better siting choices, the critical ingredient for safe operation of nuclear reactors lie i...
Petit, V.; Wade, G. A.
2011-01-01
In this paper we describe a Bayesian statistical method designed to infer the magnetic properties of stars observed using high-resolution circular spectropolarimetry in the context of large surveys. This approach is well suited for analysing stars for which the stellar rotation period is not known, and therefore the rotational phases of the observations are ambiguous. The model assumes that the magnetic observations correspond to a dipole oblique rotator, a situation commonly encountered in i...
A Bayesian Analysis of GPS Guidance System in Precision Agriculture: The Role of Expectations
Khanal, Aditya R; Mishra, Ashok K.; Lambert, Dayton M.; Paudel, Krishna P.
2013-01-01
Farmer’s post adoption responses about technology are important in continuation and diffusion of a technology in precision agriculture. We studied farmer’s frequency of application decisions of GPS guidance system, after adoption. Using a Cotton grower’s precision farming survey in the U.S. and Bayesian approaches, our study suggests that ‘meeting expectation’ plays an important positive role. Farmer’s income level, farm size, and farming occupation are other important factors in modeling GPS...
Salamonson, Yenna; Ramjan, Lucie M; van den Nieuwenhuizen, Simon; Metcalfe, Lauren; Chang, Sungwon; Everett, Bronwyn
2016-03-01
This paper examines the relationship between nursing students' sense of coherence, self-regulated learning and academic performance in bioscience. While there is increasing recognition of a need to foster students' self-regulated learning, little is known about the relationship of psychological strengths, particularly sense of coherence and academic performance. Using a prospective, correlational design, 563 first year nursing students completed the three dimensions of sense of coherence scale - comprehensibility, manageability and meaningfulness, and five components of self-regulated learning strategy - elaboration, organisation, rehearsal, self-efficacy and task value. Cluster analysis was used to group respondents into three clusters, based on their sense of coherence subscale scores. Although there were no sociodemographic differences in sense of coherence subscale scores, those with higher sense of coherence were more likely to adopt self-regulated learning strategies. Furthermore, academic grades collected at the end of semester revealed that higher sense of coherence was consistently related to achieving higher academic grades across all four units of study. Students with higher sense of coherence were more self-regulated in their learning approach. More importantly, the study suggests that sense of coherence may be an explanatory factor for students' successful adaptation and transition in higher education, as indicated by the positive relationship of sense of coherence to academic performance. PMID:26804936
Sankararaman, Shankar
2016-01-01
This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.
Wavelet coherence analysis of prefrontal oxygenation signals in elderly subjects with hypertension
International Nuclear Information System (INIS)
This study aims to assess the prefrontal functional connectivity in elderly subjects with hypertension during the resting state using wavelet coherence analysis of changes in prefrontal tissue oxyhaemoglobin concentrations (Δ[HbO2]) signals measured by near-infrared spectroscopy (NIRS). Continuous recordings of NIRS signals were obtained from the left and right prefrontal lobes in 24 elderly subjects with hypertension (age: 70.7 ± 8.4 years) and 26 elderly normotensive subjects (age: 70.6 ± 7.9 years) during the resting state. The coherence between the left and right prefrontal oscillations in four frequency intervals (I, 0.4 Hz to 2 Hz; II, 0.15 Hz to 0.4 Hz; III, 0.05 Hz to 0.15 Hz; and IV, 0.02 Hz to 0.05 Hz) was analyzed using wavelet coherence method. The Δ[HbO2] oscillations showed significant wavelet coherence (WCO) in intervals I and III, and significant wavelet phase coherence (WPCO) in intervals from I to IV. Remarkably, in elderly subjects with hypertension, the WCO and WPCO in interval III were significantly lower in the left and right prefrontal regions than in healthy elderly subjects (p = 0.014 for WCO, p = 0.007 for WPCO). The lower coherence in interval III indicates a decreased synchronization of neural control in the left and right prefrontal regions in elderly subjects with hypertension. This might suggest a weakened brain functional connectivity in the elderly subjects with hypertension. (paper)
Bayesian models a statistical primer for ecologists
Hobbs, N Thompson
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources
Gong, Maozhen
Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.
Bhadra, Anindya; Mallick, Bani K
2013-06-01
We describe a Bayesian technique to (a) perform a sparse joint selection of significant predictor variables and significant inverse covariance matrix elements of the response variables in a high-dimensional linear Gaussian sparse seemingly unrelated regression (SSUR) setting and (b) perform an association analysis between the high-dimensional sets of predictors and responses in such a setting. To search the high-dimensional model space, where both the number of predictors and the number of possibly correlated responses can be larger than the sample size, we demonstrate that a marginalization-based collapsed Gibbs sampler, in combination with spike and slab type of priors, offers a computationally feasible and efficient solution. As an example, we apply our method to an expression quantitative trait loci (eQTL) analysis on publicly available single nucleotide polymorphism (SNP) and gene expression data for humans where the primary interest lies in finding the significant associations between the sets of SNPs and possibly correlated genetic transcripts. Our method also allows for inference on the sparse interaction network of the transcripts (response variables) after accounting for the effect of the SNPs (predictor variables). We exploit properties of Gaussian graphical models to make statements concerning conditional independence of the responses. Our method compares favorably to existing Bayesian approaches developed for this purpose. PMID:23607608
Bhadra, Anindya
2013-04-22
We describe a Bayesian technique to (a) perform a sparse joint selection of significant predictor variables and significant inverse covariance matrix elements of the response variables in a high-dimensional linear Gaussian sparse seemingly unrelated regression (SSUR) setting and (b) perform an association analysis between the high-dimensional sets of predictors and responses in such a setting. To search the high-dimensional model space, where both the number of predictors and the number of possibly correlated responses can be larger than the sample size, we demonstrate that a marginalization-based collapsed Gibbs sampler, in combination with spike and slab type of priors, offers a computationally feasible and efficient solution. As an example, we apply our method to an expression quantitative trait loci (eQTL) analysis on publicly available single nucleotide polymorphism (SNP) and gene expression data for humans where the primary interest lies in finding the significant associations between the sets of SNPs and possibly correlated genetic transcripts. Our method also allows for inference on the sparse interaction network of the transcripts (response variables) after accounting for the effect of the SNPs (predictor variables). We exploit properties of Gaussian graphical models to make statements concerning conditional independence of the responses. Our method compares favorably to existing Bayesian approaches developed for this purpose. © 2013, The International Biometric Society.
Bayesian reasoning with ifs and ands and ors
Cruz, Nicole; Baratgin, Jean; Oaksford, Mike; Over, David E.
2015-01-01
The Bayesian approach to the psychology of reasoning generalizes binary logic, extending the binary concept of consistency to that of coherence, and allowing the study of deductive reasoning from uncertain premises. Studies in judgment and decision making have found that people’s probability judgments can fail to be coherent. We investigated people’s coherence further for judgments about conjunctions, disjunctions and conditionals, and asked whether their coherence would increase when they we...
Wang, Lejun; Lu, Aiyun; Zhang, Shengnian; Niu, Wenxin; Zheng, Fanhui; Gong, Mingxin
2015-03-01
The aim of this study was to examine coherence and phase synchronization between antagonistic elbow muscles and thus to explore the coupling and common neural inputs of antagonistic elbow muscles during sustained submaximal isometric fatiguing contraction. Fifteen healthy male subjects sustained an isometric elbow flexion at 20 % maximal level until exhaustion, while surface electromyographic signals (sEMG) were collected from biceps brachii (BB) and triceps brachii (TB). sEMG signals were divided into the first half (stage 1 with minimal fatigue) and second half (stage 2 with severe fatigue) of the contraction. Coherence and phase synchronization analysis was conducted between sEMG of BB and TB, and coherence value and phase synchronization index in alpha (8-12 Hz), beta (15-35 Hz) and gamma (35-60 Hz) frequency bands were obtained. Significant increase in EMG-EMG coherence and phase synchronization index in alpha and beta frequency bands between antagonistic elbow flexion muscles was observed all increased in stage 2 compared to stage 1. Coupling of EMG activities between antagonistic muscles increased as a result of fatigue caused by 20 % maximal level sustained isometric elbow flexion, indicating the increased interconnection between synchronized cortical neurons and the motoneuron pool of BB and TB, which may be cortical in origin. This increased coupling may help to maintain coactivation level so as to ensure joint stability on the basis of maintaining the joint force output. PMID:25515087
Wavelet coherency analysis to relate saturated hydraulic properties to soil physical properties
Si, Bing Cheng; Zeleke, Takele B.
2005-11-01
A soil property may be related to another and the relationships may change depending on the scale and location. Understanding these scale- and location-dependent relationships is important for prediction of one soil property based on another. The objective of this study is to use wavelet coherency analysis to examine whether the relationship between hydraulic properties and soil physical properties are scale- and location-dependent. Undisturbed cores were collected along a transect from the sandy loam soil of a farm field in northern Saskatchewan, Canada. Saturated hydraulic conductivity (Ks), sand content, and organic carbon content (OC) were measured on these cores, and their relationships as a function of scale and location were analyzed using wavelets. Results indicated that the wavelet coherency between Ks and sand content is only significantly different from that of red noises at the scales around 48 m. The cross-wavelet spectrum and wavelet coherency are predominantly in phase, suggesting a positive correlation between Ks and sand. For Ks and OC, significant coherency exists at scales from 30 to 48 and around 80 m. At the scales of 30-48 and around 80 m the relationships are predominantly out of phase, suggesting negative correlation. Therefore relationships between Ks and sand or Ks and OC are not only scale-dependent but also location-dependent. Scale and location dependence have an important implication for understanding the scaling relationships between Ks and sand and OC and for the prediction of Ks from sand and OC.
Topics in Bayesian statistics and maximum entropy
International Nuclear Information System (INIS)
Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)
Bayesian survival analysis modeling applied to sensory shelf life of foods
Calle, M. Luz; Hough, Guillermo; Curia, Ana; Gómez, Guadalupe
2006-01-01
Data from sensory shelf-life studies can be analyzed using survival statistical methods. The objective of this research was to introduce Bayesian methodology to sensory shelf-life studies and discuss its advantages in relation to classical (frequentist) methods. A specific algorithm which incorporates the interval censored data from shelf-life studies is presented. Calculations were applied to whole-fat and fat-free yogurt, each tasted by 80 consumers who answered ‘‘yes’’ or ‘‘no’’ t...
Eadie, Gwendolyn; Harris, William
2016-01-01
We present a hierarchical Bayesian method for estimating the total mass and mass profile of the Milky Way Galaxy. The new hierarchical Bayesian approach further improves the framework presented by Eadie, Harris, & Widrow (2015) and Eadie & Harris (2016) and builds upon the preliminary reports by Eadie et al (2015a,c). The method uses a distribution function $f(\\mathcal{E},L)$ to model the galaxy and kinematic data from satellite objects such as globular clusters to trace the Galaxy's gravitational potential. A major advantage of the method is that it not only includes complete and incomplete data simultaneously in the analysis, but also incorporates measurement uncertainties in a coherent and meaningful way. We first test the hierarchical Bayesian framework, which includes measurement uncertainties, using the same data and power-law model assumed in Eadie & Harris (2016), and find the results are similar but more strongly constrained. Next, we take advantage of the new statistical framework and in...
Arnaud, N; Bizouard, M A; Brisson, V; Cavalier, F; Davier, M; Hello, P; Kreckelberg, S; Porter, E K; Arnaud, Nicolas; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Kreckelberg, Stephane; Porter, Edward K.
2003-01-01
Network data analysis methods are the only way to properly separate real gravitational wave (GW) transient events from detector noise. They can be divided into two generic classes: the coincidence method and the coherent analysis. The former uses lists of selected events provided by each interferometer belonging to the network and tries to correlate them in time to identify a physical signal. Instead of this binary treatment of detector outputs (signal present or absent), the latter method involves first the merging of the interferometer data and looks for a common pattern, consistent with an assumed GW waveform and a given source location in the sky. The thresholds are only applied later, to validate or not the hypothesis made. As coherent algorithms use a more complete information than coincidence methods, they are expected to provide better detection performances, but at a higher computational cost. An efficient filter must yield a good compromise between a low false alarm rate (hence triggering on data at...
Bayesian Approaches to Gamma Ray Bursts Data Analysis%伽玛暴数据处理中的贝叶斯方法
Institute of Scientific and Technical Information of China (English)
卜庆翠; 陈黎; 李兆升; 王德华
2013-01-01
Bayesian method is based on Bayes’ Theorem in which one deems it sensible to consider a probability distribution function (pdf) for the unknown parameterθ, p(θ), called the“prior”pdf forθ. p(θ) reflects our knowledge before observation. We let p(θ|D) be the“posterior”pdf of θ(given the data D), which reflects our modified beliefs after incorporating the results of the observation. All Bayesian inferences are based on the posterior distribution. Bayesian approach is a powerful tool for the gamma ray bursts (GRBs) data analysis, such as analyzing structure in photon counting data, combining different lightcurves, determining various parameters’ distributions, testing if there is spectral line, and so on. During the last decade, more and more astronomers realized the potential of the Bayesian ap-proach. Our aim here is not to provide a detailed explanation of Bayesian theory but rather to show what we have learned from GRBs data analysis by using Bayesian approach. As a comparison, we first give a brief introduction to frequentist approach and Bayesian approach. Then, we present spe-cific examples to study various characteristics of GRBs by using Bayesian method. We emphasize the following aspects: using the BIC (Bayesian information criterion) to select model, using BB (Bayesian block) analysis to find optimal change-points in GRBs light curves,using Bayesian fit method to estimate GRBs peak energy, using this method to extend the Hubble diagram to a very high redshift. Bayesian approach has two mainly usages in GRBs data analysis:model selection and parameter estimation. To some degrees, Bayesian approach requires a great deal of thought about the given situation to apply sensibly, therefore, it seemed to need more techniques and experiences.%贝叶斯推断是建立在贝叶斯定理上的一种参数估计方法。根据贝叶斯定理，当根据经验，对待估计的参量θ的分布密度p(θ)(称为“验前分布”)有所了解时，在
Analysis of axle and vehicle load properties through Bayesian Networks based on Weigh-in-Motion data
International Nuclear Information System (INIS)
Weigh-in-Motion (WIM) systems are used, among other applications, in pavement and bridge reliability. The system measures quantities such as individual axle load, vehicular loads, vehicle speed, vehicle length and number of axles. Because of the nature of traffic configuration, the quantities measured are evidently regarded as random variables. The dependence structure of the data of such complex systems as the traffic systems is also very complex. It is desirable to be able to represent the complex multidimensional-distribution with models where the dependence may be explained in a clear way and different locations where the system operates may be treated simultaneously. Bayesian Networks (BNs) are models that comply with the characteristics listed above. In this paper we discuss BN models and results concerning their ability to adequately represent the data. The paper places attention on the construction and use of the models. We discuss applications of the proposed BNs in reliability analysis. In particular we show how the proposed BNs may be used for computing design values for individual axles, vehicle weight and maximum bending moments of bridges in certain time intervals. These estimates have been used to advise authorities with respect to bridge reliability. Directions as to how the model may be extended to include locations where the WIM system does not operate are given whenever possible. These ideas benefit from structured expert judgment techniques previously used to quantify Hybrid Bayesian Networks (HBNs) with success
Directory of Open Access Journals (Sweden)
Giulia Carreras
2012-09-01
Full Text Available
Background: parameter uncertainty in the Markov model’s description of a disease course was addressed. Probabilistic sensitivity analysis (PSA is now considered the only tool that properly permits parameter uncertainty’s examination. This consists in sampling values from the parameter’s probability distributions.
Methods: Markov models fitted with microsimulation were considered and methods for carrying out a PSA on transition probabilities were studied. Two Bayesian solutions were developed: for each row of the modeled transition matrix the prior distribution was assumed as a product of Beta or a Dirichlet. The two solutions differ in the source of information: several different sources for each transition in the Beta approach and a single source for each transition from a given health state in the Dirichlet. The two methods were applied to a simple cervical cancer’s model.
Results : differences between posterior estimates from the two methods were negligible. Results showed that the prior variability highly influence the posterior distribution.
Conclusions: the novelty of this work is the Bayesian approach that integrates the two distributions with a product of Binomial distributions likelihood. Such methods could be also applied to cohort data and their application to more complex models could be useful and unique in the cervical cancer context, as well as in other disease modeling.
Khanin, Alexander
2014-01-01
Cosmic rays (CRs) are protons and atomic nuclei that flow into our Solar system and reach the Earth with energies of up to ~10^21 eV. The sources of ultra-high energy cosmic rays (UHECRs) with E >~ 10^19 eV remain unknown, although there are theoretical reasons to think that at least some come from active galactic nuclei (AGNs). One way to assess the different hypotheses is by analysing the arrival directions of UHECRs, in particular their self-clustering. We have developed a fully Bayesian approach to analyzing the self-clustering of points on the sphere, which we apply to the UHECR arrival directions. The analysis is based on a multi-step approach that enables the application of Bayesian model comparison to cases with weak prior information. We have applied this approach to the 69 highest energy events recorded by the Pierre Auger Observatory (PAO), which is the largest current UHECR data set. We do not detect self-clustering, but simulations show that this is consistent with the AGN-sourced model for a dat...
Mohan, Nishant; Vakoc, Benjamin
2011-01-01
The intensity signal in optical coherence tomography contains information about the translational velocity of scatterers, and can be used to quantify blood flow. We apply principal component analysis to efficiently extract this information. We also study use of nonuniform temporal sampling of the intensity signal to increase the range of quantifiable flow velocities. We demonstrate this technique in simulation, phantom and in vivo blood flow measurements, and highlight its potential to enable...
Coherent Uncertainty Analysis of Aerosol Measurements from Multiple Satellite Sensors
Petrenko, M.; Ichoku, C.
2013-01-01
Aerosol retrievals from multiple spaceborne sensors, including MODIS (on Terra and Aqua), MISR, OMI, POLDER, CALIOP, and SeaWiFS altogether, a total of 11 different aerosol products were comparatively analyzed using data collocated with ground-based aerosol observations from the Aerosol Robotic Network (AERONET) stations within the Multi-sensor Aerosol Products Sampling System (MAPSS, http://giovanni.gsfc.nasa.gov/mapss/ and http://giovanni.gsfc.nasa.gov/aerostat/). The analysis was performed by comparing quality-screened satellite aerosol optical depth or thickness (AOD or AOT) retrievals during 2006-2010 to available collocated AERONET measurements globally, regionally, and seasonally, and deriving a number of statistical measures of accuracy. We used a robust statistical approach to detect and remove possible outliers in the collocated data that can bias the results of the analysis. Overall, the proportion of outliers in each of the quality-screened AOD products was within 12%. Squared correlation coefficient (R2) values of the satellite AOD retrievals relative to AERONET exceeded 0.6, with R2 for most of the products exceeding 0.7 over land and 0.8 over ocean. Root mean square error (RMSE) values for most of the AOD products were within 0.15 over land and 0.09 over ocean. We have been able to generate global maps showing regions where the different products present advantages over the others, as well as the relative performance of each product over different landcover types. It was observed that while MODIS, MISR, and SeaWiFS provide accurate retrievals over most of the landcover types, multi-angle capabilities make MISR the only sensor to retrieve reliable AOD over barren and snow / ice surfaces. Likewise, active sensing enables CALIOP to retrieve aerosol properties over bright-surface shrublands more accurately than the other sensors, while POLDER, which is the only one of the sensors capable of measuring polarized aerosols, outperforms other sensors in
A Bayesian Analysis for Identifying DNA Copy Number Variations Using a Compound Poisson Process
Directory of Open Access Journals (Sweden)
Yiğiter Ayten
2010-01-01
Full Text Available To study chromosomal aberrations that may lead to cancer formation or genetic diseases, the array-based Comparative Genomic Hybridization (aCGH technique is often used for detecting DNA copy number variants (CNVs. Various methods have been developed for gaining CNVs information based on aCGH data. However, most of these methods make use of the log-intensity ratios in aCGH data without taking advantage of other information such as the DNA probe (e.g., biomarker positions/distances contained in the data. Motivated by the specific features of aCGH data, we developed a novel method that takes into account the estimation of a change point or locus of the CNV in aCGH data with its associated biomarker position on the chromosome using a compound Poisson process. We used a Bayesian approach to derive the posterior probability for the estimation of the CNV locus. To detect loci of multiple CNVs in the data, a sliding window process combined with our derived Bayesian posterior probability was proposed. To evaluate the performance of the method in the estimation of the CNV locus, we first performed simulation studies. Finally, we applied our approach to real data from aCGH experiments, demonstrating its applicability.
Directory of Open Access Journals (Sweden)
Moslem Moradi
2015-06-01
Full Text Available Here in, an application of a new seismic inversion algorithm in one of Iran’s oilfields is described. Stochastic (geostatistical seismic inversion, as a complementary method to deterministic inversion, is perceived as contribution combination of geostatistics and seismic inversion algorithm. This method integrates information from different data sources with different scales, as prior information in Bayesian statistics. Data integration leads to a probability density function (named as a posteriori probability that can yield a model of subsurface. The Markov Chain Monte Carlo (MCMC method is used to sample the posterior probability distribution, and the subsurface model characteristics can be extracted by analyzing a set of the samples. In this study, the theory of stochastic seismic inversion in a Bayesian framework was described and applied to infer P-impedance and porosity models. The comparison between the stochastic seismic inversion and the deterministic model based seismic inversion indicates that the stochastic seismic inversion can provide more detailed information of subsurface character. Since multiple realizations are extracted by this method, an estimation of pore volume and uncertainty in the estimation were analyzed.
A Bayesian analysis of HAT-P-7b using the EXONEST algorithm
Energy Technology Data Exchange (ETDEWEB)
Placek, Ben [Department of Physics, University at Albany (SUNY), Albany NY (United States); Knuth, Kevin H. [Department of Physics, University at Albany (SUNY), Albany NY, USA and Department of Informatics, University at Albany (SUNY), Albany NY (United States)
2015-01-13
The study of exoplanets (planets orbiting other stars) is revolutionizing the way we view our universe. High-precision photometric data provided by the Kepler Space Telescope (Kepler) enables not only the detection of such planets, but also their characterization. This presents a unique opportunity to apply Bayesian methods to better characterize the multitude of previously confirmed exoplanets. This paper focuses on applying the EXONEST algorithm to characterize the transiting short-period-hot-Jupiter, HAT-P-7b (also referred to as Kepler-2b). EXONEST evaluates a suite of exoplanet photometric models by applying Bayesian Model Selection, which is implemented with the MultiNest algorithm. These models take into account planetary effects, such as reflected light and thermal emissions, as well as the effect of the planetary motion on the host star, such as Doppler beaming, or boosting, of light from the reflex motion of the host star, and photometric variations due to the planet-induced ellipsoidal shape of the host star. By calculating model evidences, one can determine which model best describes the observed data, thus identifying which effects dominate the planetary system. Presented are parameter estimates and model evidences for HAT-P-7b.
The GRB Golentskii Correlation as a Cosmological Probe via Bayesian Analysis
Burgess, Michael
2016-07-01
Gamma-ray bursts (GRBs) are characterized by a strong correlation between the instantaneous luminosity and the spectral peak energy within a burst. This correlation, which is known as the hardness-intensity correlation or the Golenetskii correlation, not only holds important clues to the physics of GRBs but is thought to have the potential to determine redshifts of bursts. In this paper, I use a hierarchical Bayesian model to study the universality of the rest-frame Golenetskii correlation and in particular I assess its use as a redshift estimator for GRBs. I find that, using a power-law prescription of the correlation, the power-law indices cluster near a common value, but have a broader variance than previously reported ( 1-2). Furthermore, I find evidence that there is spread in intrinsic rest-frame correlation normalizations for the GRBs in our sample ( 10 ^{51}-10 ^{53} erg/s). This points towards variable physical settings of the emission (magnetic field strength, number of emitting electrons, photospheric radius, viewing angle, etc.). Subsequently, these results eliminate the Golenetskii correlation as a useful tool for redshift determination and hence a cosmological probe. Nevertheless, the Bayesian method introduced in this paper allows for a better determination of the rest frame properties of the correlation, which in turn allows for more stringent limitations for physical models of the emission to be set.
Directory of Open Access Journals (Sweden)
Gerhard Moser
2015-04-01
Full Text Available Gene discovery, estimation of heritability captured by SNP arrays, inference on genetic architecture and prediction analyses of complex traits are usually performed using different statistical models and methods, leading to inefficiency and loss of power. Here we use a Bayesian mixture model that simultaneously allows variant discovery, estimation of genetic variance explained by all variants and prediction of unobserved phenotypes in new samples. We apply the method to simulated data of quantitative traits and Welcome Trust Case Control Consortium (WTCCC data on disease and show that it provides accurate estimates of SNP-based heritability, produces unbiased estimators of risk in new samples, and that it can estimate genetic architecture by partitioning variation across hundreds to thousands of SNPs. We estimated that, depending on the trait, 2,633 to 9,411 SNPs explain all of the SNP-based heritability in the WTCCC diseases. The majority of those SNPs (>96% had small effects, confirming a substantial polygenic component to common diseases. The proportion of the SNP-based variance explained by large effects (each SNP explaining 1% of the variance varied markedly between diseases, ranging from almost zero for bipolar disorder to 72% for type 1 diabetes. Prediction analyses demonstrate that for diseases with major loci, such as type 1 diabetes and rheumatoid arthritis, Bayesian methods outperform profile scoring or mixed model approaches.
Sinha, Samiran
2009-08-10
We propose a semiparametric Bayesian method for handling measurement error in nutritional epidemiological data. Our goal is to estimate nonparametrically the form of association between a disease and exposure variable while the true values of the exposure are never observed. Motivated by nutritional epidemiological data, we consider the setting where a surrogate covariate is recorded in the primary data, and a calibration data set contains information on the surrogate variable and repeated measurements of an unbiased instrumental variable of the true exposure. We develop a flexible Bayesian method where not only is the relationship between the disease and exposure variable treated semiparametrically, but also the relationship between the surrogate and the true exposure is modeled semiparametrically. The two nonparametric functions are modeled simultaneously via B-splines. In addition, we model the distribution of the exposure variable as a Dirichlet process mixture of normal distributions, thus making its modeling essentially nonparametric and placing this work into the context of functional measurement error modeling. We apply our method to the NIH-AARP Diet and Health Study and examine its performance in a simulation study.
OBJECTIVE BAYESIAN ANALYSIS OF ''ON/OFF'' MEASUREMENTS
Energy Technology Data Exchange (ETDEWEB)
Casadei, Diego, E-mail: diego.casadei@fhnw.ch [Visiting Scientist, Department of Physics and Astronomy, UCL, Gower Street, London WC1E 6BT (United Kingdom)
2015-01-01
In high-energy astrophysics, it is common practice to account for the background overlaid with counts from the source of interest with the help of auxiliary measurements carried out by pointing off-source. In this ''on/off'' measurement, one knows the number of photons detected while pointing toward the source, the number of photons collected while pointing away from the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of photons detected is so low that the approximations that hold asymptotically are not valid. On the other hand, an analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. Here we illustrate the objective Bayesian solution based on the reference posterior and compare the result with the approach very recently proposed by Knoetig, and discuss its most delicate points. In addition, we propose to compute the significance of the excess with respect to the background-only expectation with a method that is able to account for any uncertainty on the background and is valid for any photon count. This method is compared to the widely used significance formula by Li and Ma, which is based on asymptotic properties.
Wagner-Kaiser, R.; Stenning, D. C.; Robinson, E.; von Hippel, T.; Sarajedini, A.; van Dyk, D. A.; Stein, N.; Jefferys, W. H.
2016-07-01
We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival Advanced Camera for Surveys Treasury observations of Galactic Globular Clusters to find and characterize two stellar populations in NGC 5024 (M53), NGC 5272 (M3), and NGC 6352. For these three clusters, both single and double-population analyses are used to determine a best fit isochrone(s). We employ a sophisticated Bayesian analysis technique to simultaneously fit the cluster parameters (age, distance, absorption, and metallicity) that characterize each cluster. For the two-population analysis, unique population level helium values are also fit to each distinct population of the cluster and the relative proportions of the populations are determined. We find differences in helium ranging from ∼0.05 to 0.11 for these three clusters. Model grids with solar α-element abundances ([α/Fe] = 0.0) and enhanced α-elements ([α/Fe] = 0.4) are adopted.
Bayesian analysis for two-parameter hybrid EoS with high-mass compact star twins
Alvarez-Castillo, David E; Blaschke, David; Grigorian, Hovik
2015-01-01
We perform a Bayesian analysis in the basis of a recently developed two-parameter class of hybrid equations of state that allow for high-mass compact star twins. While recently a wide range of radii, from 9 - 15 km, has been inferred for different neutron stars using different techniques, we perform our analysis under the supposition that the radii are towards the large end ($13-15$ km). We use this radius constraint together with the undebated statistically independent constraint for high masses ($\\sim 2~M_\\odot$) as priors in selecting the most probable hybrid equations of state from a family with two free parameters: the baryon excluded volume in the hadronic phase and the 8-quark vector channel interaction in the quark matter phase.
Wagner-Kaiser, R.; Stenning, D. C.; Robinson, E.; von Hippel, T.; Sarajedini, A.; van Dyk, D. A.; Stein, N.; Jefferys, W. H.
2016-07-01
We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival Advanced Camera for Surveys Treasury observations of Galactic Globular Clusters to find and characterize two stellar populations in NGC 5024 (M53), NGC 5272 (M3), and NGC 6352. For these three clusters, both single and double-population analyses are used to determine a best fit isochrone(s). We employ a sophisticated Bayesian analysis technique to simultaneously fit the cluster parameters (age, distance, absorption, and metallicity) that characterize each cluster. For the two-population analysis, unique population level helium values are also fit to each distinct population of the cluster and the relative proportions of the populations are determined. We find differences in helium ranging from ˜0.05 to 0.11 for these three clusters. Model grids with solar α-element abundances ([α/Fe] = 0.0) and enhanced α-elements ([α/Fe] = 0.4) are adopted.
Wagner-Kaiser, R; Robinson, E; von Hippel, T; Sarajedini, A; van Dyk, D A; Stein, N; Jefferys, W H
2016-01-01
We use Cycle 21 Hubble Space Telescope (HST) observations and HST archival ACS Treasury observations of Galactic Globular Clusters to find and characterize two stellar populations in NGC 5024 (M53), NGC 5272 (M3), and NGC 6352. For these three clusters, both single and double-population analyses are used to determine a best fit isochrone(s). We employ a sophisticated Bayesian analysis technique to simultaneously fit the cluster parameters (age, distance, absorption, and metallicity) that characterize each cluster. For the two-population analysis, unique population level helium values are also fit to each distinct population of the cluster and the relative proportions of the populations are determined. We find differences in helium ranging from $\\sim$0.05 to 0.11 for these three clusters. Model grids with solar $\\alpha$-element abundances ([$\\alpha$/Fe] =0.0) and enhanced $\\alpha$-elements ([$\\alpha$/Fe]=0.4) are adopted.
Directory of Open Access Journals (Sweden)
Tamilselvi Madeswaran
2013-10-01
Full Text Available Microarray data studies produce large number of data and in order to analyze such large micro array data lies on Data mining or Statistical Analysis. Our objective is to classify the micro arraygene expression data. Usually before going for the classification the dimensionality reduction will be performed on the micro array gene expression dataset. A statistical approach for the extraction of thegene has been proposed. The drawback in the statistical analysis is that, it doesn’t identify the important genes. Here for the classification process we use k-nearest neighbor and SVM and Naïve Bayesian classifiers. From the experimental result our proposed classifiers show increase in the efficiency and accuracy.
Lander, Tonya A; Klein, Etienne K; Oddou-Muratorio, Sylvie; Candau, Jean-Noël; Gidoin, Cindy; Chalon, Alain; Roig, Anne; Fallour, Delphine; Auger-Rozenberg, Marie-Anne; Boivin, Thomas
2014-12-01
Understanding how invasive species establish and spread is vital for developing effective management strategies for invaded areas and identifying new areas where the risk of invasion is highest. We investigated the explanatory power of dispersal histories reconstructed based on local-scale wind data and a regional-scale wind-dispersed particle trajectory model for the invasive seed chalcid wasp Megastigmus schimitscheki (Hymenoptera: Torymidae) in France. The explanatory power was tested by: (1) survival analysis of empirical data on M. schimitscheki presence, absence and year of arrival at 52 stands of the wasp's obligate hosts, Cedrus (true cedar trees); and (2) Approximate Bayesian analysis of M. schimitscheki genetic data using a coalescence model. The Bayesian demographic modeling and traditional population genetic analysis suggested that initial invasion across the range was the result of long-distance dispersal from the longest established sites. The survival analyses of the windborne expansion patterns derived from a particle dispersal model indicated that there was an informative correlation between the M. schimitscheki presence/absence data from the annual surveys and the scenarios based on regional-scale wind data. These three very different analyses produced highly congruent results supporting our proposal that wind is the most probable vector for passive long-distance dispersal of this invasive seed wasp. This result confirms that long-distance dispersal from introduction areas is a likely driver of secondary expansion of alien invasive species. Based on our results, management programs for this and other windborne invasive species may consider (1) focusing effort at the longest established sites and (2) monitoring outlying populations remains critically important due to their influence on rates of spread. We also suggest that there is a distinct need for new analysis methods that have the capacity to combine empirical spatiotemporal field data
Bayesian methods for measures of agreement
Broemeling, Lyle D
2009-01-01
Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...
Koepke, C.; Irving, J.
2015-12-01
Bayesian solutions to inverse problems in near-surface geophysics and hydrology have gained increasing popularity as a means of estimating not only subsurface model parameters, but also their corresponding uncertainties that can be used in probabilistic forecasting and risk analysis. In particular, Markov-chain-Monte-Carlo (MCMC) methods have attracted much recent attention as a means of statistically sampling from the Bayesian posterior distribution. In this regard, two approaches are commonly used to improve the computational tractability of the Bayesian-MCMC approach: (i) Forward models involving a simplification of the underlying physics are employed, which offer a significant reduction in the time required to calculate data, but generally at the expense of model accuracy, and (ii) the model parameter space is represented using a limited set of spatially correlated basis functions as opposed to a more intuitive high-dimensional pixel-based parameterization. It has become well understood that model inaccuracies resulting from (i) can lead to posterior parameter distributions that are highly biased and overly confident. Further, when performing model reduction as described in (ii), it is not clear how the prior distribution for the basis weights should be defined because simple (e.g., Gaussian or uniform) priors that may be suitable for a pixel-based parameterization may result in a strong prior bias when used for the weights. To address the issue of model error resulting from known forward model approximations, we generate a set of error training realizations and analyze them with principal component analysis (PCA) in order to generate a sparse basis. The latter is used in the MCMC inversion to remove the main model-error component from the residuals. To improve issues related to prior bias when performing model reduction, we also use a training realization approach, but this time models are simulated from the prior distribution and analyzed using independent
Electrodynamics analysis on coherent perfect absorber and phase-controlled optical switch.
Chen, Tianjie; Duan, Shaoguang; Chen, Y C
2012-05-01
A coherent perfect absorber is essentially a specially designed Fabry-Perot interferometer, which completely extinguishes the incident coherent light. The one- and two-beam coherent perfect absorbers have been analyzed using classical electrodynamics by considering index matching in layered structures to totally suppress reflections. This approach presents a clear and physically intuitive picture for the principle of operation of a perfect absorber. The results show that the incident beam(s) must have correct phases and amplitudes, and the real and imaginary parts of the refractive indices of the media in the interferometer must satisfy a well-defined relation. Our results are in agreement with those obtained using the S-matrix analysis. However, the results were obtained solely based on the superposition of waves from multiple reflections without invoking the concept of time reversal as does the S-matrix approach. Further analysis shows that the two-beam device can be configured to function as a phase-controlled three-state switch.
Classification of fracture and non-fracture groups by analysis of coherent X-ray scatter
Dicken, A. J.; Evans, J. P. O.; Rogers, K. D.; Stone, N.; Greenwood, C.; Godber, S. X.; Clement, J. G.; Lyburn, I. D.; Martin, R. M.; Zioupos, P.
2016-07-01
Osteoporotic fractures present a significant social and economic burden, which is set to rise commensurately with the aging population. Greater understanding of the physicochemical differences between osteoporotic and normal conditions will facilitate the development of diagnostic technologies with increased performance and treatments with increased efficacy. Using coherent X-ray scattering we have evaluated a population of 108 ex vivo human bone samples comprised of non-fracture and fracture groups. Principal component fed linear discriminant analysis was used to develop a classification model to discern each condition resulting in a sensitivity and specificity of 93% and 91%, respectively. Evaluating the coherent X-ray scatter differences from each condition supports the hypothesis that a causal physicochemical change has occurred in the fracture group. This work is a critical step along the path towards developing an in vivo diagnostic tool for fracture risk prediction.
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
Fuster-Parra, P; García-Mas, A; Ponseti, F J; Leo, F M
2015-04-01
The purpose of this paper was to discover the relationships among 22 relevant psychological features in semi-professional football players in order to study team's performance and collective efficacy via a Bayesian network (BN). The paper includes optimization of team's performance and collective efficacy using intercausal reasoning pattern which constitutes a very common pattern in human reasoning. The BN is used to make inferences regarding our problem, and therefore we obtain some conclusions; among them: maximizing the team's performance causes a decrease in collective efficacy and when team's performance achieves the minimum value it causes an increase in moderate/high values of collective efficacy. Similarly, we may reason optimizing team collective efficacy instead. It also allows us to determine the features that have the strongest influence on performance and which on collective efficacy. From the BN two different coaching styles were differentiated taking into account the local Markov property: training leadership and autocratic leadership.
Bayesian networks precipitation model based on hidden Markov analysis and its application
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
Surface precipitation estimation is very important in hydrologic forecast. To account for the influence of the neighbors on the precipitation of an arbitrary grid in the network, Bayesian networks and Markov random field were adopted to estimate surface precipitation. Spherical coordinates and the expectation-maximization (EM) algorithm were used for region interpolation, and for estimation of the precipitation of arbitrary point in the region. Surface precipitation estimation of seven precipitation stations in Qinghai Lake region was performed. By comparing with other surface precipitation methods such as Thiessen polygon method, distance weighted mean method and arithmetic mean method, it is shown that the proposed method can judge the relationship of precipitation among different points in the area under complicated circumstances and the simulation results are more accurate and rational.
A Bayesian Analysis of a Random Effects Small Business Loan Credit Scoring Model
Directory of Open Access Journals (Sweden)
Patrick J. Farrell
2011-09-01
Full Text Available One of the most important aspects of credit scoring is constructing a model that has low misclassification rates and is also flexible enough to allow for random variation. It is also well known that, when there are a large number of highly correlated variables as is typical in studies involving questionnaire data, a method must be found to reduce the number of variables to those that have high predictive power. Here we propose a Bayesian multivariate logistic regression model with both fixed and random effects for small business loan credit scoring and a variable reduction method using Bayes factors. The method is illustrated on an interesting data set based on questionnaires sent to loan officers in Canadian banks and venture capital companies
Fuster-Parra, P; García-Mas, A; Ponseti, F J; Leo, F M
2015-04-01
The purpose of this paper was to discover the relationships among 22 relevant psychological features in semi-professional football players in order to study team's performance and collective efficacy via a Bayesian network (BN). The paper includes optimization of team's performance and collective efficacy using intercausal reasoning pattern which constitutes a very common pattern in human reasoning. The BN is used to make inferences regarding our problem, and therefore we obtain some conclusions; among them: maximizing the team's performance causes a decrease in collective efficacy and when team's performance achieves the minimum value it causes an increase in moderate/high values of collective efficacy. Similarly, we may reason optimizing team collective efficacy instead. It also allows us to determine the features that have the strongest influence on performance and which on collective efficacy. From the BN two different coaching styles were differentiated taking into account the local Markov property: training leadership and autocratic leadership. PMID:25546263
Default Bayesian analysis for multi-way tables: a data-augmentation approach
Polson, Nicholas G
2011-01-01
This paper proposes a strategy for regularized estimation in multi-way contingency tables, which are common in meta-analyses and multi-center clinical trials. Our approach is based on data augmentation, and appeals heavily to a novel class of Polya-Gamma distributions. Our main contributions are to build up the relevant distributional theory and to demonstrate three useful features of this data-augmentation scheme. First, it leads to simple EM and Gibbs-sampling algorithms for posterior inference, circumventing the need for analytic approximations, numerical integration, Metropolis--Hastings, or variational methods. Second, it allows modelers much more flexibility when choosing priors, which have traditionally come from the Dirichlet or logistic-normal family. For example, our approach allows users to incorporate Bayesian analogues of classical penalized-likelihood techniques (e.g. the lasso or bridge) in computing regularized estimates for log-odds ratios. Finally, our data-augmentation scheme naturally sugg...
Analysis and assessment of injury risk in female gymnastics:Bayesian Network approach
Directory of Open Access Journals (Sweden)
Lyudmila Dimitrova
2015-02-01
Full Text Available This paper presents a Bayesian network (BN model for estimating injury risk in female artistic gymnastics. The model illustrates the connections betweenunderlying injury risk factorsthrough a series ofcausal dependencies. The quantitativepart of the model – the conditional probability tables, are determined using ТNormal distribution with parameters, derived by experts. The injury rates calculated by the network are in an agreement with injury statistic data and correctly reports the impact of various risk factors on injury rates. The model is designed to assist coaches and supporting teams in planning the training activity so that injuries are minimized. This study provides important background for further data collection and research necessary to improve the precision of the quantitative predictions of the model.
Directory of Open Access Journals (Sweden)
D.O. Olayungbo
2016-12-01
Full Text Available This paper examines the dynamic interactions between insurance and economic growth in eight African countries for the period of 1970–2013. Insurance demand is measured by insurance penetration which accounts for income differences across the sample countries. A Bayesian Time Varying Parameter Vector Auto regression (TVP-VAR model with stochastic volatility is used to analyze the short run and the long run among the variables of interest. Using insurance penetration as a measure of insurance to economic growth, we find positive relationship for Egypt, while short-run negative and long-run positive effects are found for Kenya, Mauritius, and South Africa. On the contrary, negative effects are found for Algeria, Nigeria, Tunisia, and Zimbabwe. Implementation of sound financial reforms and wide insurance coverage are proposed recommendations for insurance development in the selected African countries.
International Nuclear Information System (INIS)
This paper assesses factors that potentially influence the volatility of crude oil prices and the possible linkage between this volatility and agricultural commodity markets. Stochastic volatility models are applied to weekly crude oil, corn, and wheat futures prices from November 1998 to January 2009. Model parameters are estimated using Bayesian Markov Chain Monte Carlo methods. Speculation, scalping, and petroleum inventories are found to be important in explaining the volatility of crude oil prices. Several properties of crude oil price dynamics are established, including mean-reversion, an asymmetry between returns and volatility, volatility clustering, and infrequent compound jumps. We find evidence of volatility spillover among crude oil, corn, and wheat markets after the fall of 2006. This can be largely explained by tightened interdependence between crude oil and these commodity markets induced by ethanol production.
Directory of Open Access Journals (Sweden)
V. S.S. Yadavalli
2002-09-01
Full Text Available Bayesian estimation is presented for the stationary rate of disappointments, D∞, for two models (with different specifications of intermittently used systems. The random variables in the system are considered to be independently exponentially distributed. Jeffreys’ prior is assumed for the unknown parameters in the system. Inference about D∞ is being restrained in both models by the complex and non-linear definition of D∞. Monte Carlo simulation is used to derive the posterior distribution of D∞ and subsequently the highest posterior density (HPD intervals. A numerical example where Bayes estimates and the HPD intervals are determined illustrates these results. This illustration is extended to determine the frequentistical properties of this Bayes procedure, by calculating covering proportions for each of these HPD intervals, assuming fixed values for the parameters.
Bayesian Analysis for Stellar Evolution with Nine Parameters (BASE-9): User's Manual
von Hippel, Ted; Jeffery, Elizabeth; Wagner-Kaiser, Rachel; DeGennaro, Steven; Stein, Nathan; Stenning, David; Jefferys, William H; van Dyk, David
2014-01-01
BASE-9 is a Bayesian software suite that recovers star cluster and stellar parameters from photometry. BASE-9 is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses Markov chain Monte Carlo and brute-force numerical integration techniques to estimate the posterior probability distributions for the age, metallicity, helium abundance, distance modulus, and line-of-sight absorption for a cluster, and the mass, binary mass ratio, and cluster membership probability for every stellar object. BASE-9 is provided as open source code on a version-controlled web server. The executables are also available as Amazon Elastic Compute Cloud images. This manual provides potential users with an overview of BASE-9, including instructions for installation and use.
Improving PWR core simulations by Monte Carlo uncertainty analysis and Bayesian inference
Castro, Emilio; Buss, Oliver; Garcia-Herranz, Nuria; Hoefer, Axel; Porsch, Dieter
2016-01-01
A Monte Carlo-based Bayesian inference model is applied to the prediction of reactor operation parameters of a PWR nuclear power plant. In this non-perturbative framework, high-dimensional covariance information describing the uncertainty of microscopic nuclear data is combined with measured reactor operation data in order to provide statistically sound, well founded uncertainty estimates of integral parameters, such as the boron letdown curve and the burnup-dependent reactor power distribution. The performance of this methodology is assessed in a blind test approach, where we use measurements of a given reactor cycle to improve the prediction of the subsequent cycle. As it turns out, the resulting improvement of the prediction quality is impressive. In particular, the prediction uncertainty of the boron letdown curve, which is of utmost importance for the planning of the reactor cycle length, can be reduced by one order of magnitude by including the boron concentration measurement information of the previous...
Energy Technology Data Exchange (ETDEWEB)
Du Xiaodong, E-mail: xdu23@wisc.ed [Department of Agricultural and Applied Economics, University of Wisconsin-Madison, WI (United States); Yu, Cindy L., E-mail: cindyyu@iastate.ed [Department of Statistics, Iowa State University, IA (United States); Hayes, Dermot J., E-mail: dhayes@iastate.ed [Department of Economics and Department of Finance, Iowa State University, IA (United States)
2011-05-15
This paper assesses factors that potentially influence the volatility of crude oil prices and the possible linkage between this volatility and agricultural commodity markets. Stochastic volatility models are applied to weekly crude oil, corn, and wheat futures prices from November 1998 to January 2009. Model parameters are estimated using Bayesian Markov Chain Monte Carlo methods. Speculation, scalping, and petroleum inventories are found to be important in explaining the volatility of crude oil prices. Several properties of crude oil price dynamics are established, including mean-reversion, an asymmetry between returns and volatility, volatility clustering, and infrequent compound jumps. We find evidence of volatility spillover among crude oil, corn, and wheat markets after the fall of 2006. This can be largely explained by tightened interdependence between crude oil and these commodity markets induced by ethanol production.
DEFF Research Database (Denmark)
Dashab, Golam Reza; Kadri, Naveen Kumar; Mahdi Shariati, Mohammad;
2012-01-01
) Mixed model analysis (MMA), 2) Random haplotype model (RHM), 3) Genealogy-based mixed model (GENMIX), and 4) Bayesian variable selection (BVS). The data consisted of phenotypes of 2000 animals from 20 sire families and were genotyped with 9990 SNPs on five chromosomes. Results: Out of the eight...
Bayesian analysis of RNA sequencing data by estimating multiple shrinkage priors.
Van De Wiel, Mark A; Leday, Gwenaël G R; Pardo, Luba; Rue, Håvard; Van Der Vaart, Aad W; Van Wieringen, Wessel N
2013-01-01
Next generation sequencing is quickly replacing microarrays as a technique to probe different molecular levels of the cell, such as DNA or RNA. The technology provides higher resolution, while reducing bias. RNA sequencing results in counts of RNA strands. This type of data imposes new statistical challenges. We present a novel, generic approach to model and analyze such data. Our approach aims at large flexibility of the likelihood (count) model and the regression model alike. Hence, a variety of count models is supported, such as the popular NB model, which accounts for overdispersion. In addition, complex, non-balanced designs and random effects are accommodated. Like some other methods, our method provides shrinkage of dispersion-related parameters. However, we extend it by enabling joint shrinkage of parameters, including those for which inference is desired. We argue that this is essential for Bayesian multiplicity correction. Shrinkage is effectuated by empirically estimating priors. We discuss several parametric (mixture) and non-parametric priors and develop procedures to estimate (parameters of) those. Inference is provided by means of local and Bayesian false discovery rates. We illustrate our method on several simulations and two data sets, also to compare it with other methods. Model- and data-based simulations show substantial improvements in the sensitivity at the given specificity. The data motivate the use of the ZI-NB as a powerful alternative to the NB, which results in higher detection rates for low-count data. Finally, compared with other methods, the results on small sample subsets are more reproducible when validated on their large sample complements, illustrating the importance of the type of shrinkage. PMID:22988280
Converse, Sarah J.; Royle, J. Andrew; Urbanek, Richard P.
2012-01-01
Inbreeding depression is frequently a concern of managers interested in restoring endangered species. Decisions to reduce the potential for inbreeding depression by balancing genotypic contributions to reintroduced populations may exact a cost on long-term demographic performance of the population if those decisions result in reduced numbers of animals released and/or restriction of particularly successful genotypes (i.e., heritable traits of particular family lines). As part of an effort to restore a migratory flock of Whooping Cranes (Grus americana) to eastern North America using the offspring of captive breeders, we obtained a unique dataset which includes post-release mark-recapture data, as well as the pedigree of each released individual. We developed a Bayesian formulation of a multi-state model to analyze radio-telemetry, band-resight, and dead recovery data on reintroduced individuals, in order to track survival and breeding state transitions. We used studbook-based individual covariates to examine the comparative evidence for and degree of effects of inbreeding, genotype, and genotype quality on post-release survival of reintroduced individuals. We demonstrate implementation of the Bayesian multi-state model, which allows for the integration of imperfect detection, multiple data types, random effects, and individual- and time-dependent covariates. Our results provide only weak evidence for an effect of the quality of an individual's genotype in captivity on post-release survival as well as for an effect of inbreeding on post-release survival. We plan to integrate our results into a decision-analytic modeling framework that can explicitly examine tradeoffs between the effects of inbreeding and the effects of genotype and demographic stochasticity on population establishment.
Rapid sphere sizing using a Bayesian analysis of reciprocal space imaging data.
Ziovas, K; Sederman, A J; Gehin-Delval, C; Gunes, D Z; Hughes, E; Mantle, M D
2016-01-15
Dispersed systems are important in many applications in a wide range of industries such as the petroleum, pharmaceutical and food industries. Therefore the ability to control and non-invasively measure the physical properties of these systems, such as the dispersed phase size distribution, is of significant interest, in particular for concentrated systems, where microscopy or scattering techniques may not apply or with very limited output quality. In this paper we show how reciprocal space data acquired using both 1D magnetic resonance imaging (MRI) and 2D X-ray micro-tomographic (X-ray μCT) data can be analysed, using a Bayesian statistical model, to extract the sphere size distribution (SSD) from model sphere systems and dispersed food foam samples. Glass spheres-in-xanthan gels were used as model samples with sphere diameters (D) in the range of 45μm⩽D⩽850μm. The results show that the SSD was successfully estimated from both the NMR and X-ray μCT with a good degree of accuracy for the entire range of glass spheres in times as short as two seconds. After validating the technique using model samples, the Bayesian sphere sizing method was successfully applied to air/water foam samples generated using a microfluidics apparatus with 160μm⩽D⩽400μm. The effect of different experimental parameters such as the standard deviation of the bubble size distribution and the volume fraction of the dispersed phase is discussed. PMID:26439290
Ordering states with coherence measures
Liu, C. L.; Yu, Xiao-Dong; Xu, G. F.; Tong, D. M.
2016-07-01
The quantification of quantum coherence has attracted a growing attention, and based on various physical contexts, several coherence measures have been put forward. An interesting question is whether these coherence measures give the same ordering when they are used to quantify the coherence of quantum states. In this paper, we consider the two well-known coherence measures, the l_1 norm of coherence and the relative entropy of coherence, to show that there are the states for which the two measures give a different ordering. Our analysis can be extended to other coherence measures, and as an illustration of the extension we further consider the formation of coherence to show that the l_1 norm of coherence and the formation of coherence, as well as the relative entropy of coherence and the coherence of formation, do not give the same ordering too.
Directory of Open Access Journals (Sweden)
W David Walter
Full Text Available Bovine tuberculosis is a bacterial disease caused by Mycobacterium bovis in livestock and wildlife with hosts that include Eurasian badgers (Meles meles, brushtail possum (Trichosurus vulpecula, and white-tailed deer (Odocoileus virginianus. Risk-assessment efforts in Michigan have been initiated on farms to minimize interactions of cattle with wildlife hosts but research on M. bovis on cattle farms has not investigated the spatial context of disease epidemiology. To incorporate spatially explicit data, initial likelihood of infection probabilities for cattle farms tested for M. bovis, prevalence of M. bovis in white-tailed deer, deer density, and environmental variables for each farm were modeled in a Bayesian hierarchical framework. We used geo-referenced locations of 762 cattle farms that have been tested for M. bovis, white-tailed deer prevalence, and several environmental variables that may lead to long-term survival and viability of M. bovis on farms and surrounding habitats (i.e., soil type, habitat type. Bayesian hierarchical analyses identified deer prevalence and proportion of sandy soil within our sampling grid as the most supported model. Analysis of cattle farms tested for M. bovis identified that for every 1% increase in sandy soil resulted in an increase in odds of infection by 4%. Our analysis revealed that the influence of prevalence of M. bovis in white-tailed deer was still a concern even after considerable efforts to prevent cattle interactions with white-tailed deer through on-farm mitigation and reduction in the deer population. Cattle farms test positive for M. bovis annually in our study area suggesting that the potential for an environmental source either on farms or in the surrounding landscape may contributing to new or re-infections with M. bovis. Our research provides an initial assessment of potential environmental factors that could be incorporated into additional modeling efforts as more knowledge of deer herd
Energy Technology Data Exchange (ETDEWEB)
Sigeti, David E. [Los Alamos National Laboratory; Pelak, Robert A. [Los Alamos National Laboratory
2012-09-11
We present a Bayesian statistical methodology for identifying improvement in predictive simulations, including an analysis of the number of (presumably expensive) simulations that will need to be made in order to establish with a given level of confidence that an improvement has been observed. Our analysis assumes the ability to predict (or postdict) the same experiments with legacy and new simulation codes and uses a simple binomial model for the probability, {theta}, that, in an experiment chosen at random, the new code will provide a better prediction than the old. This model makes it possible to do statistical analysis with an absolute minimum of assumptions about the statistics of the quantities involved, at the price of discarding some potentially important information in the data. In particular, the analysis depends only on whether or not the new code predicts better than the old in any given experiment, and not on the magnitude of the improvement. We show how the posterior distribution for {theta} may be used, in a kind of Bayesian hypothesis testing, both to decide if an improvement has been observed and to quantify our confidence in that decision. We quantify the predictive probability that should be assigned, prior to taking any data, to the possibility of achieving a given level of confidence, as a function of sample size. We show how this predictive probability depends on the true value of {theta} and, in particular, how there will always be a region around {theta} = 1/2 where it is highly improbable that we will be able to identify an improvement in predictive capability, although the width of this region will shrink to zero as the sample size goes to infinity. We show how the posterior standard deviation may be used, as a kind of 'plan B metric' in the case that the analysis shows that {theta} is close to 1/2 and argue that such a plan B should generally be part of hypothesis testing. All the analysis presented in the paper is done with a
Coherence analysis of optical frequency-modulated continuous-wave interference.
Zheng, Jesse
2006-06-01
I analyze the coherence of optical frequency-modulated continuous-wave (FMCW) interference. With a simple model modified from the classical coherence theory, I successfully derive the relationships among the frequency bandwidth, coherence length, and coherence time of the practical optical source, and the contrast of the beat signal in optical FMCW interference. PMID:16724123
Hahn, Gitte Holst; Christensen, Karl Bang; Leung, Terence S.; Greisen, Gorm
2010-05-01
Coherence between spontaneous fluctuations in arterial blood pressure (ABP) and the cerebral near-infrared spectroscopy signal can detect cerebral autoregulation. Because reliable measurement depends on signals with high signal-to-noise ratio, we hypothesized that coherence is more precisely determined when fluctuations in ABP are large rather than small. Therefore, we investigated whether adjusting for variability in ABP (variabilityABP) improves precision. We examined the impact of variabilityABP within the power spectrum in each measurement and between repeated measurements in preterm infants. We also examined total monitoring time required to discriminate among infants with a simulation study. We studied 22 preterm infants (GA<30) yielding 215 10-min measurements. Surprisingly, adjusting for variabilityABP within the power spectrum did not improve the precision. However, adjusting for the variabilityABP among repeated measurements (i.e., weighting measurements with high variabilityABP in favor of those with low) improved the precision. The evidence of drift in individual infants was weak. Minimum monitoring time needed to discriminate among infants was 1.3-3.7 h. Coherence analysis in low frequencies (0.04-0.1 Hz) had higher precision and statistically more power than in very low frequencies (0.003-0.04 Hz). In conclusion, a reliable detection of cerebral autoregulation takes hours and the precision is improved by adjusting for variabilityABP between repeated measurements.
Liu, Yao; Wang, Xiufeng; Lin, Jing; Zhao, Wei
2016-11-01
Motor current is an emerging and popular signal which can be used to detect machining chatter with its multiple advantages. To achieve accurate and reliable chatter detection using motor current, it is important to make clear the quantitative relationship between motor current and chatter vibration, which has not yet been studied clearly. In this study, complex continuous wavelet coherence, including cross wavelet transform and wavelet coherence, is applied to the correlation analysis of motor current and chatter vibration in grinding. Experimental results show that complex continuous wavelet coherence performs very well in demonstrating and quantifying the intense correlation between these two signals in frequency, amplitude and phase. When chatter occurs, clear correlations in frequency and amplitude in the chatter frequency band appear and the phase difference of current signal to vibration signal turns from random to stable. The phase lead of the most correlated chatter frequency is the largest. With the further development of chatter, the correlation grows up in intensity and expands to higher order chatter frequency band. The analyzing results confirm that there is a consistent correlation between motor current and vibration signals in the grinding chatter process. However, to achieve accurate and reliable chatter detection using motor current, the frequency response bandwidth of current loop of the feed drive system must be wide enough to response chatter effectively.
Tian, Fenghua; Tarumi, Takashi; Liu, Hanli; Zhang, Rong; Chalak, Lina
2016-01-01
Cerebral autoregulation represents the physiological mechanisms that keep brain perfusion relatively constant in the face of changes in blood pressure and thus plays an essential role in normal brain function. This study assessed cerebral autoregulation in nine newborns with moderate-to-severe hypoxic-ischemic encephalopathy (HIE). These neonates received hypothermic therapy during the first 72 h of life while mean arterial pressure (MAP) and cerebral tissue oxygenation saturation (SctO2) were continuously recorded. Wavelet coherence analysis, which is a time-frequency domain approach, was used to characterize the dynamic relationship between spontaneous oscillations in MAP and SctO2. Wavelet-based metrics of phase, coherence and gain were derived for quantitative evaluation of cerebral autoregulation. We found cerebral autoregulation in neonates with HIE was time-scale-dependent in nature. Specifically, the spontaneous changes in MAP and SctO2 had in-phase coherence at time scales of less than 80 min (hypothermia. PMID:26937380
12th Brazilian Meeting on Bayesian Statistics
Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo
2015-01-01
Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...
Fourier optics analysis of phase-mask-based path-length-multiplexed optical coherence tomography.
Yin, Biwei; Dwelle, Jordan; Wang, Bingqing; Wang, Tianyi; Feldman, Marc D; Rylander, Henry G; Milner, Thomas E
2015-11-01
Optical coherence tomography (OCT) is an imaging technique that constructs a depth-resolved image by measuring the optical path-length difference between broadband light backscattered from a sample and a reference surface. For many OCT sample arm optical configurations, sample illumination and backscattered light detection share a common path. When a phase mask is placed in the sample path, features in the detected signal are observed, which suggests that an analysis of a generic common path OCT imaging system is warranted. In this study, we present a Fourier optics analysis using a Fresnel diffraction approximation of an OCT system with a path-length-multiplexing element (PME) inserted in the sample arm optics. The analysis may be generalized for most phase-mask-based OCT systems. A radial-angle-diverse PME is analyzed in detail, and the point spread function, coherent transfer function, sensitivity of backscattering angular diversity detection, and signal formation in terms of sample spatial frequency are simulated and discussed. The analysis reveals important imaging features and application limitations of OCT imaging systems with a phase mask in the sample path optics.
Uncertainty analysis of strain modal parameters by Bayesian method using frequency response function
Institute of Scientific and Technical Information of China (English)
Xu Li; Yi Weijian; Zhihua Yi
2007-01-01
Structural strain modes are able to detect changes in local structural performance, but errors are inevitably intermixed in the measured data. In this paper, strain modal parameters are considered as random variables, and their uncertainty is analyzed by a Bayesian method based on the structural frequency response function (FRF). The estimates of strain modal parameters with maximal posterior probability are determined. Several independent measurements of the FRF of a four-story reinforced concrete frame structural model were performed in the laboratory. The ability to identify the stiffness change in a concrete column using the strain mode was verified. It is shown that the uncertainty of the natural frequency is very small. Compared with the displacement mode shape, the variations of strain mode shapes at each point are quite different. The damping ratios are more affected by the types of test systems. Except for the case where a high order strain mode does not identify local damage, the first order strain mode can provide an exact indication of the damage location.
Snyder, Carolyn W.
2016-09-01
Statistical challenges often preclude comparisons among different sea surface temperature (SST) reconstructions over the past million years. Inadequate consideration of uncertainty can result in misinterpretation, overconfidence, and biased conclusions. Here I apply Bayesian hierarchical regressions to analyze local SST responsiveness to climate changes for 54 SST reconstructions from across the globe over the past million years. I develop methods to account for multiple sources of uncertainty, including the quantification of uncertainty introduced from absolute dating into interrecord comparisons. The estimates of local SST responsiveness explain 64% (62% to 77%, 95% interval) of the total variation within each SST reconstruction with a single number. There is remarkable agreement between SST proxy methods, with the exception of Mg/Ca proxy methods estimating muted responses at high latitudes. The Indian Ocean exhibits a muted response in comparison to other oceans. I find a stable estimate of the proposed "universal curve" of change in local SST responsiveness to climate changes as a function of sin2(latitude) over the past 400,000 years: SST change at 45°N/S is larger than the average tropical response by a factor of 1.9 (1.5 to 2.6, 95% interval) and explains 50% (35% to 58%, 95% interval) of the total variation between each SST reconstruction. These uncertainty and statistical methods are well suited for application across paleoclimate and environmental data series intercomparisons.
A unified Bayesian semiparametric approach to assess discrimination ability in survival analysis.
Zhao, Lili; Feng, Dai; Chen, Guoan; Taylor, Jeremy M G
2016-06-01
The discriminatory ability of a marker for censored survival data is routinely assessed by the time-dependent ROC curve and the c-index. The time-dependent ROC curve evaluates the ability of a biomarker to predict whether a patient lives past a particular time t. The c-index measures the global concordance of the marker and the survival time regardless of the time point. We propose a Bayesian semiparametric approach to estimate these two measures. The proposed estimators are based on the conditional distribution of the survival time given the biomarker and the empirical biomarker distribution. The conditional distribution is estimated by a linear-dependent Dirichlet process mixture model. The resulting ROC curve is smooth as it is estimated by a mixture of parametric functions. The proposed c-index estimator is shown to be more efficient than the commonly used Harrell's c-index since it uses all pairs of data rather than only informative pairs. The proposed estimators are evaluated through simulations and illustrated using a lung cancer dataset. PMID:26676324
Stenning, D. C.; Wagner-Kaiser, R.; Robinson, E.; van Dyk, D. A.; von Hippel, T.; Sarajedini, A.; Stein, N.
2016-07-01
We develop a Bayesian model for globular clusters composed of multiple stellar populations, extending earlier statistical models for open clusters composed of simple (single) stellar populations. Specifically, we model globular clusters with two populations that differ in helium abundance. Our model assumes a hierarchical structuring of the parameters in which physical properties—age, metallicity, helium abundance, distance, absorption, and initial mass—are common to (i) the cluster as a whole or to (ii) individual populations within a cluster, or are unique to (iii) individual stars. An adaptive Markov chain Monte Carlo (MCMC) algorithm is devised for model fitting that greatly improves convergence relative to its precursor non-adaptive MCMC algorithm. Our model and computational tools are incorporated into an open-source software suite known as BASE-9. We use numerical studies to demonstrate that our method can recover parameters of two-population clusters, and also show how model misspecification can potentially be identified. As a proof of concept, we analyze the two stellar populations of globular cluster NGC 5272 using our model and methods. (BASE-9 is available from GitHub: https://github.com/argiopetech/base/releases).
A Bayesian analysis of trends in ozone sounding data series from 9 Nordic stations
Christiansen, Bo; Jepsen, Nis; Larsen, Niels; Korsholm, Ulrik S.
2016-04-01
Ozone soundings from 9 Nordic stations have been homogenized and interpolated to standard pressure levels. The different stations have very different data coverage; the longest period with data is from the end of the 1980ies to 2013. We apply a model which includes both low-frequency variability in form of a polynomial, an annual cycle with harmonics, the possibility for low-frequency variability in the annual amplitude and phasing, and either white noise or AR1 noise. The fitting of the parameters is performed with a Bayesian approach not only giving the posterior mean values but also credible intervals. We find that all stations agree on an well-defined annual cycle in the free troposphere with a relatively confined maximum in the early summer. Regarding the low-frequency variability we find that Scoresbysund, Ny Aalesund, and Sodankyla show similar structures with a maximum near 2005 followed by a decrease. However, these results are only weakly significant. A significant change in the amplitude of the annual cycle was only found for Ny Aalesund. Here the peak-to-peak amplitude changes from 0.9 to 0.8 mhPa between 1995-2000 and 2007-2012. The results are shown to be robust to the different settings of the model parameters (order of the polynomial, number of harmonics in the annual cycle, type of noise, etc). The results are also shown to be characteristic for all pressure levels in the free troposphere.
A Bayesian threshold-normal mixture model for analysis of a continuous mastitis-related trait.
Ødegård, J; Madsen, P; Gianola, D; Klemetsdal, G; Jensen, J; Heringstad, B; Korsgaard, I R
2005-07-01
Mastitis is associated with elevated somatic cell count in milk, inducing a positive correlation between milk somatic cell score (SCS) and the absence or presence of the disease. In most countries, selection against mastitis has focused on selecting parents with genetic evaluations that have low SCS. Univariate or multivariate mixed linear models have been used for statistical description of SCS. However, an observation of SCS can be regarded as drawn from a 2- (or more) component mixture defined by the (usually) unknown health status of a cow at the test-day on which SCS is recorded. A hierarchical 2-component mixture model was developed, assuming that the health status affecting the recorded test-day SCS is completely specified by an underlying liability variable. Based on the observed SCS, inferences can be drawn about disease status and parameters of both SCS and liability to mastitis. The prior probability of putative mastitis was allowed to vary between subgroups (e.g., herds, families), by specifying fixed and random effects affecting both SCS and liability. Using simulation, it was found that a Bayesian model fitted to the data yielded parameter estimates close to their true values. The model provides selection criteria that are more appealing than selection for lower SCS. The proposed model can be extended to handle a wide range of problems related to genetic analyses of mixture traits.
The Rest-Frame Golenetskii Correlation via a Hierarchical Bayesian Analysis
Burgess, J Michael
2015-01-01
Gamma-ray bursts (GRBs) are characterised by a strong correlation between the instantaneous luminosity and the spectral peak energy within a burst. This correlation, which is known as the hardness-intensity correlation or the Golenetskii correlation, not only holds important clues to the physics of GRBs but is thought to have the potential to determine redshifts of bursts. In this paper, I use a hierarchical Bayesian model to study the universality of the rest-frame Golenetskii correlation and in particular I assess its use as a redshift estimator for GRBs. I find that, using a power-law prescription of the correlation, the power-law indices cluster near a common value, but have a broader variance than previously reported ($\\sim 1-2$). Furthermore, I find evidence that there is spread in intrinsic rest-frame correlation normalizations for the GRBs in our sample ($\\sim 10^{51}-10^{53}$ erg s$^{-1}$). This points towards variable physical settings of the emission (magnetic field strength, number of emitting ele...
Mbakwe, Anthony C; Saka, Anthony A; Choi, Keechoo; Lee, Young-Jae
2016-08-01
Highway traffic accidents all over the world result in more than 1.3 million fatalities annually. An alarming number of these fatalities occurs in developing countries. There are many risk factors that are associated with frequent accidents, heavy loss of lives, and property damage in developing countries. Unfortunately, poor record keeping practices are very difficult obstacle to overcome in striving to obtain a near accurate casualty and safety data. In light of the fact that there are numerous accident causes, any attempts to curb the escalating death and injury rates in developing countries must include the identification of the primary accident causes. This paper, therefore, seeks to show that the Delphi Technique is a suitable alternative method that can be exploited in generating highway traffic accident data through which the major accident causes can be identified. In order to authenticate the technique used, Korea, a country that underwent similar problems when it was in its early stages of development in addition to the availability of excellent highway safety records in its database, is chosen and utilized for this purpose. Validation of the methodology confirms the technique is suitable for application in developing countries. Furthermore, the Delphi Technique, in combination with the Bayesian Network Model, is utilized in modeling highway traffic accidents and forecasting accident rates in the countries of research. PMID:27183516
Bayesian analysis of white noise levels in the 5-year WMAP data
Groeneboom, N E; Gorski, K; Huey, G; Jewell, J; Wandelt, B
2009-01-01
We develop a new Bayesian method for estimating white noise levels in CMB sky maps, and apply this algorithm to the 5-year WMAP data. We assume that the amplitude of the noise RMS is scaled by a constant value, alpha, relative to a pre-specified noise level. We then derive the corresponding conditional density, P(alpha | s, C_l, d), which is subsequently integrated into a general CMB Gibbs sampler. We first verify our code by analyzing simulated data sets, and then apply the framework to the WMAP data. For the foreground-reduced 5-year WMAP sky maps, we find that the posterior means typically range between alpha=1.005 +- 0.001 and alpha=1.010 +- 0.001 depending on differencing assembly, indicating that the noise level of these maps are underestimated by 0.5-1.0%. The same problem is not observed for the uncorrected WMAP sky maps. The only difference between these two cases is that the nominal white noise level for the foreground-reduced map is specified to be lower than that of the raw maps. This is likely in...
Bayesian analysis of cosmic-ray propagation: evidence against homogeneous diffusion
Jóhannesson, G; Vincent, A C; Moskalenko, I V; Orlando, E; Porter, T A; Strong, A W; Trotta, R; Feroz, F; Graff, P; Hobson, M P
2016-01-01
We present the results of the most complete ever scan of the parameter space for cosmic ray (CR) injection and propagation. We perform a Bayesian search of the main GALPROP parameters, using the MultiNest nested sampling algorithm, augmented by the BAMBI neural network machine learning package. This is the first such study to separate out low-mass isotopes ($p$, $\\bar p$ and He) from the usual light elements (Be, B, C, N, O). We find that the propagation parameters that best fit $p$, $\\bar p$, He data are significantly different from those that fit light elements, including the B/C and $^{10}$Be/$^9$Be secondary-to-primary ratios normally used to calibrate propagation parameters. This suggests each set of species is probing a very different interstellar medium, and that the standard approach of calibrating propagation parameters using B/C can lead to incorrect results. We present posterior distributions and best fit parameters for propagation of both sets of nuclei, as well as for the injection abundances of ...
A Bayesian analysis of redshifted 21-cm HI signal and foregrounds: Simulations for LOFAR
Ghosh, Abhik; Chapman, Emma; Jelic, Vibor
2015-01-01
Observations of the EoR with the 21-cm hyperfine emission of neutral hydrogen (HI) promise to open an entirely new window onto the formation of the first stars, galaxies and accreting black holes. In order to characterize the weak 21-cm signal, we need to develop imaging techniques which can reconstruct the extended emission very precisely. Here, we present an inversion technique for LOFAR baselines at NCP, based on a Bayesian formalism with optimal spatial regularization, which is used to reconstruct the diffuse foreground map directly from the simulated visibility data. We notice the spatial regularization de-noises the images to a large extent, allowing one to recover the 21-cm power-spectrum over a considerable $k_{\\perp}-k_{\\para}$ space in the range of $0.03\\,{\\rm Mpc^{-1}}
Shaya, Ed
2010-01-01
We develop Bayesian statistical methods for discovering and assigning probabilities to non-random (e.g., physical) stellar proper motion companions. They are either presently bound or from previously bound systems. The probabilities depend on similarities in proper motion parallel and perpendicular to the brighter component's motion, parallax, and the local phase-space density of field stars. Control experiments are conducted to understand the behavior of false positives. The technique is applied to the Hipparcos Catalogue within 100 pc. This is the first all-sky survey to locate escaped companions still drifting along with each other out to several parsecs. In both the 25 - 50 and the 50 - 100 pc distance ranges, about 50 high probability companions with separations between 0.01 - 1 pc are found. Evidence is found for a population of several 100 companions separated by 1 - 10 pc. We find these unnoticed naked-eye companions (both with V6) of primaries with V<4 are found: Fomalhaut ({\\alpha} PsA), Mizar, {...
Directory of Open Access Journals (Sweden)
Chen Yidong
2011-10-01
Full Text Available Abstract Background Transcriptional regulation by transcription factor (TF controls the time and abundance of mRNA transcription. Due to the limitation of current proteomics technologies, large scale measurements of protein level activities of TFs is usually infeasible, making computational reconstruction of transcriptional regulatory network a difficult task. Results We proposed here a novel Bayesian non-negative factor model for TF mediated regulatory networks. Particularly, the non-negative TF activities and sample clustering effect are modeled as the factors from a Dirichlet process mixture of rectified Gaussian distributions, and the sparse regulatory coefficients are modeled as the loadings from a sparse distribution that constrains its sparsity using knowledge from database; meantime, a Gibbs sampling solution was developed to infer the underlying network structure and the unknown TF activities simultaneously. The developed approach has been applied to simulated system and breast cancer gene expression data. Result shows that, the proposed method was able to systematically uncover TF mediated transcriptional regulatory network structure, the regulatory coefficients, the TF protein level activities and the sample clustering effect. The regulation target prediction result is highly coordinated with the prior knowledge, and sample clustering result shows superior performance over previous molecular based clustering method. Conclusions The results demonstrated the validity and effectiveness of the proposed approach in reconstructing transcriptional networks mediated by TFs through simulated systems and real data.
Electroencephalograpic coherence
Directory of Open Access Journals (Sweden)
Simon Brežan
2004-08-01
Full Text Available Different brain areas process various aspects of information in parallel as well as segregated way. It is not known, how is this information integrated into a unitary percept or action. The binding problem is one of the key problems in understanding brain function. Synchronized oscillatory activity of neurons is one possible mechanism of the functional integration of different communicating brain areas. The binding has been well-studied in the visual system, but it could also serve as a mechanism in visuomotor integration or functional coupling present with other brain processes and behavioural modes (perception, complex motor behaviour, selective attention, learning, working memory, etc.. Interregional synchronization of the electroencephalographic (EEG signal can be determined by EEG coherence analysis. In the article we present a research example of coherence changes in a visuomotor task. During this task, coherence between visual and motor brain areas increased. This might reflect functional coupling between those areas, but it could also be influenced by other cognitive processes (e.g. selective attention. Coherence analysis is suitable for studying integrative brain function. Because it measures only one of the possible mechanisms of integration, it offers promise especially when combined with other electrophysiological and functional imaging methods.
State-of-the-art in retinal optical coherence tomography image analysis
Baghaie, Ahmadreza; Yu, Zeyun; D’Souza, Roshan M.
2015-01-01
Optical coherence tomography (OCT) is an emerging imaging modality that has been widely used in the field of biomedical imaging. In the recent past, it has found uses as a diagnostic tool in dermatology, cardiology, and ophthalmology. In this paper we focus on its applications in the field of ophthalmology and retinal imaging. OCT is able to non-invasively produce cross-sectional volumetric images of the tissues which can be used for analysis of tissue structure and properties. Due to the und...
Hockett, P; Lux, C; Baumert, T
2015-01-01
The feasibility of complete photoionization experiments, in which the full set of photoionization matrix elements are determined, using multiphoton ionization schemes with polarization-shaped pulses has recently been demonstrated [Hockett et. al., Phys. Rev. Lett. 112, 223001 (2014)]. Here we extend on our previous work to discuss further details of the numerics and analysis methodology utilised, and compare the results directly to new tomographic photoelectron measurements, which provide a more sensitive test of the validity of the results. In so doing we discuss in detail the physics of the photoionziation process, and suggest various avenues and prospects for this coherent multiplexing methodology.
Field measurement and analysis of turbulence coherence for Typhoon Nuri at Macao Friendship Bridge
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
For the purpose of investigating the turbulent and spatial coherent characteristics of strong wind during typhoon landing period,two 3-dimensional ultrasonic anemometer stations were set up 30 m horizontally apart on the Macao Friendship Bridge to caputure the turbulent wind velocities of Typhoon Nuri. Based on the reliable and representative field measured data,the mean wind speed and direction,turbulence intensity,turbulence integral scale,turbulence power spectra,spatial correlation coefficient and coherence function were statistically evaluated. The field measurement analysis have presented the following results: 1) Two anemometer stations provided consistent results. The mean wind speed variation in time domain presented typical M-shape curves. The strong wind (10-minute mean wind speed higher than 8th grade in Beaufort wind scale) direction changed in a big range up to 122-degrees-angle,indicating the field measurements scoped over the typhoon landing period. 2) The ratio of the longitudinal,lateral and vertical turbulence intensities of the strong wind in the typhoon eye wall region was 1:0.96:0.36. Compared with the code defined ratio 1:0.88:0.5,the lateral component was larger and the vertical component was smaller. 3) The value of integral scale increased when the eye wall of Typhoon Nuri passed over the field measurement site. Before the center of Typhoon Nuri arrived,the integral scale of the strong typhoon wind was about twice compared with that for the non-typhoon wind. 4) The spatial correlation of the turbulent wind,coherence function curve and the decay factor had significant differences at different times during the typhoon process. In the eye wall of the typhoon,the horizontal spatial correlation was relatively strong and horizontally spatial correlation spectrum decayed slower with frequency increase. The minimum regressed coefficient C in coherence function model was 4.67,which is lower than the code defined low limit. The maximum decay factor
Incoherent SSI Analysis of Reactor Building using 2007 Hard-Rock Coherency Model
International Nuclear Information System (INIS)
Many strong earthquake recordings show the response motions at building foundations to be less intense than the corresponding free-field motions. To account for these phenomena, the concept of spatial variation, or wave incoherence was introduced. Several approaches for its application to practical analysis and design as part of soil-structure interaction (SSI) effect have been developed. However, conventional wave incoherency models didn't reflect the characteristics of earthquake data from hard-rock site, and their application to the practical nuclear structures on the hard-rock sites was not justified sufficiently. This paper is focused on the response impact of hard-rock coherency model proposed in 2007 on the incoherent SSI analysis results of nuclear power plant (NPP) structure. A typical reactor building of pressurized water reactor (PWR) type NPP is modeled classified into surface and embedded foundations. The model is also assumed to be located on medium-hard rock and hard-rock sites. The SSI analysis results are obtained and compared in case of coherent and incoherent input motions. The structural responses considering rocking and torsion effects are also investigated
Bayesian peak bagging analysis of 19 low-mass low-luminosity red giants observed with Kepler
Corsaro, E; García, R A
2015-01-01
The currently available Kepler light curves contain an outstanding amount of information but a detailed analysis of the individual oscillation modes in the observed power spectra, also known as peak bagging, is computationally demanding and challenging to perform on a large number of targets. Our intent is to perform for the first time a peak bagging analysis on a sample of 19 low-mass low-luminosity red giants observed by Kepler for more than four years. This allows us to provide high-quality asteroseismic measurements that can be exploited for an intensive testing of the physics used in stellar structure models, stellar evolution and pulsation codes, as well as for refining existing asteroseismic scaling relations in the red giant branch regime. For this purpose, powerful and sophisticated analysis tools are needed. We exploit the Bayesian code Diamonds, using an efficient nested sampling Monte Carlo algorithm, to perform both a fast fitting of the individual oscillation modes and a peak detection test base...
Directory of Open Access Journals (Sweden)
Matthieu Vignes
Full Text Available Modern technologies and especially next generation sequencing facilities are giving a cheaper access to genotype and genomic data measured on the same sample at once. This creates an ideal situation for multifactorial experiments designed to infer gene regulatory networks. The fifth "Dialogue for Reverse Engineering Assessments and Methods" (DREAM5 challenges are aimed at assessing methods and associated algorithms devoted to the inference of biological networks. Challenge 3 on "Systems Genetics" proposed to infer causal gene regulatory networks from different genetical genomics data sets. We investigated a wide panel of methods ranging from Bayesian networks to penalised linear regressions to analyse such data, and proposed a simple yet very powerful meta-analysis, which combines these inference methods. We present results of the Challenge as well as more in-depth analysis of predicted networks in terms of structure and reliability. The developed meta-analysis was ranked first among the 16 teams participating in Challenge 3A. It paves the way for future extensions of our inference method and more accurate gene network estimates in the context of genetical genomics.
Sensitivity issues in the bayesian analysis of failures in repairable systems
Energy Technology Data Exchange (ETDEWEB)
Ruggero, F.
2001-07-01
The paper arises from a consulting project in which failures (gas escapes) are considered in the steel pipelines of an urban gas distribution network. The available data are the 33 failure times from 1978 to 1997 over a network of 275 kilometres. More details on the data can be found in (1). Steel pipelines are subject to ageing and the global system reliability is barely affected by one failure. Thus, we consider the network as a repairable system, since it keeps the same reliability when minimal repairs immediately follow failures. Failures of Such systems are often described by non-homogeneous Poisson processes (NHPP), which take in account the degradation of the components; see, e. g. (2). In the paper we model the failures pattern with a NHPP with logarithmic intensity and present different sensitivity analyses when relaxing the assumption on the parametric model. We operate in a robust Bayesian framework, see (3) for a thoroughly illustration of the approach. In the paper we do not focus on the commonly addressed issue of sensitivity to the prior, but we are interested in model sensitivity and consider two ways to build classes of models around the NHPP. In the first approach, we consider the NHPP as an element of a class of processes, defined through differential equations whose solutions are mean value functions of NHPPs. In the second, nonparametric approach, we consider processes whose mean value functions are distribution functions of random measures. In both cases, we compare the models with the baseline, logarithmic NHPP. In Section 2 we analyse the logarithmic NHPP and the class of parametric models, whereas comparison between parametric and nonparametric models is performed in Section 3. Some concluding remarks are presented in Section. (Author) 12 refs.
Directory of Open Access Journals (Sweden)
Zhi-Qiang Cai
Full Text Available The prognosis of hepatocellular carcinoma (HCC after hepatectomy involves many factors. Previous studies have evaluated the separate influences of single factors; few have considered the combined influence of various factors. This paper combines the Bayesian network (BN with importance measures to identify key factors that have significant effects on survival time.A dataset of 299 patients with HCC after hepatectomy was studied to establish a BN using a tree-augmented naïve Bayes algorithm that could mine relationships between factors. The composite importance measure was applied to rank the impact of factors on survival time.124 patients (>10 months and 77 patients (≤10 months were correctly classified. The accuracy of BN model was 67.2%. For patients with long survival time (>10 months, the true-positive rate of the model was 83.22% and the false-positive rate was 48.67%. According to the model, the preoperative alpha fetoprotein (AFP level and postoperative performance of transcatheter arterial chemoembolization (TACE were independent factors for survival of HCC patients. The grade of preoperative liver function reflected the tendency for postoperative complications. Intraoperative blood loss, tumor size, portal vein tumor thrombosis (PVTT, time of clamping the porta hepatis, tumor number, operative method, and metastasis were dependent variables in survival time prediction. PVTT was considered the most significant for the prognosis of survival time.Using the BN and importance measures, PVTT was identified as the most significant predictor of survival time for patients with HCC after hepatectomy.
Lu, Dan; Ye, Ming; Curtis, Gary P.
2015-10-01
While Bayesian model averaging (BMA) has been widely used in groundwater modeling, it is infrequently applied to groundwater reactive transport modeling because of multiple sources of uncertainty in the coupled hydrogeochemical processes and because of the long execution time of each model run. To resolve these problems, this study analyzed different levels of uncertainty in a hierarchical way, and used the maximum likelihood version of BMA, i.e., MLBMA, to improve the computational efficiency. This study demonstrates the applicability of MLBMA to groundwater reactive transport modeling in a synthetic case in which twenty-seven reactive transport models were designed to predict the reactive transport of hexavalent uranium (U(VI)) based on observations at a former uranium mill site near Naturita, CO. These reactive transport models contain three uncertain model components, i.e., parameterization of hydraulic conductivity, configuration of model boundary, and surface complexation reactions that simulate U(VI) adsorption. These uncertain model components were aggregated into the alternative models by integrating a hierarchical structure into MLBMA. The modeling results of the individual models and MLBMA were analyzed to investigate their predictive performance. The predictive logscore results show that MLBMA generally outperforms the best model, suggesting that using MLBMA is a sound strategy to achieve more robust model predictions relative to a single model. MLBMA works best when the alternative models are structurally distinct and have diverse model predictions. When correlation in model structure exists, two strategies were used to improve predictive performance by retaining structurally distinct models or assigning smaller prior model probabilities to correlated models. Since the synthetic models were designed using data from the Naturita site, the results of this study are expected to provide guidance for real-world modeling. Limitations of applying MLBMA to the
Krishnamurthy, Krish
2013-12-01
The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented.
Bayesian theory and applications
Dellaportas, Petros; Polson, Nicholas G; Stephens, David A
2013-01-01
The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...
International Nuclear Information System (INIS)
As the essential foundation of noise reduction, many noise source identification methods have been developed and applied to engineering practice. To identify the noise source in the board-band frequency of different engine parts at various typical speeds, this paper presents an integrated noise source identification method based on the ensemble empirical mode decomposition (EEMD), the coherent power spectrum analysis, and the improved analytic hierarchy process (AHP). The measured noise is decomposed into several IMFs with physical meaning, which ensures the coherence analysis of the IMFs and the vibration signals are meaningful. An improved AHP is developed by introducing an objective weighting function to replace the traditional subjective evaluation, which makes the results no longer dependent on the subject performances and provides a better consistency in the meantime. The proposed noise identification model is applied to identifying a diesel engine surface radiated noise. As a result, the frequency-dependent contributions of different engine parts to different test points at different speeds are obtained, and an overall weight order is obtained as oil pan > left body > valve chamber cover > gear chamber casing > right body > flywheel housing, which provides an effectual guidance for the noise reduction. (paper)
Liu, Chih-Hao; Qi, Ji; Lu, Jing; Wang, Shang; Wu, Chen; Shih, Wei-Chuan; Larin, Kirill V.
2014-02-01
Optical coherence tomography (OCT) is an optical imaging technique that is capable of performing high-resolution (approaching the histopathology level) and real-time imaging of tissues without use of contrast agents. Based on these advantages, the pathological features of tumors (in micro scale) can be identified during resection surgery. However, the accuracy of tumor margin prediction still needs to be enhanced for assisting the judgment of surgeons. In this regard, we present a two-dimensional computational method for advanced tissue analysis and characterization based on optical coherence tomography (OCT) and Raman spectroscopy (RS). The method combines the slope of OCT intensity signal and the Principal component (PC) of RS, and relies on the tissue optical attenuation and chemical ingredients for the classification of tissue types. Our pilot experiments were performed on mouse kidney, liver and small intestine. Results demonstrate the improvement of the tissue differentiation compared with the analysis only based on the OCT detection. This combined OCT/RS method is potentially useful as a novel optical biopsy technique for cancer detection.
Zhang, Junhong; Wang, Jian; Lin, Jiewei; Bi, Fengrong; Guo, Qian; Chen, Kongwu; Ma, Liang
2015-09-01
As the essential foundation of noise reduction, many noise source identification methods have been developed and applied to engineering practice. To identify the noise source in the board-band frequency of different engine parts at various typical speeds, this paper presents an integrated noise source identification method based on the ensemble empirical mode decomposition (EEMD), the coherent power spectrum analysis, and the improved analytic hierarchy process (AHP). The measured noise is decomposed into several IMFs with physical meaning, which ensures the coherence analysis of the IMFs and the vibration signals are meaningful. An improved AHP is developed by introducing an objective weighting function to replace the traditional subjective evaluation, which makes the results no longer dependent on the subject performances and provides a better consistency in the meantime. The proposed noise identification model is applied to identifying a diesel engine surface radiated noise. As a result, the frequency-dependent contributions of different engine parts to different test points at different speeds are obtained, and an overall weight order is obtained as oil pan > left body > valve chamber cover > gear chamber casing > right body > flywheel housing, which provides an effectual guidance for the noise reduction.
Feroz, Farhan
2013-01-01
GJ667C is the least massive component of a triple star system which lies at a distance of about 6.8 pc (22.1 light-years) from Earth. GJ667C has received much attention recently due to the claims that it hosts up to seven planets including three super-Earths inside the habitable zone. We present a Bayesian technique for the analysis of radial velocity (RV) data-sets in the presence of correlated noise component ("red noise"), with unknown parameters. We also introduce hyper-parameters in our model in order to deal statistically with under or over-estimated error bars on measured RVs as well as inconsistencies between different data-sets. By applying this method to the RV data-set of GJ667C and show that this data-set contains a significant correlated (red) noise component with correlation timescale for HARPS data of order 9 days. Our analysis shows that the data only provides strong evidence for the presence of two planets: GJ667Cb and c with periods 7.19d and 28.13d respectively, with some hints towards the ...
Walter, William D.; Smith, Rick; Vanderklok, Mike; VerCauterren, Kurt C.
2014-01-01
Bovine tuberculosis is a bacterial disease caused by Mycobacterium bovis in livestock and wildlife with hosts that include Eurasian badgers (Meles meles), brushtail possum (Trichosurus vulpecula), and white-tailed deer (Odocoileus virginianus). Risk-assessment efforts in Michigan have been initiated on farms to minimize interactions of cattle with wildlife hosts but research onM. bovis on cattle farms has not investigated the spatial context of disease epidemiology. To incorporate spatially explicit data, initial likelihood of infection probabilities for cattle farms tested for M. bovis, prevalence of M. bovis in white-tailed deer, deer density, and environmental variables for each farm were modeled in a Bayesian hierarchical framework. We used geo-referenced locations of 762 cattle farms that have been tested for M. bovis, white-tailed deer prevalence, and several environmental variables that may lead to long-term survival and viability of M. bovis on farms and surrounding habitats (i.e., soil type, habitat type). Bayesian hierarchical analyses identified deer prevalence and proportion of sandy soil within our sampling grid as the most supported model. Analysis of cattle farms tested for M. bovisidentified that for every 1% increase in sandy soil resulted in an increase in odds of infection by 4%. Our analysis revealed that the influence of prevalence of M. bovis in white-tailed deer was still a concern even after considerable efforts to prevent cattle interactions with white-tailed deer through on-farm mitigation and reduction in the deer population. Cattle farms test positive for M. bovis annually in our study area suggesting that the potential for an environmental source either on farms or in the surrounding landscape may contributing to new or re-infections with M. bovis. Our research provides an initial assessment of potential environmental factors that could be incorporated into additional modeling efforts as more knowledge of deer herd
Echeverria, Alex; Silva, Jorge F.; Mendez, Rene A.; Orchard, Marcos
2016-10-01
Context. The best precision that can be achieved to estimate the location of a stellar-like object is a topic of permanent interest in the astrometric community. Aims: We analyze bounds for the best position estimation of a stellar-like object on a CCD detector array in a Bayesian setting where the position is unknown, but where we have access to a prior distribution. In contrast to a parametric setting where we estimate a parameter from observations, the Bayesian approach estimates a random object (i.e., the position is a random variable) from observations that are statistically dependent on the position. Methods: We characterize the Bayesian Cramér-Rao (CR) that bounds the minimum mean square error (MMSE) of the best estimator of the position of a point source on a linear CCD-like detector, as a function of the properties of detector, the source, and the background. Results: We quantify and analyze the increase in astrometric performance from the use of a prior distribution of the object position, which is not available in the classical parametric setting. This gain is shown to be significant for various observational regimes, in particular in the case of faint objects or when the observations are taken under poor conditions. Furthermore, we present numerical evidence that the MMSE estimator of this problem tightly achieves the Bayesian CR bound. This is a remarkable result, demonstrating that all the performance gains presented in our analysis can be achieved with the MMSE estimator. Conclusions: The Bayesian CR bound can be used as a benchmark indicator of the expected maximum positional precision of a set of astrometric measurements in which prior information can be incorporated. This bound can be achieved through the conditional mean estimator, in contrast to the parametric case where no unbiased estimator precisely reaches the CR bound.
Teng, Ming; Johnson, Timothy; Nathoo, Farouk
2016-01-01
We consider Bayesian computation for a Bayesian fMRI time series model with spatial priors. A previously derived variational Bayes (VB) algorithm based on a mean field approximation is currently implemented in the Statistical Parametric Mapping (SPM) software. To examine the accuracy of this VB approximation we derive Hamiltonian Monte Carlo (HMC) for this model and conduct simulation studies to compare its performance with VB in terms of estimation accuracy, posterior variability, the spatia...
Hayama, Kazuhiro; Kotake, Kei; Takiwaki, Tomoya
2015-01-01
Using predictions from three-dimensional (3D) hydrodynamics simulations of core-collapse supernovae (CCSNe), we present a coherent network analysis to detection, reconstruction, and the source localization of the gravitational-wave (GW) signals. By combining with the GW spectrogram analysis, we show that several important hydrodynamics features imprinted in the original waveforms persist in the waveforms of the reconstructed signals. The characteristic excess in the GW spectrograms originates not only from rotating core-collapse and bounce, the subsequent ring down of the proto-neutron star (PNS) as previously identified, but also from the formation of magnetohydrodynamics jets and non-axisymmetric instabilities in the vicinity of the PNS. Regarding the GW signals emitted near at the rotating core bounce, the horizon distance, which we set by a SNR exceeding 8, extends up to $\\sim$ 18 kpc for the most rapidly rotating 3D model among the employed waveform libraries. Following the rotating core bounce, the domi...
Bayesian artificial intelligence
Korb, Kevin B
2003-01-01
As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
Applying support vector regression analysis on grip force level-related corticomuscular coherence
DEFF Research Database (Denmark)
Rong, Yao; Han, Xixuan; Hao, Dongmei;
2014-01-01
in an accessory muscle, this study proposed an expanded support vector regression (ESVR) algorithm to quantify the coherence between electroencephalogram (EEG) from sensorimotor cortex and surface electromyogram (EMG) from brachioradialis in upper limb. A measure called coherence proportion was introduced...... to compare the corticomuscular coherence in the alpha (7–15Hz), beta (15–30Hz) and gamma (30–45Hz) band at 25 % maximum grip force (MGF) and 75 % MGF. Results show that ESVR could reduce the influence of deflected signals and summarize the overall behavior of multiple coherence curves. Coherence proportion...
Institute of Scientific and Technical Information of China (English)
Wei Lu; Liren Liu; Jianfeng Sun
2006-01-01
@@ A new factor M is proposed to characterize the similarity of the behavior of a partially coherent beam (PCB) to its coherent counterpart when propagating through atmospheric turbulence. It is shown that there exists a boundary in the range of the source coherent length. The decreasing rates of free space angular spreading and of turbulence distance with the source coherent length are different before and after the coherent length arriving at the boundary value, by which the change trend of M can be concluded.
X-ray standing wave analysis of nanostructures using partially coherent radiation
Energy Technology Data Exchange (ETDEWEB)
Tiwari, M. K., E-mail: mktiwari@rrcat.gov.in; Das, Gangadhar [Indus Synchrotrons Utilization Division, Raja Ramanna Centre for Advanced Technology, Indore-452013, Madhya Pradesh (India); Bedzyk, M. J. [Departments of Materials Science & Engineering and Physics & Astronomy, Northwestern University, Evanston, Illinois 60208 (United States)
2015-09-07
The effect of longitudinal (or temporal) coherence on total reflection assisted x-ray standing wave (TR-XSW) analysis of nanoscale materials is quantitatively demonstrated by showing how the XSW fringe visibility can be strongly damped by decreasing the spectral resolution of the incident x-ray beam. The correction for nonzero wavelength dispersion (δλ ≠ 0) of the incident x-ray wave field is accounted for in the model computations of TR-XSW assisted angle dependent fluorescence yields of the nanostructure coatings on x-ray mirror surfaces. Given examples include 90 nm diameter Au nanospheres deposited on a Si(100) surface and a 3 nm thick Zn layer trapped on top a 100 nm Langmuir-Blodgett film coating on a Au mirror surface. Present method opens up important applications, such as enabling XSW studies of large dimensioned nanostructures using conventional laboratory based partially coherent x-ray sources.
Zbylut-Górska, Maria; Kosek, Wiesław; Wnęk, Agnieszka; Młocek, Wojciech; Rutkowska, Agnieszka; Popiński, Waldemar; Niedzielski, Tomasz
2016-04-01
One of the main causes of the sea level variations is the steric effect caused by changes of local sea surface temperature (SST). To show how the altimetric Sea Level Anomaly (SLA) data are related to the SST data, correlation coefficients between them as a function of geographic location were computed. The analysis showed a high positive correlation (about 0.7), especially in the Northern and South-Eastern parts of the Pacific Ocean and a large part of the Atlantic Ocean. There is a negative correlation of about 0.5 in the South-East part of Indian Ocean, on the Arafura Sea and the Red Sea. In addition the time-frequency coherence and semblance functions between the SLA and SST data were calculated using Fourier transform band pass filter. The maps of such coherence and semblance functions in frequency bands corresponding to the annual oscillation and its integer multiplicities were computed. The most imporntat contribution to the correlation coefficient values has the annual oscillation in the SST and SLA data.
Baron, Philippe; Ishii, Shoken; Mizutani, Kohei; Itabe, Toshikazu; Yasui, Motoaki
2012-11-01
In the last decade the precision of coherent Doppler differential absorption lidar (DIAL) has been greatly improved in near and middle infra-red domains for measuring greenhouse gases such as CO2, CH4 and winds. The National Institute of Information and Communications Technology (NICT, Japan) has developed and is operating a CO2 and wind measuring ground-based coherent DIAL at 2.05 μm (4878 cm-1). The application of this technology from space is now considered. In this analysis we study the use of the NICT DIAL for profiling tropospheric water vapour from space. We present the methodology to select the spectral lines and summarized the results of the selected lines between 4000 and 7000 cm-1. The choice of the frequency offset, the pulse energy and repetition frequency are discussed. Retrieval simulations from the line at 4580 cm-1 (2.18 μm) suitable for the boundary layer and the stronger one at 5621 cm-1 (1.78 μm) for sounding the boundary layer and the middle troposphere, are shown.
Chen, Yu-Pei; Guo, Rui; Liu, Na; Liu, Xu; Mao, Yan-Ping; Tang, Ling-Long; Zhou, Guan-qun; Lin, Ai-Hua; Sun, Ying; Ma, Jun
2015-01-01
Background: Due to the lack of studies, it remains unclear whether the additional neoadjuvant chemotherapy (NACT) to concurrent chemoradiotherapy (CCRT) is superior to CCRT alone for locoregionally advanced nasopharyngeal carcinoma (NPC). The main objective of this Bayesian network meta-analysis was to determine the efficacy of NACT+CCRT as compared with CCRT alone. Methods: We comprehensively searched databases and extracted data from randomized controlled trials involving NPC patients who r...
Owens Chantelle J; Owusu-Edusei Kwame
2009-01-01
Abstract Background Chlamydia continues to be the most prevalent disease in the United States. Effective spatial monitoring of chlamydia incidence is important for successful implementation of control and prevention programs. The objective of this study is to apply Bayesian smoothing and exploratory spatial data analysis (ESDA) methods to monitor Texas county-level chlamydia incidence rates by examining spatiotemporal patterns. We used county-level data on chlamydia incidence (for all ages, g...
Variational bayesian method of estimating variance components.
Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi
2016-07-01
We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.
Indian Academy of Sciences (India)
Tao Wei; Xiao Xiao Jin; Tian Jun Xu
2013-08-01
To understand the phylogenetic position of Bostrychus sinensis in Eleotridae and the phylogenetic relationships of the family, we determined the nucleotide sequence of the mitochondrial (mt) genome of Bostrychus sinensis. It is the first complete mitochondrial genome sequence of Bostrychus genus. The entire mtDNA sequence was 16508 bp in length with a standard set of 13 protein-coding genes, 22 transfer RNA genes (tRNAs), two ribosomal RNA genes (rRNAs) and a noncoding control region. The mitochondrial genome of B. sinensis had common features with those of other bony fishes with respect to gene arrangement, base composition, and tRNA structures. Phylogenetic hypotheses within Eleotridae fish have been controversial at the genus level. We used the mitochondrial cytochrome b (cytb) gene sequence to examine phylogenetic relationships of Eleotridae by using partitioned Bayesian method. When the specific models and parameter estimates were presumed for partitioning the total data, the harmonic mean –lnL was improved. The phylogenetic analysis supported the monophyly of Hypseleotris and Gobiomorphs. In addition, the Bostrychus were most closely related to Ophiocara, and the Philypnodon is also the sister to Microphlypnus, based on the current datasets. Further, extensive taxonomic sampling and more molecular information are needed to confirm the phylogenetic relationships in Eleotridae.
How to avoid mismodelling in GLM-based fMRI data analysis: cross-validated Bayesian model selection.
Soch, Joram; Haynes, John-Dylan; Allefeld, Carsten
2016-11-01
Voxel-wise general linear models (GLMs) are a standard approach for analyzing functional magnetic resonance imaging (fMRI) data. An advantage of GLMs is that they are flexible and can be adapted to the requirements of many different data sets. However, the specification of first-level GLMs leaves the researcher with many degrees of freedom which is problematic given recent efforts to ensure robust and reproducible fMRI data analysis. Formal model comparisons that allow a systematic assessment of GLMs are only rarely performed. On the one hand, too simple models may underfit data and leave real effects undiscovered. On the other hand, too complex models might overfit data and also reduce statistical power. Here we present a systematic approach termed cross-validated Bayesian model selection (cvBMS) that allows to decide which GLM best describes a given fMRI data set. Importantly, our approach allows for non-nested model comparison, i.e. comparing more than two models that do not just differ by adding one or more regressors. It also allows for spatially heterogeneous modelling, i.e. using different models for different parts of the brain. We validate our method using simulated data and demonstrate potential applications to empirical data. The increased use of model comparison and model selection should increase the reliability of GLM results and reproducibility of fMRI studies. PMID:27477536
International Nuclear Information System (INIS)
During the last three decades, several techniques have been developed for the quantitative study of human reliability. In the 1980s, techniques were developed to model systems by means of binary trees, which did not allow for the representation of the context in which human actions occur. Thus, these techniques cannot model the representation of individuals, their interrelationships, and the dynamics of a system. These issues make the improvement of methods for Human Reliability Analysis (HRA) a pressing need. To eliminate or at least attenuate these limitations, some authors have proposed modeling systems using Bayesian Belief Networks (BBNs). The application of these tools is expected to address many of the deficiencies in current approaches to modeling human actions with binary trees. This paper presents a methodology based on BBN for analyzing human reliability and applies this method to the operation of an oil tanker, focusing on the risk of collision accidents. The obtained model was used to determine the most likely sequence of hazardous events and thus isolate critical activities in the operation of the ship to study Internal Factors (IFs), Skills, and Management and Organizational Factors (MOFs) that should receive more attention for risk reduction.
Directory of Open Access Journals (Sweden)
Alan F. Sasso
2012-01-01
Full Text Available A lipid-based physiologically based toxicokinetic (PBTK model has been developed for a mixture of six polychlorinated biphenyls (PCBs in rats. The aim of this study was to apply population Bayesian analysis to a lipid PBTK model, while incorporating an internal exposure-response model linking enzyme induction and metabolic rate. Lipid-based physiologically based toxicokinetic models are a subset of PBTK models that can simulate concentrations of highly lipophilic compounds in tissue lipids, without the need for partition coefficients. A hierarchical treatment of population metabolic parameters and a CYP450 induction model were incorporated into the lipid-based PBTK framework, and Markov-Chain Monte Carlo was applied to in vivo data. A mass balance of CYP1A and CYP2B in the liver was necessary to model PCB metabolism at high doses. The linked PBTK/induction model remained on a lipid basis and was capable of modeling PCB concentrations in multiple tissues for all dose levels and dose profiles.
Lyons, James E.; Kendall, William; Royle, J. Andrew; Converse, Sarah J.; Andres, Brad A.; Buchanan, Joseph B.
2016-01-01
We present a novel formulation of a mark–recapture–resight model that allows estimation of population size, stopover duration, and arrival and departure schedules at migration areas. Estimation is based on encounter histories of uniquely marked individuals and relative counts of marked and unmarked animals. We use a Bayesian analysis of a state–space formulation of the Jolly–Seber mark–recapture model, integrated with a binomial model for counts of unmarked animals, to derive estimates of population size and arrival and departure probabilities. We also provide a novel estimator for stopover duration that is derived from the latent state variable representing the interim between arrival and departure in the state–space model. We conduct a simulation study of field sampling protocols to understand the impact of superpopulation size, proportion marked, and number of animals sampled on bias and precision of estimates. Simulation results indicate that relative bias of estimates of the proportion of the population with marks was low for all sampling scenarios and never exceeded 2%. Our approach does not require enumeration of all unmarked animals detected or direct knowledge of the number of marked animals in the population at the time of the study. This provides flexibility and potential application in a variety of sampling situations (e.g., migratory birds, breeding seabirds, sea turtles, fish, pinnipeds, etc.). Application of the methods is demonstrated with data from a study of migratory sandpipers.
International Nuclear Information System (INIS)
Network data analysis methods are the only way to properly separate real gravitational wave (GW) transient events from detector noise. They can be divided into two generic classes: the coincidence method and the coherent analysis. The former uses lists of selected events provided by each interferometer belonging to the network and tries to correlate them in time to identify a physical signal. Instead of this binary treatment of detector outputs (signal present or absent), the latter method involves first the merging of the interferometer data and looks for a common pattern, consistent with an assumed GW waveform and a given source location in the sky. The thresholds are only applied later, to validate or not the hypothesis made. As coherent algorithms use more complete information than coincidence methods, they are expected to provide better detection performances, but at a higher computational cost. An efficient filter must yield a good compromise between a low false alarm rate (hence triggering on data at a manageable rate) and a high detection efficiency. Therefore, the comparison of the two approaches is achieved using so-called receiving operating characteristics (ROC), giving the relationship between the false alarm rate and the detection efficiency for a given method. This paper investigates this question via Monte Carlo simulations, using the network model developed in a previous article. Its main conclusions are the following. First, a three-interferometer network such as Virgo-LIGO is found to be too small to reach good detection efficiencies at low false alarm rates: larger configurations are suitable to reach a confidence level high enough to validate as true GW a detected event. In addition, an efficient network must contain interferometers with comparable sensitivities: studying the three-interferometer LIGO network shows that the 2-km interferometer with half sensitivity leads to a strong reduction of performances as compared to a network of three
Congdon, Peter
2014-01-01
This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU
Bayesian Geostatistical Design
DEFF Research Database (Denmark)
Diggle, Peter; Lophaven, Søren Nymand
2006-01-01
locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model......This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...
Burn, Robert W; Underwood, Fiona M; Blanc, Julian
2011-01-01
Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE), set up by the 10(th) Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002-2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE) as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process. PMID:21912670
Burn, Robert W; Underwood, Fiona M; Blanc, Julian
2011-01-01
Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE), set up by the 10(th) Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002-2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE) as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process.
Directory of Open Access Journals (Sweden)
Robert W Burn
Full Text Available Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES. Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE, set up by the 10(th Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002-2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process.
A default Bayesian hypothesis test for ANOVA designs
R. Wetzels; R.P.P.P. Grasman; E.J. Wagenmakers
2012-01-01
This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA desig
Linac Coherent Light Source data analysis using psana
Energy Technology Data Exchange (ETDEWEB)
Damiani, D.; Dubrovin, M.; Gaponenko, I.; Kroeger, W.; Lane, T. J.; Mitra, A.; O' Grady, C. P.; Salnikov, A.; Sanchez-Gonzalez, A.; Schneider, D.; Yoon, C. H.
2016-03-18
Energy Technology Data Exchange (ETDEWEB)
Juang, Zhen [Los Alamos National Laboratory; Roussel-dupre, Robert [Los Alamos National Laboratory
2008-01-01
An analysis was perfonned on the mid-latitude scintillation and coherence frequency bandwidth (Fcoh) using transionospheric VHF signal data. The data include 1062 events spanning from November 1997 to June 2002. Each event records FORTE satellite received VHF signals from LAPP located at Los Alamos, New Mexico. Fcohs were derived to study scintillation characteristics on diurnal and seasonal variations, as well as changes due to solar and geomagnetic activities. Comparisons to the VHFIUHF coherence frequency bandwidth studies previously reported at equatorial and mid-latitude regions are made using a 4th power frequency dependence relationship. Furthennore, a wideband ionospheric scintillation model, WBMOD, was used to estimate Fcohs and compared with our VHF Fcoh values. Our analysis indicates mid-latitude scintillation characteristics that are not previously revealed. At the VHF bottom frequency range (3035 MHz), distinguished smaller Fcohs are found in time period from sunset to midnight, in wann season from May to August, and in low solar activity years. The effects of geomagnetic storm activity on Fcoh are characterized by a sudden transition at a Kp index of 50-60. Comparisons with median Fcohs estimated from other studies validated our VHF Fcohs for daytime while an order of magnitude larger Fcohs are found for nighttime, implying a time-dependent issue in applying the 4th order power relationship. Furthermore, comparisons with WBMOD-estimated Fcohs indicated generally matched median scintillation level estimates while differences do exist for those events undergoing high geomagnetic stonn activity which may imply underestimates of scintillation level by the WBMOD in the mid-latitude regions.
Bayesian Games with Intentions
Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael
2016-01-01
We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.
The analysis of low-level radioactivity using the Bayesian statistics
International Nuclear Information System (INIS)
The analysis of low-level or dominant background dominant radioactiviy is complicated by the fact that sample net rate of activity above background may be negative, because of random fluctuations. (authors)
Feroz, F.; Hobson, M. P.
2014-02-01
GJ667C is the least massive component of a triple star system which lies at a distance of about 6.8 pc (22.1 light-year) from the Earth. GJ667C has received much attention recently due to the claims that it hosts up to seven planets including three super-Earths inside the habitable zone. We present a Bayesian technique for the analysis of radial velocity (RV) data sets in the presence of correlated noise component (`red noise'), with unknown parameters. We also introduce hyper-parameters in our model in order to deal statistically with under- or overestimated error bars on measured RVs as well as inconsistencies between different data sets. By applying this method to the RV data set of GJ667C, we show that this data set contains a significant correlated (red) noise component with correlation time-scale for HARPS data of the order of 9 d. Our analysis shows that the data only provide strong evidence for the presence of two planets: GJ667Cb and c with periods 7.19 and 28.13 d, respectively, with some hints towards the presence of a third signal with period 91 d. The planetary nature of this third signal is not clear and additional RV observations are required for its confirmation. Previous claims of the detection of additional planets in this system are due the erroneous assumption of white noise. Using the standard white noise assumption, our method leads to the detection of up to five signals in this system. We also find that with the red noise model, the measurement uncertainties from HARPS for this system are underestimated at the level of ˜50 per cent.
Analysis of non-ergodic behaviour in spatio-temporal coherence properties of speckle light
Réfrégier, Philippe
Spatio-temporal coherence properties of light scattered by rough surfaces that leads to speckle fluctuations are analysed. It is demonstrated that the scattered light is non-ergodic with the disorder due to the scattering process. Although the mutual coherence matrix vanishes with isotropic polarization fluctuations, it is shown that spatio-temporal coherence properties can be described with interference experiments that can be obtained between different speckles of the scattered light. For non-singular scattering processes, the maximal value of the modulus of the Wolf degree of coherence is analysed in the spatial time domain. This approach is also applied to totally unpolarized incident light with an isotropic and spatially independent scattering process. The mean value and the standard deviation of the Wolf degree of coherence are then determined from the coherence properties of the incident light.
Coherent Performance Analysis of the HJ-1-C Synthetic Aperture Radar
Li Hai-ying; Zhang Shan-shan; Li Shi-qiang; Zhang Hua-chun
2014-01-01
Synthetic Aperture Radar (SAR) is a coherent imaging radar. Hence, coherence is critical in SAR imaging. In a coherent system, several sources can degrade performance. Based on the HJ-1-C SAR system implementation and sensor characteristics, this study evaluates the effect of frequency stability and pulse-to-pulse timing jitter on the SAR coherent performance. A stable crystal oscillator with short-term stability of 10×1.0−10 / 5 ms is used to generate the reference frequency by using a direc...
Spatial Bayesian Latent Factor Regression Modeling of Coordinate-based Meta-analysis Data
Montagna, Silvia; Wager, Tor; Feldman-Barrett, Lisa; Timothy D. Johnson; Nichols, Thomas E.
2016-01-01
Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the paper are available for Coordinate-based Meta-analysis (CBMA). Neuroimaging meta-analysis is used to 1) identify areas of consistent activation; and 2) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously...
Can, Seda; van de Schoot, Rens; Hox, Joop
2015-01-01
Because variables may be correlated in the social and behavioral sciences, multicollinearity might be problematic. This study investigates the effect of collinearity manipulated in within and between levels of a two-level confirmatory factor analysis by Monte Carlo simulation. Furthermore, the influence of the size of the intraclass correlation…
Can, Seda; van de Schoot, Rens; Hox, Joop
2014-01-01
Because variables may be correlated in the social and behavioral sciences, multicollinearity might be problematic. This study investigates the effect of collinearity manipulated in within and between levels of a two-level confirmatory factor analysis by Monte Carlo simulation. Furthermore, the influ
Bayesian Uncertainty Analysis of PBPK Model Predictions for Permethrin in Rats
Uncertainty analysis of human physiologically-based pharmacokinetic (PBPK) model predictions can pose a significant challenge due to data limitations. As a result of these limitations, human models are often derived from extrapolated animal PBPK models, for which there is usuall...
Integrative analysis of histone ChIP-seq and transcription data using Bayesian mixture models
DEFF Research Database (Denmark)
Klein, Hans-Ulrich; Schäfer, Martin; Porse, Bo T;
2014-01-01
Histone modifications are a key epigenetic mechanism to activate or repress the transcription of genes. Datasets of matched transcription data and histone modification data obtained by ChIP-seq exist, but methods for integrative analysis of both data types are still rare. Here, we present a novel...
Machine learning concepts in coherent optical communication systems
DEFF Research Database (Denmark)
Zibar, Darko; Schäffer, Christian G.
2014-01-01
Powerful statistical signal processing methods, used by the machine learning community, are addressed and linked to current problems in coherent optical communication. Bayesian filtering methods are presented and applied for nonlinear dynamic state tracking. © 2014 OSA.......Powerful statistical signal processing methods, used by the machine learning community, are addressed and linked to current problems in coherent optical communication. Bayesian filtering methods are presented and applied for nonlinear dynamic state tracking. © 2014 OSA....
Bayesian belief networks in business continuity.
Phillipson, Frank; Matthijssen, Edwin; Attema, Thomas
2014-01-01
Business continuity professionals aim to mitigate the various challenges to the continuity of their company. The goal is a coherent system of measures that encompass detection, prevention and recovery. Choices made in one part of the system affect other parts as well as the continuity risks of the company. In complex organisations, however, these relations are far from obvious. This paper proposes the use of Bayesian belief networks to expose these relations, and presents a modelling framework for this approach. PMID:25193453
Owusu-Edusei, Kwame; Owens, Chantelle J
2009-01-01
Background Chlamydia continues to be the most prevalent disease in the United States. Effective spatial monitoring of chlamydia incidence is important for successful implementation of control and prevention programs. The objective of this study is to apply Bayesian smoothing and exploratory spatial data analysis (ESDA) methods to monitor Texas county-level chlamydia incidence rates by examining spatiotemporal patterns. We used county-level data on chlamydia incidence (for all ages, gender and races) from the National Electronic Telecommunications System for Surveillance (NETSS) for 2004 and 2005. Results Bayesian-smoothed chlamydia incidence rates were spatially dependent both in levels and in relative changes. Erath county had significantly (p 300 cases per 100,000 residents) than its contiguous neighbors (195 or less) in both years. Gaines county experienced the highest relative increase in smoothed rates (173% – 139 to 379). The relative change in smoothed chlamydia rates in Newton county was significantly (p < 0.05) higher than its contiguous neighbors. Conclusion Bayesian smoothing and ESDA methods can assist programs in using chlamydia surveillance data to identify outliers, as well as relevant changes in chlamydia incidence in specific geographic units. Secondly, it may also indirectly help in assessing existing differences and changes in chlamydia surveillance systems over time. PMID:19245686
Directory of Open Access Journals (Sweden)
Owens Chantelle J
2009-02-01
Full Text Available Abstract Background Chlamydia continues to be the most prevalent disease in the United States. Effective spatial monitoring of chlamydia incidence is important for successful implementation of control and prevention programs. The objective of this study is to apply Bayesian smoothing and exploratory spatial data analysis (ESDA methods to monitor Texas county-level chlamydia incidence rates by examining spatiotemporal patterns. We used county-level data on chlamydia incidence (for all ages, gender and races from the National Electronic Telecommunications System for Surveillance (NETSS for 2004 and 2005. Results Bayesian-smoothed chlamydia incidence rates were spatially dependent both in levels and in relative changes. Erath county had significantly (p 300 cases per 100,000 residents than its contiguous neighbors (195 or less in both years. Gaines county experienced the highest relative increase in smoothed rates (173% – 139 to 379. The relative change in smoothed chlamydia rates in Newton county was significantly (p Conclusion Bayesian smoothing and ESDA methods can assist programs in using chlamydia surveillance data to identify outliers, as well as relevant changes in chlamydia incidence in specific geographic units. Secondly, it may also indirectly help in assessing existing differences and changes in chlamydia surveillance systems over time.
Approximation methods for efficient learning of Bayesian networks
Riggelsen, C
2008-01-01
This publication offers and investigates efficient Monte Carlo simulation methods in order to realize a Bayesian approach to approximate learning of Bayesian networks from both complete and incomplete data. For large amounts of incomplete data when Monte Carlo methods are inefficient, approximations are implemented, such that learning remains feasible, albeit non-Bayesian. The topics discussed are: basic concepts about probabilities, graph theory and conditional independence; Bayesian network learning from data; Monte Carlo simulation techniques; and, the concept of incomplete data. In order to provide a coherent treatment of matters, thereby helping the reader to gain a thorough understanding of the whole concept of learning Bayesian networks from (in)complete data, this publication combines in a clarifying way all the issues presented in the papers with previously unpublished work.
Bayesian Frailty for Competing Risks Survival Analysis in the Iranian Metastatic Colorectal Patients
Asghari Jafarabadi, Mohamad; Hajizadeh, Ebrahim; Kazemnejad, Anoshirvan; Fatemi, Seyed Reza
2010-01-01
Objective: In the competing risks problem, wherein only the event due one cause is observed, the cause-specific hazard rates are usually estimated by considering the independence assumption on the competing causes. However, this assumption is too rigorous in the practical situations. This paper aimed to extend the results of Finkelstein and Esaulova (2008), in order to relax the problem of independence assumption in the competing risks survival analysis with counting process approach by exert...
Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems
He, Yuning; Davies, Misty Dawn
2014-01-01
The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.
Directory of Open Access Journals (Sweden)
Goodacre Royston
2011-01-01
Full Text Available Abstract Background The rapid identification of Bacillus spores and bacterial identification are paramount because of their implications in food poisoning, pathogenesis and their use as potential biowarfare agents. Many automated analytical techniques such as Curie-point pyrolysis mass spectrometry (Py-MS have been used to identify bacterial spores giving use to large amounts of analytical data. This high number of features makes interpretation of the data extremely difficult We analysed Py-MS data from 36 different strains of aerobic endospore-forming bacteria encompassing seven different species. These bacteria were grown axenically on nutrient agar and vegetative biomass and spores were analyzed by Curie-point Py-MS. Results We develop a novel genetic algorithm-Bayesian network algorithm that accurately identifies sand selects a small subset of key relevant mass spectra (biomarkers to be further analysed. Once identified, this subset of relevant biomarkers was then used to identify Bacillus spores successfully and to identify Bacillus species via a Bayesian network model specifically built for this reduced set of features. Conclusions This final compact Bayesian network classification model is parsimonious, computationally fast to run and its graphical visualization allows easy interpretation of the probabilistic relationships among selected biomarkers. In addition, we compare the features selected by the genetic algorithm-Bayesian network approach with the features selected by partial least squares-discriminant analysis (PLS-DA. The classification accuracy results show that the set of features selected by the GA-BN is far superior to PLS-DA.
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...... and edges. The nodes represent variables, which may be either discrete or continuous. An edge between two nodes A and B indicates a direct influence between the state of A and the state of B, which in some domains can also be interpreted as a causal relation. The wide-spread use of Bayesian networks...... is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...
International Nuclear Information System (INIS)
Time-variant partial directed coherence (tvPDC) is used for the first time in a multivariate analysis of heart rate variability (HRV), respiratory movements (RMs) and (systolic) arterial blood pressure. It is shown that respiration-related HRV components which also occur at other frequencies besides the RM frequency (= respiratory sinus arrhythmia, RSA) can be identified. These additional components are known to be an effect of the 'half-the-mean-heart-rate-dilemma' ('cardiac aliasing' CA). These CA components may contaminate the entire frequency range of HRV and can lead to misinterpretation of the RSA analysis. TvPDC analysis of simulated and clinical data (full-term neonates and sedated patients) reveals these contamination effects and, in addition, the respiration-related CA components can be separated from the RSA component and the Traube–Hering–Mayer wave. It can be concluded that tvPDC can be beneficially applied to avoid misinterpretations in HRV analyses as well as to quantify partial correlative interaction properties between RM and RSA
Analysis of simulated data for the KArlsruhe TRItium Neutrino experiment using Bayesian inference
DEFF Research Database (Denmark)
Riis, Anna Sejersen; Hannestad, Steen; Weinheimer, C.
2011-01-01
The KATRIN (Karlsruhe Tritium Neutrino) experiment will analyze the tritium β spectrum to determine the mass of the neutrino with a sensitivity of 0.2 eV (90% C.L.). This approach to a measurement of the absolute value of the neutrino mass relies only on the principle of energy conservation and can...... neutrinos. As an alternative to the frequentist minimization methods used in the analysis of the earlier experiments in Mainz and Troitsk we have been investigating Markov chain Monte Carlo (MCMC) methods which are very well suited for probing multiparameter spaces. We found that implementing the KATRIN χ2...
Bayesian Penalized Spline Models for the Analysis of Spatio-Temporal Count Data
Bauer, Cici; Wakefield, Jon; Rue, Håvard; Self, Steve; Feng, Zijian; Wang, Yu
2016-01-01
In recent years, the availability of infectious disease counts in time and space has increased, and consequently there has been renewed interest in model formulation for such data. In this paper, we describe a model that was motivated by the need to analyze hand, foot and mouth disease (HFMD) surveillance data in China. The data are aggregated by geographical areas and by week, with the aims of the analysis being to gain insight into the space-time dynamics and to make short-term prediction to implement public health campaigns in those areas with a large predicted disease burden. The model we develop decomposes disease risk into marginal spatial and temporal components, and a space-time interaction piece. The latter is the crucial element, and we use a tensor product spline model with a Markov random field prior on the coefficients of the basis functions. The model can be formulated as a Gaussian Markov random field and so fast computation can be carried out using the integrated nested Laplace approximation (INLA) approach. A simulation study shows that the model can pick up complex space-time structure and our analysis of HFMD data in the central north region of China provides new insights into the dynamics of the disease. PMID:26530705
Directory of Open Access Journals (Sweden)
Joseph P. Yurko
2015-01-01
Full Text Available System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here with Markov Chain Monte Carlo (MCMC sampling feasible. This work uses Gaussian Process (GP based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.
International Nuclear Information System (INIS)
System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here with Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This 'function factorization' Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process
Directory of Open Access Journals (Sweden)
Sérgio L. Pereira
2008-01-01
Full Text Available Most Neotropical birds, including Pteroglossus aracaris, do not have an adequate fossil record to be used as time constraints in molecular dating. Hence, the evolutionary timeframe of the avian biota can only be inferred using alternative time constraints. We applied a Bayesian relaxed clock approach to propose an alternative interpretation for the historical biogeography of Pteroglossus based on mitochondrial DNA sequences, using different combinations of outgroups and time constraints obtained from outgroup fossils, vicariant barriers and molecular time estimates. The results indicated that outgroup choice has little effect on the Bayesian posterior distribution of divergence times within Pteroglossus , that geological and molecular time constraints seem equally suitable to estimate the Bayesian posterior distribution of divergence times for Pteroglossus , and that the fossil record alone overestimates divergence times within the fossil-lacking ingroup. The Bayesian estimates of divergence times suggest that the radiation of Pteroglossus occurred from the Late Miocene to the Pliocene (three times older than estimated by the “standard” mitochondrial rate of 2% sequence divergence per million years, likely triggered by Andean uplift, multiple episodes of marine transgressions in South America, and formation of present-day river basins. The time estimates are in agreement with other Neotropical taxa with similar geographic distributions.
Institute of Scientific and Technical Information of China (English)
MING Zhimao; TAO Junyong; ZHANG Yunan; YI Xiaoshan; CHEN Xun
2009-01-01
New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.
Gaffney, Jim A; Sonnad, Vijay; Libby, Stephen B
2013-01-01
The complex nature of inertial confinement fusion (ICF) experiments results in a very large number of experimental parameters that are only known with limited reliability. These parameters, combined with the myriad physical models that govern target evolution, make the reliable extraction of physics from experimental campaigns very difficult. We develop an inference method that allows all important experimental parameters, and previous knowledge, to be taken into account when investigating underlying microphysics models. The result is framed as a modified $\\chi^{2}$ analysis which is easy to implement in existing analyses, and quite portable. We present a first application to a recent convergent ablator experiment performed at the NIF, and investigate the effect of variations in all physical dimensions of the target (very difficult to do using other methods). We show that for well characterised targets in which dimensions vary at the 0.5% level there is little effect, but 3% variations change the results of i...
DEFF Research Database (Denmark)
Burgess, Stephen; Thompson, Simon G; Andrews, G;
2010-01-01
Genetic markers can be used as instrumental variables, in an analogous way to randomization in a clinical trial, to estimate the causal relationship between a phenotype and an outcome variable. Our purpose is to extend the existing methods for such Mendelian randomization studies to the context...... an overall estimate of the causal relationship between the phenotype and the outcome, and an assessment of its heterogeneity across studies. As an example, we estimate the causal relationship of blood concentrations of C-reactive protein on fibrinogen levels using data from 11 studies. These methods provide...... a flexible framework for efficient estimation of causal relationships derived from multiple studies. Issues discussed include weak instrument bias, analysis of binary outcome data such as disease risk, missing genetic data, and the use of haplotypes....
TCA Cycle Turnover And Serum Glucose Sources By Automated Bayesian Analysis Of NMR Spectra
Merritt, Matthew E.; Burgess, Shawn; Jeffrey, F. Mark; Sherry, A. Dean; Malloy, Craig; Bretthorst, G. Larry
2004-04-01
Changes in sources of serum glucose are indicative of a variety of pathological metabolic states. It is possible to measure the sources of serum glucose by the administration of deuterated water to a subject followed by analysis of the 2H enrichment levels in glucose extracted from plasma from a single blood draw by 2H NMR. Markov Chain Monte Carlo simulations of the posterior probability densities may then be used to evaluate the contribution of glycogenolysis, glycerol, and the Kreb's cycle to serum glucose. Experiments with simulated NMR spectra show that in spectra with a S/N of 20 to 1, the resulting metabolic information may be evaluated with an accuracy of about 4 percent.