WorldWideScience

Sample records for bayesian coalescent analysis

  1. Impact of Bayesian Priors on the Characterization of Binary Black Hole Coalescences.

    Science.gov (United States)

    Vitale, Salvatore; Gerosa, Davide; Haster, Carl-Johan; Chatziioannou, Katerina; Zimmerman, Aaron

    2017-12-22

    In a regime where data are only mildly informative, prior choices can play a significant role in Bayesian statistical inference, potentially affecting the inferred physics. We show this is indeed the case for some of the parameters inferred from current gravitational-wave measurements of binary black hole coalescences. We reanalyze the first detections performed by the twin LIGO interferometers using alternative (and astrophysically motivated) prior assumptions. We find different prior distributions can introduce deviations in the resulting posteriors that impact the physical interpretation of these systems. For instance, (i) limits on the 90% credible interval on the effective black hole spin χ_{eff} are subject to variations of ∼10% if a prior with black hole spins mostly aligned to the binary's angular momentum is considered instead of the standard choice of isotropic spin directions, and (ii) under priors motivated by the initial stellar mass function, we infer tighter constraints on the black hole masses, and in particular, we find no support for any of the inferred masses within the putative mass gap M≲5  M_{⊙}.

  2. Bayesian Mediation Analysis

    OpenAIRE

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  3. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  4. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  5. BEAST: Bayesian evolutionary analysis by sampling trees

    Directory of Open Access Journals (Sweden)

    Drummond Alexei J

    2007-11-01

    Full Text Available Abstract Background The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models suitable for both within- and between-species sequence data are implemented. Results BEAST version 1.4.6 consists of 81000 lines of Java source code, 779 classes and 81 packages. It provides models for DNA and protein sequence evolution, highly parametric coalescent analysis, relaxed clock phylogenetics, non-contemporaneous sequence data, statistical alignment and a wide range of options for prior distributions. BEAST source code is object-oriented, modular in design and freely available at http://beast-mcmc.googlecode.com/ under the GNU LGPL license. Conclusion BEAST is a powerful and flexible evolutionary analysis package for molecular sequence variation. It also provides a resource for the further development of new models and statistical methods of evolutionary analysis.

  6. Gas bubble coalescence

    International Nuclear Information System (INIS)

    Ganeev, G.Z.; Turkebaev, T.Eh.; Kadyrov, Kh.G.

    2001-01-01

    Analysis of gaseous pores coalescence in area of the straggling of helium atoms is given. The considered coalescence is based on both the Brownian movement and the migration within the stress gradient. (author)

  7. Estimating Population Parameters using the Structured Serial Coalescent with Bayesian MCMC Inference when some Demes are Hidden

    Directory of Open Access Journals (Sweden)

    Allen Rodrigo

    2006-01-01

    Full Text Available Using the structured serial coalescent with Bayesian MCMC and serial samples, we estimate population size when some demes are not sampled or are hidden, ie ghost demes. It is found that even with the presence of a ghost deme, accurate inference was possible if the parameters are estimated with the true model. However with an incorrect model, estimates were biased and can be positively misleading. We extend these results to the case where there are sequences from the ghost at the last time sample. This case can arise in HIV patients, when some tissue samples and viral sequences only become available after death. When some sequences from the ghost deme are available at the last sampling time, estimation bias is reduced and accurate estimation of parameters associated with the ghost deme is possible despite sampling bias. Migration rates for this case are also shown to be good estimates when migration values are low.

  8. Phylogenetic analysis at deep timescales: unreliable gene trees, bypassed hidden support, and the coalescence/concatalescence conundrum.

    Science.gov (United States)

    Gatesy, John; Springer, Mark S

    2014-11-01

    Large datasets are required to solve difficult phylogenetic problems that are deep in the Tree of Life. Currently, two divergent systematic methods are commonly applied to such datasets: the traditional supermatrix approach (= concatenation) and "shortcut" coalescence (= coalescence methods wherein gene trees and the species tree are not co-estimated). When applied to ancient clades, these contrasting frameworks often produce congruent results, but in recent phylogenetic analyses of Placentalia (placental mammals), this is not the case. A recent series of papers has alternatively disputed and defended the utility of shortcut coalescence methods at deep phylogenetic scales. Here, we examine this exchange in the context of published phylogenomic data from Mammalia; in particular we explore two critical issues - the delimitation of data partitions ("genes") in coalescence analysis and hidden support that emerges with the combination of such partitions in phylogenetic studies. Hidden support - increased support for a clade in combined analysis of all data partitions relative to the support evident in separate analyses of the various data partitions, is a hallmark of the supermatrix approach and a primary rationale for concatenating all characters into a single matrix. In the most extreme cases of hidden support, relationships that are contradicted by all gene trees are supported when all of the genes are analyzed together. A valid fear is that shortcut coalescence methods might bypass or distort character support that is hidden in individual loci because small gene fragments are analyzed in isolation. Given the extensive systematic database for Mammalia, the assumptions and applicability of shortcut coalescence methods can be assessed with rigor to complement a small but growing body of simulation work that has directly compared these methods to concatenation. We document several remarkable cases of hidden support in both supermatrix and coalescence paradigms and argue

  9. Coalescent methods for estimating phylogenetic trees.

    Science.gov (United States)

    Liu, Liang; Yu, Lili; Kubatko, Laura; Pearl, Dennis K; Edwards, Scott V

    2009-10-01

    We review recent models to estimate phylogenetic trees under the multispecies coalescent. Although the distinction between gene trees and species trees has come to the fore of phylogenetics, only recently have methods been developed that explicitly estimate species trees. Of the several factors that can cause gene tree heterogeneity and discordance with the species tree, deep coalescence due to random genetic drift in branches of the species tree has been modeled most thoroughly. Bayesian approaches to estimating species trees utilizes two likelihood functions, one of which has been widely used in traditional phylogenetics and involves the model of nucleotide substitution, and the second of which is less familiar to phylogeneticists and involves the probability distribution of gene trees given a species tree. Other recent parametric and nonparametric methods for estimating species trees involve parsimony criteria, summary statistics, supertree and consensus methods. Species tree approaches are an appropriate goal for systematics, appear to work well in some cases where concatenation can be misleading, and suggest that sampling many independent loci will be paramount. Such methods can also be challenging to implement because of the complexity of the models and computational time. In addition, further elaboration of the simplest of coalescent models will be required to incorporate commonly known issues such as deviation from the molecular clock, gene flow and other genetic forces.

  10. Hydrodynamic effects on coalescence.

    Energy Technology Data Exchange (ETDEWEB)

    Dimiduk, Thomas G.; Bourdon, Christopher Jay; Grillet, Anne Mary; Baer, Thomas A.; de Boer, Maarten Pieter; Loewenberg, Michael (Yale University, New Haven, CT); Gorby, Allen D.; Brooks, Carlton, F.

    2006-10-01

    The goal of this project was to design, build and test novel diagnostics to probe the effect of hydrodynamic forces on coalescence dynamics. Our investigation focused on how a drop coalesces onto a flat surface which is analogous to two drops coalescing, but more amenable to precise experimental measurements. We designed and built a flow cell to create an axisymmetric compression flow which brings a drop onto a flat surface. A computer-controlled system manipulates the flow to steer the drop and maintain a symmetric flow. Particle image velocimetry was performed to confirm that the control system was delivering a well conditioned flow. To examine the dynamics of the coalescence, we implemented an interferometry capability to measure the drainage of the thin film between the drop and the surface during the coalescence process. A semi-automated analysis routine was developed which converts the dynamic interferogram series into drop shape evolution data.

  11. Analysis of Void Growth and Coalescence in Porous Polymer Materials. Coalescence in Polymer Materials

    Directory of Open Access Journals (Sweden)

    S. A. Reffas

    2013-06-01

    Full Text Available The use of polymeric materials in engineering applications is growing more and more all over the world. This issue requests new methodologies of analysis in order to assess the material’s capability to withstand complex loads. The use of polyacetal in engineering applications has increased rapidly in the last decade. In order to evaluate the behavior, the damage and coalescence of this type of polymer, a numerical method based on damage which occurs following several stages (nucleation of cavities, their growth and coalescence in more advanced stages of deformation is proposed in this work. A particular attention is given on the stress-strain and the volumetric strain evolution under different triaxiality and for three initial void shapes. Its application to polyacetal allows approving this approach for technical polymers. Finally, this method allow us to compare the obtained results of basic calculations at different triaxiality and to discuss their possible influence on the initial size and the geometrical shape of the porosity on the material failure.

  12. GeneRecon Users' Manual — A coalescent based tool for fine-scale association mapping

    DEFF Research Database (Denmark)

    Mailund, T

    2006-01-01

    GeneRecon is a software package for linkage disequilibrium mapping using coalescent theory. It is based on Bayesian Markov-chain Monte Carlo (MCMC) method for fine-scale linkage-disequilibrium gene mapping using high-density marker maps. GeneRecon explicitly models the genealogy of a sample of th...

  13. The Bacterial Sequential Markov Coalescent.

    Science.gov (United States)

    De Maio, Nicola; Wilson, Daniel J

    2017-05-01

    Bacteria can exchange and acquire new genetic material from other organisms directly and via the environment. This process, known as bacterial recombination, has a strong impact on the evolution of bacteria, for example, leading to the spread of antibiotic resistance across clades and species, and to the avoidance of clonal interference. Recombination hinders phylogenetic and transmission inference because it creates patterns of substitutions (homoplasies) inconsistent with the hypothesis of a single evolutionary tree. Bacterial recombination is typically modeled as statistically akin to gene conversion in eukaryotes, i.e. , using the coalescent with gene conversion (CGC). However, this model can be very computationally demanding as it needs to account for the correlations of evolutionary histories of even distant loci. So, with the increasing popularity of whole genome sequencing, the need has emerged for a faster approach to model and simulate bacterial genome evolution. We present a new model that approximates the coalescent with gene conversion: the bacterial sequential Markov coalescent (BSMC). Our approach is based on a similar idea to the sequential Markov coalescent (SMC)-an approximation of the coalescent with crossover recombination. However, bacterial recombination poses hurdles to a sequential Markov approximation, as it leads to strong correlations and linkage disequilibrium across very distant sites in the genome. Our BSMC overcomes these difficulties, and shows a considerable reduction in computational demand compared to the exact CGC, and very similar patterns in simulated data. We implemented our BSMC model within new simulation software FastSimBac. In addition to the decreased computational demand compared to previous bacterial genome evolution simulators, FastSimBac provides more general options for evolutionary scenarios, allowing population structure with migration, speciation, population size changes, and recombination hotspots. FastSimBac is

  14. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  15. Bayesian Data Analysis (lecture 2)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  16. Bayesian Data Analysis (lecture 1)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  17. A maximum pseudo-likelihood approach for estimating species trees under the coalescent model

    Directory of Open Access Journals (Sweden)

    Edwards Scott V

    2010-10-01

    Full Text Available Abstract Background Several phylogenetic approaches have been developed to estimate species trees from collections of gene trees. However, maximum likelihood approaches for estimating species trees under the coalescent model are limited. Although the likelihood of a species tree under the multispecies coalescent model has already been derived by Rannala and Yang, it can be shown that the maximum likelihood estimate (MLE of the species tree (topology, branch lengths, and population sizes from gene trees under this formula does not exist. In this paper, we develop a pseudo-likelihood function of the species tree to obtain maximum pseudo-likelihood estimates (MPE of species trees, with branch lengths of the species tree in coalescent units. Results We show that the MPE of the species tree is statistically consistent as the number M of genes goes to infinity. In addition, the probability that the MPE of the species tree matches the true species tree converges to 1 at rate O(M -1. The simulation results confirm that the maximum pseudo-likelihood approach is statistically consistent even when the species tree is in the anomaly zone. We applied our method, Maximum Pseudo-likelihood for Estimating Species Trees (MP-EST to a mammal dataset. The four major clades found in the MP-EST tree are consistent with those in the Bayesian concatenation tree. The bootstrap supports for the species tree estimated by the MP-EST method are more reasonable than the posterior probability supports given by the Bayesian concatenation method in reflecting the level of uncertainty in gene trees and controversies over the relationship of four major groups of placental mammals. Conclusions MP-EST can consistently estimate the topology and branch lengths (in coalescent units of the species tree. Although the pseudo-likelihood is derived from coalescent theory, and assumes no gene flow or horizontal gene transfer (HGT, the MP-EST method is robust to a small amount of HGT in the

  18. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  19. Robust bayesian analysis of an autoregressive model with ...

    African Journals Online (AJOL)

    In this work, robust Bayesian analysis of the Bayesian estimation of an autoregressive model with exponential innovations is performed. Using a Bayesian robustness methodology, we show that, using a suitable generalized quadratic loss, we obtain optimal Bayesian estimators of the parameters corresponding to the ...

  20. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  1. Implementing the Bayesian paradigm in risk analysis

    International Nuclear Information System (INIS)

    Aven, T.; Kvaloey, J.T.

    2002-01-01

    The Bayesian paradigm comprises a unified and consistent framework for analyzing and expressing risk. Yet, we see rather few examples of applications where the full Bayesian setting has been adopted with specifications of priors of unknown parameters. In this paper, we discuss some of the practical challenges of implementing Bayesian thinking and methods in risk analysis, emphasizing the introduction of probability models and parameters and associated uncertainty assessments. We conclude that there is a need for a pragmatic view in order to 'successfully' apply the Bayesian approach, such that we can do the assignments of some of the probabilities without adopting the somewhat sophisticated procedure of specifying prior distributions of parameters. A simple risk analysis example is presented to illustrate ideas

  2. SODIUM ALUMINOSILICATE FOULING AND CLEANING OF DECONTAMINATED SALT SOLUTION COALESCERS

    International Nuclear Information System (INIS)

    Poirier, M.; Thomas Peters, T.; Fernando Fondeur, F.; Samuel Fink, S.

    2008-01-01

    During initial non-radioactive operations at the Modular Caustic Side Solvent Extraction Unit (MCU), the pressure drop across the decontaminated salt solution coalescer reached ∼10 psi while processing ∼1250 gallons of salt solution, indicating possible fouling or plugging of the coalescer. An analysis of the feed solution and the 'plugged coalescer' concluded that the plugging was due to sodium aluminosilicate solids. MCU personnel requested Savannah River National Laboratory (SRNL) to investigate the formation of the sodium aluminosilicate solids (NAS) and the impact of the solids on the decontaminated salt solution coalescer. Researchers performed developmental testing of the cleaning protocols with a bench-scale coalescer container 1-inch long segments of a new coalescer element fouled using simulant solution. In addition, the authors obtained a 'plugged' Decontaminated Salt Solution coalescer from non-radioactive testing in the MCU and cleaned it according to the proposed cleaning procedure. Conclusions from this testing include the following: (1) Testing with the bench-scale coalescer showed an increase in pressure drop from solid particles, but the increase was not as large as observed at MCU. (2) Cleaning the bench-scale coalescer with nitric acid reduced the pressure drop and removed a large amount of solid particles (11 g of bayerite if all aluminum is present in that form or 23 g of sodium aluminosilicate if all silicon is present in that form). (3) Based on analysis of the cleaning solutions from bench-scale test, the 'dirt capacity' of a 40 inch coalescer for the NAS solids tested is calculated as 450-950 grams. (4) Cleaning the full-scale coalescer with nitric acid reduced the pressure drop and removed a large amount of solid particles (60 g of aluminum and 5 g of silicon). (5) Piping holdup in the full-scale coalescer system caused the pH to differ from the target value. Comparable hold-up in the facility could lead to less effective cleaning and

  3. Bayesian Latent Class Analysis Tutorial.

    Science.gov (United States)

    Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca

    2018-01-01

    This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.

  4. Bayesian data analysis in population ecology: motivations, methods, and benefits

    Science.gov (United States)

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  5. Doing bayesian data analysis a tutorial with R and BUGS

    CERN Document Server

    Kruschke, John K

    2011-01-01

    There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all

  6. Delimiting Coalescence Genes (C-Genes) in Phylogenomic Data Sets.

    Science.gov (United States)

    Springer, Mark S; Gatesy, John

    2018-02-26

    coalescence methods have emerged as a popular alternative for inferring species trees with large genomic datasets, because these methods explicitly account for incomplete lineage sorting. However, statistical consistency of summary coalescence methods is not guaranteed unless several model assumptions are true, including the critical assumption that recombination occurs freely among but not within coalescence genes (c-genes), which are the fundamental units of analysis for these methods. Each c-gene has a single branching history, and large sets of these independent gene histories should be the input for genome-scale coalescence estimates of phylogeny. By contrast, numerous studies have reported the results of coalescence analyses in which complete protein-coding sequences are treated as c-genes even though exons for these loci can span more than a megabase of DNA. Empirical estimates of recombination breakpoints suggest that c-genes may be much shorter, especially when large clades with many species are the focus of analysis. Although this idea has been challenged recently in the literature, the inverse relationship between c-gene size and increased taxon sampling in a dataset-the 'recombination ratchet'-is a fundamental property of c-genes. For taxonomic groups characterized by genes with long intron sequences, complete protein-coding sequences are likely not valid c-genes and are inappropriate units of analysis for summary coalescence methods unless they occur in recombination deserts that are devoid of incomplete lineage sorting (ILS). Finally, it has been argued that coalescence methods are robust when the no-recombination within loci assumption is violated, but recombination must matter at some scale because ILS, a by-product of recombination, is the raison d'etre for coalescence methods. That is, extensive recombination is required to yield the large number of independently segregating c-genes used to infer a species tree. If coalescent methods are powerful

  7. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  8. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  9. Bayesian analysis of CCDM models

    Science.gov (United States)

    Jesus, J. F.; Valentim, R.; Andrade-Oliveira, F.

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3αH0 model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  10. Bayesian analysis of CCDM models

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, J.F. [Universidade Estadual Paulista (Unesp), Câmpus Experimental de Itapeva, Rua Geraldo Alckmin 519, Vila N. Sra. de Fátima, Itapeva, SP, 18409-010 Brazil (Brazil); Valentim, R. [Departamento de Física, Instituto de Ciências Ambientais, Químicas e Farmacêuticas—ICAQF, Universidade Federal de São Paulo (UNIFESP), Unidade José Alencar, Rua São Nicolau No. 210, Diadema, SP, 09913-030 Brazil (Brazil); Andrade-Oliveira, F., E-mail: jfjesus@itapeva.unesp.br, E-mail: valentim.rodolfo@unifesp.br, E-mail: felipe.oliveira@port.ac.uk [Institute of Cosmology and Gravitation—University of Portsmouth, Burnaby Road, Portsmouth, PO1 3FX United Kingdom (United Kingdom)

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3α H {sub 0} model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  11. A Bayesian approach to multi-messenger astronomy: identification of gravitational-wave host galaxies

    International Nuclear Information System (INIS)

    Fan, XiLong; Messenger, Christopher; Heng, Ik Siong

    2014-01-01

    We present a general framework for incorporating astrophysical information into Bayesian parameter estimation techniques used by gravitational wave data analysis to facilitate multi-messenger astronomy. Since the progenitors of transient gravitational wave events, such as compact binary coalescences, are likely to be associated with a host galaxy, improvements to the source sky location estimates through the use of host galaxy information are explored. To demonstrate how host galaxy properties can be included, we simulate a population of compact binary coalescences and show that for ∼8.5% of simulations within 200 Mpc, the top 10 most likely galaxies account for a ∼50% of the total probability of hosting a gravitational wave source. The true gravitational wave source host galaxy is in the top 10 galaxy candidates ∼10% of the time. Furthermore, we show that by including host galaxy information, a better estimate of the inclination angle of a compact binary gravitational wave source can be obtained. We also demonstrate the flexibility of our method by incorporating the use of either the B or K band into our analysis.

  12. Rapid molecular evolution of human bocavirus revealed by Bayesian coalescent inference.

    Science.gov (United States)

    Zehender, Gianguglielmo; De Maddalena, Chiara; Canuti, Marta; Zappa, Alessandra; Amendola, Antonella; Lai, Alessia; Galli, Massimo; Tanzi, Elisabetta

    2010-03-01

    Human bocavirus (HBoV) is a linear single-stranded DNA virus belonging to the Parvoviridae family that has recently been isolated from the upper respiratory tract of children with acute respiratory infection. All of the strains observed so far segregate into two genotypes (1 and 2) with a low level of polymorphism. Given the recent description of the infection and the lack of epidemiological and molecular data, we estimated the virus's rates of molecular evolution and population dynamics. A dataset of forty-nine dated VP2 sequences, including also eight new isolates obtained from pharyngeal swabs of Italian patients with acute respiratory tract infections, was submitted to phylogenetic analysis. The model parameters, evolutionary rates and population dynamics were co-estimated using a Bayesian Markov Chain Monte Carlo approach, and site-specific positive and negative selection was also investigated. Recombination was investigated by seven different methods and one suspected recombinant strain was excluded from further analysis. The estimated mean evolutionary rate of HBoV was 8.6x10(-4)subs/site/year, and that of the 1st+2nd codon positions was more than 15 times less than that of the 3rd codon position. Viral population dynamics analysis revealed that the two known genotypes diverged recently (mean tMRCA: 24 years), and that the epidemic due to HBoV genotype 2 grew exponentially at a rate of 1.01year(-1). Selection analysis of the partial VP2 showed that 8.5% of sites were under significant negative pressure and the absence of positive selection. Our results show that, like other parvoviruses, HBoV is characterised by a rapid evolution. The low level of polymorphism is probably due to a relatively recent divergence between the circulating genotypes and strong purifying selection acting on viral antigens.

  13. Bayesian Analysis for Penalized Spline Regression Using WinBUGS

    Directory of Open Access Journals (Sweden)

    Ciprian M. Crainiceanu

    2005-09-01

    Full Text Available Penalized splines can be viewed as BLUPs in a mixed model framework, which allows the use of mixed model software for smoothing. Thus, software originally developed for Bayesian analysis of mixed models can be used for penalized spline regression. Bayesian inference for nonparametric models enjoys the flexibility of nonparametric models and the exact inference provided by the Bayesian inferential machinery. This paper provides a simple, yet comprehensive, set of programs for the implementation of nonparametric Bayesian analysis in WinBUGS. Good mixing properties of the MCMC chains are obtained by using low-rank thin-plate splines, while simulation times per iteration are reduced employing WinBUGS specific computational tricks.

  14. Study on shielded pump system failure analysis method based on Bayesian network

    International Nuclear Information System (INIS)

    Bao Yilan; Huang Gaofeng; Tong Lili; Cao Xuewu

    2012-01-01

    This paper applies Bayesian network to the system failure analysis, with an aim to improve knowledge representation of the uncertainty logic and multi-fault states in system failure analysis. A Bayesian network for shielded pump failure analysis is presented, conducting fault parameter learning, updating Bayesian network parameter based on new samples. Finally, through the Bayesian network inference, vulnerability in this system, the largest possible failure modes, and the fault probability are obtained. The powerful ability of Bayesian network to analyze system fault is illustrated by examples. (authors)

  15. A fluid-mechanical model of elastocapillary coalescence

    KAUST Repository

    Singh, Kiran

    2014-03-25

    © 2014 Cambridge University Press. We present a fluid-mechanical model of the coalescence of a number of elastic objects due to surface tension. We consider an array of spring-block elements separated by thin liquid films, whose dynamics are modelled using lubrication theory. With this simplified model of elastocapillary coalescence, we present the results of numerical simulations for a large number of elements, N = O(104). A linear stability analysis shows that pairwise coalescence is always the most unstable mode of deformation. However, the numerical simulations show that the cluster sizes actually produced by coalescence from a small white-noise perturbation have a distribution that depends on the relative strength of surface tension and elasticity, as measured by an elastocapillary number K. Both the maximum cluster size and the mean cluster size scale like K-1/2 for small K. An analytical solution for the response of the system to a localized perturbation shows that such perturbations generate propagating disturbance fronts, which leave behind \\'frozen-in\\' clusters of a predictable size that also depends on K. A good quantitative comparison between the cluster-size statistics from noisy perturbations and this \\'frozen-in\\' cluster size suggests that propagating fronts may play a crucial role in the dynamics of coalescence.

  16. A Bayesian approach to meta-analysis of plant pathology studies.

    Science.gov (United States)

    Mila, A L; Ngugi, H K

    2011-01-01

    Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework

  17. Fast "coalescent" simulation

    Directory of Open Access Journals (Sweden)

    Wall Jeff D

    2006-03-01

    Full Text Available Abstract Background The amount of genome-wide molecular data is increasing rapidly, as is interest in developing methods appropriate for such data. There is a consequent increasing need for methods that are able to efficiently simulate such data. In this paper we implement the sequentially Markovian coalescent algorithm described by McVean and Cardin and present a further modification to that algorithm which slightly improves the closeness of the approximation to the full coalescent model. The algorithm ignores a class of recombination events known to affect the behavior of the genealogy of the sample, but which do not appear to affect the behavior of generated samples to any substantial degree. Results We show that our software is able to simulate large chromosomal regions, such as those appropriate in a consideration of genome-wide data, in a way that is several orders of magnitude faster than existing coalescent algorithms. Conclusion This algorithm provides a useful resource for those needing to simulate large quantities of data for chromosomal-length regions using an approach that is much more efficient than traditional coalescent models.

  18. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  19. A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research

    NARCIS (Netherlands)

    Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First,

  20. A gentle introduction to Bayesian analysis : Applications to developmental research

    NARCIS (Netherlands)

    van de Schoot, R.; Kaplan, D.; Denissen, J.J.A.; Asendorpf, J.B.; Neyer, F.J.; van Aken, M.A.G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First,

  1. Identifying the rooted species tree from the distribution of unrooted gene trees under the coalescent.

    Science.gov (United States)

    Allman, Elizabeth S; Degnan, James H; Rhodes, John A

    2011-06-01

    Gene trees are evolutionary trees representing the ancestry of genes sampled from multiple populations. Species trees represent populations of individuals-each with many genes-splitting into new populations or species. The coalescent process, which models ancestry of gene copies within populations, is often used to model the probability distribution of gene trees given a fixed species tree. This multispecies coalescent model provides a framework for phylogeneticists to infer species trees from gene trees using maximum likelihood or Bayesian approaches. Because the coalescent models a branching process over time, all trees are typically assumed to be rooted in this setting. Often, however, gene trees inferred by traditional phylogenetic methods are unrooted. We investigate probabilities of unrooted gene trees under the multispecies coalescent model. We show that when there are four species with one gene sampled per species, the distribution of unrooted gene tree topologies identifies the unrooted species tree topology and some, but not all, information in the species tree edges (branch lengths). The location of the root on the species tree is not identifiable in this situation. However, for 5 or more species with one gene sampled per species, we show that the distribution of unrooted gene tree topologies identifies the rooted species tree topology and all its internal branch lengths. The length of any pendant branch leading to a leaf of the species tree is also identifiable for any species from which more than one gene is sampled.

  2. Bayesian linkage and segregation analysis: factoring the problem.

    Science.gov (United States)

    Matthysse, S

    2000-01-01

    Complex segregation analysis and linkage methods are mathematical techniques for the genetic dissection of complex diseases. They are used to delineate complex modes of familial transmission and to localize putative disease susceptibility loci to specific chromosomal locations. The computational problem of Bayesian linkage and segregation analysis is one of integration in high-dimensional spaces. In this paper, three available techniques for Bayesian linkage and segregation analysis are discussed: Markov Chain Monte Carlo (MCMC), importance sampling, and exact calculation. The contribution of each to the overall integration will be explicitly discussed.

  3. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    Science.gov (United States)

    2012-01-01

    Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach. PMID:23033878

  4. Coalescent: an open-source and scalable framework for exact calculations in coalescent theory

    Directory of Open Access Journals (Sweden)

    Tewari Susanta

    2012-10-01

    Full Text Available Abstract Background Currently, there is no open-source, cross-platform and scalable framework for coalescent analysis in population genetics. There is no scalable GUI based user application either. Such a framework and application would not only drive the creation of more complex and realistic models but also make them truly accessible. Results As a first attempt, we built a framework and user application for the domain of exact calculations in coalescent analysis. The framework provides an API with the concepts of model, data, statistic, phylogeny, gene tree and recursion. Infinite-alleles and infinite-sites models are considered. It defines pluggable computations such as counting and listing all the ancestral configurations and genealogies and computing the exact probability of data. It can visualize a gene tree, trace and visualize the internals of the recursion algorithm for further improvement and attach dynamically a number of output processors. The user application defines jobs in a plug-in like manner so that they can be activated, deactivated, installed or uninstalled on demand. Multiple jobs can be run and their inputs edited. Job inputs are persisted across restarts and running jobs can be cancelled where applicable. Conclusions Coalescent theory plays an increasingly important role in analysing molecular population genetic data. Models involved are mathematically difficult and computationally challenging. An open-source, scalable framework that lets users immediately take advantage of the progress made by others will enable exploration of yet more difficult and realistic models. As models become more complex and mathematically less tractable, the need for an integrated computational approach is obvious. Object oriented designs, though has upfront costs, are practical now and can provide such an integrated approach.

  5. Coalescent: an open-science framework for importance sampling in coalescent theory.

    Science.gov (United States)

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  6. Coalescent: an open-science framework for importance sampling in coalescent theory

    Directory of Open Access Journals (Sweden)

    Susanta Tewari

    2015-08-01

    Full Text Available Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner.Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3 for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux. Extensive tests and coverage make the framework reliable and maintainable.Conclusions. In coalescent theory, many studies of computational efficiency

  7. Bayesian Correlation Analysis for Sequence Count Data.

    Directory of Open Access Journals (Sweden)

    Daniel Sánchez-Taltavull

    Full Text Available Evaluating the similarity of different measured variables is a fundamental task of statistics, and a key part of many bioinformatics algorithms. Here we propose a Bayesian scheme for estimating the correlation between different entities' measurements based on high-throughput sequencing data. These entities could be different genes or miRNAs whose expression is measured by RNA-seq, different transcription factors or histone marks whose expression is measured by ChIP-seq, or even combinations of different types of entities. Our Bayesian formulation accounts for both measured signal levels and uncertainty in those levels, due to varying sequencing depth in different experiments and to varying absolute levels of individual entities, both of which affect the precision of the measurements. In comparison with a traditional Pearson correlation analysis, we show that our Bayesian correlation analysis retains high correlations when measurement confidence is high, but suppresses correlations when measurement confidence is low-especially for entities with low signal levels. In addition, we consider the influence of priors on the Bayesian correlation estimate. Perhaps surprisingly, we show that naive, uniform priors on entities' signal levels can lead to highly biased correlation estimates, particularly when different experiments have widely varying sequencing depths. However, we propose two alternative priors that provably mitigate this problem. We also prove that, like traditional Pearson correlation, our Bayesian correlation calculation constitutes a kernel in the machine learning sense, and thus can be used as a similarity measure in any kernel-based machine learning algorithm. We demonstrate our approach on two RNA-seq datasets and one miRNA-seq dataset.

  8. Bayesian dynamic mediation analysis.

    Science.gov (United States)

    Huang, Jing; Yuan, Ying

    2017-12-01

    Most existing methods for mediation analysis assume that mediation is a stationary, time-invariant process, which overlooks the inherently dynamic nature of many human psychological processes and behavioral activities. In this article, we consider mediation as a dynamic process that continuously changes over time. We propose Bayesian multilevel time-varying coefficient models to describe and estimate such dynamic mediation effects. By taking the nonparametric penalized spline approach, the proposed method is flexible and able to accommodate any shape of the relationship between time and mediation effects. Simulation studies show that the proposed method works well and faithfully reflects the true nature of the mediation process. By modeling mediation effect nonparametrically as a continuous function of time, our method provides a valuable tool to help researchers obtain a more complete understanding of the dynamic nature of the mediation process underlying psychological and behavioral phenomena. We also briefly discuss an alternative approach of using dynamic autoregressive mediation model to estimate the dynamic mediation effect. The computer code is provided to implement the proposed Bayesian dynamic mediation analysis. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Power in Bayesian Mediation Analysis for Small Sample Research

    NARCIS (Netherlands)

    Miočević, M.; MacKinnon, David; Levy, Roy

    2017-01-01

    Bayesian methods have the potential for increasing power in mediation analysis (Koopman, Howe, Hollenbeck, & Sin, 2015; Yuan & MacKinnon, 2009). This article compares the power of Bayesian credibility intervals for the mediated effect to the power of normal theory, distribution of the product,

  10. Bayesian Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  11. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  12. A parallel algorithm for filtering gravitational waves from coalescing binaries

    International Nuclear Information System (INIS)

    Sathyaprakash, B.S.; Dhurandhar, S.V.

    1992-10-01

    Coalescing binary stars are perhaps the most promising sources for the observation of gravitational waves with laser interferometric gravity wave detectors. The waveform from these sources can be predicted with sufficient accuracy for matched filtering techniques to be applied. In this paper we present a parallel algorithm for detecting signals from coalescing compact binaries by the method of matched filtering. We also report the details of its implementation on a 256-node connection machine consisting of a network of transputers. The results of our analysis indicate that parallel processing is a promising approach to on-line analysis of data from gravitational wave detectors to filter out coalescing binary signals. The algorithm described is quite general in that the kernel of the algorithm is applicable to any set of matched filters. (author). 15 refs, 4 figs

  13. Le phénomène de coalescence. Etude bibliographique Coalescence Phenomena. A Review

    Directory of Open Access Journals (Sweden)

    Palermo T.

    2006-11-01

    Full Text Available Nous présentons une revue des différents travaux expérimentaux et théoriques effectués sur le phénomène de coalescence en limitant principalement notre discussion au cas de la coalescence d'une goutte isolée à une interface plane ou déformable. La coalescence se divise en deux phases distinctes : le drainage et la rupture du film interfacial. Ce film est constitué du liquide de la phase continue séparant la goutte de l'interface. La condition de rupture est principalement contrôlée par les forces électrostatiques dues à la double couche et par les forces de Van der Waals. Les résultats expérimentaux mettent en évidence un phénomène de coalescence partielle ainsi que l'existence d'une distribution statistique du temps de coalescence. Ils montrent également l'influence complexe de nombreux paramètres physiques et physico-chimiques sur le temps de coalescence. On rappelle les principaux modèles théoriques décrivant le drainage et la rupture des films liquides. Ces modèles permettent, entre autre, d'aboutir à des expressions mathématiques reliant le temps de coalescence à des paramètres tels que la tension interfaciale, les densités et les viscosités des fluides, la taille des gouttes. The problem linked to the stability of oil-in-water emulsions (e. g. deoiling of water and water-in-oil emulsions (e. g. dehydration of crudes is one of the major problems encountered in the petroleum industry. From the thermodynamic standpoint, an emulsion is always unstable (Fig. I. 1. The kinematic stability characterizing the separation rate of the dispersed phase from the continuous phase can nonetheless be controlled by the coalescence of droplets present in the emulsion (Fig. I. 2. This article reviews various experimental and theoretical works on the phenomenon of coalescence but the discussion is limited mainly to the coalescence of an single drop at a flat or deformable interface. The coalescence of a single drop is governed

  14. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  15. Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy

    Science.gov (United States)

    Sharma, Sanjib

    2017-08-01

    Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.

  16. Bayesian analysis in plant pathology.

    Science.gov (United States)

    Mila, A L; Carriquiry, A L

    2004-09-01

    ABSTRACT Bayesian methods are currently much discussed and applied in several disciplines from molecular biology to engineering. Bayesian inference is the process of fitting a probability model to a set of data and summarizing the results via probability distributions on the parameters of the model and unobserved quantities such as predictions for new observations. In this paper, after a short introduction of Bayesian inference, we present the basic features of Bayesian methodology using examples from sequencing genomic fragments and analyzing microarray gene-expressing levels, reconstructing disease maps, and designing experiments.

  17. Crack Coalescence in Molded Gypsum and Carrara Marble

    Science.gov (United States)

    Wong, N.; Einstein, H. H.

    2007-12-01

    This research investigates the fracturing and coalescence behavior in prismatic laboratory-molded gypsum and Carrara marble specimens, which consist of either one or two pre-existing open flaws, under uniaxial compression. The tests are monitored by a high speed video system with a frame rate up to 24,000 frames/second. It allows one to precisely observe the cracking mechanisms, in particular if shear or tensile fracturing takes place. Seven crack types and nine crack coalescence categories are identified. The flaw inclination angle, the ligament length and the bridging angle between two flaws have different extents of influence on the coalescence patterns. For coplanar flaws, as the flaw inclination angle increases, there is a general trend of variation from shear coalescence to tensile coalescence. For stepped flaws, as the bridging angle changes from negative to small positive, and further up to large positive values, the coalescence generally progresses from categories of no coalescence, indirect coalescence to direct coalescence. For direct coalescence, it generally progresses from shear, mixed shear-tensile to tensile as the bridging angle increases. Some differences in fracturing and coalescence processes are observed in gypsum and marble, particularly the crack initiation in marble is preceded by the development of macroscopic white patches, but not in gypsum. Scanning Electron Microprobe (SEM) study reveals that the white patches consist of zones of microcracks (process zones).

  18. Self-consistent Bayesian analysis of space-time symmetry studies

    International Nuclear Information System (INIS)

    Davis, E.D.

    1996-01-01

    We introduce a Bayesian method for the analysis of epithermal neutron transmission data on space-time symmetries in which unique assignment of the prior is achieved by maximisation of the cross entropy and the imposition of a self-consistency criterion. Unlike the maximum likelihood method used in previous analyses of parity-violation data, our method is freed of an ad hoc cutoff parameter. Monte Carlo studies indicate that our self-consistent Bayesian analysis is superior to the maximum likelihood method when applied to the small data samples typical of symmetry studies. (orig.)

  19. Bubble coalescence in breathing DNA

    DEFF Research Database (Denmark)

    Novotný, Tomas; Pedersen, Jonas Nyvold; Ambjörnsson, Tobias

    2007-01-01

    We investigate the coalescence of two DNA bubbles initially located at weak segments and separated by a more stable barrier region in a designed construct of double-stranded DNA. The characteristic time for bubble coalescence and the corresponding distribution are derived, as well as the distribu...... vicious walkers in opposite potentials....

  20. Bubble coalescence in a Newtonian fluid

    Science.gov (United States)

    Garg, Vishrut; Basaran, Osman

    2017-11-01

    Bubble coalescence plays a central role in the hydrodynamics of gas-liquid systems such as bubble column reactors, spargers, and foams. Two bubbles approaching each other at velocity V coalesce when the thin film between them ruptures, which is often the rate-limiting step. Experimental studies of this system are difficult, and recent works provide conflicting results on the effect of V on coalescence times. We simulate the head-on approach of two bubbles of equal radii R in an incompressible Newtonian fluid (density ρ, viscosity μ, and surface tension σ) by solving numerically the free boundary problem comprised of the Navier Stokes and continuity equations. Simulations are made challenging by the existence of highly disparate lengthscales, i.e. film thickness and drop radii, which are resolved by using the method of elliptic mesh generation. For a given liquid, the bubbles are shown to coalesce for all velocities below a critical value. The effects of Ohnesorge number Oh = μ /√{ ρσR } on coalescence time and critical velocity are also investigated.

  1. Bayesian cost-effectiveness analysis with the R package BCEA

    CERN Document Server

    Baio, Gianluca; Heath, Anna

    2017-01-01

    The book provides a description of the process of health economic evaluation and modelling for cost-effectiveness analysis, particularly from the perspective of a Bayesian statistical approach. Some relevant theory and introductory concepts are presented using practical examples and two running case studies. The book also describes in detail how to perform health economic evaluations using the R package BCEA (Bayesian Cost-Effectiveness Analysis). BCEA can be used to post-process the results of a Bayesian cost-effectiveness model and perform advanced analyses producing standardised and highly customisable outputs. It presents all the features of the package, including its many functions and their practical application, as well as its user-friendly web interface. The book is a valuable resource for statisticians and practitioners working in the field of health economics wanting to simplify and standardise their workflow, for example in the preparation of dossiers in support of marketing authorisation, or acade...

  2. Coalescence measurements for evolving foams monitored by real-time projection imaging

    International Nuclear Information System (INIS)

    Myagotin, A; Helfen, L; Baumbach, T

    2009-01-01

    Real-time radiographic projection imaging together with novel spatio-temporal image analysis is presented to be a powerful technique for the quantitative analysis of coalescence processes accompanying the generation and temporal evolution of foams and emulsions. Coalescence events can be identified as discontinuities in a spatio-temporal image representing a sequence of projection images. Detection, identification of intensity and localization of the discontinuities exploit a violation criterion of the Fourier shift theorem and are based on recursive spatio-temporal image partitioning. The proposed method is suited for automated measurements of discontinuity rates (i.e., discontinuity intensity per unit time), so that large series of radiographs can be analyzed without user intervention. The application potential is demonstrated by the quantification of coalescence during the formation and decay of metal foams monitored by real-time x-ray radiography

  3. Operational modal analysis modeling, Bayesian inference, uncertainty laws

    CERN Document Server

    Au, Siu-Kui

    2017-01-01

    This book presents operational modal analysis (OMA), employing a coherent and comprehensive Bayesian framework for modal identification and covering stochastic modeling, theoretical formulations, computational algorithms, and practical applications. Mathematical similarities and philosophical differences between Bayesian and classical statistical approaches to system identification are discussed, allowing their mathematical tools to be shared and their results correctly interpreted. Many chapters can be used as lecture notes for the general topic they cover beyond the OMA context. After an introductory chapter (1), Chapters 2–7 present the general theory of stochastic modeling and analysis of ambient vibrations. Readers are first introduced to the spectral analysis of deterministic time series (2) and structural dynamics (3), which do not require the use of probability concepts. The concepts and techniques in these chapters are subsequently extended to a probabilistic context in Chapter 4 (on stochastic pro...

  4. Mechanism and simulation of droplet coalescence in molten steel

    Science.gov (United States)

    Ni, Bing; Zhang, Tao; Ni, Hai-qi; Luo, Zhi-guo

    2017-11-01

    Droplet coalescence in liquid steel was carefully investigated through observations of the distribution pattern of inclusions in solidified steel samples. The process of droplet coalescence was slow, and the critical Weber number ( We) was used to evaluate the coalescence or separation of droplets. The relationship between the collision parameter and the critical We indicated whether slow coalescence or bouncing of droplets occurred. The critical We was 5.5, which means that the droplets gradually coalesce when We ≤ 5.5, whereas they bounce when We > 5.5. For the carbonate wire feeding into liquid steel, a mathematical model implementing a combined computational fluid dynamics (CFD)-discrete element method (DEM) approach was developed to simulate the movement and coalescence of variably sized droplets in a bottom-argon-blowing ladle. In the CFD model, the flow field was solved on the premise that the fluid was a continuous medium. Meanwhile, the droplets were dispersed in the DEM model, and the coalescence criterion of the particles was added to simulate the collision- coalescence process of the particles. The numerical simulation results and observations of inclusion coalescence in steel samples are consistent.

  5. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  6. Prior elicitation and Bayesian analysis of the Steroids for Corneal Ulcers Trial.

    Science.gov (United States)

    See, Craig W; Srinivasan, Muthiah; Saravanan, Somu; Oldenburg, Catherine E; Esterberg, Elizabeth J; Ray, Kathryn J; Glaser, Tanya S; Tu, Elmer Y; Zegans, Michael E; McLeod, Stephen D; Acharya, Nisha R; Lietman, Thomas M

    2012-12-01

    To elicit expert opinion on the use of adjunctive corticosteroid therapy in bacterial corneal ulcers. To perform a Bayesian analysis of the Steroids for Corneal Ulcers Trial (SCUT), using expert opinion as a prior probability. The SCUT was a placebo-controlled trial assessing visual outcomes in patients receiving topical corticosteroids or placebo as adjunctive therapy for bacterial keratitis. Questionnaires were conducted at scientific meetings in India and North America to gauge expert consensus on the perceived benefit of corticosteroids as adjunct treatment. Bayesian analysis, using the questionnaire data as a prior probability and the primary outcome of SCUT as a likelihood, was performed. For comparison, an additional Bayesian analysis was performed using the results of the SCUT pilot study as a prior distribution. Indian respondents believed there to be a 1.21 Snellen line improvement, and North American respondents believed there to be a 1.24 line improvement with corticosteroid therapy. The SCUT primary outcome found a non-significant 0.09 Snellen line benefit with corticosteroid treatment. The results of the Bayesian analysis estimated a slightly greater benefit than did the SCUT primary analysis (0.19 lines verses 0.09 lines). Indian and North American experts had similar expectations on the effectiveness of corticosteroids in bacterial corneal ulcers; that corticosteroids would markedly improve visual outcomes. Bayesian analysis produced results very similar to those produced by the SCUT primary analysis. The similarity in result is likely due to the large sample size of SCUT and helps validate the results of SCUT.

  7. Bayesian Analysis of Bubbles in Asset Prices

    Directory of Open Access Journals (Sweden)

    Andras Fulop

    2017-10-01

    Full Text Available We develop a new model where the dynamic structure of the asset price, after the fundamental value is removed, is subject to two different regimes. One regime reflects the normal period where the asset price divided by the dividend is assumed to follow a mean-reverting process around a stochastic long run mean. The second regime reflects the bubble period with explosive behavior. Stochastic switches between two regimes and non-constant probabilities of exit from the bubble regime are both allowed. A Bayesian learning approach is employed to jointly estimate the latent states and the model parameters in real time. An important feature of our Bayesian method is that we are able to deal with parameter uncertainty and at the same time, to learn about the states and the parameters sequentially, allowing for real time model analysis. This feature is particularly useful for market surveillance. Analysis using simulated data reveals that our method has good power properties for detecting bubbles. Empirical analysis using price-dividend ratios of S&P500 highlights the advantages of our method.

  8. Coalescence preference in dense packing of bubbles

    Science.gov (United States)

    Kim, Yeseul; Gim, Bopil; Gim, Bopil; Weon, Byung Mook

    2015-11-01

    Coalescence preference is the tendency that a merged bubble from the contact of two original bubbles (parent) tends to be near to the bigger parent. Here, we show that the coalescence preference can be blocked by densely packing of neighbor bubbles. We use high-speed high-resolution X-ray microscopy to clearly visualize individual coalescence phenomenon which occurs in micro scale seconds and inside dense packing of microbubbles with a local packing fraction of ~40%. Previous theory and experimental evidence predict a power of -5 between the relative coalescence position and the parent size. However, our new observation for coalescence preference in densely packed microbubbles shows a different power of -2. We believe that this result may be important to understand coalescence dynamics in dense packing of soft matter. This work (NRF-2013R1A22A04008115) was supported by Mid-career Researcher Program through NRF grant funded by the MEST and also was supported by Ministry of Science, ICT and Future Planning (2009-0082580) and by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry and Education, Science and Technology (NRF-2012R1A6A3A04039257).

  9. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung; Liang, Faming

    2009-01-01

    in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method

  10. Satellite Formation during Coalescence of Unequal Size Drops

    KAUST Repository

    Zhang, F. H.

    2009-03-12

    The coalescence of a drop with a flat liquid surface pinches off a satellite from its top, in the well-known coalescence cascade, whereas the coalescence of two equally sized drops does not appear to leave such a satellite. Herein we perform experiments to identify the critical diameter ratio of two drops, above which a satellite is produced during their coalescence. We find that the critical parent ratio is as small as 1.55, but grows monotonically with the Ohnesorge number. The daughter size is typically about 50% of the mother drop. However, we have identified novel pinch-off dynamics close to the critical size ratio, where the satellite does not fully separate, but rather goes directly into a second stage of the coalescence cascade, thus generating a much smaller satellite droplet.

  11. Satellite Formation during Coalescence of Unequal Size Drops

    KAUST Repository

    Zhang, F. H.; Li, E. Q.; Thoroddsen, Sigurdur T

    2009-01-01

    The coalescence of a drop with a flat liquid surface pinches off a satellite from its top, in the well-known coalescence cascade, whereas the coalescence of two equally sized drops does not appear to leave such a satellite. Herein we perform experiments to identify the critical diameter ratio of two drops, above which a satellite is produced during their coalescence. We find that the critical parent ratio is as small as 1.55, but grows monotonically with the Ohnesorge number. The daughter size is typically about 50% of the mother drop. However, we have identified novel pinch-off dynamics close to the critical size ratio, where the satellite does not fully separate, but rather goes directly into a second stage of the coalescence cascade, thus generating a much smaller satellite droplet.

  12. Role of microtexture in the interaction and coalescence of hydrogen-induced cracks

    Energy Technology Data Exchange (ETDEWEB)

    Venegas, V. [Departamento de Ingenieria Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico D.F. 07738 (Mexico); Caleyo, F. [Departamento de Ingenieria Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico D.F. 07738 (Mexico)], E-mail: fcaleyo@gmail.com; Baudin, T. [Laboratoire de Physico-Chimie de l' Etat Solide, ICMMO, UMR CNRS 8182, Batiment 410, Universite de Paris Sud, 91405, Orsay, Cedex (France); Hallen, J.M. [Departamento de Ingenieria Metalurgica, IPN-ESIQIE, UPALM Edif. 7, Zacatenco, Mexico D.F. 07738 (Mexico); Penelle, R. [Laboratoire de Physico-Chimie de l' Etat Solide, ICMMO, UMR CNRS 8182, Batiment 410, Universite de Paris Sud, 91405, Orsay, Cedex (France)

    2009-05-15

    The role of microtexture in hydrogen-induced crack interaction and coalescence is investigated in line pipe steels using electron backscatter diffraction. Experimental evidence shows that, depending on the local grain orientation, crack interaction and coalescence can depart from the conditions predicted by the mixed-mode fracture mechanics of isotropic linear elastic materials. Stress simulation and microtexture analysis are used to explain the experimental observations.

  13. Role of microtexture in the interaction and coalescence of hydrogen-induced cracks

    International Nuclear Information System (INIS)

    Venegas, V.; Caleyo, F.; Baudin, T.; Hallen, J.M.; Penelle, R.

    2009-01-01

    The role of microtexture in hydrogen-induced crack interaction and coalescence is investigated in line pipe steels using electron backscatter diffraction. Experimental evidence shows that, depending on the local grain orientation, crack interaction and coalescence can depart from the conditions predicted by the mixed-mode fracture mechanics of isotropic linear elastic materials. Stress simulation and microtexture analysis are used to explain the experimental observations.

  14. Review of bayesian statistical analysis methods for cytogenetic radiation biodosimetry, with a practical example

    International Nuclear Information System (INIS)

    Ainsbury, Elizabeth A.; Lloyd, David C.; Rothkamm, Kai; Vinnikov, Volodymyr A.; Maznyk, Nataliya A.; Puig, Pedro; Higueras, Manuel

    2014-01-01

    Classical methods of assessing the uncertainty associated with radiation doses estimated using cytogenetic techniques are now extremely well defined. However, several authors have suggested that a Bayesian approach to uncertainty estimation may be more suitable for cytogenetic data, which are inherently stochastic in nature. The Bayesian analysis framework focuses on identification of probability distributions (for yield of aberrations or estimated dose), which also means that uncertainty is an intrinsic part of the analysis, rather than an 'afterthought'. In this paper Bayesian, as well as some more advanced classical, data analysis methods for radiation cytogenetics are reviewed that have been proposed in the literature. A practical overview of Bayesian cytogenetic dose estimation is also presented, with worked examples from the literature. (authors)

  15. Research & development and growth: A Bayesian model averaging analysis

    Czech Academy of Sciences Publication Activity Database

    Horváth, Roman

    2011-01-01

    Roč. 28, č. 6 (2011), s. 2669-2673 ISSN 0264-9993. [Society for Non-linear Dynamics and Econometrics Annual Conferencen. Washington DC, 16.03.2011-18.03.2011] R&D Projects: GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Keywords : Research and development * Growth * Bayesian model averaging Subject RIV: AH - Economic s Impact factor: 0.701, year: 2011 http://library.utia.cas.cz/separaty/2011/E/horvath-research & development and growth a bayesian model averaging analysis.pdf

  16. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.

  17. Bayesian data analysis tools for atomic physics

    Science.gov (United States)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  18. An Analysis of Construction Accident Factors Based on Bayesian Network

    OpenAIRE

    Yunsheng Zhao; Jinyong Pei

    2013-01-01

    In this study, we have an analysis of construction accident factors based on bayesian network. Firstly, accidents cases are analyzed to build Fault Tree method, which is available to find all the factors causing the accidents, then qualitatively and quantitatively analyzes the factors with Bayesian network method, finally determines the safety management program to guide the safety operations. The results of this study show that bad condition of geological environment has the largest posterio...

  19. CytoBayesJ: software tools for Bayesian analysis of cytogenetic radiation dosimetry data.

    Science.gov (United States)

    Ainsbury, Elizabeth A; Vinnikov, Volodymyr; Puig, Pedro; Maznyk, Nataliya; Rothkamm, Kai; Lloyd, David C

    2013-08-30

    A number of authors have suggested that a Bayesian approach may be most appropriate for analysis of cytogenetic radiation dosimetry data. In the Bayesian framework, probability of an event is described in terms of previous expectations and uncertainty. Previously existing, or prior, information is used in combination with experimental results to infer probabilities or the likelihood that a hypothesis is true. It has been shown that the Bayesian approach increases both the accuracy and quality assurance of radiation dose estimates. New software entitled CytoBayesJ has been developed with the aim of bringing Bayesian analysis to cytogenetic biodosimetry laboratory practice. CytoBayesJ takes a number of Bayesian or 'Bayesian like' methods that have been proposed in the literature and presents them to the user in the form of simple user-friendly tools, including testing for the most appropriate model for distribution of chromosome aberrations and calculations of posterior probability distributions. The individual tools are described in detail and relevant examples of the use of the methods and the corresponding CytoBayesJ software tools are given. In this way, the suitability of the Bayesian approach to biological radiation dosimetry is highlighted and its wider application encouraged by providing a user-friendly software interface and manual in English and Russian. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    Science.gov (United States)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  1. APPLICATION OF BAYESIAN MONTE CARLO ANALYSIS TO A LAGRANGIAN PHOTOCHEMICAL AIR QUALITY MODEL. (R824792)

    Science.gov (United States)

    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  2. Direct numerical simulation of water droplet coalescence in the oil

    International Nuclear Information System (INIS)

    Mohammadi, Mehdi; Shahhosseini, Shahrokh; Bayat, Mahmoud

    2012-01-01

    Highlights: ► VOF computational technique has been used to simulate coalescence of two water droplets in oil. ► The model was validated with the experimental data for binary droplet coalescence. ► Based on the CFD simulation results a correlation has been proposed to predict the coalescence time. - Abstract: Coalescence of two water droplets in the oil was simulated using Computational Fluid Dynamics (CFD) techniques. The finite volume numerical method was applied to solve the Navier–Stokes equations in conjunction with the Volume of Fluid (VOF) approach for interface tracking. The effects of some parameters consisting of the collision velocity, off-center collision parameter, oil viscosity and water–oil interfacial tension on the coalescence time were investigated. The simulation results were validated against the experimental data available in the literature. The results revealed that quicker coalescence could be achieved if the head-on collisions occur or the droplets approach each other with a high velocity. In addition, low oil viscosities or large water–oil interfacial tensions cause less coalescence time. Moreover, a correlation was developed to predict coalescence efficiency as a function of the mentioned parameters.

  3. Conjunction analysis and propositional logic in fMRI data analysis using Bayesian statistics.

    Science.gov (United States)

    Rudert, Thomas; Lohmann, Gabriele

    2008-12-01

    To evaluate logical expressions over different effects in data analyses using the general linear model (GLM) and to evaluate logical expressions over different posterior probability maps (PPMs). In functional magnetic resonance imaging (fMRI) data analysis, the GLM was applied to estimate unknown regression parameters. Based on the GLM, Bayesian statistics can be used to determine the probability of conjunction, disjunction, implication, or any other arbitrary logical expression over different effects or contrast. For second-level inferences, PPMs from individual sessions or subjects are utilized. These PPMs can be combined to a logical expression and its probability can be computed. The methods proposed in this article are applied to data from a STROOP experiment and the methods are compared to conjunction analysis approaches for test-statistics. The combination of Bayesian statistics with propositional logic provides a new approach for data analyses in fMRI. Two different methods are introduced for propositional logic: the first for analyses using the GLM and the second for common inferences about different probability maps. The methods introduced extend the idea of conjunction analysis to a full propositional logic and adapt it from test-statistics to Bayesian statistics. The new approaches allow inferences that are not possible with known standard methods in fMRI. (c) 2008 Wiley-Liss, Inc.

  4. Explosive coalescence of Magnetic Islands

    International Nuclear Information System (INIS)

    Tajima, T.; Sakai, J.I.

    1985-04-01

    An explosive reconnection process associated with nonlinear evolution of the coalescence instability is found through studies of particle and magnetohydrodynamic simulations. The explosive coalescence is a self-similar process of magnetic collapse, in which the magnetic and electrostatic energies and temperatures explode toward the explosion time t 0 as (t 0 -t)/sup 8/3/,(t 0 -t) -4 , and (t 0 -t)/sup -8/3/, respectively. Ensuing amplitude oscillations in these quantities are identified by deriving an equation of motion for the scale factor in the Sagdeev potential

  5. Bayesian benefits with JASP

    NARCIS (Netherlands)

    Marsman, M.; Wagenmakers, E.-J.

    2017-01-01

    We illustrate the Bayesian approach to data analysis using the newly developed statistical software program JASP. With JASP, researchers are able to take advantage of the benefits that the Bayesian framework has to offer in terms of parameter estimation and hypothesis testing. The Bayesian

  6. Toward a DNA taxonomy of Alpine Rhithrogena (Ephemeroptera: Heptageniidae using a mixed Yule-coalescent analysis of mitochondrial and nuclear DNA.

    Directory of Open Access Journals (Sweden)

    Laurent Vuataz

    Full Text Available Aquatic larvae of many Rhithrogena mayflies (Ephemeroptera inhabit sensitive Alpine environments. A number of species are on the IUCN Red List and many recognized species have restricted distributions and are of conservation interest. Despite their ecological and conservation importance, ambiguous morphological differences among closely related species suggest that the current taxonomy may not accurately reflect the evolutionary diversity of the group. Here we examined the species status of nearly 50% of European Rhithrogena diversity using a widespread sampling scheme of Alpine species that included 22 type localities, general mixed Yule-coalescent (GMYC model analysis of one standard mtDNA marker and one newly developed nDNA marker, and morphological identification where possible. Using sequences from 533 individuals from 144 sampling localities, we observed significant clustering of the mitochondrial (cox1 marker into 31 GMYC species. Twenty-one of these could be identified based on the presence of topotypes (expertly identified specimens from the species' type locality or unambiguous morphology. These results strongly suggest the presence of both cryptic diversity and taxonomic oversplitting in Rhithrogena. Significant clustering was not detected with protein-coding nuclear PEPCK, although nine GMYC species were congruent with well supported terminal clusters of nDNA. Lack of greater congruence in the two data sets may be the result of incomplete sorting of ancestral polymorphism. Bayesian phylogenetic analyses of both gene regions recovered four of the six recognized Rhithrogena species groups in our samples as monophyletic. Future development of more nuclear markers would facilitate multi-locus analysis of unresolved, closely related species pairs. The DNA taxonomy developed here lays the groundwork for a future revision of the important but cryptic Rhithrogena genus in Europe.

  7. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  8. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    Science.gov (United States)

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  9. Bunch coalescing in the Fermilab Main Ring

    International Nuclear Information System (INIS)

    Wildman, D.; Martin, P.; Meisner, K.; Miller, H.W.

    1987-01-01

    A new RF system has been installed in the Fermilab Main Ring to coalesce up to 13 individual bunches of protons or antiprotons into a single high-intensity bunch. The coalescing process consists of adiabatically reducing the h=1113 Main Ring RF voltage from 1 MV to less than 1 kV, capturing the debunched beam in a linearized h=53 and h=106 bucket, rotating for a quarter of a synchrotron oscillation period, and then recapturing the beam in a single h=1113 bucket. The new system is described and the results of recent coalescing experiments are compared with computer-generated particle tracking simulations

  10. Surfactant effect on drop coalescence and film drainage hydrodynamics

    Science.gov (United States)

    Weheliye, Weheliye; Chinaud, Maxime; Voulgaropoulos, Victor; Angeli, Panagiota

    2015-11-01

    Coalescence of a drop on an aqueous-organic interface is studied in two test geometries A rectangular acrylic vessel and a Hele-Shaw cell (two parallel plates placed 2mm apart) are investigated for the experiments. Time resolved Particle Image Velocimetry (PIV) measurements provide information on the hydrodynamics during the bouncing stage of the droplet and on the vortices generated at the bulk fluid after the droplet has coalesced. The velocity field inside the droplet during its coalescence is presented. By localizing the rupture point of the coalescence in the quasi two dimensional cell, the film drainage dynamics are discussed by acquiring its flow velocity by PIV measurements with a straddling camera. The effect of surface tension forces in the coalescence of the droplet is investigated by introducing surface active agents at various concentrations extending on both sides of the critical micelle concentration.

  11. Bayesian analysis of magnetic island dynamics

    International Nuclear Information System (INIS)

    Preuss, R.; Maraschek, M.; Zohm, H.; Dose, V.

    2003-01-01

    We examine a first order differential equation with respect to time used to describe magnetic islands in magnetically confined plasmas. The free parameters of this equation are obtained by employing Bayesian probability theory. Additionally, a typical Bayesian change point is solved in the process of obtaining the data

  12. Coalescence dynamics of mobile and immobile fluid interfaces

    KAUST Repository

    Vakarelski, Ivan Uriev; Manica, Rogerio; Li, Erqiang; Basheva, Elka S; Chan, Derek Y. C.; Thoroddsen, Sigurdur T.

    2018-01-01

    Coalescence dynamics between deformable bubbles and droplets can be dramatically affected by the mobility of the interfaces with fully tangentially mobile bubble-liquid or droplet-liquid interfaces expected to accelerate the coalescence by orders

  13. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  14. Bayesian-network-based safety risk analysis in construction projects

    International Nuclear Information System (INIS)

    Zhang, Limao; Wu, Xianguo; Skibniewski, Miroslaw J.; Zhong, Jingbing; Lu, Yujie

    2014-01-01

    This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical

  15. Naive Bayesian classifiers for multinomial features: a theoretical analysis

    CSIR Research Space (South Africa)

    Van Dyk, E

    2007-11-01

    Full Text Available The authors investigate the use of naive Bayesian classifiers for multinomial feature spaces and derive error estimates for these classifiers. The error analysis is done by developing a mathematical model to estimate the probability density...

  16. On spatial coalescents with multiple mergers in two dimensions.

    Science.gov (United States)

    Heuer, Benjamin; Sturm, Anja

    2013-08-01

    We consider the genealogy of a sample of individuals taken from a spatially structured population when the variance of the offspring distribution is relatively large. The space is structured into discrete sites of a graph G. If the population size at each site is large, spatial coalescents with multiple mergers, so called spatial Λ-coalescents, for which ancestral lines migrate in space and coalesce according to some Λ-coalescent mechanism, are shown to be appropriate approximations to the genealogy of a sample of individuals. We then consider as the graph G the two dimensional torus with side length 2L+1 and show that as L tends to infinity, and time is rescaled appropriately, the partition structure of spatial Λ-coalescents of individuals sampled far enough apart converges to the partition structure of a non-spatial Kingman coalescent. From a biological point of view this means that in certain circumstances both the spatial structure as well as larger variances of the underlying offspring distribution are harder to detect from the sample. However, supplemental simulations show that for moderately large L the different structure is still evident. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Bayesian Sensitivity Analysis of Statistical Models with Missing Data.

    Science.gov (United States)

    Zhu, Hongtu; Ibrahim, Joseph G; Tang, Niansheng

    2014-04-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures.

  18. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  19. An Overview of Bayesian Methods for Neural Spike Train Analysis

    Directory of Open Access Journals (Sweden)

    Zhe Chen

    2013-01-01

    Full Text Available Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed.

  20. Coalescence of magnetic islands

    International Nuclear Information System (INIS)

    Pellat, R.

    1982-01-01

    The paper gives the analytical theory of the coalescence instability and of a new, one island, instability. These instabilities are expected to be relevant for the disruptions observed in Tokamak experiments and astrophysical plasmas

  1. Explosive coalescence of magnetic islands and explosive particle acceleration

    International Nuclear Information System (INIS)

    Tajima, T.; Sakai, J.I.

    1985-07-01

    An explosive reconnection process associated with the nonlinear evolution of the coalescence instability is found through studies of the electromagnetic particle simulation and the magnetohydrodynamic particle simulation. The explosive coalescence is a process of magnetic collapse, in which we find the magnetic and electrostatic field energies and temperatures (ion temperature in the coalescing direction, in particular) explode toward the explosion time t 0 as (t 0 - t)/sup -8/3/, (t 0 - t) -4 , and (t 0 - t)/sup -8/3/, respectively for a canonical case. Single-peak, double-peak, and triple-peak structures of magnetic energy, temperature, and electrostatic energy, respectively, are observed on the simulation as overshoot amplitude oscillations and are theoretically explained. The heuristic model of Brunel and Tajima is extended to this explosive coalescence in order to extract the basic process. Since the explosive coalescence exhibits self-similarity, a temporal universality, we theoretically search for a self-similar solution to the two-fluid plasma equations

  2. Bayesian inference – a way to combine statistical data and semantic analysis meaningfully

    Directory of Open Access Journals (Sweden)

    Eila Lindfors

    2011-11-01

    Full Text Available This article focuses on presenting the possibilities of Bayesian modelling (Finite Mixture Modelling in the semantic analysis of statistically modelled data. The probability of a hypothesis in relation to the data available is an important question in inductive reasoning. Bayesian modelling allows the researcher to use many models at a time and provides tools to evaluate the goodness of different models. The researcher should always be aware that there is no such thing as the exact probability of an exact event. This is the reason for using probabilistic models. Each model presents a different perspective on the phenomenon in focus, and the researcher has to choose the most probable model with a view to previous research and the knowledge available.The idea of Bayesian modelling is illustrated here by presenting two different sets of data, one from craft science research (n=167 and the other (n=63 from educational research (Lindfors, 2007, 2002. The principles of how to build models and how to combine different profiles are described in the light of the research mentioned.Bayesian modelling is an analysis based on calculating probabilities in relation to a specific set of quantitative data. It is a tool for handling data and interpreting it semantically. The reliability of the analysis arises from an argumentation of which model can be selected from the model space as the basis for an interpretation, and on which arguments.Keywords: method, sloyd, Bayesian modelling, student teachersURN:NBN:no-29959

  3. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    Science.gov (United States)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  4. Characterization of solids deposited on the modular caustic-side solvent extraction unit (MCU) coalescer media removed in October 2014

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-03-01

    In February 2015, Savannah River National Laboratory (SRNL) received a Strip Effluent (SE) coalescer (FLT-304) from MCU. That coalescer was first installed at MCU in July 2014 and removed in October 2014. While processing approximately 31,400 gallons of strip solution, the pressure drop steadily increased from 1 psi to beyond the administrative limit of 20 psi. The physical and chemical analysis was conducted on this coalescer to determine the mechanism that led to the plugging of this coalescer. Characterization of this coalescer revealed the adsorption of organic containing amines as well as MCU modifier. The amines are probably from the decomposition of the suppressor (TiDG) as well as from bacteria. This adsorption may have changed the surface energetics (characteristics) of the coalescer fibers and therefore, their wetting behavior. A very small amount of inorganic solids were found to have deposited on this coalescer (possibly an artifact of cleaning the coalescer with Boric acid. However, we believe that inorganic precipitation, as has been seen in the past, did not play a role in the high pressure drop rise of this coalescer. With regards to the current practice of reducing the radioactive content of the SE coalescer, it is recommended that future SE coalescer should be flushed with 10 mM boric acid which is currently used at MCU. Plugging of the SE coalescer was most likely due to the formation and accumulation of a water-in-oil emulsion that reduced the overall porosity of the coalescer. There is also evidence that a bimodal oil particle distribution may have entered and deposited in the coalescer and caused the initial increase in pressure drop.

  5. Bayesian meta-analysis models for microarray data: a comparative study

    Directory of Open Access Journals (Sweden)

    Song Joon J

    2007-03-01

    Full Text Available Abstract Background With the growing abundance of microarray data, statistical methods are increasingly needed to integrate results across studies. Two common approaches for meta-analysis of microarrays include either combining gene expression measures across studies or combining summaries such as p-values, probabilities or ranks. Here, we compare two Bayesian meta-analysis models that are analogous to these methods. Results Two Bayesian meta-analysis models for microarray data have recently been introduced. The first model combines standardized gene expression measures across studies into an overall mean, accounting for inter-study variability, while the second combines probabilities of differential expression without combining expression values. Both models produce the gene-specific posterior probability of differential expression, which is the basis for inference. Since the standardized expression integration model includes inter-study variability, it may improve accuracy of results versus the probability integration model. However, due to the small number of studies typical in microarray meta-analyses, the variability between studies is challenging to estimate. The probability integration model eliminates the need to model variability between studies, and thus its implementation is more straightforward. We found in simulations of two and five studies that combining probabilities outperformed combining standardized gene expression measures for three comparison values: the percent of true discovered genes in meta-analysis versus individual studies; the percent of true genes omitted in meta-analysis versus separate studies, and the number of true discovered genes for fixed levels of Bayesian false discovery. We identified similar results when pooling two independent studies of Bacillus subtilis. We assumed that each study was produced from the same microarray platform with only two conditions: a treatment and control, and that the data sets

  6. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  7. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    Science.gov (United States)

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method. Copyright © 2013, The International Biometric Society.

  8. Bayesian Analysis of Individual Level Personality Dynamics

    Directory of Open Access Journals (Sweden)

    Edward Cripps

    2016-07-01

    Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.

  9. Cryptic genetic diversity is paramount in small-bodied amphibians of the genus Euparkerella (Anura: Craugastoridae endemic to the Brazilian Atlantic forest.

    Directory of Open Access Journals (Sweden)

    Luciana A Fusinatto

    Full Text Available Morphological similarity associated to restricted distributions and low dispersal abilities make the direct developing "Terrarana" frogs of the genus Euparkerella a good model for examining diversification processes. We here infer phylogenetic relationships within the genus Euparkerella, using DNA sequence data from one mitochondrial and four nuclear genes coupled with traditional Bayesian phylogenetic reconstruction approaches and more recent coalescent methods of species tree inference. We also used Bayesian clustering analysis and a recent Bayesian coalescent-based approach specifically to infer species delimitation. The analysis of 39 individuals from the four known Euparkerella species uncovered high levels of genetic diversity, especially within the two previously morphologically-defined E. cochranae and E. brasiliensis. Within these species, the gene trees at five independent loci and trees from combined data (concatenated dataset and the species tree uncovered six deeply diverged and geographically coherent evolutionary units, which may have diverged between the Miocene and the Pleistocene. These six units were also uncovered in the Bayesian clustering analysis, and supported by the Bayesian coalescent-based species delimitation (BPP, and Genealogical Sorting Index (GSI, providing thus strong evidence for underestimation of the current levels of diversity within Euparkerella. The cryptic diversity now uncovered opens new opportunities to examine the origins and maintenance of microendemism in the context of spatial heterogeneity and/or human induced fragmentation of the highly threatened Brazilian Atlantic forest hotspot.

  10. Coalescence preference and droplet size inequality during fluid phase segregation

    Science.gov (United States)

    Roy, Sutapa

    2018-02-01

    Using molecular dynamics simulations and scaling arguments, we investigate the coalescence preference dynamics of liquid droplets in a phase-segregating off-critical, single-component fluid. It is observed that the preferential distance of the product drop from its larger parent, during a coalescence event, gets smaller for large parent size inequality. The relative coalescence position exhibits a power-law dependence on the parent size ratio with an exponent q ≃ 3.1 . This value of q is in strong contrast with earlier reports 2.1 and 5.1 in the literature. The dissimilarity is explained by considering the underlying coalescence mechanisms.

  11. Partial coalescence from bubbles to drops

    KAUST Repository

    Zhang, F. H.

    2015-10-07

    The coalescence of drops is a fundamental process in the coarsening of emulsions. However, counter-intuitively, this coalescence process can produce a satellite, approximately half the size of the original drop, which is detrimental to the overall coarsening. This also occurs during the coalescence of bubbles, while the resulting satellite is much smaller, approximately 10 %. To understand this difference, we have conducted a set of coalescence experiments using xenon bubbles inside a pressure chamber, where we can continuously raise the pressure from 1 up to 85 atm and thereby vary the density ratio between the inner and outer fluid, from 0.005 up to unity. Using high-speed video imaging, we observe a continuous increase in satellite size as the inner density is varied from the bubble to emulsion-droplet conditions, with the most rapid changes occurring as the bubble density grows up to 15 % of that of the surrounding liquid. We propose a model that successfully relates the satellite size to the capillary wave mode responsible for its pinch-off and the overall deformations from the drainage. The wavelength of the primary wave changes during its travel to the apex, with the instantaneous speed adjusting to the local wavelength. By estimating the travel time of this wave mode on the bubble surface, we also show that the model is consistent with the experiments. This wavenumber is determined by both the global drainage as well as the interface shapes during the rapid coalescence in the neck connecting the two drops or bubbles. The rate of drainage is shown to scale with the density of the inner fluid. Empirically, we find that the pinch-off occurs when 60 % of the bubble fluid has drained from it. Numerical simulations using the volume-of-fluid method with dynamic adaptive grid refinement can reproduce these dynamics, as well as show the associated vortical structure and stirring of the coalescing fluid masses. Enhanced stirring is observed for cases with second

  12. Coalescence and movement of nanobubbles studied with tapping mode AFM and tip-bubble interaction analysis

    International Nuclear Information System (INIS)

    Bhushan, Bharat; Wang Yuliang; Maali, Abdelhamid

    2008-01-01

    Imaging of a polystyrene (PS) coated silicon wafer immersed in deionized (DI) water was conducted using atomic force microscopy (AFM) in the tapping mode (TMAFM). As reported earlier, spherical cap-like domains, referred to as nanobubbles, were observed to be distributed on the PS surface. Experiments reveal that, in addition to the well-known parameter of scan load, scan speed is also an important parameter which affects nanobubble coalescence. The process of nanobubble coalescence was studied. It was found that during coalescence, small nanobubbles were easily moved and merged into bigger ones. Based on the interaction between the AFM cantilever tip and a bubble in the so-called force modulation mode of TMAFM, bubble height and adhesive force information for a given bubble was extracted. A viscoelastic model is used to obtain the interaction stiffness and damping coefficient, which provides a method to obtain the mechanical properties of nanobubbles. The model was further used to study the effect of surface tension force on attractive interaction force and contact angle hysteresis on the changes of the interaction damping coefficient during tip-bubble interaction.

  13. Implementation of a Bayesian Engine for Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  14. Use of SAMC for Bayesian analysis of statistical models with intractable normalizing constants

    KAUST Repository

    Jin, Ick Hoon

    2014-03-01

    Statistical inference for the models with intractable normalizing constants has attracted much attention. During the past two decades, various approximation- or simulation-based methods have been proposed for the problem, such as the Monte Carlo maximum likelihood method and the auxiliary variable Markov chain Monte Carlo methods. The Bayesian stochastic approximation Monte Carlo algorithm specifically addresses this problem: It works by sampling from a sequence of approximate distributions with their average converging to the target posterior distribution, where the approximate distributions can be achieved using the stochastic approximation Monte Carlo algorithm. A strong law of large numbers is established for the Bayesian stochastic approximation Monte Carlo estimator under mild conditions. Compared to the Monte Carlo maximum likelihood method, the Bayesian stochastic approximation Monte Carlo algorithm is more robust to the initial guess of model parameters. Compared to the auxiliary variable MCMC methods, the Bayesian stochastic approximation Monte Carlo algorithm avoids the requirement for perfect samples, and thus can be applied to many models for which perfect sampling is not available or very expensive. The Bayesian stochastic approximation Monte Carlo algorithm also provides a general framework for approximate Bayesian analysis. © 2012 Elsevier B.V. All rights reserved.

  15. Bayesian statistics applied to neutron activation data for reactor flux spectrum analysis

    International Nuclear Information System (INIS)

    Chiesa, Davide; Previtali, Ezio; Sisti, Monica

    2014-01-01

    Highlights: • Bayesian statistics to analyze the neutron flux spectrum from activation data. • Rigorous statistical approach for accurate evaluation of the neutron flux groups. • Cross section and activation data uncertainties included for the problem solution. • Flexible methodology applied to analyze different nuclear reactor flux spectra. • The results are in good agreement with the MCNP simulations of neutron fluxes. - Abstract: In this paper, we present a statistical method, based on Bayesian statistics, to analyze the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation experiment performed at the TRIGA Mark II reactor of Pavia University (Italy) in four irradiation positions characterized by different neutron spectra. In order to evaluate the neutron flux spectrum, subdivided in energy groups, a system of linear equations, containing the group effective cross sections and the activation rate data, has to be solved. However, since the system’s coefficients are experimental data affected by uncertainties, a rigorous statistical approach is fundamental for an accurate evaluation of the neutron flux groups. For this purpose, we applied the Bayesian statistical analysis, that allows to include the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, was used to define the problem statistical model and solve it. The first analysis involved the determination of the thermal, resonance-intermediate and fast flux components and the dependence of the results on the Prior distribution choice was investigated to confirm the reliability of the Bayesian analysis. After that, the main resonances of the activation cross sections were analyzed to implement multi-group models with finer energy subdivisions that would allow to determine the

  16. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  17. Combining morphological analysis and Bayesian Networks for strategic decision support

    CSIR Research Space (South Africa)

    De Waal, AJ

    2007-12-01

    Full Text Available Morphological analysis (MA) and Bayesian networks (BN) are two closely related modelling methods, each of which has its advantages and disadvantages for strategic decision support modelling. MA is a method for defining, linking and evaluating...

  18. ULTRAMASSIVE BLACK HOLE COALESCENCE

    International Nuclear Information System (INIS)

    Khan, Fazeel Mahmood; Holley-Bockelmann, Kelly; Berczik, Peter

    2015-01-01

    Although supermassive black holes (SMBHs) correlate well with their host galaxies, there is an emerging view that outliers exist. Henize 2-10, NGC 4889, and NGC 1277 are examples of SMBHs at least an order of magnitude more massive than their host galaxy suggests. The dynamical effects of such ultramassive central black holes is unclear. Here, we perform direct N-body simulations of mergers of galactic nuclei where one black hole is ultramassive to study the evolution of the remnant and the black hole dynamics in this extreme regime. We find that the merger remnant is axisymmetric near the center, while near the large SMBH influence radius, the galaxy is triaxial. The SMBH separation shrinks rapidly due to dynamical friction, and quickly forms a binary black hole; if we scale our model to the most massive estimate for the NGC 1277 black hole, for example, the timescale for the SMBH separation to shrink from nearly a kiloparsec to less than a parsec is roughly 10 Myr. By the time the SMBHs form a hard binary, gravitational wave emission dominates, and the black holes coalesce in a mere few Myr. Curiously, these extremely massive binaries appear to nearly bypass the three-body scattering evolutionary phase. Our study suggests that in this extreme case, SMBH coalescence is governed by dynamical friction followed nearly directly by gravitational wave emission, resulting in a rapid and efficient SMBH coalescence timescale. We discuss the implications for gravitational wave event rates and hypervelocity star production

  19. Low-latency analysis pipeline for compact binary coalescences in the advanced gravitational wave detector era

    International Nuclear Information System (INIS)

    Adams, T; Buskulic, D; Germain, V; Marion, F; Mours, B; Guidi, G M; Montani, M; Piergiovanni, F; Wang, G

    2016-01-01

    The multi-band template analysis (MBTA) pipeline is a low-latency coincident analysis pipeline for the detection of gravitational waves (GWs) from compact binary coalescences. MBTA runs with a low computational cost, and can identify candidate GW events online with a sub-minute latency. The low computational running cost of MBTA also makes it useful for data quality studies. Events detected by MBTA online can be used to alert astronomical partners for electromagnetic follow-up. We outline the current status of MBTA and give details of recent pipeline upgrades and validation tests that were performed in preparation for the first advanced detector observing period. The MBTA pipeline is ready for the outset of the advanced detector era and the exciting prospects it will bring. (paper)

  20. Genetic Variability Under the Seedbank Coalescent.

    Science.gov (United States)

    Blath, Jochen; González Casanova, Adrián; Eldon, Bjarki; Kurt, Noemi; Wilke-Berenguer, Maite

    2015-07-01

    We analyze patterns of genetic variability of populations in the presence of a large seedbank with the help of a new coalescent structure called the seedbank coalescent. This ancestral process appears naturally as a scaling limit of the genealogy of large populations that sustain seedbanks, if the seedbank size and individual dormancy times are of the same order as those of the active population. Mutations appear as Poisson processes on the active lineages and potentially at reduced rate also on the dormant lineages. The presence of "dormant" lineages leads to qualitatively altered times to the most recent common ancestor and nonclassical patterns of genetic diversity. To illustrate this we provide a Wright-Fisher model with a seedbank component and mutation, motivated from recent models of microbial dormancy, whose genealogy can be described by the seedbank coalescent. Based on our coalescent model, we derive recursions for the expectation and variance of the time to most recent common ancestor, number of segregating sites, pairwise differences, and singletons. Estimates (obtained by simulations) of the distributions of commonly employed distance statistics, in the presence and absence of a seedbank, are compared. The effect of a seedbank on the expected site-frequency spectrum is also investigated using simulations. Our results indicate that the presence of a large seedbank considerably alters the distribution of some distance statistics, as well as the site-frequency spectrum. Thus, one should be able to detect from genetic data the presence of a large seedbank in natural populations. Copyright © 2015 by the Genetics Society of America.

  1. Bubble coalescence dynamics and supersaturation in electrolytic gas evolution

    Energy Technology Data Exchange (ETDEWEB)

    Stover, R.L. [Univ. of California, Berkeley, CA (United States). Dept. of Chemical Engineering]|[Lawrence Berkeley National Lab., CA (United States). Energy and Environment Div.

    1996-08-01

    The apparatus and procedures developed in this research permit the observation of electrolytic bubble coalescence, which heretofore has not been possible. The influence of bubble size, electrolyte viscosity, surface tension, gas type, and pH on bubble coalescence was examined. The Navier-Stokes equations with free surface boundary conditions were solved numerically for the full range of experimental variables that were examined. Based on this study, the following mechanism for bubble coalescence emerges: when two gas bubbles coalesce, the surface energy decreases as the curvature and surface area of the resultant bubble decrease, and the energy is imparted into the surrounding liquid. The initial motion is driven by the surface tension and slowed by the inertia and viscosity of the surrounding fluid. The initial velocity of the interface is approximately proportional to the square root of the surface tension and inversely proportional to the square root of the bubble radius. Fluid inertia sustains the oblate/prolate oscillations of the resultant bubble. The period of the oscillations varies with the bubble radius raised to the 3/2 power and inversely with the square root of the surface tension. Viscous resistance dampens the oscillations at a rate proportional to the viscosity and inversely proportional to the square of the bubble radius. The numerical simulations were consistent with most of the experimental results. The differences between the computed and measured saddle point decelerations and periods suggest that the surface tension in the experiments may have changed during each run. By adjusting the surface tension in the simulation, a good fit was obtained for the 150-{micro}m diameter bubbles. The simulations fit the experiments on larger bubbles with very little adjustment of surface tension. A more focused analysis should be done to elucidate the phenomena that occur in the receding liquid film immediately following rupture.

  2. Bayesian analysis of heterogeneous treatment effects for patient-centered outcomes research.

    Science.gov (United States)

    Henderson, Nicholas C; Louis, Thomas A; Wang, Chenguang; Varadhan, Ravi

    2016-01-01

    Evaluation of heterogeneity of treatment effect (HTE) is an essential aspect of personalized medicine and patient-centered outcomes research. Our goal in this article is to promote the use of Bayesian methods for subgroup analysis and to lower the barriers to their implementation by describing the ways in which the companion software beanz can facilitate these types of analyses. To advance this goal, we describe several key Bayesian models for investigating HTE and outline the ways in which they are well-suited to address many of the commonly cited challenges in the study of HTE. Topics highlighted include shrinkage estimation, model choice, sensitivity analysis, and posterior predictive checking. A case study is presented in which we demonstrate the use of the methods discussed.

  3. Coalescence dynamics of mobile and immobile fluid interfaces

    KAUST Repository

    Vakarelski, Ivan Uriev

    2018-01-12

    Coalescence dynamics between deformable bubbles and droplets can be dramatically affected by the mobility of the interfaces with fully tangentially mobile bubble-liquid or droplet-liquid interfaces expected to accelerate the coalescence by orders of magnitudes. However, there is a lack of systematic experimental investigations that quantify this effect. By using high speed camera imaging we examine the free rise and coalescence of small air-bubbles (100 to 1300 μm in diameter) with a liquid interface. A perfluorocarbon liquid, PP11 is used as a model liquid to investigate coalescence dynamics between fully-mobile and immobile deformable interfaces. The mobility of the bubble surface was determined by measuring the terminal rise velocity of small bubbles rising at Reynolds numbers, Re less than 0.1 and the mobility of free PP11 surface by measuring the deceleration kinetics of the small bubble toward the interface. Induction or film drainage times of a bubble at the mobile PP11-air surface were found to be more than two orders of magnitude shorter compared to the case of bubble and an immobile PP11-water interface. A theoretical model is used to illustrate the effect of hydrodynamics and interfacial mobility on the induction time or film drainage time. The results of this study are expected to stimulate the development of a comprehensive theoretical model for coalescence dynamics between two fully or partially mobile fluid interfaces.

  4. Coalescence of liquid drops: Different models versus experiment

    KAUST Repository

    Sprittles, J. E.

    2012-01-01

    The process of coalescence of two identical liquid drops is simulated numerically in the framework of two essentially different mathematical models, and the results are compared with experimental data on the very early stages of the coalescence process reported recently. The first model tested is the "conventional" one, where it is assumed that coalescence as the formation of a single body of fluid occurs by an instant appearance of a liquid bridge smoothly connecting the two drops, and the subsequent process is the evolution of this single body of fluid driven by capillary forces. The second model under investigation considers coalescence as a process where a section of the free surface becomes trapped between the bulk phases as the drops are pressed against each other, and it is the gradual disappearance of this "internal interface" that leads to the formation of a single body of fluid and the conventional model taking over. Using the full numerical solution of the problem in the framework of each of the two models, we show that the recently reported electrical measurements probing the very early stages of the process are better described by the interface formation/disappearance model. New theory-guided experiments are suggested that would help to further elucidate the details of the coalescence phenomenon. As a by-product of our research, the range of validity of different "scaling laws" advanced as approximate solutions to the problem formulated using the conventional model is established. © 2012 American Institute of Physics.

  5. Node insertion in Coalescence Fractal Interpolation Function

    International Nuclear Information System (INIS)

    Prasad, Srijanani Anurag

    2013-01-01

    The Iterated Function System (IFS) used in the construction of Coalescence Hidden-variable Fractal Interpolation Function (CHFIF) depends on the interpolation data. The insertion of a new point in a given set of interpolation data is called the problem of node insertion. In this paper, the effect of insertion of new point on the related IFS and the Coalescence Fractal Interpolation Function is studied. Smoothness and Fractal Dimension of a CHFIF obtained with a node are also discussed

  6. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  7. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)

    2008-01-01

    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  8. Coalescent Processes with Skewed Offspring Distributions and Nonequilibrium Demography.

    Science.gov (United States)

    Matuszewski, Sebastian; Hildebrandt, Marcel E; Achaz, Guillaume; Jensen, Jeffrey D

    2018-01-01

    Nonequilibrium demography impacts coalescent genealogies leaving detectable, well-studied signatures of variation. However, similar genomic footprints are also expected under models of large reproductive skew, posing a serious problem when trying to make inference. Furthermore, current approaches consider only one of the two processes at a time, neglecting any genomic signal that could arise from their simultaneous effects, preventing the possibility of jointly inferring parameters relating to both offspring distribution and population history. Here, we develop an extended Moran model with exponential population growth, and demonstrate that the underlying ancestral process converges to a time-inhomogeneous psi-coalescent. However, by applying a nonlinear change of time scale-analogous to the Kingman coalescent-we find that the ancestral process can be rescaled to its time-homogeneous analog, allowing the process to be simulated quickly and efficiently. Furthermore, we derive analytical expressions for the expected site-frequency spectrum under the time-inhomogeneous psi-coalescent, and develop an approximate-likelihood framework for the joint estimation of the coalescent and growth parameters. By means of extensive simulation, we demonstrate that both can be estimated accurately from whole-genome data. In addition, not accounting for demography can lead to serious biases in the inferred coalescent model, with broad implications for genomic studies ranging from ecology to conservation biology. Finally, we use our method to analyze sequence data from Japanese sardine populations, and find evidence of high variation in individual reproductive success, but few signs of a recent demographic expansion. Copyright © 2018 by the Genetics Society of America.

  9. Variational Bayesian Learning for Wavelet Independent Component Analysis

    Science.gov (United States)

    Roussos, E.; Roberts, S.; Daubechies, I.

    2005-11-01

    In an exploratory approach to data analysis, it is often useful to consider the observations as generated from a set of latent generators or "sources" via a generally unknown mapping. For the noisy overcomplete case, where we have more sources than observations, the problem becomes extremely ill-posed. Solutions to such inverse problems can, in many cases, be achieved by incorporating prior knowledge about the problem, captured in the form of constraints. This setting is a natural candidate for the application of the Bayesian methodology, allowing us to incorporate "soft" constraints in a natural manner. The work described in this paper is mainly driven by problems in functional magnetic resonance imaging of the brain, for the neuro-scientific goal of extracting relevant "maps" from the data. This can be stated as a `blind' source separation problem. Recent experiments in the field of neuroscience show that these maps are sparse, in some appropriate sense. The separation problem can be solved by independent component analysis (ICA), viewed as a technique for seeking sparse components, assuming appropriate distributions for the sources. We derive a hybrid wavelet-ICA model, transforming the signals into a domain where the modeling assumption of sparsity of the coefficients with respect to a dictionary is natural. We follow a graphical modeling formalism, viewing ICA as a probabilistic generative model. We use hierarchical source and mixing models and apply Bayesian inference to the problem. This allows us to perform model selection in order to infer the complexity of the representation, as well as automatic denoising. Since exact inference and learning in such a model is intractable, we follow a variational Bayesian mean-field approach in the conjugate-exponential family of distributions, for efficient unsupervised learning in multi-dimensional settings. The performance of the proposed algorithm is demonstrated on some representative experiments.

  10. Prior Sensitivity Analysis in Default Bayesian Structural Equation Modeling.

    Science.gov (United States)

    van Erp, Sara; Mulder, Joris; Oberski, Daniel L

    2017-11-27

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models and solve some of the issues often encountered in classical maximum likelihood estimation, such as nonconvergence and inadmissible solutions. An important component of any Bayesian analysis is the prior distribution of the unknown model parameters. Often, researchers rely on default priors, which are constructed in an automatic fashion without requiring substantive prior information. However, the prior can have a serious influence on the estimation of the model parameters, which affects the mean squared error, bias, coverage rates, and quantiles of the estimates. In this article, we investigate the performance of three different default priors: noninformative improper priors, vague proper priors, and empirical Bayes priors-with the latter being novel in the BSEM literature. Based on a simulation study, we find that these three default BSEM methods may perform very differently, especially with small samples. A careful prior sensitivity analysis is therefore needed when performing a default BSEM analysis. For this purpose, we provide a practical step-by-step guide for practitioners to conducting a prior sensitivity analysis in default BSEM. Our recommendations are illustrated using a well-known case study from the structural equation modeling literature, and all code for conducting the prior sensitivity analysis is available in the online supplemental materials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Analysis of lifespan monitoring data using Bayesian logic

    International Nuclear Information System (INIS)

    Pozzi, M; Zonta, D; Glisic, B; Inaudi, D; Lau, J M; Fong, C C

    2011-01-01

    In this contribution, we use a Bayesian approach to analyze the data from a 19-storey building block, which is part of the Punggol EC26 construction project undertaken by the Singapore Housing and Development Board in the early 2000s. The building was instrumented during construction with interferometric fiber optic average strain sensors, embedded in ten of the first story columns during construction. The philosophy driving the design of the monitoring system was to instrument a representative number of structural elements, while maintaining the cost at a reasonable level. The analysis of the data, along with prior experience, allowed the engineer to recognize at early stage an ongoing differential settlement of one base column. We show how the whole cognitive process followed by the engineer can be reproduced using Bayesian logic. Particularly, we discuss to what extent the prior knowledge and potential evidence from inspection, can alter the perception of the building response based solely on instrumental data.

  12. DATMAN: A reliability data analysis program using Bayesian updating

    International Nuclear Information System (INIS)

    Becker, M.; Feltus, M.A.

    1996-01-01

    Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, which can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately

  13. Liquid Marble Coalescence and Triggered Microreaction Driven by Acoustic Levitation.

    Science.gov (United States)

    Chen, Zhen; Zang, Duyang; Zhao, Liang; Qu, Mengfei; Li, Xu; Li, Xiaoguang; Li, Lixin; Geng, Xingguo

    2017-06-27

    Liquid marbles show promising potential for application in the microreactor field. Control of the coalescence between two or among multiple liquid marbles is critical; however, the successful merging of two isolated marbles is difficult because of their mechanically robust particle shells. In this work, the coalescence of multiple liquid marbles was achieved via acoustic levitation. The dynamic behaviors of the liquid marbles were monitored by a high-speed camera. Driven by the sound field, the liquid marbles moved toward each other, collided, and eventually coalesced into a larger single marble. The underlying mechanisms of this process were probed via sound field simulation and acoustic radiation pressure calculation. The results indicated that the pressure gradient on the liquid marble surface favors the formation of a liquid bridge between the liquid marbles, resulting in their coalescence. A preliminary indicator reaction was induced by the coalescence of dual liquid marbles, which suggests that expected chemical reactions can be successfully triggered with multiple reagents contained in isolated liquid marbles via acoustic levitation.

  14. Cosmology with coalescing massive black holes

    International Nuclear Information System (INIS)

    Hughes, Scott A; Holz, Daniel E

    2003-01-01

    The gravitational waves generated in the coalescence of massive binary black holes will be measurable by LISA to enormous distances. Redshifts z ∼ 10 or larger (depending somewhat on the mass of the binary) can potentially be probed by such measurements, suggesting that binary coalescences can be made into cosmological tools. We discuss two particularly interesting types of probe. First, by combining gravitational-wave measurements with information about the cosmography of the universe, we can study the evolution of black-hole masses and merger rates as a function of redshift, providing information about the growth of structures at high redshift and possibly constraining hierarchical merger scenarios. Second, if it is possible to associate an 'electromagnetic' counterpart with a coalescence, it may be possible to measure both redshift and luminosity distance to an event with less than ∼1% error. Such a measurement would constitute an amazingly precise cosmological standard candle. Unfortunately, gravitational lensing uncertainties will reduce the quality of this candle significantly. Though not as amazing as might have been hoped, such a candle would nonetheless very usefully complement other distance-redshift probes, in particular providing a valuable check on systematic effects in such measurements

  15. An Exploratory Study Examining the Feasibility of Using Bayesian Networks to Predict Circuit Analysis Understanding

    Science.gov (United States)

    Chung, Gregory K. W. K.; Dionne, Gary B.; Kaiser, William J.

    2006-01-01

    Our research question was whether we could develop a feasible technique, using Bayesian networks, to diagnose gaps in student knowledge. Thirty-four college-age participants completed tasks designed to measure conceptual knowledge, procedural knowledge, and problem-solving skills related to circuit analysis. A Bayesian network was used to model…

  16. The integration of expert-defined importance factors to enrich Bayesian Fault Tree Analysis

    International Nuclear Information System (INIS)

    Darwish, Molham; Almouahed, Shaban; Lamotte, Florent de

    2017-01-01

    This paper proposes an analysis of a hybrid Bayesian-Importance model for system designers to improve the quality of services related to Active Assisted Living Systems. The proposed model is based on two factors: failure probability measure of different service components and, an expert defined degree of importance that each component holds for the success of the corresponding service. The proposed approach advocates the integration of expert-defined importance factors to enrich the Bayesian Fault Tree Analysis (FTA) approach. The evaluation of the proposed approach is conducted using the Fault Tree Analysis formalism where the undesired state of a system is analyzed using Boolean logic mechanisms to combine a series of lower-level events.

  17. Coalescent-Based Analyses of Genomic Sequence Data Provide a Robust Resolution of Phylogenetic Relationships among Major Groups of Gibbons

    Science.gov (United States)

    Shi, Cheng-Min; Yang, Ziheng

    2018-01-01

    Abstract The phylogenetic relationships among extant gibbon species remain unresolved despite numerous efforts using morphological, behavorial, and genetic data and the sequencing of whole genomes. A major challenge in reconstructing the gibbon phylogeny is the radiative speciation process, which resulted in extremely short internal branches in the species phylogeny and extensive incomplete lineage sorting with extensive gene-tree heterogeneity across the genome. Here, we analyze two genomic-scale data sets, with ∼10,000 putative noncoding and exonic loci, respectively, to estimate the species tree for the major groups of gibbons. We used the Bayesian full-likelihood method bpp under the multispecies coalescent model, which naturally accommodates incomplete lineage sorting and uncertainties in the gene trees. For comparison, we included three heuristic coalescent-based methods (mp-est, SVDQuartets, and astral) as well as concatenation. From both data sets, we infer the phylogeny for the four extant gibbon genera to be (Hylobates, (Nomascus, (Hoolock, Symphalangus))). We used simulation guided by the real data to evaluate the accuracy of the methods used. Astral, while not as efficient as bpp, performed well in estimation of the species tree even in presence of excessive incomplete lineage sorting. Concatenation, mp-est and SVDQuartets were unreliable when the species tree contains very short internal branches. Likelihood ratio test of gene flow suggests a small amount of migration from Hylobates moloch to H. pileatus, while cross-genera migration is absent or rare. Our results highlight the utility of coalescent-based methods in addressing challenging species tree problems characterized by short internal branches and rampant gene tree-species tree discordance. PMID:29087487

  18. The coalescence of heterogeneous liquid metal on nano substrate

    Science.gov (United States)

    Wang, Long; Li, Yifan; Zhou, Xuyan; Li, Tao; Li, Hui

    2017-06-01

    Molecular dynamics simulation has been performed to study the asymmetric coalescence of heterogeneous liquid metal on graphene. Simulation results show that the anomalies in the drop coalescence is mainly caused by the wettability of heterogeneous liquid metal. The silver atoms incline to distribute on the outer layer of the gold and copper droplets, revealing that the structure is determined by the interaction between different metal atoms. The coalescence and fusion of heterogeneous liquid metal drop can be predicted by comparing the wettability and the atomic mass of metallic liquid drops, which has important implications in the industrial application such as ink-jet printing and metallurgy.

  19. Inference algorithms and learning theory for Bayesian sparse factor analysis

    International Nuclear Information System (INIS)

    Rattray, Magnus; Sharp, Kevin; Stegle, Oliver; Winn, John

    2009-01-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  20. Inference algorithms and learning theory for Bayesian sparse factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rattray, Magnus; Sharp, Kevin [School of Computer Science, University of Manchester, Manchester M13 9PL (United Kingdom); Stegle, Oliver [Max-Planck-Institute for Biological Cybernetics, Tuebingen (Germany); Winn, John, E-mail: magnus.rattray@manchester.ac.u [Microsoft Research Cambridge, Roger Needham Building, Cambridge, CB3 0FB (United Kingdom)

    2009-12-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  1. A default Bayesian hypothesis test for mediation.

    Science.gov (United States)

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  2. Coalescence in dense water/oil dispersions

    Energy Technology Data Exchange (ETDEWEB)

    Thunem, H

    1993-06-01

    This project has been divided into three parts. The first part has been to review a large amount of literature to obtain models describing separate stages of the coalescence of a single drop at an interface. These stages include the drop deformation, the film thinning, the critical film thickness, and the drop breakup. The second part has been to evaluate the different models and select which to use in the development of the OneDrop program. The models describing drop deformation and film thinning were supplied by Charles and Mason, however the film thinning model has been slightly enhanced in this project. The models and the enhancements made have been compared to experimental data from the literature and from work done by undergraduate students at our department. The third part of the project has been to implement the models to drop-drop coalescence, and to write the SIM program to simulate the coalescence in a system of many drops. We use the same equations as in the OneDrop case, except for the film thinning. But by using a similar derivation as for the OneDrop case, an equation for the SIM case has been developed. We have made the assumption that the physical phenomena regarding drop deformation, film thinning and critical film thickness are the same in the OneDrop and SIM cases, so the experimental validation of OneDrop also apply to SIM. By using the results from the SIM program, we can obtain some information about how different physical parameters will affect the collision frequency and collision efficiency. We may use this information to derive empirical equations describing these parameters effect on the coalescence probability in a dispersion. 207 refs., 83 figs., 21 tabs.

  3. Gene tree rooting methods give distributions that mimic the coalescent process.

    Science.gov (United States)

    Tian, Yuan; Kubatko, Laura S

    2014-01-01

    Multi-locus phylogenetic inference is commonly carried out via models that incorporate the coalescent process to model the possibility that incomplete lineage sorting leads to incongruence between gene trees and the species tree. An interesting question that arises in this context is whether data "fit" the coalescent model. Previous work (Rosenfeld et al., 2012) has suggested that rooting of gene trees may account for variation in empirical data that has been previously attributed to the coalescent process. We examine this possibility using simulated data. We show that, in the case of four taxa, the distribution of gene trees observed from rooting estimated gene trees with either the molecular clock or with outgroup rooting can be closely matched by the distribution predicted by the coalescent model with specific choices of species tree branch lengths. We apply commonly-used coalescent-based methods of species tree inference to assess their performance in these situations. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Using acoustics to study and stimulate the coalescence of oil drops surrounded by water

    Energy Technology Data Exchange (ETDEWEB)

    Gardner, E.A.; Apfel, R.E. (Yale Univ., New Haven, CT (United States). Dept. of Mechanical Engineering)

    1993-08-01

    The coalescence of oil drops in water is studied using acoustic levitation and stimulated with acoustic cavitation. Unlike most earlier studies, which investigate the coalescence of a single drop with an initially planar interface, the use of acoustic radiation forces allows two drops to be brought into contact and allowed to coalesce. The acoustic technique has the advantage over other drop-drop coalescence systems in that the drops remain in contact until they coalesce without the use of solid supports to control them. Additionally, acoustic cavitation is observed to deposit sufficient energy in the oil-water interface to trigger the coalescence of a pair of 2-mm-diameter drops. This stimulation mechanism could have application to emulsion breaking. Some of the factors that affect spontaneous and stimulated coalescence are investigated.

  5. Theory of peak coalescence in Fourier transform ion cyclotron resonance mass spectrometry.

    Science.gov (United States)

    Boldin, Ivan A; Nikolaev, Eugene N

    2009-10-01

    Peak coalescence, i.e. the merging of two close peaks in a Fourier transform ion cyclotron resonance (FTICR) mass spectrum at a high number of ions, plays an important role in various FTICR experiments. In order to describe the coalescence phenomenon we would like to propose a new theory of motion for ion clouds with close mass-to-charge ratios, driven by a uniform magnetic field and Coulomb interactions between the clouds. We describe the motion of the ion clouds in terms of their averaged drift motion in crossed magnetic and electric fields. The ion clouds are considered to be of constant size and their motion is studied in two dimensions. The theory deals with the first-order approximation of the equations of motion in relation to dm/m, where dm is the mass difference and m is the mass of a single ion. The analysis was done for an arbitrary inter-cloud interaction potential, which makes it possible to analyze finite-size ion clouds of any shape. The final analytical expression for the condition of the onset of coalescence is found for the case of uniformly charged spheres. An algorithm for finding this condition for an arbitrary interaction potential is proposed. The critical number of ions for the peak coalescence to take place is shown to depend quadratically on the magnetic field strength and to be proportional to the cyclotron radius and inversely proportional to the ion masses. Copyright (c) 2009 John Wiley & Sons, Ltd.

  6. Theoretical Coalescence: A Method to Develop Qualitative Theory: The Example of Enduring.

    Science.gov (United States)

    Morse, Janice M

    Qualitative research is frequently context bound, lacks generalizability, and is limited in scope. The purpose of this article was to describe a method, theoretical coalescence, that provides a strategy for analyzing complex, high-level concepts and for developing generalizable theory. Theoretical coalescence is a method of theoretical expansion, inductive inquiry, of theory development, that uses data (rather than themes, categories, and published extracts of data) as the primary source for analysis. Here, using the development of the lay concept of enduring as an example, I explore the scientific development of the concept in multiple settings over many projects and link it within the Praxis Theory of Suffering. As comprehension emerges when conducting theoretical coalescence, it is essential that raw data from various different situations be available for reinterpretation/reanalysis and comparison to identify the essential features of the concept. The concept is then reconstructed, with additional inquiry that builds description, and evidence is conducted and conceptualized to create a more expansive concept and theory. By utilizing apparently diverse data sets from different contexts that are linked by certain characteristics, the essential features of the concept emerge. Such inquiry is divergent and less bound by context yet purposeful, logical, and with significant pragmatic implications for practice in nursing and beyond our discipline. Theoretical coalescence is a means by which qualitative inquiry is broadened to make an impact, to accommodate new theoretical shifts and concepts, and to make qualitative research applied and accessible in new ways.

  7. Binary naive Bayesian classifiers for correlated Gaussian features: a theoretical analysis

    CSIR Research Space (South Africa)

    Van Dyk, E

    2008-11-01

    Full Text Available classifier with Gaussian features while using any quadratic decision boundary. Therefore, the analysis is not restricted to Naive Bayesian classifiers alone and can, for instance, be used to calculate the Bayes error performance. We compare the analytical...

  8. An efficient Bayesian meta-analysis approach for studying cross-phenotype genetic associations.

    Directory of Open Access Journals (Sweden)

    Arunabha Majumdar

    2018-02-01

    Full Text Available Simultaneous analysis of genetic associations with multiple phenotypes may reveal shared genetic susceptibility across traits (pleiotropy. For a locus exhibiting overall pleiotropy, it is important to identify which specific traits underlie this association. We propose a Bayesian meta-analysis approach (termed CPBayes that uses summary-level data across multiple phenotypes to simultaneously measure the evidence of aggregate-level pleiotropic association and estimate an optimal subset of traits associated with the risk locus. This method uses a unified Bayesian statistical framework based on a spike and slab prior. CPBayes performs a fully Bayesian analysis by employing the Markov Chain Monte Carlo (MCMC technique Gibbs sampling. It takes into account heterogeneity in the size and direction of the genetic effects across traits. It can be applied to both cohort data and separate studies of multiple traits having overlapping or non-overlapping subjects. Simulations show that CPBayes can produce higher accuracy in the selection of associated traits underlying a pleiotropic signal than the subset-based meta-analysis ASSET. We used CPBayes to undertake a genome-wide pleiotropic association study of 22 traits in the large Kaiser GERA cohort and detected six independent pleiotropic loci associated with at least two phenotypes. This includes a locus at chromosomal region 1q24.2 which exhibits an association simultaneously with the risk of five different diseases: Dermatophytosis, Hemorrhoids, Iron Deficiency, Osteoporosis and Peripheral Vascular Disease. We provide an R-package 'CPBayes' implementing the proposed method.

  9. Coalescence kinetics of oil-in-water emulsions studied with microfluidics

    NARCIS (Netherlands)

    Krebs, T.; Schroen, C.G.P.H.; Boom, R.M.

    2013-01-01

    We report the results of experiments on the coalescence dynamics in flowing oil-in-water emulsions using an integrated microfluidic device. The microfluidic circuit permits direct observation of shear-induced collisions and coalescence events between emulsion droplets. Three mineral oils with a

  10. Electrohydrodynamic coalescence of droplets using an embedded potential flow model

    Science.gov (United States)

    Garzon, M.; Gray, L. J.; Sethian, J. A.

    2018-03-01

    The coalescence, and subsequent satellite formation, of two inviscid droplets is studied numerically. The initial drops are taken to be of equal and different sizes, and simulations have been carried out with and without the presence of an electrical field. The main computational challenge is the tracking of a free surface that changes topology. Coupling level set and boundary integral methods with an embedded potential flow model, we seamlessly compute through these singular events. As a consequence, the various coalescence modes that appear depending upon the relative ratio of the parent droplets can be studied. Computations of first stage pinch-off, second stage pinch-off, and complete engulfment are analyzed and compared to recent numerical studies and laboratory experiments. Specifically, we study the evolution of bridge radii and the related scaling laws, the minimum drop radii evolution from coalescence to satellite pinch-off, satellite sizes, and the upward stretching of the near cylindrical protrusion at the droplet top. Clear evidence of partial coalescence self-similarity is presented for parent droplet ratios between 1.66 and 4. This has been possible due to the fact that computational initial conditions only depend upon the mother droplet size, in contrast with laboratory experiments where the difficulty in establishing the same initial physical configuration is well known. The presence of electric forces changes the coalescence patterns, and it is possible to control the satellite droplet size by tuning the electrical field intensity. All of the numerical results are in very good agreement with recent laboratory experiments for water droplet coalescence.

  11. Environmental Modeling and Bayesian Analysis for Assessing Human Health Impacts from Radioactive Waste Disposal

    Science.gov (United States)

    Stockton, T.; Black, P.; Tauxe, J.; Catlett, K.

    2004-12-01

    Bayesian decision analysis provides a unified framework for coherent decision-making. Two key components of Bayesian decision analysis are probability distributions and utility functions. Calculating posterior distributions and performing decision analysis can be computationally challenging, especially for complex environmental models. In addition, probability distributions and utility functions for environmental models must be specified through expert elicitation, stakeholder consensus, or data collection, all of which have their own set of technical and political challenges. Nevertheless, a grand appeal of the Bayesian approach for environmental decision- making is the explicit treatment of uncertainty, including expert judgment. The impact of expert judgment on the environmental decision process, though integral, goes largely unassessed. Regulations and orders of the Environmental Protection Agency, Department Of Energy, and Nuclear Regulatory Agency orders require assessing the impact on human health of radioactive waste contamination over periods of up to ten thousand years. Towards this end complex environmental simulation models are used to assess "risk" to human and ecological health from migration of radioactive waste. As the computational burden of environmental modeling is continually reduced probabilistic process modeling using Monte Carlo simulation is becoming routinely used to propagate uncertainty from model inputs through model predictions. The utility of a Bayesian approach to environmental decision-making is discussed within the context of a buried radioactive waste example. This example highlights the desirability and difficulties of merging the cost of monitoring, the cost of the decision analysis, the cost and viability of clean up, and the probability of human health impacts within a rigorous decision framework.

  12. Coala: an R framework for coalescent simulation.

    Science.gov (United States)

    Staab, Paul R; Metzler, Dirk

    2016-06-15

    Simulation programs based on the coalescent efficiently generate genetic data according to a given model of evolution. We present coala, an R package for calling coalescent simulators with a unified syntax. It can execute simulations with several programs, calculate additional summary statistics and combine multiple simulations to create biologically more realistic data. The package is publicly available on CRAN and on https://github.com/statgenlmu/coala under the conditions of the MIT license. metzler@bio.lmu.de. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. msBayes: Pipeline for testing comparative phylogeographic histories using hierarchical approximate Bayesian computation

    Directory of Open Access Journals (Sweden)

    Takebayashi Naoki

    2007-07-01

    Full Text Available Abstract Background Although testing for simultaneous divergence (vicariance across different population-pairs that span the same barrier to gene flow is of central importance to evolutionary biology, researchers often equate the gene tree and population/species tree thereby ignoring stochastic coalescent variance in their conclusions of temporal incongruence. In contrast to other available phylogeographic software packages, msBayes is the only one that analyses data from multiple species/population pairs under a hierarchical model. Results msBayes employs approximate Bayesian computation (ABC under a hierarchical coalescent model to test for simultaneous divergence (TSD in multiple co-distributed population-pairs. Simultaneous isolation is tested by estimating three hyper-parameters that characterize the degree of variability in divergence times across co-distributed population pairs while allowing for variation in various within population-pair demographic parameters (sub-parameters that can affect the coalescent. msBayes is a software package consisting of several C and R programs that are run with a Perl "front-end". Conclusion The method reasonably distinguishes simultaneous isolation from temporal incongruence in the divergence of co-distributed population pairs, even with sparse sampling of individuals. Because the estimate step is decoupled from the simulation step, one can rapidly evaluate different ABC acceptance/rejection conditions and the choice of summary statistics. Given the complex and idiosyncratic nature of testing multi-species biogeographic hypotheses, we envision msBayes as a powerful and flexible tool for tackling a wide array of difficult research questions that use population genetic data from multiple co-distributed species. The msBayes pipeline is available for download at http://msbayes.sourceforge.net/ under an open source license (GNU Public License. The msBayes pipeline is comprised of several C and R programs that

  14. RADYBAN: A tool for reliability analysis of dynamic fault trees through conversion into dynamic Bayesian networks

    International Nuclear Information System (INIS)

    Montani, S.; Portinale, L.; Bobbio, A.; Codetta-Raiteri, D.

    2008-01-01

    In this paper, we present RADYBAN (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze a dynamic fault tree relying on its conversion into a dynamic Bayesian network. The tool implements a modular algorithm for automatically translating a dynamic fault tree into the corresponding dynamic Bayesian network and exploits classical algorithms for the inference on dynamic Bayesian networks, in order to compute reliability measures. After having described the basic features of the tool, we show how it operates on a real world example and we compare the unreliability results it generates with those returned by other methodologies, in order to verify the correctness and the consistency of the results obtained

  15. The coalescence instability in solar flares

    Science.gov (United States)

    Tajima, T.; Brunel, F.; Sakai, J.-I.; Vlahos, L.; Kundu, M. R.

    1985-01-01

    The nonlinear coalescence instability of current carrying solar loops can explain many of the characteristics of the solar flares such as their impulsive nature, heating and high energy particle acceleration, amplitude oscillations of electromagnetic and emission as well as the characteristics of two-dimensional microwave images obtained during a flare. The plasma compressibility leads to the explosive phase of loop coalescence and its overshoot results in amplitude oscillations in temperatures by adiabatic compression and decompression. It is noted that the presence of strong electric fields and super-Alfvenic flows during the course of the instability play an important role in the production of nonthermal particles. A qualitative explanation on the physical processes taking place during the nonlinear stages of the instability is given.

  16. The coalescence instability in solar flares

    International Nuclear Information System (INIS)

    Tajima, T.; Brunel, F.; Sakai, J.I.; Vlahos, L.; Kundu, M.R.

    1984-01-01

    The non-linear coalescence instability of current carrying solar loops can explain many of the characteristics of the solar flares such as their impulsive nature, heating and high energy particle acceleration, amplitude oscillations of electromagnetic emission as well as the characteristics of 2-D microwave images obtained during a flare. The plasma compressibility leads to the explosive phase of loop coalescence and its overshoot results in amplitude oscillations in temperatures by adiabatic compression and decompression. We note that the presence of strong electric fields and super-Alfvenic flows during the course of the instabilty paly an important role in the production of non-thermal particles. A qualitative explanation on the physical processes taking place during the non-linear stages of the instability is given. (author)

  17. Assessment of partial coalescence in whippable oil-in-water food emulsions.

    Science.gov (United States)

    Petrut, Raul Flaviu; Danthine, Sabine; Blecker, Christophe

    2016-03-01

    Partial coalescence influences to a great extent the properties of final food products such as ice cream and whipped toppings. In return, the partial coalescence occurrence and development are conditioned, in such systems, by the emulsion's intrinsic properties (e.g. solid fat content, fat crystal shape and size), formulation (e.g. protein content, surfactants presence) and extrinsic factors (e.g. cooling rate, shearing). A set of methods is available for partial coalescence investigation and quantification. These methods are critically reviewed in this paper, balancing the weaknesses of the methods in terms of structure alteration (for turbidity, dye dilution, etc.) and assumptions made for mathematical models (for particle size determination) with their advantages (good repeatability, high sensitivity, etc.). With the methods proposed in literature, the partial coalescence investigations can be conducted quantitatively and/or qualitatively. Good correlation were observed between some of the quantitative methods such as dye dilution, calorimetry, fat particle size; while a poor correlation was found in the case of solvent extraction method with other quantitative methods. The most suitable way for partial coalescence quantification was implied to be the fat particle size method, which would give results with a high degree of confidence if used in combination with a microscopic technique for the confirmation of partial coalescence as the main destabilization mechanism. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Review of the Dynamics of Coalescence and Demulsification by High-Voltage Pulsed Electric Fields

    Directory of Open Access Journals (Sweden)

    Ye Peng

    2016-01-01

    Full Text Available The coalescence of droplets in oil can be implemented rapidly by high-voltage pulse electric field, which is an effective demulsification dehydration technological method. At present, it is widely believed that the main reason of pulse electric field promoting droplets coalescence is the dipole coalescence and oscillation coalescence in pulse electric field, and the optimal coalescence pulse electric field parameters exist. Around the above content, the dynamics of high-voltage pulse electric field promoting the coalescence of emulsified droplets is studied by researchers domestically and abroad. By review, the progress of high-voltage pulse electric field demulsification technology can get a better understanding, which has an effect of throwing a sprat to catch a whale on promoting the industrial application.

  19. Bayesian Reasoning in Data Analysis A Critical Introduction

    CERN Document Server

    D'Agostini, Giulio

    2003-01-01

    This book provides a multi-level introduction to Bayesian reasoning (as opposed to "conventional statistics") and its applications to data analysis. The basic ideas of this "new" approach to the quantification of uncertainty are presented using examples from research and everyday life. Applications covered include: parametric inference; combination of results; treatment of uncertainty due to systematic errors and background; comparison of hypotheses; unfolding of experimental distributions; upper/lower bounds in frontier-type measurements. Approximate methods for routine use are derived and ar

  20. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  1. Computational approach for a pair of bubble coalescence process

    International Nuclear Information System (INIS)

    Nurul Hasan; Zalinawati binti Zakaria

    2011-01-01

    The coalescence of bubbles has great value in mineral recovery and oil industry. In this paper, two co-axial bubbles rising in a cylinder is modelled to study the coalescence of bubbles for four computational experimental test cases. The Reynolds' (Re) number is chosen in between 8.50 and 10, Bond number, Bo ∼4.25-50, Morton number, M 0.0125-14.7. The viscosity ratio (μ r ) and density ratio (ρ r ) of liquid to bubble are kept constant (100 and 850 respectively). It was found that the Bo number has significant effect on the coalescence process for constant Re, μ r and ρ r . The bubble-bubble distance over time was validated against published experimental data. The results show that VOF approach can be used to model these phenomena accurately. The surface tension was changed to alter the Bo and density of the fluids to alter the Re and M, keeping the μ r and ρ r the same. It was found that for lower Bo, the bubble coalesce is slower and the pocket at the lower part of the leading bubble is less concave (towards downward) which is supported by the experimental data.

  2. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  3. Bayesian tomography and integrated data analysis in fusion diagnostics

    Science.gov (United States)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  4. Prior approval: the growth of Bayesian methods in psychology.

    Science.gov (United States)

    Andrews, Mark; Baguley, Thom

    2013-02-01

    Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.

  5. A Bayesian Analysis of the Flood Frequency Hydrology Concept

    Science.gov (United States)

    2016-02-01

    ERDC/CHL CHETN-X-1 February 2016 Approved for public release; distribution is unlimited. A Bayesian Analysis of the Flood Frequency Hydrology ...flood frequency hydrology concept as a formal probabilistic-based means by which to coherently combine and also evaluate the worth of different types...and development. INTRODUCTION: Merz and Blöschl (2008a,b) proposed the concept of flood frequency hydrology , which emphasizes the importance of

  6. Sensitivity analysis in Gaussian Bayesian networks using a symbolic-numerical technique

    International Nuclear Information System (INIS)

    Castillo, Enrique; Kjaerulff, Uffe

    2003-01-01

    The paper discusses the problem of sensitivity analysis in Gaussian Bayesian networks. The algebraic structure of the conditional means and variances, as rational functions involving linear and quadratic functions of the parameters, are used to simplify the sensitivity analysis. In particular the probabilities of conditional variables exceeding given values and related probabilities are analyzed. Two examples of application are used to illustrate all the concepts and methods

  7. Dynamics of gas cell coalescence during baking expansion of leavened dough.

    Science.gov (United States)

    Miś, Antoni; Nawrocka, Agnieszka; Lamorski, Krzysztof; Dziki, Dariusz

    2018-01-01

    The investigation of the dynamics of gas cell coalescence, i.e. a phenomenon that deteriorates the homogeneity of the cellular structure of bread crumb, was carried out performing simultaneously measurements of the dough volume, pressure, and viscosity. It was demonstrated that, during the baking expansion of chemically leavened wheat flour dough, the maximum growth rate of the gas cell radius determined from the ratio of pressure exerted by the expanded dough to its viscosity was on average four-fold lower than that calculated from volume changes in the gas phase of the dough. Such a high discrepancy was interpreted as a result of the course of coalescence, and a formula for determination of its rate was developed. The coalescence rate in the initial baking expansion phase had negative values, indicating nucleation of newly formed gas cells, which increased the number of gas cells even by 8%. In the next baking expansion phase, the coalescence rate started to exhibit positive values, reflecting dominance of the coalescence phenomenon over nucleation. The maximum coalescence rates indicate that, during the period of the most intensive dough expansion, the number of gas cells decreased by 2-3% within one second. At the end of the formation of bread crumb, the number of the gas cells declined by 55-67% in comparison with the initial value. The correctness of the results was positively verified using X-ray micro-computed tomography. The developed method can be a useful tool for more profound exploration of the coalescence phenomenon at various stages of evolution of the cellular structure and its determinants, which may contribute to future development of more effective methods for improving the texture and sensory quality of bread crumb. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Experimental investigation of void coalescence in a dual phase steel using X-ray tomography

    International Nuclear Information System (INIS)

    Landron, C.; Bouaziz, O.; Maire, E.; Adrien, J.

    2013-01-01

    In situ tensile tests were carried out during X-ray microtomography imaging of a smooth and a notched specimen of dual phase steel. The void coalescence was first qualitatively observed and quantitative data concerning this damage step was then acquired. The void coalescence criteria of Brown and Embury and of Thomason were then tested against the experimental data at both the macroscopic and local levels. Although macroscopic implementation of the criteria gave acceptable results, the local approach was probably closest to the real nature of void coalescence, because it takes into account local coalescence events observed experimentally before final fracture. The correlation between actual coalescing couples of cavities and local implementation of the two criteria showed that the Thomason criterion is probably the best adapted to predict the local coalescence events in the case of the material studied

  9. Dating ancient Chinese celadon porcelain by neutron activation analysis and bayesian classification

    International Nuclear Information System (INIS)

    Xie Guoxi; Feng Songlin; Feng Xiangqian; Zhu Jihao; Yan Lingtong; Li Li

    2009-01-01

    Dating ancient Chinese porcelain is one of the most important and difficult problems in porcelain archaeological field. Eighteen elements in bodies of ancient celadon porcelains fired in Southern Song to Yuan period (AD 1127-1368) and Ming dynasty (AD 1368-1644), including La, Sm, U, Ce, etc., were determined by neutron activation analysis (NAA). After the outliers of experimental data were excluded and multivariate normal distribution was tested, and Bayesian classification was used for dating of 165 ancient celadon porcelain samples. The results show that 98.2% of total ancient celadon porcelain samples are classified correctly. It means that NAA and Bayesian classification are very useful for dating ancient porcelain. (authors)

  10. bspmma: An R Package for Bayesian Semiparametric Models for Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Deborah Burr

    2012-07-01

    Full Text Available We introduce an R package, bspmma, which implements a Dirichlet-based random effects model specific to meta-analysis. In meta-analysis, when combining effect estimates from several heterogeneous studies, it is common to use a random-effects model. The usual frequentist or Bayesian models specify a normal distribution for the true effects. However, in many situations, the effect distribution is not normal, e.g., it can have thick tails, be skewed, or be multi-modal. A Bayesian nonparametric model based on mixtures of Dirichlet process priors has been proposed in the literature, for the purpose of accommodating the non-normality. We review this model and then describe a competitor, a semiparametric version which has the feature that it allows for a well-defined centrality parameter convenient for determining whether the overall effect is significant. This second Bayesian model is based on a different version of the Dirichlet process prior, and we call it the "conditional Dirichlet model". The package contains functions to carry out analyses based on either the ordinary or the conditional Dirichlet model, functions for calculating certain Bayes factors that provide a check on the appropriateness of the conditional Dirichlet model, and functions that enable an empirical Bayes selection of the precision parameter of the Dirichlet process. We illustrate the use of the package on two examples, and give an interpretation of the results in these two different scenarios.

  11. Efficient Bayesian hierarchical functional data analysis with basis function approximations using Gaussian-Wishart processes.

    Science.gov (United States)

    Yang, Jingjing; Cox, Dennis D; Lee, Jong Soo; Ren, Peng; Choi, Taeryon

    2017-12-01

    Functional data are defined as realizations of random functions (mostly smooth functions) varying over a continuum, which are usually collected on discretized grids with measurement errors. In order to accurately smooth noisy functional observations and deal with the issue of high-dimensional observation grids, we propose a novel Bayesian method based on the Bayesian hierarchical model with a Gaussian-Wishart process prior and basis function representations. We first derive an induced model for the basis-function coefficients of the functional data, and then use this model to conduct posterior inference through Markov chain Monte Carlo methods. Compared to the standard Bayesian inference that suffers serious computational burden and instability in analyzing high-dimensional functional data, our method greatly improves the computational scalability and stability, while inheriting the advantage of simultaneously smoothing raw observations and estimating the mean-covariance functions in a nonparametric way. In addition, our method can naturally handle functional data observed on random or uncommon grids. Simulation and real studies demonstrate that our method produces similar results to those obtainable by the standard Bayesian inference with low-dimensional common grids, while efficiently smoothing and estimating functional data with random and high-dimensional observation grids when the standard Bayesian inference fails. In conclusion, our method can efficiently smooth and estimate high-dimensional functional data, providing one way to resolve the curse of dimensionality for Bayesian functional data analysis with Gaussian-Wishart processes. © 2017, The International Biometric Society.

  12. Driving Style Analysis Using Primitive Driving Patterns With Bayesian Nonparametric Approaches

    OpenAIRE

    Wang, Wenshuo; Xi, Junqiang; Zhao, Ding

    2017-01-01

    Analysis and recognition of driving styles are profoundly important to intelligent transportation and vehicle calibration. This paper presents a novel driving style analysis framework using the primitive driving patterns learned from naturalistic driving data. In order to achieve this, first, a Bayesian nonparametric learning method based on a hidden semi-Markov model (HSMM) is introduced to extract primitive driving patterns from time series driving data without prior knowledge of the number...

  13. Coalescence aspects of cobalt nanoparticles during in situ high-temperature annealing

    NARCIS (Netherlands)

    Palasantzas, G; Vystavel, T; Koch, SA; De Hosson, JTM

    2006-01-01

    In this work we investigate the coalescence aspects of Co nanoparticles. It was observed that nanoparticles in contact with the substrate are relatively immobile, whereas those on top of other Co particles can rearrange themselves during high-temperature annealing and further coalesce. Indeed,

  14. A Bayesian analysis of the unit root in real exchange rates

    NARCIS (Netherlands)

    P.C. Schotman (Peter); H.K. van Dijk (Herman)

    1991-01-01

    textabstractWe propose a posterior odds analysis of the hypothesis of a unit root in real exchange rates. From a Bayesian viewpoint the random walk hypothesis for real exchange rates is a posteriori as probable as a stationary AR(1) process for four out of eight time series investigated. The French

  15. An introduction to Bayesian statistics in health psychology.

    Science.gov (United States)

    Depaoli, Sarah; Rus, Holly M; Clifton, James P; van de Schoot, Rens; Tiemensma, Jitske

    2017-09-01

    The aim of the current article is to provide a brief introduction to Bayesian statistics within the field of health psychology. Bayesian methods are increasing in prevalence in applied fields, and they have been shown in simulation research to improve the estimation accuracy of structural equation models, latent growth curve (and mixture) models, and hierarchical linear models. Likewise, Bayesian methods can be used with small sample sizes since they do not rely on large sample theory. In this article, we discuss several important components of Bayesian statistics as they relate to health-based inquiries. We discuss the incorporation and impact of prior knowledge into the estimation process and the different components of the analysis that should be reported in an article. We present an example implementing Bayesian estimation in the context of blood pressure changes after participants experienced an acute stressor. We conclude with final thoughts on the implementation of Bayesian statistics in health psychology, including suggestions for reviewing Bayesian manuscripts and grant proposals. We have also included an extensive amount of online supplementary material to complement the content presented here, including Bayesian examples using many different software programmes and an extensive sensitivity analysis examining the impact of priors.

  16. Etude expérimentale des phénomènes de coalescence dans les systèmes bulles-gouttes Experimental Study of Coalescence Phenomena in Bubble-Drop Systems

    Directory of Open Access Journals (Sweden)

    Roques H.

    2006-11-01

    Full Text Available A l'occasion d'une étude sur les séparations eau-hydrocarbures par flottation, nous avons été amenés à étudier expérimentalement les coalescences bulle-goutte, goutte-goutte et bulle-bulle dans le système triphasique eau-air-kérosène. Les 4 montages expérimentaux décrits nous ont permis d'étudier les aspects statiques (structure du complexe bulle-goutte qui se forme et dynamiques (temps de coalescence moyens bulle-goutte, goutte-goutte et bulle-bulle et d'étudier l'influence de composés transférables d'une phase à l'autre sur ces temps de coalescence moyens. Du point de vue statique, la configuration stable du complexe bulle d'air-goutte de kérosène correspond à la formation à l'interface eau-air d'un film d'hydrocarbure qui entoure la bulle d'air, Par contre, la fixation d'une bulle d'air à la périphérie d'une goutte de kérosène, selon la disposition classique dans la flottation des solides, ne s'observe ici que rarement et toujours de façon transitoire. D'un point de vue cinétique on observe que : - les coalescences bulle-bulle ou goutte-goutte sont toujours favorisées (les temps de coalescence moyens diminuent lorsqu'on introduit dans la phase gaz ou dans l'une des phases liquides un composé transférable dans l'autre phase liquide; - les coalescences bulle-goutte sont favorisées par la présence dans la phase gaz d'un composé transférable dans la phase continue aqueuse ou par la présence dans la phase dispersée liquide d'un composé transférable dans la phase continue aqueuse; - les coalescences bulle-goutte sont défavorisées par la présence dans la phase continue aqueuse d'un composé transférable sur les gouttes constituant la phase dispersée liquide. During a study of water-hydrocarbon separations by flotation, we were led to make an experimental examination of bubble-drop, drop-drop and bubble-bubble coalescences in a three-phase water-air-kerosene system. The four experimental arrangements

  17. Multimode multidrop serial coalescence effects during condensation on hierarchical superhydrophobic surfaces.

    Science.gov (United States)

    Rykaczewski, Konrad; Paxson, Adam T; Anand, Sushant; Chen, Xuemei; Wang, Zuankai; Varanasi, Kripa K

    2013-01-22

    The prospect of enhancing the condensation rate by decreasing the maximum drop departure diameter significantly below the capillary length through spontaneous drop motion has generated significant interest in condensation on superhydrophobic surfaces (SHS). The mobile coalescence leading to spontaneous drop motion was initially reported to occur only on hierarchical SHS, consisting of both nanoscale and microscale topological features. However, subsequent studies have shown that mobile coalescence also occurs on solely nanostructured SHS. Thus, recent focus has been on understanding the condensation process on nanostructured surfaces rather than on hierarchical SHS. In this work, we investigate the impact of microscale topography of hierarchical SHS on the droplet coalescence dynamics and wetting states during the condensation process. We show that isolated mobile and immobile coalescence between two drops, almost exclusively focused on in previous studies, are rare. We identify several new droplet shedding modes, which are aided by tangential propulsion of mobile drops. These droplet shedding modes comprise of multiple droplets merging during serial coalescence events, which culminate in formation of a drop that either departs or remains anchored to the surface. We directly relate postmerging drop adhesion to formation of drops in nanoscale as well as microscale Wenzel and Cassie-Baxter wetting states. We identify the optimal microscale feature spacing of the hierarchical SHS, which promotes departure of the highest number of microdroplets. This optimal surface architecture consists of microscale features spaced close enough to enable transition of larger droplets into micro-Cassie state yet, at the same time, provides sufficient spacing in-between the features for occurrence of mobile coalescence.

  18. Prediction of crack coalescence of steam generator tubes in nuclear power plants

    International Nuclear Information System (INIS)

    Abou-Hanna, Jeries; McGreevy, Timothy E.; Majumdar, Saurin

    2004-01-01

    Prediction of failure pressures of cracked steam generator tubes of nuclear power plants is an important ingredient in scheduling inspection and repair of tubes. Prediction is usually based on nondestructive evaluation (NDE) of cracks. NDE often reveals two neighboring cracks. If the cracks interact, the tube pressure under which the ligament between the two cracks fails could be much lower than the critical burst pressure of an individual equivalent crack. The ability to accurately predict the ligament failure pressure, called ''coalescence pressure,'' is important. The failure criterion was established by nonlinear finite element model (FEM) analyses of coalescence of two 100% through-wall collinear cracks. The ligament failure is precipitated by local instability of the ligament under plane strain conditions. As a result of this local instability, the ligament thickness in the radial direction decreases abruptly with pressure. Good correlation of FEM analysis results with experimental data obtained at Argonne National Laboratory's Energy Technology Division demonstrated that nonlinear FEM analyses are capable of predicting the coalescence pressure accurately for 100% through-wall cracks. This failure criterion and FEA work have been extended to axial cracks of varying ligament width, crack length, and cases where cracks are offset by axial or circumferential ligaments

  19. Partial coalescence as a tool to control sensory perception of emulsions

    NARCIS (Netherlands)

    Benjamins, J.; Vingerhoeds, M.H.; Zoet, F.D.; Hoog, de E.H.A.; Aken, van G.A.

    2009-01-01

    This study evaluates the role of partial coalescence of whey protein-stabilized emulsions on sensory perception. The selection of fats was restricted to vegetable fats that are essentially melted at oral temperatures. The sensitivity to partial coalescence was controlled by a variation in the fat

  20. Bayesian switching factor analysis for estimating time-varying functional connectivity in fMRI.

    Science.gov (United States)

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-07-15

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal

  1. Bayesian Integrated Data Analysis of Fast-Ion Measurements by Velocity-Space Tomography

    DEFF Research Database (Denmark)

    Salewski, M.; Nocente, M.; Jacobsen, A.S.

    2018-01-01

    Bayesian integrated data analysis combines measurements from different diagnostics to jointly measure plasma parameters of interest such as temperatures, densities, and drift velocities. Integrated data analysis of fast-ion measurements has long been hampered by the complexity of the strongly non...... framework. The implementation for different types of diagnostics as well as the uncertainties are discussed, and we highlight the importance of integrated data analysis of all available detectors....

  2. BURD, Bayesian estimation in data analysis of Probabilistic Safety Assessment

    International Nuclear Information System (INIS)

    Jang, Seung-cheol; Park, Jin-Kyun

    2008-01-01

    1 - Description of program or function: BURD (Bayesian Update for Reliability Data) is a simple code that can be used to obtain a Bayesian estimate easily in the data analysis of PSA (Probabilistic Safety Assessment). According to the Bayes' theorem, basically, the code facilitates calculations of posterior distribution given the prior and the likelihood (evidence) distributions. The distinctive features of the program, BURD, are the following: - The input consists of the prior and likelihood functions that can be chosen from the built-in statistical distributions. - The available prior distributions are uniform, Jeffrey's non informative, beta, gamma, and log-normal that are most-frequently used in performing PSA. - For likelihood function, the user can choose from four statistical distributions, e.g., beta, gamma, binomial and poisson. - A simultaneous graphic display of the prior and posterior distributions facilitate an intuitive interpretation of the results. - Export facilities for the graphic display screen and text-type outputs are available. - Three options for treating zero-evidence data are provided. - Automatic setup of an integral calculus section for a Bayesian updating. 2 - Methods: The posterior distribution is estimated in accordance with the Bayes' theorem, given the prior and the likelihood (evidence) distributions. 3 - Restrictions on the complexity of the problem: The accuracy of the results depends on the calculational error of the statistical function library in MS Excel

  3. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  4. Analysis of COSIMA spectra: Bayesian approach

    Directory of Open Access Journals (Sweden)

    H. J. Lehto

    2015-06-01

    secondary ion mass spectrometer (TOF-SIMS spectra. The method is applied to the COmetary Secondary Ion Mass Analyzer (COSIMA TOF-SIMS mass spectra where the analysis can be broken into subgroups of lines close to integer mass values. The effects of the instrumental dead time are discussed in a new way. The method finds the joint probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes and positions. In the case of two or more lines, these distributions can take complex forms. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique to COSIMA. Finally, we point out that the Bayesian method can be thought of as a means to solve inverse problems but with forward calculations, only with no iterative corrections or other manipulation of the observed data.

  5. On hydrogen-induced plastic flow localization during void growth and coalescence

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, D.C.; Sofronis, P. [Department of Mechanical Science and Engineering, University of Illinois at Urbana-Champaign, 1206 West Green Street, Urbana, IL 61801 (United States); Dodds, R.H. Jr. [Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, 205 North Mathews Avenue, Urbana, IL 61801 (United States)

    2007-11-15

    Hydrogen-enhanced localized plasticity (HELP) is recognized as a viable mechanism of hydrogen embrittlement. A possible way by which the HELP mechanism can bring about macroscopic material failure is through hydrogen-induced accelerated void growth and coalescence. Assuming a periodic array of spherical voids loaded axisymmetrically, we investigate the hydrogen effect on the occurrence of plastic flow localization upon void growth and its dependence on macroscopic stress triaxiality. Under a macroscopic stress triaxiality equal to 1 and prior to void coalescence, the finite element calculation results obtained with material data relevant to A533B steel indicate that a hydrogen-induced localized shear band forms at an angle of about 45 {sup circle} from the axis of symmetry. At triaxiality equal to 3, void coalescence takes place by accelerated hydrogen-induced localization of plasticity mainly in the ligament between the voids. Lastly, we discuss the numerical results within the context of experimental observations on void growth and coalescence in the presence of hydrogen. (author)

  6. Application of parametric equations of motion to study the resonance coalescence in H2(+).

    Science.gov (United States)

    Kalita, Dhruba J; Gupta, Ashish K

    2012-12-07

    Recently, occurrence of coalescence point was reported in H(2)(+) undergoing multiphoton dissociation in strong laser field. We have applied parametric equations of motion and smooth exterior scaling method to study the coalescence phenomenon of H(2)(+). The advantage of this method is that one can easily trace the different states that are changing as the field parameters change. It was reported earlier that in the parameter space, only two bound states coalesce [R. Lefebvre, O. Atabek, M. Sindelka, and N. Moiseyev, Phys. Rev. Lett. 103, 123003 (2009)]. However, it is found that increasing the accuracy of the calculation leads to the coalescence between resonance states originating from the bound and the continuum states. We have also reported many other coalescence points.

  7. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  8. Monte Carlo Algorithms for a Bayesian Analysis of the Cosmic Microwave Background

    Science.gov (United States)

    Jewell, Jeffrey B.; Eriksen, H. K.; ODwyer, I. J.; Wandelt, B. D.; Gorski, K.; Knox, L.; Chu, M.

    2006-01-01

    A viewgraph presentation on the review of Bayesian approach to Cosmic Microwave Background (CMB) analysis, numerical implementation with Gibbs sampling, a summary of application to WMAP I and work in progress with generalizations to polarization, foregrounds, asymmetric beams, and 1/f noise is given.

  9. Flow-induced coalescence in polydisperse systems

    Czech Academy of Sciences Publication Activity Database

    Fortelný, Ivan; Jůza, Josef

    2014-01-01

    Roč. 299, č. 10 (2014), s. 1213-1219 ISSN 1438-7492 R&D Projects: GA ČR GAP106/11/1069 Institutional support: RVO:61389013 Keywords : coalescence * polymer blends * polydisperse droplets Subject RIV: BK - Fluid Dynamics Impact factor: 2.661, year: 2014

  10. Search for gravitational waves from primordial black hole binary coalescences in the galactic halo

    International Nuclear Information System (INIS)

    Abbott, B.; Anderson, S.B.; Araya, M.; Armandula, H.; Asiri, F.; Barish, B.C.; Barnes, M.; Barton, M.A.; Bhawal, B.; Billingsley, G.; Black, E.; Blackburn, K.; Bogue, L.; Bork, R.; Brown, D.A.; Busby, D.; Cardenas, L.; Chandler, A.; Chapsky, J.; Charlton, P.

    2005-01-01

    We use data from the second science run of the LIGO gravitational-wave detectors to search for the gravitational waves from primordial black hole binary coalescence with component masses in the range 0.2-1.0M · . The analysis requires a signal to be found in the data from both LIGO observatories, according to a set of coincidence criteria. No inspiral signals were found. Assuming a spherical halo with core radius 5 kpc extending to 50 kpc containing nonspinning black holes with masses in the range 0.2-1.0M · , we place an observational upper limit on the rate of primordial black hole coalescence of 63 per year per Milky Way halo (MWH) with 90% confidence

  11. A Bayesian Analysis of a Randomized Clinical Trial Comparing Antimetabolite Therapies for Non-Infectious Uveitis.

    Science.gov (United States)

    Browne, Erica N; Rathinam, Sivakumar R; Kanakath, Anuradha; Thundikandy, Radhika; Babu, Manohar; Lietman, Thomas M; Acharya, Nisha R

    2017-02-01

    To conduct a Bayesian analysis of a randomized clinical trial (RCT) for non-infectious uveitis using expert opinion as a subjective prior belief. A RCT was conducted to determine which antimetabolite, methotrexate or mycophenolate mofetil, is more effective as an initial corticosteroid-sparing agent for the treatment of intermediate, posterior, and pan-uveitis. Before the release of trial results, expert opinion on the relative effectiveness of these two medications was collected via online survey. Members of the American Uveitis Society executive committee were invited to provide an estimate for the relative decrease in efficacy with a 95% credible interval (CrI). A prior probability distribution was created from experts' estimates. A Bayesian analysis was performed using the constructed expert prior probability distribution and the trial's primary outcome. A total of 11 of the 12 invited uveitis specialists provided estimates. Eight of 11 experts (73%) believed mycophenolate mofetil is more effective. The group prior belief was that the odds of treatment success for patients taking mycophenolate mofetil were 1.4-fold the odds of those taking methotrexate (95% CrI 0.03-45.0). The odds of treatment success with mycophenolate mofetil compared to methotrexate was 0.4 from the RCT (95% confidence interval 0.1-1.2) and 0.7 (95% CrI 0.2-1.7) from the Bayesian analysis. A Bayesian analysis combining expert belief with the trial's result did not indicate preference for one drug. However, the wide credible interval leaves open the possibility of a substantial treatment effect. This suggests clinical equipoise necessary to allow a larger, more definitive RCT.

  12. Partial coalescence from bubbles to drops

    KAUST Repository

    Zhang, F. H.; Thoraval, Marie-Jean; Thoroddsen, Sigurdur T; Taborek, P.

    2015-01-01

    the travel time of this wave mode on the bubble surface, we also show that the model is consistent with the experiments. This wavenumber is determined by both the global drainage as well as the interface shapes during the rapid coalescence in the neck

  13. Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis

    Directory of Open Access Journals (Sweden)

    Chernoded Andrey

    2017-01-01

    Full Text Available Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.

  14. CoaSim: A Flexible Environment for Simulating Genetic Data under Coalescent Models

    DEFF Research Database (Denmark)

    Mailund; Schierup, Mikkel Heide; Pedersen, Christian Nørgaard Storm

    2005-01-01

    get insight into these. Results We have created the CoaSim application as a flexible environment for Monte various types of genetic data under equilibrium and non-equilibrium coalescent variety of applications. Interaction with the tool is through the Guile version scripting language. Scheme scripts......Background Coalescent simulations are playing a large role in interpreting large scale intra- polymorphism surveys and for planning and evaluating association studies. Coalescent of data sets under different models can be compared to the actual data to test different evolutionary factors and thus...

  15. Characterization of Solids Deposited on the Modular Caustic-Side Solvent Extraction Unit (MCU) Strip Effluent (SE) Coalescer Media Removed in April 2015

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-06-13

    On June 2015, Savannah River National Laboratory (SRNL) received a Strip Effluent (SE) coalescer (FLT-304) from MCU. That coalescer was first installed at MCU in late October 2014 and removed in April 2015. While processing approximately 48,700 gallons of strip solution, the pressure drop steadily increased linearly from 1 psi to near 16 psi (the administrative limit is 17 psi) with the total filtrate volume (2.1E-4 psi/gal of filtrate). The linear behavior is due to the combined effect of a constant deposition of material that starts from the closed-end to the mid-section of the coalescer reducing the available surface area of the coalescer for fluid passage (linearly with filtrate volume) and the formation of a secondary emulsion (water in NG-CSSX) on the fibers of the coalescer media. Both effects reduced the coalescer porosity by at least 13% (after processing 48,700 gallons). Before the coalescer was removed, it was flushed with a 10 mM boric acid solution to reduce the dose level. To determine the nature of the deposited material, a physical and chemical analysis of the coalescer was conducted. Characterization of this coalescer revealed the adsorption of organic containing amines (secondary amides and primary amines), TiDG, degraded modifier (with no hydroxyl group), MaxCalix, and oxidized hydrocarbon (possibly from Isopar™L or from lubricant used at MCU) onto the coalescer media. The amide and amines are possibly from the decomposition of the suppressor (TiDG). The modifier and MaxCalix were the largest components of the deposited organic material, as determined from leaching the coalescer with dichloromethane. Both the Fourier-Transformed Infrared (FTIR) and Fourier-Transformed Hydrogen Nuclear Magnetic Resonance (FT-HNMR) results indicated that some of the modifier was degraded (missing their OH groups). The modifier was observed everywhere in the examined coalescer pieces (FTIR), while the TiDG and its decomposition products were observed at the

  16. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  17. Asymptotic distributions of coalescence times and ancestral lineage numbers for populations with temporally varying size.

    Science.gov (United States)

    Chen, Hua; Chen, Kun

    2013-07-01

    The distributions of coalescence times and ancestral lineage numbers play an essential role in coalescent modeling and ancestral inference. Both exact distributions of coalescence times and ancestral lineage numbers are expressed as the sum of alternating series, and the terms in the series become numerically intractable for large samples. More computationally attractive are their asymptotic distributions, which were derived in Griffiths (1984) for populations with constant size. In this article, we derive the asymptotic distributions of coalescence times and ancestral lineage numbers for populations with temporally varying size. For a sample of size n, denote by Tm the mth coalescent time, when m + 1 lineages coalesce into m lineages, and An(t) the number of ancestral lineages at time t back from the current generation. Similar to the results in Griffiths (1984), the number of ancestral lineages, An(t), and the coalescence times, Tm, are asymptotically normal, with the mean and variance of these distributions depending on the population size function, N(t). At the very early stage of the coalescent, when t → 0, the number of coalesced lineages n - An(t) follows a Poisson distribution, and as m → n, $$n\\left(n-1\\right){T}_{m}/2N\\left(0\\right)$$ follows a gamma distribution. We demonstrate the accuracy of the asymptotic approximations by comparing to both exact distributions and coalescent simulations. Several applications of the theoretical results are also shown: deriving statistics related to the properties of gene genealogies, such as the time to the most recent common ancestor (TMRCA) and the total branch length (TBL) of the genealogy, and deriving the allele frequency spectrum for large genealogies. With the advent of genomic-level sequencing data for large samples, the asymptotic distributions are expected to have wide applications in theoretical and methodological development for population genetic inference.

  18. Influence of Energy and Temperature in Cluster Coalescence Induced by Deposition

    Directory of Open Access Journals (Sweden)

    J. C. Jiménez-Sáez

    2012-01-01

    Full Text Available Coalescence induced by deposition of different Cu clusters on an epitaxial Co cluster supported on a Cu(001 substrate is studied by constant-temperature molecular dynamics simulations. The degree of epitaxy of the final system increases with increasing separation between the centres of mass of the projectile and target clusters during the collision. Structure, roughness, and epitaxial order of the supported cluster also influence the degree of epitaxy. The effect of energy and temperature is determinant on the epitaxial condition of the coalesced cluster, especially both factors modify the generation, growth and interaction among grains. A higher temperature favours the epitaxial growth for low impact parameters. A higher energy contributes to the epitaxial coalescence for any initial separation between the projectile and target clusters. The influence of projectile energy is notably greater than the influence of temperature since higher energies allow greater and instantaneous atomic reorganizations, so that the number of arisen grains just after the collision becomes smaller. The appearance of grain boundary dislocations is, therefore, a decisive factor in the epitaxial growth of the coalesced cluster.

  19. Visualization by X-ray tomography of void growth and coalescence leading to fracture in model materials

    International Nuclear Information System (INIS)

    Weck, A.; Wilkinson, D.S.; Maire, E.; Toda, H.

    2008-01-01

    The literature contains many models for the process of void nucleation, growth and coalescence leading to ductile fracture. However, these models lack in-depth experimental validation, in part because void coalescence is difficult to capture experimentally. In this paper, an embedded array of holes is obtained by diffusion bonding a sheet filled with laser-drilled holes between two intact sheets. The experiments have been performed with both pure copper and Glidcop. Using X-ray computed tomography, we show that void growth and coalescence (or linkage) are well captured in both materials. The Brown and Embury model for void coalescence underestimates coalescence strains due to constraining effects. However, both the Rice and Tracey model for void growth and the Thomason model for void coalescence give good predictions for copper samples when stress triaxiality is considered. The Thomason model, however, fails to predict coalescence for the Glidcop samples; this is primarily due to secondary void nucleation

  20. Imputation of missing genotypes within LD-blocks relying on the basic coalescent and beyond: consideration of population growth and structure.

    Science.gov (United States)

    Kabisch, Maria; Hamann, Ute; Lorenzo Bermejo, Justo

    2017-10-17

    Genotypes not directly measured in genetic studies are often imputed to improve statistical power and to increase mapping resolution. The accuracy of standard imputation techniques strongly depends on the similarity of linkage disequilibrium (LD) patterns in the study and reference populations. Here we develop a novel approach for genotype imputation in low-recombination regions that relies on the coalescent and permits to explicitly account for population demographic factors. To test the new method, study and reference haplotypes were simulated and gene trees were inferred under the basic coalescent and also considering population growth and structure. The reference haplotypes that first coalesced with study haplotypes were used as templates for genotype imputation. Computer simulations were complemented with the analysis of real data. Genotype concordance rates were used to compare the accuracies of coalescent-based and standard (IMPUTE2) imputation. Simulations revealed that, in LD-blocks, imputation accuracy relying on the basic coalescent was higher and less variable than with IMPUTE2. Explicit consideration of population growth and structure, even if present, did not practically improve accuracy. The advantage of coalescent-based over standard imputation increased with the minor allele frequency and it decreased with population stratification. Results based on real data indicated that, even in low-recombination regions, further research is needed to incorporate recombination in coalescence inference, in particular for studies with genetically diverse and admixed individuals. To exploit the full potential of coalescent-based methods for the imputation of missing genotypes in genetic studies, further methodological research is needed to reduce computer time, to take into account recombination, and to implement these methods in user-friendly computer programs. Here we provide reproducible code which takes advantage of publicly available software to facilitate

  1. Bayesian models and meta analysis for multiple tissue gene expression data following corticosteroid administration

    Directory of Open Access Journals (Sweden)

    Kelemen Arpad

    2008-08-01

    Full Text Available Abstract Background This paper addresses key biological problems and statistical issues in the analysis of large gene expression data sets that describe systemic temporal response cascades to therapeutic doses in multiple tissues such as liver, skeletal muscle, and kidney from the same animals. Affymetrix time course gene expression data U34A are obtained from three different tissues including kidney, liver and muscle. Our goal is not only to find the concordance of gene in different tissues, identify the common differentially expressed genes over time and also examine the reproducibility of the findings by integrating the results through meta analysis from multiple tissues in order to gain a significant increase in the power of detecting differentially expressed genes over time and to find the differential differences of three tissues responding to the drug. Results and conclusion Bayesian categorical model for estimating the proportion of the 'call' are used for pre-screening genes. Hierarchical Bayesian Mixture Model is further developed for the identifications of differentially expressed genes across time and dynamic clusters. Deviance information criterion is applied to determine the number of components for model comparisons and selections. Bayesian mixture model produces the gene-specific posterior probability of differential/non-differential expression and the 95% credible interval, which is the basis for our further Bayesian meta-inference. Meta-analysis is performed in order to identify commonly expressed genes from multiple tissues that may serve as ideal targets for novel treatment strategies and to integrate the results across separate studies. We have found the common expressed genes in the three tissues. However, the up/down/no regulations of these common genes are different at different time points. Moreover, the most differentially expressed genes were found in the liver, then in kidney, and then in muscle.

  2. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    Science.gov (United States)

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  3. A Bayesian multidimensional scaling procedure for the spatial analysis of revealed choice data

    NARCIS (Netherlands)

    DeSarbo, WS; Kim, Y; Fong, D

    1999-01-01

    We present a new Bayesian formulation of a vector multidimensional scaling procedure for the spatial analysis of binary choice data. The Gibbs sampler is gainfully employed to estimate the posterior distribution of the specified scalar products, bilinear model parameters. The computational procedure

  4. Daniel Goodman’s empirical approach to Bayesian statistics

    Science.gov (United States)

    Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina

    2016-01-01

    Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.

  5. Optimizing Nuclear Reaction Analysis (NRA) using Bayesian Experimental Design

    International Nuclear Information System (INIS)

    Toussaint, Udo von; Schwarz-Selinger, Thomas; Gori, Silvio

    2008-01-01

    Nuclear Reaction Analysis with 3 He holds the promise to measure Deuterium depth profiles up to large depths. However, the extraction of the depth profile from the measured data is an ill-posed inversion problem. Here we demonstrate how Bayesian Experimental Design can be used to optimize the number of measurements as well as the measurement energies to maximize the information gain. Comparison of the inversion properties of the optimized design with standard settings reveals huge possible gains. Application of the posterior sampling method allows to optimize the experimental settings interactively during the measurement process.

  6. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  7. Unavailability analysis of a PWR safety system by a Bayesian network

    International Nuclear Information System (INIS)

    Estevao, Lilian B.; Melo, Paulo Fernando F. Frutuoso e; Rivero, Jose J.

    2013-01-01

    Bayesian networks (BN) are directed acyclic graphs that have dependencies between variables, which are represented by nodes. These dependencies are represented by lines connecting the nodes and can be directed or not. Thus, it is possible to model conditional probabilities and calculate them with the help of Bayes' Theorem. The objective of this paper is to present the modeling of the failure of a safety system of a typical second generation light water reactor plant, the Containment Heat Removal System (CHRS), whose function is to cool the water of containment reservoir being recirculated through the Containment Spray Recirculation System (CSRS). CSRS is automatically initiated after a loss of coolant accident (LOCA) and together with the CHRS cools the reservoir water. The choice of this system was due to the fact that its analysis by a fault tree is available in Appendix II of the Reactor Safety Study Report (WASH-1400), and therefore all the necessary technical information is also available, such as system diagrams, failure data input and the fault tree itself that was developed to study system failure. The reason for the use of a bayesian network in this context was to assess its ability to reproduce the results of fault tree analyses and also verify the feasibility of treating dependent events. Comparing the fault trees and bayesian networks, the results obtained for the system failure were very close. (author)

  8. Bayesian estimation of dynamic matching function for U-V analysis in Japan

    Science.gov (United States)

    Kyo, Koki; Noda, Hideo; Kitagawa, Genshiro

    2012-05-01

    In this paper we propose a Bayesian method for analyzing unemployment dynamics. We derive a Beveridge curve for unemployment and vacancy (U-V) analysis from a Bayesian model based on a labor market matching function. In our framework, the efficiency of matching and the elasticities of new hiring with respect to unemployment and vacancy are regarded as time varying parameters. To construct a flexible model and obtain reasonable estimates in an underdetermined estimation problem, we treat the time varying parameters as random variables and introduce smoothness priors. The model is then described in a state space representation, enabling the parameter estimation to be carried out using Kalman filter and fixed interval smoothing. In such a representation, dynamic features of the cyclic unemployment rate and the structural-frictional unemployment rate can be accurately captured.

  9. A Pareto scale-inflated outlier model and its Bayesian analysis

    OpenAIRE

    Scollnik, David P. M.

    2016-01-01

    This paper develops a Pareto scale-inflated outlier model. This model is intended for use when data from some standard Pareto distribution of interest is suspected to have been contaminated with a relatively small number of outliers from a Pareto distribution with the same shape parameter but with an inflated scale parameter. The Bayesian analysis of this Pareto scale-inflated outlier model is considered and its implementation using the Gibbs sampler is discussed. The paper contains three wor...

  10. cudaBayesreg: Parallel Implementation of a Bayesian Multilevel Model for fMRI Data Analysis

    Directory of Open Access Journals (Sweden)

    Adelino R. Ferreira da Silva

    2011-10-01

    Full Text Available Graphic processing units (GPUs are rapidly gaining maturity as powerful general parallel computing devices. A key feature in the development of modern GPUs has been the advancement of the programming model and programming tools. Compute Unified Device Architecture (CUDA is a software platform for massively parallel high-performance computing on Nvidia many-core GPUs. In functional magnetic resonance imaging (fMRI, the volume of the data to be processed, and the type of statistical analysis to perform call for high-performance computing strategies. In this work, we present the main features of the R-CUDA package cudaBayesreg which implements in CUDA the core of a Bayesian multilevel model for the analysis of brain fMRI data. The statistical model implements a Gibbs sampler for multilevel/hierarchical linear models with a normal prior. The main contribution for the increased performance comes from the use of separate threads for fitting the linear regression model at each voxel in parallel. The R-CUDA implementation of the Bayesian model proposed here has been able to reduce significantly the run-time processing of Markov chain Monte Carlo (MCMC simulations used in Bayesian fMRI data analyses. Presently, cudaBayesreg is only configured for Linux systems with Nvidia CUDA support.

  11. Hydrogen-induced crack interaction and coalescence: the role of local crystallographic texture

    Energy Technology Data Exchange (ETDEWEB)

    Caleyo, F.; Hallen, J. M.; Venegas, V. [ESIQIE, Instituto Politecnico Nacional, Mexico, (Mexico); Baudin, T. [Universite de Paris Sud, Orsay, (France)

    2010-07-01

    Hydrogen induced cracking (HIC) is a big concern in pipeline industry specialized in sour service. The strategies to improve HIC resistance of pipeline steel have not been completely efficient. This study investigated the role of grain orientation in the interaction and coalescence of non-coplanar HIC cracks through experimental analysis. HIC samples of pipeline steels (API 5L X46 and ASME-A106) were studied using automated electron backscatter diffraction (EBSD) and orientation imaging microscopy (OIM). The results showed that the microtexture can play a significant role in the coalescence of closely spaced non-coplanar HIC cracks. It was also found that the presence of cleavage planes and slip systems correctly oriented to the mixed-mode stresses can activate low-resistance transgranular paths along in which cracks can merge. It is demonstrated that crystallographic texture must be considered in developing predictive models for the study of the stepwise propagation of HIC cracking in pipeline steels.

  12. Revisiting the time until fixation of a neutral mutant in a finite population - A coalescent theory approach.

    Science.gov (United States)

    Greenbaum, Gili

    2015-09-07

    Evaluation of the time scale of the fixation of neutral mutations is crucial to the theoretical understanding of the role of neutral mutations in evolution. Diffusion approximations of the Wright-Fisher model are most often used to derive analytic formulations of genetic drift, as well as for the time scales of the fixation of neutral mutations. These approximations require a set of assumptions, most notably that genetic drift is a stochastic process in a continuous allele-frequency space, an assumption appropriate for large populations. Here equivalent approximations are derived using a coalescent theory approach which relies on a different set of assumptions than the diffusion approach, and adopts a discrete allele-frequency space. Solutions for the mean and variance of the time to fixation of a neutral mutation derived from the two approaches converge for large populations but slightly differ for small populations. A Markov chain analysis of the Wright-Fisher model for small populations is used to evaluate the solutions obtained, showing that both the mean and the variance are better approximated by the coalescent approach. The coalescence approximation represents a tighter upper-bound for the mean time to fixation than the diffusion approximation, while the diffusion approximation and coalescence approximation form an upper and lower bound, respectively, for the variance. The converging solutions and the small deviations of the two approaches strongly validate the use of diffusion approximations, but suggest that coalescent theory can provide more accurate approximations for small populations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Dynamic measurements and simulations of airborne picolitre-droplet coalescence in holographic optical tweezers

    International Nuclear Information System (INIS)

    Bzdek, Bryan R.; Reid, Jonathan P.; Collard, Liam; Sprittles, James E.; Hudson, Andrew J.

    2016-01-01

    We report studies of the coalescence of pairs of picolitre aerosol droplets manipulated with holographic optical tweezers, probing the shape relaxation dynamics following coalescence by simultaneously monitoring the intensity of elastic backscattered light (EBL) from the trapping laser beam (time resolution on the order of 100 ns) while recording high frame rate camera images (time resolution <10 μs). The goals of this work are to: resolve the dynamics of droplet coalescence in holographic optical traps; assign the origin of key features in the time-dependent EBL intensity; and validate the use of the EBL alone to precisely determine droplet surface tension and viscosity. For low viscosity droplets, two sequential processes are evident: binary coalescence first results from the overlap of the optical traps on the time scale of microseconds followed by the recapture of the composite droplet in an optical trap on the time scale of milliseconds. As droplet viscosity increases, the relaxation in droplet shape eventually occurs on the same time scale as recapture, resulting in a convoluted evolution of the EBL intensity that inhibits quantitative determination of the relaxation time scale. Droplet coalescence was simulated using a computational framework to validate both experimental approaches. The results indicate that time-dependent monitoring of droplet shape from the EBL intensity allows for robust determination of properties such as surface tension and viscosity. Finally, the potential of high frame rate imaging to examine the coalescence of dissimilar viscosity droplets is discussed.

  14. Dynamic measurements and simulations of airborne picolitre-droplet coalescence in holographic optical tweezers

    Energy Technology Data Exchange (ETDEWEB)

    Bzdek, Bryan R.; Reid, Jonathan P., E-mail: j.p.reid@bristol.ac.uk [School of Chemistry, University of Bristol, Bristol BS8 1TS (United Kingdom); Collard, Liam [Department of Mathematics, University of Leicester, Leicester LE1 7RH (United Kingdom); Sprittles, James E. [Mathematics Institute, University of Warwick, Coventry CV4 7AL (United Kingdom); Hudson, Andrew J. [Department of Chemistry, University of Leicester, Leicester LE1 7RH (United Kingdom)

    2016-08-07

    We report studies of the coalescence of pairs of picolitre aerosol droplets manipulated with holographic optical tweezers, probing the shape relaxation dynamics following coalescence by simultaneously monitoring the intensity of elastic backscattered light (EBL) from the trapping laser beam (time resolution on the order of 100 ns) while recording high frame rate camera images (time resolution <10 μs). The goals of this work are to: resolve the dynamics of droplet coalescence in holographic optical traps; assign the origin of key features in the time-dependent EBL intensity; and validate the use of the EBL alone to precisely determine droplet surface tension and viscosity. For low viscosity droplets, two sequential processes are evident: binary coalescence first results from the overlap of the optical traps on the time scale of microseconds followed by the recapture of the composite droplet in an optical trap on the time scale of milliseconds. As droplet viscosity increases, the relaxation in droplet shape eventually occurs on the same time scale as recapture, resulting in a convoluted evolution of the EBL intensity that inhibits quantitative determination of the relaxation time scale. Droplet coalescence was simulated using a computational framework to validate both experimental approaches. The results indicate that time-dependent monitoring of droplet shape from the EBL intensity allows for robust determination of properties such as surface tension and viscosity. Finally, the potential of high frame rate imaging to examine the coalescence of dissimilar viscosity droplets is discussed.

  15. Bayesian Nonparametric Longitudinal Data Analysis.

    Science.gov (United States)

    Quintana, Fernando A; Johnson, Wesley O; Waetjen, Elaine; Gold, Ellen

    2016-01-01

    Practical Bayesian nonparametric methods have been developed across a wide variety of contexts. Here, we develop a novel statistical model that generalizes standard mixed models for longitudinal data that include flexible mean functions as well as combined compound symmetry (CS) and autoregressive (AR) covariance structures. AR structure is often specified through the use of a Gaussian process (GP) with covariance functions that allow longitudinal data to be more correlated if they are observed closer in time than if they are observed farther apart. We allow for AR structure by considering a broader class of models that incorporates a Dirichlet Process Mixture (DPM) over the covariance parameters of the GP. We are able to take advantage of modern Bayesian statistical methods in making full predictive inferences and about characteristics of longitudinal profiles and their differences across covariate combinations. We also take advantage of the generality of our model, which provides for estimation of a variety of covariance structures. We observe that models that fail to incorporate CS or AR structure can result in very poor estimation of a covariance or correlation matrix. In our illustration using hormone data observed on women through the menopausal transition, biology dictates the use of a generalized family of sigmoid functions as a model for time trends across subpopulation categories.

  16. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Pedro, E-mail: pedrocarv@coc.ufrj.br [Computational Modelling in Engineering and Geophysics Laboratory (LAMEMO), Department of Civil Engineering, COPPE, Federal University of Rio de Janeiro, Av. Pedro Calmon - Ilha do Fundão, 21941-596 Rio de Janeiro (Brazil); Center for Urban and Regional Systems (CESUR), CERIS, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisbon (Portugal); Marques, Rui Cunha, E-mail: pedro.c.carvalho@tecnico.ulisboa.pt [Center for Urban and Regional Systems (CESUR), CERIS, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisbon (Portugal)

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. - Highlights: • This study aims to search for economies of size and scope in the water sector; • The usefulness of the application of Bayesian methods is highlighted; • Important economies of output density, economies of size, economies of vertical integration and economies of scope are found.

  17. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis

    International Nuclear Information System (INIS)

    Carvalho, Pedro; Marques, Rui Cunha

    2016-01-01

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. - Highlights: • This study aims to search for economies of size and scope in the water sector; • The usefulness of the application of Bayesian methods is highlighted; • Important economies of output density, economies of size, economies of vertical integration and economies of scope are found.

  18. CONSTRAINTS ON COSMIC-RAY PROPAGATION MODELS FROM A GLOBAL BAYESIAN ANALYSIS

    International Nuclear Information System (INIS)

    Trotta, R.; Johannesson, G.; Moskalenko, I. V.; Porter, T. A.; Ruiz de Austri, R.; Strong, A. W.

    2011-01-01

    Research in many areas of modern physics such as, e.g., indirect searches for dark matter and particle acceleration in supernova remnant shocks rely heavily on studies of cosmic rays (CRs) and associated diffuse emissions (radio, microwave, X-rays, γ-rays). While very detailed numerical models of CR propagation exist, a quantitative statistical analysis of such models has been so far hampered by the large computational effort that those models require. Although statistical analyses have been carried out before using semi-analytical models (where the computation is much faster), the evaluation of the results obtained from such models is difficult, as they necessarily suffer from many simplifying assumptions. The main objective of this paper is to present a working method for a full Bayesian parameter estimation for a numerical CR propagation model. For this study, we use the GALPROP code, the most advanced of its kind, which uses astrophysical information, and nuclear and particle data as inputs to self-consistently predict CRs, γ-rays, synchrotron, and other observables. We demonstrate that a full Bayesian analysis is possible using nested sampling and Markov Chain Monte Carlo methods (implemented in the SuperBayeS code) despite the heavy computational demands of a numerical propagation code. The best-fit values of parameters found in this analysis are in agreement with previous, significantly simpler, studies also based on GALPROP.

  19. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...

  20. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  1. Inverse problems in the Bayesian framework

    International Nuclear Information System (INIS)

    Calvetti, Daniela; Somersalo, Erkki; Kaipio, Jari P

    2014-01-01

    The history of Bayesian methods dates back to the original works of Reverend Thomas Bayes and Pierre-Simon Laplace: the former laid down some of the basic principles on inverse probability in his classic article ‘An essay towards solving a problem in the doctrine of chances’ that was read posthumously in the Royal Society in 1763. Laplace, on the other hand, in his ‘Memoirs on inverse probability’ of 1774 developed the idea of updating beliefs and wrote down the celebrated Bayes’ formula in the form we know today. Although not identified yet as a framework for investigating inverse problems, Laplace used the formalism very much in the spirit it is used today in the context of inverse problems, e.g., in his study of the distribution of comets. With the evolution of computational tools, Bayesian methods have become increasingly popular in all fields of human knowledge in which conclusions need to be drawn based on incomplete and noisy data. Needless to say, inverse problems, almost by definition, fall into this category. Systematic work for developing a Bayesian inverse problem framework can arguably be traced back to the 1980s, (the original first edition being published by Elsevier in 1987), although articles on Bayesian methodology applied to inverse problems, in particular in geophysics, had appeared much earlier. Today, as testified by the articles in this special issue, the Bayesian methodology as a framework for considering inverse problems has gained a lot of popularity, and it has integrated very successfully with many traditional inverse problems ideas and techniques, providing novel ways to interpret and implement traditional procedures in numerical analysis, computational statistics, signal analysis and data assimilation. The range of applications where the Bayesian framework has been fundamental goes from geophysics, engineering and imaging to astronomy, life sciences and economy, and continues to grow. There is no question that Bayesian

  2. Risks Analysis of Logistics Financial Business Based on Evidential Bayesian Network

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2013-01-01

    Full Text Available Risks in logistics financial business are identified and classified. Making the failure of the business as the root node, a Bayesian network is constructed to measure the risk levels in the business. Three importance indexes are calculated to find the most important risks in the business. And more, considering the epistemic uncertainties in the risks, evidence theory associate with Bayesian network is used as an evidential network in the risk analysis of logistics finance. To find how much uncertainty in root node is produced by each risk, a new index, epistemic importance, is defined. Numerical examples show that the proposed methods could provide a lot of useful information. With the information, effective approaches could be found to control and avoid these sensitive risks, thus keep logistics financial business working more reliable. The proposed method also gives a quantitative measure of risk levels in logistics financial business, which provides guidance for the selection of financing solutions.

  3. Spectral analysis of the IntCal98 calibration curve: a Bayesian view

    International Nuclear Information System (INIS)

    Palonen, V.; Tikkanen, P.

    2004-01-01

    Preliminary results from a Bayesian approach to find periodicities in the IntCal98 calibration curve are given. It has been shown in the literature that the discrete Fourier transform (Schuster periodogram) corresponds to the use of an approximate Bayesian model of one harmonic frequency and Gaussian noise. Advantages of the Bayesian approach include the possibility to use models for variable, attenuated and multiple frequencies, the capability to analyze unevenly spaced data and the possibility to assess the significance and uncertainties of spectral estimates. In this work, a new Bayesian model using random walk noise to take care of the trend in the data is developed. Both Bayesian models are described and the first results of the new model are reported and compared with results from straightforward discrete-Fourier-transform and maximum-entropy-method spectral analyses

  4. Testing general relativity using Bayesian model selection: Applications to observations of gravitational waves from compact binary systems

    International Nuclear Information System (INIS)

    Del Pozzo, Walter; Veitch, John; Vecchio, Alberto

    2011-01-01

    Second-generation interferometric gravitational-wave detectors, such as Advanced LIGO and Advanced Virgo, are expected to begin operation by 2015. Such instruments plan to reach sensitivities that will offer the unique possibility to test general relativity in the dynamical, strong-field regime and investigate departures from its predictions, in particular, using the signal from coalescing binary systems. We introduce a statistical framework based on Bayesian model selection in which the Bayes factor between two competing hypotheses measures which theory is favored by the data. Probability density functions of the model parameters are then used to quantify the inference on individual parameters. We also develop a method to combine the information coming from multiple independent observations of gravitational waves, and show how much stronger inference could be. As an introduction and illustration of this framework-and a practical numerical implementation through the Monte Carlo integration technique of nested sampling-we apply it to gravitational waves from the inspiral phase of coalescing binary systems as predicted by general relativity and a very simple alternative theory in which the graviton has a nonzero mass. This method can (and should) be extended to more realistic and physically motivated theories.

  5. Bayesian analysis of ion beam diagnostics

    International Nuclear Information System (INIS)

    Toussaint, U. von; Fischer, R.; Dose, V.

    2001-01-01

    Ion beam diagnostics are routinely used for quantitative analysis of the surface composition of mixture materials up to a depth of a few μm. Unfortunately, advantageous properties of the diagnostics, like high depth resolution in combination with a large penetration depth, no destruction of the surface, high sensitivity for large as well as for small atomic numbers, and high sensitivity are mutually exclusive. Among other things, this is due to the ill-conditioned inverse problem of reconstructing depth distributions of the composition elements. Robust results for depth distributions are obtained with adaptive methods in the framework of Bayesian probability theory. The method of adaptive kernels allows for distributions which contain only the significant information of the data while noise fitting is avoided. This is achieved by adaptively reducing the degrees of freedom supporting the distribution. As applications for ion beam diagnostics Rutherford backscattering spectroscopy and particle induced X-ray emission are shown

  6. Development and validation of models for bubble coalescence and breakup

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Yiaxiang

    2013-10-08

    A generalized model for bubble coalescence and breakup has been developed, which is based on a comprehensive survey of existing theories and models. One important feature of the model is that all important mechanisms leading to bubble coalescence and breakup in a turbulent gas-liquid flow are considered. The new model is tested extensively in a 1D Test Solver and a 3D CFD code ANSYS CFX for the case of vertical gas-liquid pipe flow under adiabatic conditions, respectively. Two kinds of extensions of the standard multi-fluid model, i.e. the discrete population model and the inhomogeneous MUSIG (multiple-size group) model, are available in the two solvers, respectively. These extensions with suitable closure models such as those for coalescence and breakup are able to predict the evolution of bubble size distribution in dispersed flows and to overcome the mono-dispersed flow limitation of the standard multi-fluid model. For the validation of the model the high quality database of the TOPFLOW L12 experiments for air-water flow in a vertical pipe was employed. A wide range of test points, which cover the bubbly flow, turbulent-churn flow as well as the transition regime, is involved in the simulations. The comparison between the simulated results such as bubble size distribution, gas velocity and volume fraction and the measured ones indicates a generally good agreement for all selected test points. As the superficial gas velocity increases, bubble size distribution evolves via coalescence dominant regimes first, then breakup-dominant regimes and finally turns into a bimodal distribution. The tendency of the evolution is well reproduced by the model. However, the tendency is almost always overestimated, i.e. too much coalescence in the coalescence dominant case while too much breakup in breakup dominant ones. The reason of this problem is discussed by studying the contribution of each coalescence and breakup mechanism at different test points. The redistribution of the

  7. Development and validation of models for bubble coalescence and breakup

    International Nuclear Information System (INIS)

    Liao, Yiaxiang

    2013-01-01

    A generalized model for bubble coalescence and breakup has been developed, which is based on a comprehensive survey of existing theories and models. One important feature of the model is that all important mechanisms leading to bubble coalescence and breakup in a turbulent gas-liquid flow are considered. The new model is tested extensively in a 1D Test Solver and a 3D CFD code ANSYS CFX for the case of vertical gas-liquid pipe flow under adiabatic conditions, respectively. Two kinds of extensions of the standard multi-fluid model, i.e. the discrete population model and the inhomogeneous MUSIG (multiple-size group) model, are available in the two solvers, respectively. These extensions with suitable closure models such as those for coalescence and breakup are able to predict the evolution of bubble size distribution in dispersed flows and to overcome the mono-dispersed flow limitation of the standard multi-fluid model. For the validation of the model the high quality database of the TOPFLOW L12 experiments for air-water flow in a vertical pipe was employed. A wide range of test points, which cover the bubbly flow, turbulent-churn flow as well as the transition regime, is involved in the simulations. The comparison between the simulated results such as bubble size distribution, gas velocity and volume fraction and the measured ones indicates a generally good agreement for all selected test points. As the superficial gas velocity increases, bubble size distribution evolves via coalescence dominant regimes first, then breakup-dominant regimes and finally turns into a bimodal distribution. The tendency of the evolution is well reproduced by the model. However, the tendency is almost always overestimated, i.e. too much coalescence in the coalescence dominant case while too much breakup in breakup dominant ones. The reason of this problem is discussed by studying the contribution of each coalescence and breakup mechanism at different test points. The redistribution of the

  8. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    Science.gov (United States)

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Gravitational Waves from Coalescing Binary Black Holes: Theoretical and Experimental Challenges

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    (LIGO/VIRGO/GEO/...) is currently taking data near its planned sensitivity. Coalescing black hole binaries are among the most promising, and most exciting, gravitational wave sources for these detectors. The talk will review the theoretical and experimental challenges that must be met in order to successfully detect gravitational waves from coalescing black hole binaries, and to be able to reliably measure the physical parameters of the source (masses, spins, ...).

  10. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  11. Introduction of Bayesian network in risk analysis of maritime accidents in Bangladesh

    Science.gov (United States)

    Rahman, Sohanur

    2017-12-01

    Due to the unique geographic location, complex navigation environment and intense vessel traffic, a considerable number of maritime accidents occurred in Bangladesh which caused serious loss of life, property and environmental contamination. Based on the historical data of maritime accidents from 1981 to 2015, which has been collected from Department of Shipping (DOS) and Bangladesh Inland Water Transport Authority (BIWTA), this paper conducted a risk analysis of maritime accidents by applying Bayesian network. In order to conduct this study, a Bayesian network model has been developed to find out the relation among parameters and the probability of them which affect accidents based on the accident investigation report of Bangladesh. Furthermore, number of accidents in different categories has also been investigated in this paper. Finally, some viable recommendations have been proposed in order to ensure greater safety of inland vessels in Bangladesh.

  12. Simultaneous versus sequential pharmacokinetic-pharmacodynamic population analysis using an iterative two-stage Bayesian technique

    NARCIS (Netherlands)

    Proost, Johannes H.; Schiere, Sjouke; Eleveld, Douglas J.; Wierda, J. Mark K. H.

    A method for simultaneous pharmacokinetic-pharmacodynamic (PK-PD) population analysis using an Iterative Two-Stage Bayesian (ITSB) algorithm was developed. The method was evaluated using clinical data and Monte Carlo simulations. Data from a clinical study with rocuronium in nine anesthetized

  13. Coalescence of silver unidimensional structures by molecular dynamics simulation

    International Nuclear Information System (INIS)

    Perez A, M.; Gutierrez W, C.E.; Mondragon, G.; Arenas, J.

    2007-01-01

    The study of nanoparticles coalescence and silver nano rods phenomena by means of molecular dynamics simulation under the thermodynamic laws is reported. In this work we focus ourselves to see the conditions under which the one can be given one dimension growth of silver nano rods for the coalescence phenomenon among two nano rods or one nano rod and one particle; what allows us to study those structural, dynamic and morphological properties of the silver nano rods to different thermodynamic conditions. The simulations are carried out using the Sutton-Chen potentials of interaction of many bodies that allow to obtain appropriate results with the real physical systems. (Author)

  14. A Bayesian analysis of pentaquark signals from CLAS data

    Energy Technology Data Exchange (ETDEWEB)

    David Ireland; Bryan McKinnon; Dan Protopopescu; Pawel Ambrozewicz; Marco Anghinolfi; G. Asryan; Harutyun Avakian; H. Bagdasaryan; Nathan Baillie; Jacques Ball; Nathan Baltzell; V. Batourine; Marco Battaglieri; Ivan Bedlinski; Ivan Bedlinskiy; Matthew Bellis; Nawal Benmouna; Barry Berman; Angela Biselli; Lukasz Blaszczyk; Sylvain Bouchigny; Sergey Boyarinov; Robert Bradford; Derek Branford; William Briscoe; William Brooks; Volker Burkert; Cornel Butuceanu; John Calarco; Sharon Careccia; Daniel Carman; Liam Casey; Shifeng Chen; Lu Cheng; Philip Cole; Patrick Collins; Philip Coltharp; Donald Crabb; Volker Crede; Natalya Dashyan; Rita De Masi; Raffaella De Vita; Enzo De Sanctis; Pavel Degtiarenko; Alexandre Deur; Richard Dickson; Chaden Djalali; Gail Dodge; Joseph Donnelly; David Doughty; Michael Dugger; Oleksandr Dzyubak; Hovanes Egiyan; Kim Egiyan; Lamiaa Elfassi; Latifa Elouadrhiri; Paul Eugenio; Gleb Fedotov; Gerald Feldman; Ahmed Fradi; Herbert Funsten; Michel Garcon; Gagik Gavalian; Nerses Gevorgyan; Gerard Gilfoyle; Kevin Giovanetti; Francois-Xavier Girod; John Goetz; Wesley Gohn; Atilla Gonenc; Ralf Gothe; Keith Griffioen; Michel Guidal; Nevzat Guler; Lei Guo; Vardan Gyurjyan; Kawtar Hafidi; Hayk Hakobyan; Charles Hanretty; Neil Hassall; F. Hersman; Ishaq Hleiqawi; Maurik Holtrop; Charles Hyde; Yordanka Ilieva; Boris Ishkhanov; Eugeny Isupov; D. Jenkins; Hyon-Suk Jo; John Johnstone; Kyungseon Joo; Henry Juengst; Narbe Kalantarians; James Kellie; Mahbubul Khandaker; Wooyoung Kim; Andreas Klein; Franz Klein; Mikhail Kossov; Zebulun Krahn; Laird Kramer; Valery Kubarovsky; Joachim Kuhn; Sergey Kuleshov; Viacheslav Kuznetsov; Jeff Lachniet; Jean Laget; Jorn Langheinrich; D. Lawrence; Kenneth Livingston; Haiyun Lu; Marion MacCormick; Nikolai Markov; Paul Mattione; Bernhard Mecking; Mac Mestayer; Curtis Meyer; Tsutomu Mibe; Konstantin Mikhaylov; Marco Mirazita; Rory Miskimen; Viktor Mokeev; Brahim Moreno; Kei Moriya; Steven Morrow; Maryam Moteabbed; Edwin Munevar Espitia; Gordon Mutchler; Pawel Nadel-Turonski; Rakhsha Nasseripour; Silvia Niccolai; Gabriel Niculescu; Maria-Ioana Niculescu; Bogdan Niczyporuk; Megh Niroula; Rustam Niyazov; Mina Nozar; Mikhail Osipenko; Alexander Ostrovidov; Kijun Park; Evgueni Pasyuk; Craig Paterson; Sergio Pereira; Joshua Pierce; Nikolay Pivnyuk; Oleg Pogorelko; Sergey Pozdnyakov; John Price; Sebastien Procureur; Yelena Prok; Brian Raue; Giovanni Ricco; Marco Ripani; Barry Ritchie; Federico Ronchetti; Guenther Rosner; Patrizia Rossi; Franck Sabatie; Julian Salamanca; Carlos Salgado; Joseph Santoro; Vladimir Sapunenko; Reinhard Schumacher; Vladimir Serov; Youri Sharabian; Dmitri Sharov; Nikolay Shvedunov; Elton Smith; Lee Smith; Daniel Sober; Daria Sokhan; Aleksey Stavinskiy; Samuel Stepanyan; Stepan Stepanyan; Burnham Stokes; Paul Stoler; Steffen Strauch; Mauro Taiuti; David Tedeschi; Ulrike Thoma; Avtandil Tkabladze; Svyatoslav Tkachenko; Clarisse Tur; Maurizio Ungaro; Michael Vineyard; Alexander Vlassov; Daniel Watts; Lawrence Weinstein; Dennis Weygand; M. Williams; Elliott Wolin; M.H. Wood; Amrit Yegneswaran; Lorenzo Zana; Jixie Zhang; Bo Zhao; Zhiwen Zhao

    2008-02-01

    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a $\\Theta^{+}$ pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a $\\Theta^{+}$. Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner.

  15. A Bayesian analysis of pentaquark signals from CLAS data

    International Nuclear Information System (INIS)

    David Ireland; Bryan McKinnon; Dan Protopopescu; Pawel Ambrozewicz; Marco Anghinolfi; G. Asryan; Harutyun Avakian; H. Bagdasaryan; Nathan Baillie; Jacques Ball; Nathan Baltzell; V. Batourine; Marco Battaglieri; Ivan Bedlinski; Ivan Bedlinskiy; Matthew Bellis; Nawal Benmouna; Barry Berman; Angela Biselli; Lukasz Blaszczyk; Sylvain Bouchigny; Sergey Boyarinov; Robert Bradford; Derek Branford; William Briscoe; William Brooks; Volker Burkert; Cornel Butuceanu; John Calarco; Sharon Careccia; Daniel Carman; Liam Casey; Shifeng Chen; Lu Cheng; Philip Cole; Patrick Collins; Philip Coltharp; Donald Crabb; Volker Crede; Natalya Dashyan; Rita De Masi; Raffaella De Vita; Enzo De Sanctis; Pavel Degtiarenko; Alexandre Deur; Richard Dickson; Chaden Djalali; Gail Dodge; Joseph Donnelly; David Doughty; Michael Dugger; Oleksandr Dzyubak; Hovanes Egiyan; Kim Egiyan; Lamiaa Elfassi; Latifa Elouadrhiri; Paul Eugenio; Gleb Fedotov; Gerald Feldman; Ahmed Fradi; Herbert Funsten; Michel Garcon; Gagik Gavalian; Nerses Gevorgyan; Gerard Gilfoyle; Kevin Giovanetti; Francois-Xavier Girod; John Goetz; Wesley Gohn; Atilla Gonenc; Ralf Gothe; Keith Griffioen; Michel Guidal; Nevzat Guler; Lei Guo; Vardan Gyurjyan; Kawtar Hafidi; Hayk Hakobyan; Charles Hanretty; Neil Hassall; F. Hersman; Ishaq Hleiqawi; Maurik Holtrop; Charles Hyde; Yordanka Ilieva; Boris Ishkhanov; Eugeny Isupov; D. Jenkins; Hyon-Suk Jo; John Johnstone; Kyungseon Joo; Henry Juengst; Narbe Kalantarians; James Kellie; Mahbubul Khandaker; et al

    2007-01-01

    We examine the results of two measurements by the CLAS collaboration, one of which claimed evidence for a Θ + pentaquark, whilst the other found no such evidence. The unique feature of these two experiments was that they were performed with the same experimental setup. Using a Bayesian analysis we find that the results of the two experiments are in fact compatible with each other, but that the first measurement did not contain sufficient information to determine unambiguously the existence of a Θ + . Further, we suggest a means by which the existence of a new candidate particle can be tested in a rigorous manner

  16. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  17. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...

  18. Experimental visualization coalesced interaction of sliding bubble near wall in vertical narrow rectangular channel

    International Nuclear Information System (INIS)

    Xu Jianjun; Chen Bingde; Wang Xiaojun

    2011-01-01

    The characteristic of the coalesced sliding bubble was visually observed by wide side and narrow side of the narrow rectangular channel using high speed digital camera. The results show that the coalesced time among the sliding bubbles is quick, and the new formation of coalesced bubble is not lift-off, and it continues to slide along the heated surface in low heat flux for the isolated bubble region. The influence region is about 2 times projected area of the sliding bubble when the sliding bubbles begin to interact. The sliding bubble velocities increase duo to the interaction among the bubbles, which contributes to enhance heat transfer of this region. Finally, the effect of coalesced interaction of growing bubble in the nucleation sites on bubble lift-off was discussed and analysed. (authors)

  19. A bayesian approach to classification criteria for spectacled eiders

    Science.gov (United States)

    Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.

    1996-01-01

    To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.

  20. Free surface flows: coalescence, spreading and dewetting

    NARCIS (Netherlands)

    Hernandez Sanchez, J.F.

    2015-01-01

    Capillary and wetting phenomena are an essential part of nature. Its presence is noticed in many circumstances where solid and liquid surfaces come into contact. In this thesis different types of capillary free surface flows are studied. The topics discussed are mainly the coalescence of viscous

  1. Competing risk models in reliability systems, a Weibull distribution model with Bayesian analysis approach

    International Nuclear Information System (INIS)

    Iskandar, Ismed; Gondokaryono, Yudi Satria

    2016-01-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  2. A variational void coalescence model for ductile metals

    KAUST Repository

    Siddiq, Amir

    2011-08-17

    We present a variational void coalescence model that includes all the essential ingredients of failure in ductile porous metals. The model is an extension of the variational void growth model by Weinberg et al. (Comput Mech 37:142-152, 2006). The extended model contains all the deformation phases in ductile porous materials, i.e. elastic deformation, plastic deformation including deviatoric and volumetric (void growth) plasticity followed by damage initiation and evolution due to void coalescence. Parametric studies have been performed to assess the model\\'s dependence on the different input parameters. The model is then validated against uniaxial loading experiments for different materials. We finally show the model\\'s ability to predict the damage mechanisms and fracture surface profile of a notched round bar under tension as observed in experiments. © Springer-Verlag 2011.

  3. Bayesian conformational analysis of ring molecules through reversible jump MCMC

    DEFF Research Database (Denmark)

    Nolsøe, Kim; Kessler, Mathieu; Pérez, José

    2005-01-01

    In this paper we address the problem of classifying the conformations of mmembered rings using experimental observations obtained by crystal structure analysis. We formulate a model for the data generation mechanism that consists in a multidimensional mixture model. We perform inference for the p...... for the proportions and the components in a Bayesian framework, implementing an MCMC Reversible Jumps Algorithm to obtain samples of the posterior distributions. The method is illustrated on a simulated data set and on real data corresponding to cyclo-octane structures....

  4. BayesLCA: An R Package for Bayesian Latent Class Analysis

    Directory of Open Access Journals (Sweden)

    Arthur White

    2014-11-01

    Full Text Available The BayesLCA package for R provides tools for performing latent class analysis within a Bayesian setting. Three methods for fitting the model are provided, incorporating an expectation-maximization algorithm, Gibbs sampling and a variational Bayes approximation. The article briefly outlines the methodology behind each of these techniques and discusses some of the technical difficulties associated with them. Methods to remedy these problems are also described. Visualization methods for each of these techniques are included, as well as criteria to aid model selection.

  5. Coalescence of GaAs on (001) Si nano-trenches based on three-stage epitaxial lateral overgrowth

    Energy Technology Data Exchange (ETDEWEB)

    He, Yunrui; Wang, Jun, E-mail: wangjun12@bupt.edu.cn; Hu, Haiyang; Wang, Qi; Huang, Yongqing; Ren, Xiaomin [State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing 100876 (China)

    2015-05-18

    The coalescence of selective area grown GaAs regions has been performed on patterned 1.8 μm GaAs buffer layer on Si via metal-organic chemical vapor deposition. We propose a promising method of three-stage epitaxial lateral overgrowth (ELO) to achieve uniform coalescence and flat surface. Rough surface caused by the coalescence of different growth fronts is smoothened by this method. Low root-mean-square surface roughness of 6.29 nm has been obtained on a 410-nm-thick coalesced ELO GaAs layer. Cross-sectional transmission electron microscope study shows that the coalescence of different growth fronts will induce some new dislocations. However, the coalescence-induced dislocations tend to mutually annihilate and only a small part of them reach the GaAs surface. High optical quality of the ELO GaAs layer has been confirmed by low temperature (77 K) photoluminescence measurements. This research promises a very large scale integration platform for the monolithic integration of GaAs-based device on Si.

  6. A coalescence model for uranium exploration

    International Nuclear Information System (INIS)

    Stuart-Williams, V.; Taylor, C.M.

    1983-01-01

    Uranium mineralization was found in the Pristerognathus-Diictodon Assemblage Zone of the Teekloof Formation, Beaufort Group, west of Beaufort West, Cape Province, South Africa. All the anomalies can be related to a single mineralization model. Mineralization is found at the termination of a silt parting between two coalescing sandstones and lies in the lower sandstone as an inclined zone dipping downflow from the termination of the silt parting. The existence of primary Eh-pH gradient is indicated by a uranium-molybdenum zonation, the molybdenum lying above the uranium mineralization. The upper sandstone was an oxidizing fluvial channel in an arid environment through which uranyl carbonate was being transported in solution. Carbonaceous material undergoing anaerobic bacterial breakdown generated a weakly reducing fluid in the lower sandstone. Carbonaceous material at the REDOX front developed between the two mixing fluids at the point of sandstone coalescence reduced uranyl carbonates in solution. Once reduced the uranium minerals remained stable because the conditions in the REDOX front were only very weakly oxidizing. As floodplain aggradation continued, the upper sandstone was buried and the entire sandstone couplet became reducing, permanently stabilizing the uranium mineralization

  7. The confounding effect of population structure on bayesian skyline plot inferences of demographic history

    DEFF Research Database (Denmark)

    Heller, Rasmus; Chikhi, Lounes; Siegismund, Hans

    2013-01-01

    Many coalescent-based methods aiming to infer the demographic history of populations assume a single, isolated and panmictic population (i.e. a Wright-Fisher model). While this assumption may be reasonable under many conditions, several recent studies have shown that the results can be misleading...... when it is violated. Among the most widely applied demographic inference methods are Bayesian skyline plots (BSPs), which are used across a range of biological fields. Violations of the panmixia assumption are to be expected in many biological systems, but the consequences for skyline plot inferences...... the best scheme for inferring demographic change over a typical time scale. Analyses of data from a structured African buffalo population demonstrate how BSP results can be strengthened by simulations. We recommend that sample selection should be carefully considered in relation to population structure...

  8. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    Directory of Open Access Journals (Sweden)

    Kim Hyun

    2011-12-01

    Full Text Available Abstract Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  9. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network.

    Science.gov (United States)

    Kim, Hyun Uk; Kim, Tae Yong; Lee, Sang Yup

    2011-01-01

    Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism's metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  10. Void growth to coalescence in a non-local material

    DEFF Research Database (Denmark)

    Niordson, Christian Frithiof

    2008-01-01

    of different material length parameters in a multi-parameter theory is studied, and it is shown that the important length parameter is the same as under purely hydrostatic loading. It is quantified how micron scale voids grow less rapidly than larger voids, and the implications of this in the overall strength...... of the material is emphasized. The size effect on the onset of coalescence is studied, and results for the void volume fraction and the strain at the onset of coalescence are presented. It is concluded that for cracked specimens not only the void volume fraction, but also the typical void size is of importance...... to the fracture strength of ductile materials....

  11. Coalescence of liquid drops: Different models versus experiment

    KAUST Repository

    Sprittles, J. E.; Shikhmurzaev, Y. D.

    2012-01-01

    help to further elucidate the details of the coalescence phenomenon. As a by-product of our research, the range of validity of different "scaling laws" advanced as approximate solutions to the problem formulated using the conventional model

  12. On localization and void coalescence as a precursor to ductile fracture.

    Science.gov (United States)

    Tekoğlu, C; Hutchinson, J W; Pardoen, T

    2015-03-28

    Two modes of plastic flow localization commonly occur in the ductile fracture of structural metals undergoing damage and failure by the mechanism involving void nucleation, growth and coalescence. The first mode consists of a macroscopic localization, usually linked to the softening effect of void nucleation and growth, in either a normal band or a shear band where the thickness of the band is comparable to void spacing. The second mode is coalescence with plastic strain localizing to the ligaments between voids by an internal necking process. The ductility of a material is tied to the strain at macroscopic localization, as this marks the limit of uniform straining at the macroscopic scale. The question addressed is whether macroscopic localization occurs prior to void coalescence or whether the two occur simultaneously. The relation between these two modes of localization is studied quantitatively in this paper using a three-dimensional elastic-plastic computational model representing a doubly periodic array of voids within a band confined between two semi-infinite outer blocks of the same material but without voids. At sufficiently high stress triaxiality, a clear separation exists between the two modes of localization. At lower stress triaxialities, the model predicts that the onset of macroscopic localization and coalescence occur simultaneously. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  13. A PIV Study of Drop-interface Coalescence with Surfactants

    Science.gov (United States)

    Weheliye, Weheliye Hashi; Dong, Teng; Angeli, Panagiota

    2017-11-01

    In this work, the coalescence of a drop with an aqueous-organic interface was studied by Particle Image Velocimetry (PIV). The effect of surfactants on the drop surface evolution, the vorticity field and the kinetic energy distribution in the drop during coalescence were investigated. The coalescence took place in an acrylic rectangular box with 79% glycerol solution at the bottom and Exxsol D80 oil above. The glycerol solution drop was generated through a nozzle fixed at 2cm above the aqueous/oil interface and was seeded with Rhodamine particles. The whole process was captured by a high-speed camera. Different mass ratios of non-ionic surfactant Span80 to oil were studied. The increase of surfactant concentration promoted deformation of the interface before the rupture of the trapped oil film. At the early stages after film rupture, two counter-rotating vortices appeared at the bottom of the drop which then travelled to the upper part. The propagation rates, as well as the intensities of the vortices decreased at high surfactant concentrations. At early stages, the kinetic energy was mainly distributed near the bottom part of the droplet, while at later stages it was distributed near the upper part of the droplet. Programme Grant MEMPHIS, Chinese Scholarship Council (CSC).

  14. Performance of an iterative two-stage bayesian technique for population pharmacokinetic analysis of rich data sets

    NARCIS (Netherlands)

    Proost, Johannes H.; Eleveld, Douglas J.

    2006-01-01

    Purpose. To test the suitability of an Iterative Two-Stage Bayesian (ITSB) technique for population pharmacokinetic analysis of rich data sets, and to compare ITSB with Standard Two-Stage (STS) analysis and nonlinear Mixed Effect Modeling (MEM). Materials and Methods. Data from a clinical study with

  15. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  16. Stochastic coalescence in Lagrangian cloud microphysics

    Directory of Open Access Journals (Sweden)

    P. Dziekan

    2017-11-01

    Full Text Available Stochasticity of the collisional growth of cloud droplets is studied using the super-droplet method (SDM of Shima et al.(2009. Statistics are calculated from ensembles of simulations of collision–coalescence in a single well-mixed cell. The SDM is compared with direct numerical simulations and the master equation. It is argued that SDM simulations in which one computational droplet represents one real droplet are at the same level of precision as the master equation. Such simulations are used to study fluctuations in the autoconversion time, the sol–gel transition and the growth rate of lucky droplets, which is compared with a theoretical prediction. The size of the coalescence cell is found to strongly affect system behavior. In small cells, correlations in droplet sizes and droplet depletion slow down rain formation. In large cells, collisions between raindrops are more frequent and this can also slow down rain formation. The increase in the rate of collision between raindrops may be an artifact caused by assuming an overly large well-mixed volume. The highest ratio of rain water to cloud water is found in cells of intermediate sizes. Next, we use these precise simulations to determine the validity of more approximate methods: the Smoluchowski equation and the SDM with multiplicities greater than 1. In the latter, we determine how many computational droplets are necessary to correctly model the expected number and the standard deviation of the autoconversion time. The maximal size of a volume that is turbulently well mixed with respect to coalescence is estimated at Vmix  =  1.5  ×  10−2 cm3. The Smoluchowski equation is not valid in such small volumes. It is argued that larger volumes can be considered approximately well mixed, but such approximation needs to be supported by a comparison with fine-grid simulations that resolve droplet motion.

  17. Complexity analysis of accelerated MCMC methods for Bayesian inversion

    International Nuclear Information System (INIS)

    Hoang, Viet Ha; Schwab, Christoph; Stuart, Andrew M

    2013-01-01

    The Bayesian approach to inverse problems, in which the posterior probability distribution on an unknown field is sampled for the purposes of computing posterior expectations of quantities of interest, is starting to become computationally feasible for partial differential equation (PDE) inverse problems. Balancing the sources of error arising from finite-dimensional approximation of the unknown field, the PDE forward solution map and the sampling of the probability space under the posterior distribution are essential for the design of efficient computational Bayesian methods for PDE inverse problems. We study Bayesian inversion for a model elliptic PDE with an unknown diffusion coefficient. We provide complexity analyses of several Markov chain Monte Carlo (MCMC) methods for the efficient numerical evaluation of expectations under the Bayesian posterior distribution, given data δ. Particular attention is given to bounds on the overall work required to achieve a prescribed error level ε. Specifically, we first bound the computational complexity of ‘plain’ MCMC, based on combining MCMC sampling with linear complexity multi-level solvers for elliptic PDE. Our (new) work versus accuracy bounds show that the complexity of this approach can be quite prohibitive. Two strategies for reducing the computational complexity are then proposed and analyzed: first, a sparse, parametric and deterministic generalized polynomial chaos (gpc) ‘surrogate’ representation of the forward response map of the PDE over the entire parameter space, and, second, a novel multi-level Markov chain Monte Carlo strategy which utilizes sampling from a multi-level discretization of the posterior and the forward PDE. For both of these strategies, we derive asymptotic bounds on work versus accuracy, and hence asymptotic bounds on the computational complexity of the algorithms. In particular, we provide sufficient conditions on the regularity of the unknown coefficients of the PDE and on the

  18. Shear-induced Bubble Coalescence in Rhyolitic Melts with Low Vesicularity

    Science.gov (United States)

    Okumura, S.; Nakamura, M.; Tsuchiyama, A.

    2006-12-01

    Development of bubble structure during magma ascent controls the dynamics of volcanic eruption, because the bubble structure influences the magma rheology and permeability, and hence magma degassing. In the flowing magmas, the bubble structure is expected to be changed by shear, as pointed out by some previous studies based on geological observations. However, the development of bubble structure has been experimentally studied only in the isostatic magmas. We have experimentally demonstrated for the first time, the shear-induced development of number density, size and shape of bubbles in a rhyolitic melt. The deformation experiments were performed by using an externally heated, piston-cylinder type apparatus with a rotational piston. At 975°C, natural obsidian (initial water content of 0.5 wt%) having cylindrical shape (ca. 4.7 mm in diameter and 5 mm in length) was vesiculated in the graphite container (ca. 5 and 9 mm in the inner and the outer diameters, respectively, and 5 mm in length), and the vesiculated samples were twisted at various rotational speeds up to 1 rpm. The number density, size and shape of bubbles in the quenched samples were then measured by using the X-ray computed tomography. The size distribution of bubbles shows that the number of larger bubbles increases with the rotational speed and at the outer zone of the samples at which the shear rate is high. In the high shear rate zone, the magnitude of bubble deformation is large. The 3D images of large bubbles clearly indicate that they were formed by coalescence. These results indicate that the degree of bubble coalescence is enhanced with the shear rate. The experimental results also demonstrated that the coalescence of bubbles occur even at low vesicularity (ca. 20 vol.%). Because the shear rate induced in this study (in the order of 0.01 1/s) seems to be produced for magmas ascending in a volcanic conduit, we propose the possibility that the vesiculated magmas undergo bubble coalescence at a

  19. A lattice Boltzmann simulation of coalescence-induced droplet jumping on superhydrophobic surfaces with randomly distributed structures

    Science.gov (United States)

    Zhang, Li-Zhi; Yuan, Wu-Zhi

    2018-04-01

    The motion of coalescence-induced condensate droplets on superhydrophobic surface (SHS) has attracted increasing attention in energy-related applications. Previous researches were focused on regularly rough surfaces. Here a new approach, a mesoscale lattice Boltzmann method (LBM), is proposed and used to model the dynamic behavior of coalescence-induced droplet jumping on SHS with randomly distributed rough structures. A Fast Fourier Transformation (FFT) method is used to generate non-Gaussian randomly distributed rough surfaces with the skewness (Sk), kurtosis (K) and root mean square (Rq) obtained from real surfaces. Three typical spreading states of coalesced droplets are observed through LBM modeling on various rough surfaces, which are found to significantly influence the jumping ability of coalesced droplet. The coalesced droplets spreading in Cassie state or in composite state will jump off the rough surfaces, while the ones spreading in Wenzel state would eventually remain on the rough surfaces. It is demonstrated that the rough surfaces with smaller Sks, larger Rqs and a K at 3.0 are beneficial to coalescence-induced droplet jumping. The new approach gives more detailed insights into the design of SHS.

  20. Bayesian models: A statistical primer for ecologists

    Science.gov (United States)

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  1. Knitmesh And Duplex-Nylon Type Coalescence Aids Use In Phase Disengagement

    Directory of Open Access Journals (Sweden)

    Hamit Topuz

    2017-10-01

    Full Text Available This study shows how dispersions consisted of droplet sizes ranging from 100 microns and above of immiscible liquids in agitated vessels coalesced and settled back to their phases by employing commercially known as knit-mesh made from stainless steel and nylon. These components known as higher surface energy and lower surface energy contained coalesce aids respectively. In addition to compare coalesce aid made purely from commercially known as duplex-nylon also used. The experimental set up was 13 scale of a single stage mixer-settler unit of the already existing unit which was in use at BNFL Springfield Works. The liquid liquid system made from 20 tri-butyl-phosphate TBP technical grade of odorless kerosene forming the light organic phase or solvent phase and 5 M nitric acid forming the heavy aqueous phase. The solvent phase contained 70 gram of uranium per liter. Uranium contained phase was supplied by above mentioned company.

  2. Bayesian emulation for optimization in multi-step portfolio decisions

    OpenAIRE

    Irie, Kaoru; West, Mike

    2016-01-01

    We discuss the Bayesian emulation approach to computational solution of multi-step portfolio studies in financial time series. "Bayesian emulation for decisions" involves mapping the technical structure of a decision analysis problem to that of Bayesian inference in a purely synthetic "emulating" statistical model. This provides access to standard posterior analytic, simulation and optimization methods that yield indirect solutions of the decision problem. We develop this in time series portf...

  3. A variational void coalescence model for ductile metals

    KAUST Repository

    Siddiq, Amir; Arciniega, Roman; El Sayed, Tamer

    2011-01-01

    We present a variational void coalescence model that includes all the essential ingredients of failure in ductile porous metals. The model is an extension of the variational void growth model by Weinberg et al. (Comput Mech 37:142-152, 2006

  4. Bayesian Analysis of two Censored Shifted Gompertz Mixture Distributions using Informative and Noninformative Priors

    Directory of Open Access Journals (Sweden)

    Tabassum Naz Sindhu

    2017-03-01

    Full Text Available This study deals with Bayesian analysis of shifted Gompertz mixture model under type-I censored samples assuming both informative and noninformative priors. We have discussed the Bayesian estimation of parameters of shifted Gompertz mixture model under the uniform, and gamma priors assuming three loss functions. Further, some properties of the model with some graphs of the mixture density are discussed. These properties include Bayes estimators, posterior risks and reliability function under simulation scheme. Bayes estimates are obtained considering two cases: (a when the shape parameter is known and (b when all parameters are unknown. We analyzed some simulated sets in order to investigate the effect of prior belief, loss functions, and performance of the proposed set of estimators of the mixture model parameters.

  5. First- and Second-level Bayesian Inference of Flow Resistivity of Sound Absorber and Room’s Influence

    DEFF Research Database (Denmark)

    Choi, Sang-Hyeon; Lee, Ikjin; Jeong, Cheol-Ho

    2016-01-01

    Sabine absorption coefficient is a widely used one deduced from reverberation time measurements via the Sabine equation. First- and second-level Bayesian analysis are used to estimate the flow resistivity of a sound absorber and the influences of the test chambers from Sabine absorption...... coefficients measured in 13 different reverberation chambers. The first-level Bayesian analysis is more general than the second-level Bayesian analysis. Sharper posterior distribution can be acquired by the second-level Bayesian analysis than the one by the first-level Bayesian analysis because more data...... are used to set more reliable prior distribution. The estimated room’s influences by the first- and the second-level Bayesian analyses are similar to the estimated results by the mean absolute error minimization....

  6. A Bayesian Nonparametric Approach to Factor Analysis

    DEFF Research Database (Denmark)

    Piatek, Rémi; Papaspiliopoulos, Omiros

    2018-01-01

    This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does no...

  7. Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test

    DEFF Research Database (Denmark)

    Wang, C.; Turnbull, B.W.; Nielsen, Søren Saxmose

    2011-01-01

    the posterior estimates of the model parameters that provide the basis for inference concerning the accuracy of the diagnostic procedure. Based on the Bayesian approach, the posterior probability distribution of the change-point onset time can be obtained and used as a criterion for infection diagnosis......-point process with a Weibull survival hazard function was used to model the progression of the hidden disease status. The model adjusted for the fixed effects of covariate variables and random effects of subject on the diagnostic testing procedure. Markov chain Monte Carlo methods were used to compute....... An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has...

  8. Coalescence of rotating black holes on Eguchi-Hanson space

    International Nuclear Information System (INIS)

    Matsuno, Ken; Ishihara, Hideki; Kimura, Masashi; Tomizawa, Shinya

    2007-01-01

    We obtain new charged rotating multi-black hole solutions on the Eguchi-Hanson space in the five-dimensional Einstein-Maxwell system with a Chern-Simons term and a positive cosmological constant. In the two-black holes case, these solutions describe the coalescence of two rotating black holes with the horizon topologies of S 3 into a single rotating black hole with the horizon topology of the lens space L(2;1)=S 3 /Z 2 . We discuss the differences in the horizon areas between our solutions and the two-centered Klemm-Sabra solutions which describe the coalescence of two rotating black holes with the horizon topologies of S 3 into a single rotating black hole with the horizon topology of S 3

  9. Dislocation mediated alignment during metal nanoparticle coalescence

    International Nuclear Information System (INIS)

    Lange, A.P.; Samanta, A.; Majidi, H.; Mahajan, S.; Ging, J.; Olson, T.Y.; Benthem, K. van; Elhadj, S.

    2016-01-01

    Dislocation mediated alignment processes during gold nanoparticle coalescence were studied at low and high temperatures using molecular dynamics simulations and transmission electron microscopy. Particles underwent rigid body rotations immediately following attachment in both low temperature (500 K) simulated coalescence events and low temperature (∼315 K) transmission electron microscopy beam heating experiments. In many low temperature simulations, some degree of misorientation between particles remained after rigid body rotations, which was accommodated by grain boundary dislocation nodes. These dislocations were either sessile and remained at the interface for the duration of the simulation or dissociated and cross-slipped through the adjacent particles, leading to improved co-alignment. Minimal rigid body rotations were observed during or immediately following attachment in high temperature (1100 K) simulations, which is attributed to enhanced diffusion at the particles' interface. However, rotation was eventually induced by {111} slip on planes parallel to the neck groove. These deformation modes led to the formation of single and multi-fold twins whose structures depended on the initial orientation of the particles. The driving force for {111} slip is attributed to high surface stresses near the intersection of low energy {111} facets in the neck region. The details of this twinning process were examined in detail using simulated trajectories, and the results reveal possible mechanisms for the nucleation and propagation of Shockley partials on consecutive planes. Deformation twinning was also observed in-situ using transmission electron microscopy, which resulted in the co-alignment of a set of the particles' {111} planes across their grain boundary and an increase in their dihedral angle. This constitutes the first detailed experimental observation of deformation twinning during nanoparticle coalescence, validating simulation results presented here and

  10. Finding the best resolution for the Kingman-Tajima coalescent: theory and applications.

    Science.gov (United States)

    Sainudiin, Raazesh; Stadler, Tanja; Véber, Amandine

    2015-05-01

    Many summary statistics currently used in population genetics and in phylogenetics depend only on a rather coarse resolution of the underlying tree (the number of extant lineages, for example). Hence, for computational purposes, working directly on these resolutions appears to be much more efficient. However, this approach seems to have been overlooked in the past. In this paper, we describe six different resolutions of the Kingman-Tajima coalescent together with the corresponding Markov chains, which are essential for inference methods. Two of the resolutions are the well-known n-coalescent and the lineage death process due to Kingman. Two other resolutions were mentioned by Kingman and Tajima, but never explicitly formalized. Another two resolutions are novel, and complete the picture of a multi-resolution coalescent. For all of them, we provide the forward and backward transition probabilities, the probability of visiting a given state as well as the probability of a given realization of the full Markov chain. We also provide a description of the state-space that highlights the computational gain obtained by working with lower-resolution objects. Finally, we give several examples of summary statistics that depend on a coarser resolution of Kingman's coalescent, on which simulations are usually based.

  11. An automated land-use mapping comparison of the Bayesian maximum likelihood and linear discriminant analysis algorithms

    Science.gov (United States)

    Tom, C. H.; Miller, L. D.

    1984-01-01

    The Bayesian maximum likelihood parametric classifier has been tested against the data-based formulation designated 'linear discrimination analysis', using the 'GLIKE' decision and "CLASSIFY' classification algorithms in the Landsat Mapping System. Identical supervised training sets, USGS land use/land cover classes, and various combinations of Landsat image and ancilliary geodata variables, were used to compare the algorithms' thematic mapping accuracy on a single-date summer subscene, with a cellularized USGS land use map of the same time frame furnishing the ground truth reference. CLASSIFY, which accepts a priori class probabilities, is found to be more accurate than GLIKE, which assumes equal class occurrences, for all three mapping variable sets and both levels of detail. These results may be generalized to direct accuracy, time, cost, and flexibility advantages of linear discriminant analysis over Bayesian methods.

  12. Analysis of neutron reflectivity data: maximum entropy, Bayesian spectral analysis and speckle holography

    International Nuclear Information System (INIS)

    Sivia, D.S.; Hamilton, W.A.; Smith, G.S.

    1991-01-01

    The analysis of neutron reflectivity data to obtain nuclear scattering length density profiles is akin to the notorious phaseless Fourier problem, well known in many fields such as crystallography. Current methods of analysis culminate in the refinement of a few parameters of a functional model, and are often preceded by a long and laborious process of trial and error. We start by discussing the use of maximum entropy for obtained 'free-form' solutions of the density profile, as an alternative to the trial and error phase when a functional model is not available. Next we consider a Bayesian spectral analysis approach, which is appropriate for optimising the parameters of a simple (but adequate) type of model when the number of parameters is not known. Finally, we suggest a novel experimental procedure, the analogue of astronomical speckle holography, designed to alleviate the ambiguity problems inherent in traditional reflectivity measurements. (orig.)

  13. Hip fracture in the elderly: a re-analysis of the EPIDOS study with causal Bayesian networks.

    Science.gov (United States)

    Caillet, Pascal; Klemm, Sarah; Ducher, Michel; Aussem, Alexandre; Schott, Anne-Marie

    2015-01-01

    Hip fractures commonly result in permanent disability, institutionalization or death in elderly. Existing hip-fracture predicting tools are underused in clinical practice, partly due to their lack of intuitive interpretation. By use of a graphical layer, Bayesian network models could increase the attractiveness of fracture prediction tools. Our aim was to study the potential contribution of a causal Bayesian network in this clinical setting. A logistic regression was performed as a standard control approach to check the robustness of the causal Bayesian network approach. EPIDOS is a multicenter study, conducted in an ambulatory care setting in five French cities between 1992 and 1996 and updated in 2010. The study included 7598 women aged 75 years or older, in which fractures were assessed quarterly during 4 years. A causal Bayesian network and a logistic regression were performed on EPIDOS data to describe major variables involved in hip fractures occurrences. Both models had similar association estimations and predictive performances. They detected gait speed and mineral bone density as variables the most involved in the fracture process. The causal Bayesian network showed that gait speed and bone mineral density were directly connected to fracture and seem to mediate the influence of all the other variables included in our model. The logistic regression approach detected multiple interactions involving psychotropic drug use, age and bone mineral density. Both approaches retrieved similar variables as predictors of hip fractures. However, Bayesian network highlighted the whole web of relation between the variables involved in the analysis, suggesting a possible mechanism leading to hip fracture. According to the latter results, intervention focusing concomitantly on gait speed and bone mineral density may be necessary for an optimal prevention of hip fracture occurrence in elderly people.

  14. Coalescing colony model: Mean-field, scaling, and geometry

    Science.gov (United States)

    Carra, Giulia; Mallick, Kirone; Barthelemy, Marc

    2017-12-01

    We analyze the coalescing model where a `primary' colony grows and randomly emits secondary colonies that spread and eventually coalesce with it. This model describes population proliferation in theoretical ecology, tumor growth, and is also of great interest for modeling urban sprawl. Assuming the primary colony to be always circular of radius r (t ) and the emission rate proportional to r (t) θ , where θ >0 , we derive the mean-field equations governing the dynamics of the primary colony, calculate the scaling exponents versus θ , and compare our results with numerical simulations. We then critically test the validity of the circular approximation for the colony shape and show that it is sound for a constant emission rate (θ =0 ). However, when the emission rate is proportional to the perimeter, the circular approximation breaks down and the roughness of the primary colony cannot be discarded, thus modifying the scaling exponents.

  15. Modelling binary black-hole coalescence

    International Nuclear Information System (INIS)

    Baker, John

    2003-01-01

    The final burst of radiation from the coalescence of two supermassive black holes produces extraordinary gravitational wave luminosity making these events visible to LISA even out to large redshift. Interpreting such observations will require detailed theoretical models, based on general relativity. The effort to construct these models is just beginning to produce results. I describe the Lazarus approach to modelling these radiation bursts, and present recent results which indicate that the system loses, in the last few wave cycles, about 3% of its mass-energy as strongly polarized gravitational radiation

  16. Simulations of plasma heating caused by the coalescence of multiple current loops in a proton-boron fusion plasma

    International Nuclear Information System (INIS)

    Haruki, T.; Yousefi, H. R.; Sakai, J.-I.

    2010-01-01

    Two dimensional particle-in-cell simulations of a dense plasma focus were performed to investigate a plasma heating process caused by the coalescence of multiple current loops in a proton-boron-electron plasma. Recently, it was reported that the electric field produced during the coalescence of two current loops in a proton-boron-electron plasma heats up all plasma species; proton-boron nuclear fusion may therefore be achievable using a dense plasma focus device. Based on this work, the coalescence process for four and eight current loops was investigated. It was found that the return current plays an important role in both the current pinch and the plasma heating. The coalescence of four current loops led to the breakup of the return current from the pinched plasma, resulting in plasma heating. For the coalescence of eight current loops, the plasma was confined by the pinch but the plasma heating was smaller than the two and four loop cases. Therefore the heating associated with current loop coalescence depends on the number of initial current loops. These results are useful for understanding the coalescence of multiple current loops in a proton-boron-electron plasma.

  17. Direct observation of shear–induced nanocrystal attachment and coalescence in CuZr-based metallic glasses: TEM investigation

    International Nuclear Information System (INIS)

    Hajlaoui, K.; Alrasheedi, Nashmi H.; Yavari, A.R.

    2016-01-01

    In-situ tensile straining tests were performed in a transmission electron microscope (TEM) to analyse the deformation processes in CuZr-based metallic glasses and to directly observe the phase transformation occurrence. We report evidence of shear induced coalescence of nanocrystals in the vicinity of deformed regions. Nanocrystals grow in shear bands, come into contact, being attached and progressively coalesce under applied shear stress. - Highlights: • In-situ tensile straining test in TEM was investigated on CuZr-Based metallic glass. • Strain induces nanocrystallization and subsequent attachment and coalescence of nanocrystals. • The coalescence of nanocrystals compensates strain softening in metallic glasses.

  18. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Directory of Open Access Journals (Sweden)

    Villemereuil Pierre de

    2012-06-01

    Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible

  19. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  20. Probabilistic Safety Analysis of High Speed and Conventional Lines Using Bayesian Networks

    Energy Technology Data Exchange (ETDEWEB)

    Grande Andrade, Z.; Castillo Ron, E.; O' Connor, A.; Nogal, M.

    2016-07-01

    A Bayesian network approach is presented for probabilistic safety analysis (PSA) of railway lines. The idea consists of identifying and reproducing all the elements that the train encounters when circulating along a railway line, such as light and speed limit signals, tunnel or viaduct entries or exits, cuttings and embankments, acoustic sounds received in the cabin, curves, switches, etc. In addition, since the human error is very relevant for safety evaluation, the automatic train protection (ATP) systems and the driver behavior and its time evolution are modelled and taken into account to determine the probabilities of human errors. The nodes of the Bayesian network, their links and the associated probability tables are automatically constructed based on the line data that need to be carefully given. The conditional probability tables are reproduced by closed formulas, which facilitate the modelling and the sensitivity analysis. A sorted list of the most dangerous elements in the line is obtained, which permits making decisions about the line safety and programming maintenance operations in order to optimize them and reduce the maintenance costs substantially. The proposed methodology is illustrated by its application to several cases that include real lines such as the Palencia-Santander and the Dublin-Belfast lines. (Author)

  1. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    Science.gov (United States)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  2. Bayesian analysis of log Gaussian Cox processes for disease mapping

    DEFF Research Database (Denmark)

    Benes, Viktor; Bodlák, Karel; Møller, Jesper

    We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis, and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence...... of the risk on the covariates. Instead of using the common area level approaches we consider a Bayesian analysis for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using markov chain Monte Carlo methods...

  3. A Bayesian Approach to the Overlap Analysis of Epidemiologically Linked Traits.

    Science.gov (United States)

    Asimit, Jennifer L; Panoutsopoulou, Kalliope; Wheeler, Eleanor; Berndt, Sonja I; Cordell, Heather J; Morris, Andrew P; Zeggini, Eleftheria; Barroso, Inês

    2015-12-01

    Diseases often cooccur in individuals more often than expected by chance, and may be explained by shared underlying genetic etiology. A common approach to genetic overlap analyses is to use summary genome-wide association study data to identify single-nucleotide polymorphisms (SNPs) that are associated with multiple traits at a selected P-value threshold. However, P-values do not account for differences in power, whereas Bayes' factors (BFs) do, and may be approximated using summary statistics. We use simulation studies to compare the power of frequentist and Bayesian approaches with overlap analyses, and to decide on appropriate thresholds for comparison between the two methods. It is empirically illustrated that BFs have the advantage over P-values of a decreasing type I error rate as study size increases for single-disease associations. Consequently, the overlap analysis of traits from different-sized studies encounters issues in fair P-value threshold selection, whereas BFs are adjusted automatically. Extensive simulations show that Bayesian overlap analyses tend to have higher power than those that assess association strength with P-values, particularly in low-power scenarios. Calibration tables between BFs and P-values are provided for a range of sample sizes, as well as an approximation approach for sample sizes that are not in the calibration table. Although P-values are sometimes thought more intuitive, these tables assist in removing the opaqueness of Bayesian thresholds and may also be used in the selection of a BF threshold to meet a certain type I error rate. An application of our methods is used to identify variants associated with both obesity and osteoarthritis. © 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  4. Bayesian analysis of factors associated with fibromyalgia syndrome subjects

    Science.gov (United States)

    Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie

    2015-01-01

    Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.

  5. Evaluation of plugging criteria on steam generator tubes and coalescence model of collinear axial through-wall cracks

    International Nuclear Information System (INIS)

    Lee, Jin Ho; Park, Youn Won; Song, Myung Ho; Kim, Young Jin; Moon, Seong In

    2000-01-01

    In a nuclear power plant, steam generator tubes cover a major portion of the primary pressure-retaining boundary. Thus very conservative approaches have been taken in the light of steam generator tube integrity. According to the present criteria, tubes wall-thinned in excess 40% should be plugged whatever causes are. However, many analytical and experimental results have shown that no safety problems exist even with thickness reductions greater than 40%. The present criterion was developed about twenty years ago when wear and pitting were dominant causes for steam generator tube degradation. And it is based on tubes with single cracks regardless of the fact that the appearance of multiple cracks is more common in general. The objective of this study is to review the conservatism of the present plugging criteria of steam generator tubes and to propose a new coalescence model for two adjacent through-wall cracks existing in steam generator tubes. Using the existing failure models and experimental results, we reviewed the conservatism of the present plugging criteria. In order to verify the usefulness of the proposed new coalescence model, we performed finite element analysis and some parametric studies. Then, we developed a coalescence evaluation diagram

  6. Systemic antibiotics in the treatment of aggressive periodontitis. A systematic review and a Bayesian Network meta-analysis.

    Science.gov (United States)

    Rabelo, Cleverton Correa; Feres, Magda; Gonçalves, Cristiane; Figueiredo, Luciene C; Faveri, Marcelo; Tu, Yu-Kang; Chambrone, Leandro

    2015-07-01

    The aim of this study was to assess the effect of systemic antibiotic therapy on the treatment of aggressive periodontitis (AgP). This study was conducted and reported in accordance with the PRISMA statement. The MEDLINE, EMBASE and CENTRAL databases were searched up to June 2014 for randomized clinical trials comparing the treatment of subjects with AgP with either scaling and root planing (SRP) alone or associated with systemic antibiotics. Bayesian network meta-analysis was prepared using the Bayesian random-effects hierarchical models and the outcomes reported at 6-month post-treatment. Out of 350 papers identified, 14 studies were eligible. Greater gain in clinical attachment (CA) (mean difference [MD]: 1.08 mm; p benefits in CA gain and PD reduction when SRP was associated with systemic antibiotics. SRP plus systemic antibiotics led to an additional clinical effect compared with SRP alone in the treatment of AgP. Of the antibiotic protocols available for inclusion into the Bayesian network meta-analysis, Mtz and Mtz/Amx provided to the most beneficial outcomes. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Testing students' e-learning via Facebook through Bayesian structural equation modeling.

    Science.gov (United States)

    Salarzadeh Jenatabadi, Hashem; Moghavvemi, Sedigheh; Wan Mohamed Radzi, Che Wan Jasimah Bt; Babashamsi, Parastoo; Arashi, Mohammad

    2017-01-01

    Learning is an intentional activity, with several factors affecting students' intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data) were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods' results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated.

  8. Testing students' e-learning via Facebook through Bayesian structural equation modeling.

    Directory of Open Access Journals (Sweden)

    Hashem Salarzadeh Jenatabadi

    Full Text Available Learning is an intentional activity, with several factors affecting students' intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods' results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated.

  9. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  10. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  11. Second harmonic electromagnetic emission via Langmuir wave coalescence

    International Nuclear Information System (INIS)

    Willes, A.J.; Robinson, P.A.; Melrose, D.B.

    1996-01-01

    The coalescence of Langmuir waves to produce electromagnetic waves at twice the plasma frequency is considered. A simplified expression for the rate of production of second harmonic electromagnetic waves is obtained for a broad class of Langmuir spectra. In addition, two different analytic approximations are considered. The validity of the commonly used head-on approximation is explored, in which the two coalescing Langmuir waves are assumed to approach from opposite directions. This approximation breaks down at low Langmuir wavenumbers, and for narrow Langmuir wave spectra. A second, more general, approximation is introduced, called the narrow-spectrum approximation, which requires narrow spectral widths of the Langmuir spectra. The advantages of this approximation are that it does not break down at low Langmuir wavenumbers, and that it remains valid for relatively broad Langmuir wave spectra. Finally, the applicability of these approximations in treating harmonic radiation in type III solar radio bursts is discussed. copyright 1996 American Institute of Physics

  12. Characteristic dynamics near two coalescing eigenvalues incorporating continuum threshold effects

    Science.gov (United States)

    Garmon, Savannah; Ordonez, Gonzalo

    2017-06-01

    It has been reported in the literature that the survival probability P(t) near an exceptional point where two eigenstates coalesce should generally exhibit an evolution P (t ) ˜t2e-Γ t, in which Γ is the decay rate of the coalesced eigenstate; this has been verified in a microwave billiard experiment [B. Dietz et al., Phys. Rev. E 75, 027201 (2007)]. However, the heuristic effective Hamiltonian that is usually employed to obtain this result ignores the possible influence of the continuum threshold on the dynamics. By contrast, in this work we employ an analytical approach starting from the microscopic Hamiltonian representing two simple models in order to show that the continuum threshold has a strong influence on the dynamics near exceptional points in a variety of circumstances. To report our results, we divide the exceptional points in Hermitian open quantum systems into two cases: at an EP2A two virtual bound states coalesce before forming a resonance, anti-resonance pair with complex conjugate eigenvalues, while at an EP2B two resonances coalesce before forming two different resonances. For the EP2B, which is the case studied in the microwave billiard experiment, we verify that the survival probability exhibits the previously reported modified exponential decay on intermediate time scales, but this is replaced with an inverse power law on very long time scales. Meanwhile, for the EP2A the influence from the continuum threshold is so strong that the evolution is non-exponential on all time scales and the heuristic approach fails completely. When the EP2A appears very near the threshold, we obtain the novel evolution P (t ) ˜1 -C1√{t } on intermediate time scales, while further away the parabolic decay (Zeno dynamics) on short time scales is enhanced.

  13. Hierarchical Bayesian Analysis of Biased Beliefs and Distributional Other-Regarding Preferences

    Directory of Open Access Journals (Sweden)

    Jeroen Weesie

    2013-02-01

    Full Text Available This study investigates the relationship between an actor’s beliefs about others’ other-regarding (social preferences and her own other-regarding preferences, using an “avant-garde” hierarchical Bayesian method. We estimate two distributional other-regarding preference parameters, α and β, of actors using incentivized choice data in binary Dictator Games. Simultaneously, we estimate the distribution of actors’ beliefs about others α and β, conditional on actors’ own α and β, with incentivized belief elicitation. We demonstrate the benefits of the Bayesian method compared to it’s hierarchical frequentist counterparts. Results show a positive association between an actor’s own (α; β and her beliefs about average(α; β in the population. The association between own preferences and the variance in beliefs about others’ preferences in the population, however, is curvilinear for α and insignificant for β. These results are partially consistent with the cone effect [1,2] which is described in detail below. Because in the Bayesian-Nash equilibrium concept, beliefs and own preferences are assumed to be independent, these results cast doubt on the application of the Bayesian-Nash equilibrium concept to experimental data.

  14. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies

    Directory of Open Access Journals (Sweden)

    Hero Alfred

    2010-11-01

    Full Text Available Abstract Background Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP, the Indian Buffet Process (IBP, and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB analysis. Results Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV, Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD, closely related non-Bayesian approaches. Conclusions Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

  15. Bayesian inference of the number of factors in gene-expression analysis: application to human virus challenge studies.

    Science.gov (United States)

    Chen, Bo; Chen, Minhua; Paisley, John; Zaas, Aimee; Woods, Christopher; Ginsburg, Geoffrey S; Hero, Alfred; Lucas, Joseph; Dunson, David; Carin, Lawrence

    2010-11-09

    Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

  16. How to practise Bayesian statistics outside the Bayesian church: What philosophy for Bayesian statistical modelling?

    NARCIS (Netherlands)

    Borsboom, D.; Haig, B.D.

    2013-01-01

    Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science

  17. Bayesian Alternation During Tactile Augmentation

    Directory of Open Access Journals (Sweden)

    Caspar Mathias Goeke

    2016-10-01

    Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in

  18. Experiences in applying Bayesian integrative models in interdisciplinary modeling: the computational and human challenges

    DEFF Research Database (Denmark)

    Kuikka, Sakari; Haapasaari, Päivi Elisabet; Helle, Inari

    2011-01-01

    We review the experience obtained in using integrative Bayesian models in interdisciplinary analysis focusing on sustainable use of marine resources and environmental management tasks. We have applied Bayesian models to both fisheries and environmental risk analysis problems. Bayesian belief...... be time consuming and research projects can be difficult to manage due to unpredictable technical problems related to parameter estimation. Biology, sociology and environmental economics have their own scientific traditions. Bayesian models are becoming traditional tools in fisheries biology, where...

  19. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  20. An investigation of bubble coalescence and post-rupture oscillation in non-ionic surfactant solutions using high-speed cinematography.

    Science.gov (United States)

    Bournival, G; Ata, S; Karakashev, S I; Jameson, G J

    2014-01-15

    Most processes involving bubbling in a liquid require small bubbles to maximise mass/energy transfer. A common method to prevent bubbles from coalescing is by the addition of surfactants. In order to get an insight into the coalescence process, capillary bubbles were observed using a high speed cinematography. Experiments were performed in solutions of 1-pentanol, 4-methyl-2-pentanol, tri(propylene glycol) methyl ether, and poly(propylene glycol) for which information such as the coalescence time and the deformation of the resultant bubble upon coalescence was extracted. It is shown in this study that the coalescence time increases with surfactant concentration until the appearance of a plateau. The increase in coalescence time with surfactant concentration could not be attributed only to surface elasticity. The oscillation of the resultant bubble was characterised by the damping of the oscillation. The results suggested that a minimum elasticity is required to achieve an increased damping and considerable diffusion has a detrimental effect on the dynamic response of the bubble, thereby reducing the damping. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. A New Bayesian Approach for Estimating the Presence of a Suspected Compound in Routine Screening Analysis

    NARCIS (Netherlands)

    Woldegebriel, M.; Vivó-Truyols, G.

    2016-01-01

    A novel method for compound identification in liquid chromatography-high resolution mass spectrometry (LC-HRMS) is proposed. The method, based on Bayesian statistics, accommodates all possible uncertainties involved, from instrumentation up to data analysis into a single model yielding the

  2. The occurrence of in-mouth coalescence of emulsion droplets in relation to perception of fat

    NARCIS (Netherlands)

    Dresselhuis, D.M.; Hoog, de E.H.A.; Cohen Stuart, M.A.; Vingerhoeds, M.H.; Aken, van G.A.

    2008-01-01

    We studied the relation between sensitivity of emulsions for in-mouth coalescence and perception of fat-related attributes, such as creaminess as well as the relation with in vivo perceived and ex vivo measured friction. Emulsions with varying expected sensitivity towards in-mouth coalescence were

  3. Nickel, copper and cobalt coalescence in copper cliff converter slag

    Directory of Open Access Journals (Sweden)

    Wolf A.

    2016-01-01

    Full Text Available The aim of this investigation is to assess the effect of various additives on coalescence of nickel, copper and cobalt from slags generated during nickel extraction. The analyzed fluxes were silica and lime while examined reductants were pig iron, ferrosilicon and copper-silicon compound. Slag was settled at the different holding temperatures for various times in conditions that simulated the industrial environment. The newly formed matte and slag were characterized by their chemical composition and morphology. Silica flux generated higher partition coefficients for nickel and copper than the addition of lime. Additives used as reducing agents had higher valuable metal recovery rates and corresponding partition coefficients than fluxes. Microstructural studies showed that slag formed after adding reductants consisted of primarily fayalite, with some minute traces of magnetite as the secondary phase. Addition of 5 wt% of pig iron, ferrosilicon and copper-silicon alloys favored the formation of a metallized matte which increased Cu, Ni and Co recoveries. Addition of copper-silicon alloys with low silicon content was efficient in copper recovery but coalescence of the other metals was low. Slag treated with the ferrosilicon facilitated the highest cobalt recovery while copper-silicon alloys with silicon content above 10 wt% resulted in high coalescence of nickel and copper, 87 % and 72 % respectively.

  4. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  5. Rational hypocrisy: a Bayesian analysis based on informal argumentation and slippery slopes.

    Science.gov (United States)

    Rai, Tage S; Holyoak, Keith J

    2014-01-01

    Moral hypocrisy is typically viewed as an ethical accusation: Someone is applying different moral standards to essentially identical cases, dishonestly claiming that one action is acceptable while otherwise equivalent actions are not. We suggest that in some instances the apparent logical inconsistency stems from different evaluations of a weak argument, rather than dishonesty per se. Extending Corner, Hahn, and Oaksford's (2006) analysis of slippery slope arguments, we develop a Bayesian framework in which accusations of hypocrisy depend on inferences of shared category membership between proposed actions and previous standards, based on prior probabilities that inform the strength of competing hypotheses. Across three experiments, we demonstrate that inferences of hypocrisy increase as perceptions of the likelihood of shared category membership between precedent cases and current cases increase, that these inferences follow established principles of category induction, and that the presence of self-serving motives increases inferences of hypocrisy independent of changes in the actions themselves. Taken together, these results demonstrate that Bayesian analyses of weak arguments may have implications for assessing moral reasoning. © 2014 Cognitive Science Society, Inc.

  6. BATSE gamma-ray burst line search. 2: Bayesian consistency methodology

    Science.gov (United States)

    Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.

    1994-01-01

    We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.

  7. Arrested of coalescence of emulsion droplets of arbitrary size

    Science.gov (United States)

    Mbanga, Badel L.; Burke, Christopher; Blair, Donald W.; Atherton, Timothy J.

    2013-03-01

    With applications ranging from food products to cosmetics via targeted drug delivery systems, structured anisotropic colloids provide an efficient way to control the structure, properties and functions of emulsions. When two fluid emulsion droplets are brought in contact, a reduction of the interfacial tension drives their coalescence into a larger droplet of the same total volume and reduced exposed area. This coalescence can be partially or totally hindered by the presence of nano or micron-size particles that coat the interface as in Pickering emulsions. We investigate numerically the dependance of the mechanical stability of these arrested shapes on the particles size, their shape anisotropy, their polydispersity, their interaction with the solvent, and the particle-particle interactions. We discuss structural shape changes that can be induced by tuning the particles interactions after arrest occurs, and provide design parameters for the relevant experiments.

  8. Pregnancy, thrombophilia, and the risk of a first venous thrombosis : Systematic review and bayesian meta-analysis

    NARCIS (Netherlands)

    Croles, F. Nanne; Nasserinejad, Kazem; Duvekot, Johannes J.; Kruip, Marieke J. H. A.; Meijer, Karina; Leebeek, Frank W. G.

    2017-01-01

    Objective: To provide evidence to support updated guidelines for the management of pregnant women with hereditary thrombophilia in order to reduce the risk of a first venous thromboembolism (VTE) in pregnancy. Design: Systematic review and bayesian meta-analysis. Data sources: Embase, Medline, Web

  9. Bayesian ensemble refinement by replica simulations and reweighting

    Science.gov (United States)

    Hummer, Gerhard; Köfinger, Jürgen

    2015-12-01

    We describe different Bayesian ensemble refinement methods, examine their interrelation, and discuss their practical application. With ensemble refinement, the properties of dynamic and partially disordered (bio)molecular structures can be characterized by integrating a wide range of experimental data, including measurements of ensemble-averaged observables. We start from a Bayesian formulation in which the posterior is a functional that ranks different configuration space distributions. By maximizing this posterior, we derive an optimal Bayesian ensemble distribution. For discrete configurations, this optimal distribution is identical to that obtained by the maximum entropy "ensemble refinement of SAXS" (EROS) formulation. Bayesian replica ensemble refinement enhances the sampling of relevant configurations by imposing restraints on averages of observables in coupled replica molecular dynamics simulations. We show that the strength of the restraints should scale linearly with the number of replicas to ensure convergence to the optimal Bayesian result in the limit of infinitely many replicas. In the "Bayesian inference of ensembles" method, we combine the replica and EROS approaches to accelerate the convergence. An adaptive algorithm can be used to sample directly from the optimal ensemble, without replicas. We discuss the incorporation of single-molecule measurements and dynamic observables such as relaxation parameters. The theoretical analysis of different Bayesian ensemble refinement approaches provides a basis for practical applications and a starting point for further investigations.

  10. Bayesian data analysis of severe fatal accident risk in the oil chain.

    Science.gov (United States)

    Eckle, Petrissa; Burgherr, Peter

    2013-01-01

    We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis. © 2012 Paul Scherrer Institut.

  11. Decisions under uncertainty using Bayesian analysis

    Directory of Open Access Journals (Sweden)

    Stelian STANCU

    2006-01-01

    Full Text Available The present paper makes a short presentation of the Bayesian decions method, where extrainformation brings a great support to decision making process, but also attract new costs. In this situation, getting new information, generally experimentaly based, contributes to diminushing the uncertainty degree that influences decision making process. As a conclusion, in a large number of decision problems, there is the possibility that the decision makers will renew some decisions already taken because of the facilities offered by obtainig extrainformation.

  12. Genes with minimal phylogenetic information are problematic for coalescent analyses when gene tree estimation is biased.

    Science.gov (United States)

    Xi, Zhenxiang; Liu, Liang; Davis, Charles C

    2015-11-01

    The development and application of coalescent methods are undergoing rapid changes. One little explored area that bears on the application of gene-tree-based coalescent methods to species tree estimation is gene informativeness. Here, we investigate the accuracy of these coalescent methods when genes have minimal phylogenetic information, including the implementation of the multilocus bootstrap approach. Using simulated DNA sequences, we demonstrate that genes with minimal phylogenetic information can produce unreliable gene trees (i.e., high error in gene tree estimation), which may in turn reduce the accuracy of species tree estimation using gene-tree-based coalescent methods. We demonstrate that this problem can be alleviated by sampling more genes, as is commonly done in large-scale phylogenomic analyses. This applies even when these genes are minimally informative. If gene tree estimation is biased, however, gene-tree-based coalescent analyses will produce inconsistent results, which cannot be remedied by increasing the number of genes. In this case, it is not the gene-tree-based coalescent methods that are flawed, but rather the input data (i.e., estimated gene trees). Along these lines, the commonly used program PhyML has a tendency to infer one particular bifurcating topology even though it is best represented as a polytomy. We additionally corroborate these findings by analyzing the 183-locus mammal data set assembled by McCormack et al. (2012) using ultra-conserved elements (UCEs) and flanking DNA. Lastly, we demonstrate that when employing the multilocus bootstrap approach on this 183-locus data set, there is no strong conflict between species trees estimated from concatenation and gene-tree-based coalescent analyses, as has been previously suggested by Gatesy and Springer (2014). Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Asymptotic Properties of the Number of Matching Coalescent Histories for Caterpillar-Like Families of Species Trees.

    Science.gov (United States)

    Disanto, Filippo; Rosenberg, Noah A

    2016-01-01

    Coalescent histories provide lists of species tree branches on which gene tree coalescences can take place, and their enumerative properties assist in understanding the computational complexity of calculations central in the study of gene trees and species trees. Here, we solve an enumerative problem left open by Rosenberg (IEEE/ACM Transactions on Computational Biology and Bioinformatics 10: 1253-1262, 2013) concerning the number of coalescent histories for gene trees and species trees with a matching labeled topology that belongs to a generic caterpillar-like family. By bringing a generating function approach to the study of coalescent histories, we prove that for any caterpillar-like family with seed tree t , the sequence (h n ) n ≥ 0 describing the number of matching coalescent histories of the n th tree of the family grows asymptotically as a constant multiple of the Catalan numbers. Thus, h n  ∼ β t c n , where the asymptotic constant β t > 0 depends on the shape of the seed tree t. The result extends a claim demonstrated only for seed trees with at most eight taxa to arbitrary seed trees, expanding the set of cases for which detailed enumerative properties of coalescent histories can be determined. We introduce a procedure that computes from t the constant β t as well as the algebraic expression for the generating function of the sequence (h n ) n ≥ 0 .

  14. Bayesian Inference for Functional Dynamics Exploring in fMRI Data

    Directory of Open Access Journals (Sweden)

    Xuan Guo

    2016-01-01

    Full Text Available This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM, Bayesian Connectivity Change Point Model (BCCPM, and Dynamic Bayesian Variable Partition Model (DBVPM, and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.

  15. CRAFT (complete reduction to amplitude frequency table)--robust and time-efficient Bayesian approach for quantitative mixture analysis by NMR.

    Science.gov (United States)

    Krishnamurthy, Krish

    2013-12-01

    The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Impact of 50% Synthesized Iso-Paraffins (SIP) on Middle Distillate Fuel Filtration and Coalescence

    Science.gov (United States)

    2014-10-30

    Paraffins DEFINITIONS Coalescence - the ability to shed water Conventional Material Source - crude oil , natural gas liquid condensates...Impact of 50% Synthesized Iso-Paraffins (SIP) on Middle Distillate Fuel Filtration and Coalescence NF&LCFT REPORT 441/15-003 30 October 2014...heavy oil , shale oil , and oil sands Effluent - stream leaving a system Influent - stream entering a system Turnover - time required to flow the

  17. On Bayesian Inference under Sampling from Scale Mixtures of Normals

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1996-01-01

    This paper considers a Bayesian analysis of the linear regression model under independent sampling from general scale mixtures of Normals.Using a common reference prior, we investigate the validity of Bayesian inference and the existence of posterior moments of the regression and precision

  18. Pregnancy, thrombophilia, and the risk of a first venous thrombosis: systematic review and bayesian meta-analysis

    NARCIS (Netherlands)

    F.N. Croles (F. Nanne); K. Nasserinejad (Kazem); J.J. Duvekot (Hans); M.J.H.A. Kruip (Marieke); K. Meijer; F.W.G. Leebeek (Frank)

    2017-01-01

    textabstractObjective To provide evidence to support updated guidelines for the management of pregnant women with hereditary thrombophilia in order to reduce the risk of a first venous thromboembolism (VTE) in pregnancy.Design Systematic review and bayesian meta-analysis.Data sources Embase,

  19. Multi-Objective data analysis using Bayesian Inference for MagLIF experiments

    Science.gov (United States)

    Knapp, Patrick; Glinksy, Michael; Evans, Matthew; Gom, Matth; Han, Stephanie; Harding, Eric; Slutz, Steve; Hahn, Kelly; Harvey-Thompson, Adam; Geissel, Matthias; Ampleford, David; Jennings, Christopher; Schmit, Paul; Smith, Ian; Schwarz, Jens; Peterson, Kyle; Jones, Brent; Rochau, Gregory; Sinars, Daniel

    2017-10-01

    The MagLIF concept has recently demonstrated Gbar pressures and confinement of charged fusion products at stagnation. We present a new analysis methodology that allows for integration of multiple diagnostics including nuclear, x-ray imaging, and x-ray power to determine the temperature, pressure, liner areal density, and mix fraction. A simplified hot-spot model is used with a Bayesian inference network to determine the most probable model parameters that describe the observations while simultaneously revealing the principal uncertainties in the analysis. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.

  20. Validation of Bayesian analysis of compartmental kinetic models in medical imaging.

    Science.gov (United States)

    Sitek, Arkadiusz; Li, Quanzheng; El Fakhri, Georges; Alpert, Nathaniel M

    2016-10-01

    Kinetic compartmental analysis is frequently used to compute physiologically relevant quantitative values from time series of images. In this paper, a new approach based on Bayesian analysis to obtain information about these parameters is presented and validated. The closed-form of the posterior distribution of kinetic parameters is derived with a hierarchical prior to model the standard deviation of normally distributed noise. Markov chain Monte Carlo methods are used for numerical estimation of the posterior distribution. Computer simulations of the kinetics of F18-fluorodeoxyglucose (FDG) are used to demonstrate drawing statistical inferences about kinetic parameters and to validate the theory and implementation. Additionally, point estimates of kinetic parameters and covariance of those estimates are determined using the classical non-linear least squares approach. Posteriors obtained using methods proposed in this work are accurate as no significant deviation from the expected shape of the posterior was found (one-sided P>0.08). It is demonstrated that the results obtained by the standard non-linear least-square methods fail to provide accurate estimation of uncertainty for the same data set (P<0.0001). The results of this work validate new methods for a computer simulations of FDG kinetics. Results show that in situations where the classical approach fails in accurate estimation of uncertainty, Bayesian estimation provides an accurate information about the uncertainties in the parameters. Although a particular example of FDG kinetics was used in the paper, the methods can be extended for different pharmaceuticals and imaging modalities. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  2. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  3. Bayesian Analysis of the Cosmic Microwave Background

    Science.gov (United States)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  4. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...

  5. GO-Bayes: Gene Ontology-based overrepresentation analysis using a Bayesian approach.

    Science.gov (United States)

    Zhang, Song; Cao, Jing; Kong, Y Megan; Scheuermann, Richard H

    2010-04-01

    A typical approach for the interpretation of high-throughput experiments, such as gene expression microarrays, is to produce groups of genes based on certain criteria (e.g. genes that are differentially expressed). To gain more mechanistic insights into the underlying biology, overrepresentation analysis (ORA) is often conducted to investigate whether gene sets associated with particular biological functions, for example, as represented by Gene Ontology (GO) annotations, are statistically overrepresented in the identified gene groups. However, the standard ORA, which is based on the hypergeometric test, analyzes each GO term in isolation and does not take into account the dependence structure of the GO-term hierarchy. We have developed a Bayesian approach (GO-Bayes) to measure overrepresentation of GO terms that incorporates the GO dependence structure by taking into account evidence not only from individual GO terms, but also from their related terms (i.e. parents, children, siblings, etc.). The Bayesian framework borrows information across related GO terms to strengthen the detection of overrepresentation signals. As a result, this method tends to identify sets of closely related GO terms rather than individual isolated GO terms. The advantage of the GO-Bayes approach is demonstrated with a simulation study and an application example.

  6. Bayesian Group Bridge for Bi-level Variable Selection.

    Science.gov (United States)

    Mallick, Himel; Yi, Nengjun

    2017-06-01

    A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.

  7. conting : an R package for Bayesian analysis of complete and incomplete contingency tables

    OpenAIRE

    Overstall, Antony; King, Ruth

    2014-01-01

    The aim of this paper is to demonstrate the R package conting for the Bayesian analysis of complete and incomplete contingency tables using hierarchical log-linear models. This package allows a user to identify interactions between categorical factors (via complete contingency tables) and to estimate closed population sizes using capture-recapture studies (via incomplete contingency tables). The models are fitted using Markov chain Monte Carlo methods. In particular, implementations of the Me...

  8. Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-11-01

    Full Text Available Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. Results We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Conclusions Our approach provides an attractive statistical methodology for

  9. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    Science.gov (United States)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  10. Bayesian signal processing classical, modern, and particle filtering methods

    CERN Document Server

    Candy, James V

    2016-01-01

    This book aims to give readers a unified Bayesian treatment starting from the basics (Baye's rule) to the more advanced (Monte Carlo sampling), evolving to the next-generation model-based techniques (sequential Monte Carlo sampling). This next edition incorporates a new chapter on "Sequential Bayesian Detection," a new section on "Ensemble Kalman Filters" as well as an expansion of Case Studies that detail Bayesian solutions for a variety of applications. These studies illustrate Bayesian approaches to real-world problems incorporating detailed particle filter designs, adaptive particle filters and sequential Bayesian detectors. In addition to these major developments a variety of sections are expanded to "fill-in-the gaps" of the first edition. Here metrics for particle filter (PF) designs with emphasis on classical "sanity testing" lead to ensemble techniques as a basic requirement for performance analysis. The expansion of information theory metrics and their application to PF designs is fully developed an...

  11. Systematic search of Bayesian statistics in the field of psychotraumatology

    NARCIS (Netherlands)

    van de Schoot, Rens; Schalken, Naomi; Olff, Miranda

    2017-01-01

    In many different disciplines there is a recent increase in interest of Bayesian analysis. Bayesian methods implement Bayes' theorem, which states that prior beliefs are updated with data, and this process produces updated beliefs about model parameters. The prior is based on how much information we

  12. Study of shock coalescence in laser-irradiated targets

    International Nuclear Information System (INIS)

    Coe, S.E.; Willi, O.; Afshar-Rad, T.; Rose, S.J.

    1988-01-01

    We report on the first direct experimental observation of the coalescence of two shocks induced by a shaped laser pulse. Optical streak photography of the rear surface of aluminum multiple step targets was used to study the breakout of these shocks and observe their behavior. The experimental results are compared with simulations by a one-dimensional Lagrangian hydrodynamic code

  13. An analysis of the Bayesian track labelling problem

    NARCIS (Netherlands)

    Aoki, E.H.; Boers, Y.; Svensson, Lennart; Mandal, Pranab K.; Bagchi, Arunabha

    In multi-target tracking (MTT), the problem of assigning labels to tracks (track labelling) is vastly covered in literature, but its exact mathematical formulation, in terms of Bayesian statistics, has not been yet looked at in detail. Doing so, however, may help us to understand how Bayes-optimal

  14. Classifying emotion in Twitter using Bayesian network

    Science.gov (United States)

    Surya Asriadie, Muhammad; Syahrul Mubarok, Mohamad; Adiwijaya

    2018-03-01

    Language is used to express not only facts, but also emotions. Emotions are noticeable from behavior up to the social media statuses written by a person. Analysis of emotions in a text is done in a variety of media such as Twitter. This paper studies classification of emotions on twitter using Bayesian network because of its ability to model uncertainty and relationships between features. The result is two models based on Bayesian network which are Full Bayesian Network (FBN) and Bayesian Network with Mood Indicator (BNM). FBN is a massive Bayesian network where each word is treated as a node. The study shows the method used to train FBN is not very effective to create the best model and performs worse compared to Naive Bayes. F1-score for FBN is 53.71%, while for Naive Bayes is 54.07%. BNM is proposed as an alternative method which is based on the improvement of Multinomial Naive Bayes and has much lower computational complexity compared to FBN. Even though it’s not better compared to FBN, the resulting model successfully improves the performance of Multinomial Naive Bayes. F1-Score for Multinomial Naive Bayes model is 51.49%, while for BNM is 52.14%.

  15. A Bayesian analysis of the nucleon QCD sum rules

    International Nuclear Information System (INIS)

    Ohtani, Keisuke; Gubler, Philipp; Oka, Makoto

    2011-01-01

    QCD sum rules of the nucleon channel are reanalyzed, using the maximum-entropy method (MEM). This new approach, based on the Bayesian probability theory, does not restrict the spectral function to the usual ''pole + continuum'' form, allowing a more flexible investigation of the nucleon spectral function. Making use of this flexibility, we are able to investigate the spectral functions of various interpolating fields, finding that the nucleon ground state mainly couples to an operator containing a scalar diquark. Moreover, we formulate the Gaussian sum rule for the nucleon channel and find that it is more suitable for the MEM analysis to extract the nucleon pole in the region of its experimental value, while the Borel sum rule does not contain enough information to clearly separate the nucleon pole from the continuum. (orig.)

  16. Bayesian optimization for materials science

    CERN Document Server

    Packwood, Daniel

    2017-01-01

    This book provides a short and concise introduction to Bayesian optimization specifically for experimental and computational materials scientists. After explaining the basic idea behind Bayesian optimization and some applications to materials science in Chapter 1, the mathematical theory of Bayesian optimization is outlined in Chapter 2. Finally, Chapter 3 discusses an application of Bayesian optimization to a complicated structure optimization problem in computational surface science. Bayesian optimization is a promising global optimization technique that originates in the field of machine learning and is starting to gain attention in materials science. For the purpose of materials design, Bayesian optimization can be used to predict new materials with novel properties without extensive screening of candidate materials. For the purpose of computational materials science, Bayesian optimization can be incorporated into first-principles calculations to perform efficient, global structure optimizations. While re...

  17. Uncertainty analysis of depth predictions from seismic reflection data using Bayesian statistics

    Science.gov (United States)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2018-03-01

    Estimating the depths of target horizons from seismic reflection data is an important task in exploration geophysics. To constrain these depths we need a reliable and accurate velocity model. Here, we build an optimum 2D seismic reflection data processing flow focused on pre - stack deghosting filters and velocity model building and apply Bayesian methods, including Gaussian process emulation and Bayesian History Matching (BHM), to estimate the uncertainties of the depths of key horizons near the borehole DSDP-258 located in the Mentelle Basin, south west of Australia, and compare the results with the drilled core from that well. Following this strategy, the tie between the modelled and observed depths from DSDP-258 core was in accordance with the ± 2σ posterior credibility intervals and predictions for depths to key horizons were made for the two new drill sites, adjacent the existing borehole of the area. The probabilistic analysis allowed us to generate multiple realizations of pre-stack depth migrated images, these can be directly used to better constrain interpretation and identify potential risk at drill sites. The method will be applied to constrain the drilling targets for the upcoming International Ocean Discovery Program (IODP), leg 369.

  18. Void migration, coalescence and swelling in fusion materials

    International Nuclear Information System (INIS)

    Cottrell, G.A.

    2003-01-01

    A recent analysis of the migration of voids and bubbles, produced in neutron irradiated fusion materials, is outlined. The migration, brought about by thermal hopping of atoms on the surface of a void, is normally a random Brownian motion but, in a temperature gradient, can be slightly biassed up the gradient. Two effects of such migrations are the transport of voids and trapped transmutation helium atoms to grain boundaries, where embrittlement may result; and the coalescence of migrating voids, which reduces the number of non-dislocation sites available for the capture of knock-on point defects and thereby enables the dislocation bias process to maintain void swelling. A selection of candidate fusion power plant armour and structural metals have been analysed. The metals most resistant to void migration and its effects are tungsten and molybdenum. Steel and beryllium are least so and vanadium is intermediate

  19. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  20. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  1. Bayesian inference for psychology. Part II: Example applications with JASP.

    Science.gov (United States)

    Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D

    2018-02-01

    Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.

  2. Bayesian uncertainty analysis for complex systems biology models: emulation, global parameter searches and evaluation of gene functions.

    Science.gov (United States)

    Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith

    2018-01-02

    Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour

  3. Spatial and spatio-temporal bayesian models with R - INLA

    CERN Document Server

    Blangiardo, Marta

    2015-01-01

    Dedication iiiPreface ix1 Introduction 11.1 Why spatial and spatio-temporal statistics? 11.2 Why do we use Bayesian methods for modelling spatial and spatio-temporal structures? 21.3 Why INLA? 31.4 Datasets 32 Introduction to 212.1 The language 212.2 objects 222.3 Data and session management 342.4 Packages 352.5 Programming in 362.6 Basic statistical analysis with 393 Introduction to Bayesian Methods 533.1 Bayesian Philosophy 533.2 Basic Probability Elements 573.3 Bayes Theorem 623.4 Prior and Posterior Distributions 643.5 Working with the Posterior Distribution 663.6 Choosing the Prior Distr

  4. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    Science.gov (United States)

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  6. A heat transfer model for evaporating micro-channel coalescing bubble flow

    International Nuclear Information System (INIS)

    Consolini, L.; Thome, J.R.

    2009-01-01

    The current study presents a one-dimensional model of confined coalescing bubble flow for the prediction of micro-channel convective boiling heat transfer. Coalescing bubble flow has recently been identified as one of the characteristic flow patterns to be found in micro-scale systems, occurring at intermediate vapor qualities between the isolated bubble and the fully annular regimes. As two or more bubbles bond under the action of inertia and surface tension, the passage frequency of the bubble liquid slug pair declines, with a redistribution of liquid among the remaining flow structures. Assuming heat transfer to occur only by conduction through the thin evaporating liquid film surrounding individual bubbles, the present model includes a simplified description of the dynamics of the thin film evaporation process that takes into account the added mass transfer by breakup of the bridging liquid slugs. The new model has been confronted against experimental data taken within the coalescing bubble flow mode that have been identified by a diabatic micro-scale flow pattern map. The comparisons for three different fluids (R-134a, R-236fa and R-245fa) gave encouraging results with 83% of the database predicted within a ± 30% error band. (author)

  7. Bayesian networks with examples in R

    CERN Document Server

    Scutari, Marco

    2014-01-01

    Introduction. The Discrete Case: Multinomial Bayesian Networks. The Continuous Case: Gaussian Bayesian Networks. More Complex Cases. Theory and Algorithms for Bayesian Networks. Real-World Applications of Bayesian Networks. Appendices. Bibliography.

  8. How few countries will do? Comparative survey analysis from a Bayesian perspective

    Directory of Open Access Journals (Sweden)

    Joop J.C.M. Hox

    2012-07-01

    Full Text Available Meuleman and Billiet (2009 have carried out a simulation study aimed at the question how many countries are needed for accurate multilevel SEM estimation in comparative studies. The authors concluded that a sample of 50 to 100 countries is needed for accurate estimation. Recently, Bayesian estimation methods have been introduced in structural equation modeling which should work well with much lower sample sizes. The current study reanalyzes the simulation of Meuleman and Billiet using Bayesian estimation to find the lowest number of countries needed when conducting multilevel SEM. The main result of our simulations is that a sample of about 20 countries is sufficient for accurate Bayesian estimation, which makes multilevel SEM practicable for the number of countries commonly available in large scale comparative surveys.

  9. A particle system with cooperative branching and coalescence

    Czech Academy of Sciences Publication Activity Database

    Sturm, A.; Swart, Jan M.

    2015-01-01

    Roč. 25, č. 3 (2015), s. 1616-1649 ISSN 1050-5164 R&D Projects: GA ČR GAP201/10/0752 Institutional support: RVO:67985556 Keywords : interacting particle system * cooperative branching * coalescence * phase transition * upper invariant law * survival * extinction Subject RIV: BA - General Mathematics Impact factor: 1.755, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/swart-0442871.pdf

  10. Project Portfolio Risk Identification and Analysis, Considering Project Risk Interactions and Using Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Foroogh Ghasemi

    2018-05-01

    Full Text Available An organization’s strategic objectives are accomplished through portfolios. However, the materialization of portfolio risks may affect a portfolio’s sustainable success and the achievement of those objectives. Moreover, project interdependencies and cause–effect relationships between risks create complexity for portfolio risk analysis. This paper presents a model using Bayesian network (BN methodology for modeling and analyzing portfolio risks. To develop this model, first, portfolio-level risks and risks caused by project interdependencies are identified. Then, based on their cause–effect relationships all portfolio risks are organized in a BN. Conditional probability distributions for this network are specified and the Bayesian networks method is used to estimate the probability of portfolio risk. This model was applied to a portfolio of a construction company located in Iran and proved effective in analyzing portfolio risk probability. Furthermore, the model provided valuable information for selecting a portfolio’s projects and making strategic decisions.

  11. Comparing energy sources for surgical ablation of atrial fibrillation: a Bayesian network meta-analysis of randomized, controlled trials.

    Science.gov (United States)

    Phan, Kevin; Xie, Ashleigh; Kumar, Narendra; Wong, Sophia; Medi, Caroline; La Meir, Mark; Yan, Tristan D

    2015-08-01

    Simplified maze procedures involving radiofrequency, cryoenergy and microwave energy sources have been increasingly utilized for surgical treatment of atrial fibrillation as an alternative to the traditional cut-and-sew approach. In the absence of direct comparisons, a Bayesian network meta-analysis is another alternative to assess the relative effect of different treatments, using indirect evidence. A Bayesian meta-analysis of indirect evidence was performed using 16 published randomized trials identified from 6 databases. Rank probability analysis was used to rank each intervention in terms of their probability of having the best outcome. Sinus rhythm prevalence beyond the 12-month follow-up was similar between the cut-and-sew, microwave and radiofrequency approaches, which were all ranked better than cryoablation (respectively, 39, 36, and 25 vs 1%). The cut-and-sew maze was ranked worst in terms of mortality outcomes compared with microwave, radiofrequency and cryoenergy (2 vs 19, 34, and 24%, respectively). The cut-and-sew maze procedure was associated with significantly lower stroke rates compared with microwave ablation [odds ratio <0.01; 95% confidence interval 0.00, 0.82], and ranked the best in terms of pacemaker requirements compared with microwave, radiofrequency and cryoenergy (81 vs 14, and 1, <0.01% respectively). Bayesian rank probability analysis shows that the cut-and-sew approach is associated with the best outcomes in terms of sinus rhythm prevalence and stroke outcomes, and remains the gold standard approach for AF treatment. Given the limitations of indirect comparison analysis, these results should be viewed with caution and not over-interpreted. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  12. ScreenBEAM: a novel meta-analysis algorithm for functional genomics screens via Bayesian hierarchical modeling.

    Science.gov (United States)

    Yu, Jiyang; Silva, Jose; Califano, Andrea

    2016-01-15

    Functional genomics (FG) screens, using RNAi or CRISPR technology, have become a standard tool for systematic, genome-wide loss-of-function studies for therapeutic target discovery. As in many large-scale assays, however, off-target effects, variable reagents' potency and experimental noise must be accounted for appropriately control for false positives. Indeed, rigorous statistical analysis of high-throughput FG screening data remains challenging, particularly when integrative analyses are used to combine multiple sh/sgRNAs targeting the same gene in the library. We use large RNAi and CRISPR repositories that are publicly available to evaluate a novel meta-analysis approach for FG screens via Bayesian hierarchical modeling, Screening Bayesian Evaluation and Analysis Method (ScreenBEAM). Results from our analysis show that the proposed strategy, which seamlessly combines all available data, robustly outperforms classical algorithms developed for microarray data sets as well as recent approaches designed for next generation sequencing technologies. Remarkably, the ScreenBEAM algorithm works well even when the quality of FG screens is relatively low, which accounts for about 80-95% of the public datasets. R package and source code are available at: https://github.com/jyyu/ScreenBEAM. ac2248@columbia.edu, jose.silva@mssm.edu, yujiyang@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. On Bayesian reliability analysis with informative priors and censoring

    International Nuclear Information System (INIS)

    Coolen, F.P.A.

    1996-01-01

    In the statistical literature many methods have been presented to deal with censored observations, both within the Bayesian and non-Bayesian frameworks, and such methods have been successfully applied to, e.g., reliability problems. Also, in reliability theory it is often emphasized that, through shortage of statistical data and possibilities for experiments, one often needs to rely heavily on judgements of engineers, or other experts, for which means Bayesian methods are attractive. It is therefore important that such judgements can be elicited easily to provide informative prior distributions that reflect the knowledge of the engineers well. In this paper we focus on this aspect, especially on the situation that the judgements of the consulted engineers are based on experiences in environments where censoring has also been present previously. We suggest the use of the attractive interpretation of hyperparameters of conjugate prior distributions when these are available for assumed parametric models for lifetimes, and we show how one may go beyond the standard conjugate priors, using similar interpretations of hyper-parameters, to enable easier elicitation when censoring has been present in the past. This may even lead to more flexibility for modelling prior knowledge than when using standard conjugate priors, whereas the disadvantage of more complicated calculations that may be needed to determine posterior distributions play a minor role due to the advanced mathematical and statistical software that is widely available these days

  14. A Bayesian Analysis of the Radioactive Releases of Fukushima

    DEFF Research Database (Denmark)

    Tomioka, Ryota; Mørup, Morten

    2012-01-01

    the types of nuclides and their levels of concentration from the recorded mixture of radiations to take necessary measures. We presently formulate a Bayesian generative model for the data available on radioactive releases from the Fukushima Daiichi disaster across Japan. From the sparsely sampled...... the Fukushima Daiichi plant we establish that the model is able to account for the data. We further demonstrate how the model extends to include all the available measurements recorded throughout Japan. The model can be considered a first attempt to apply Bayesian learning unsupervised in order to give a more......The Fukushima Daiichi disaster 11 March, 2011 is considered the largest nuclear accident since the 1986 Chernobyl disaster and has been rated at level 7 on the International Nuclear Event Scale. As different radioactive materials have different effects to human body, it is important to know...

  15. A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis

    Science.gov (United States)

    Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan

    2009-01-01

    DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301

  16. Optimizing Prediction Using Bayesian Model Averaging: Examples Using Large-Scale Educational Assessments.

    Science.gov (United States)

    Kaplan, David; Lee, Chansoon

    2018-01-01

    This article provides a review of Bayesian model averaging as a means of optimizing the predictive performance of common statistical models applied to large-scale educational assessments. The Bayesian framework recognizes that in addition to parameter uncertainty, there is uncertainty in the choice of models themselves. A Bayesian approach to addressing the problem of model uncertainty is the method of Bayesian model averaging. Bayesian model averaging searches the space of possible models for a set of submodels that satisfy certain scientific principles and then averages the coefficients across these submodels weighted by each model's posterior model probability (PMP). Using the weighted coefficients for prediction has been shown to yield optimal predictive performance according to certain scoring rules. We demonstrate the utility of Bayesian model averaging for prediction in education research with three examples: Bayesian regression analysis, Bayesian logistic regression, and a recently developed approach for Bayesian structural equation modeling. In each case, the model-averaged estimates are shown to yield better prediction of the outcome of interest than any submodel based on predictive coverage and the log-score rule. Implications for the design of large-scale assessments when the goal is optimal prediction in a policy context are discussed.

  17. Factors governing partial coalescence in oil-in-water emulsions

    NARCIS (Netherlands)

    Fredrick, E.; Walstra, P.; Dewettinck, K.

    2010-01-01

    The consequences of the instability mechanism partial coalescence in oil-in-water food emulsions show a discrepancy. On the one hand, it needs to be avoided in order to achieve an extended shelf life in food products like sauces, creams and several milk products. On the other hand, during the

  18. Coalescing black hole solution in the De-Sitter universe

    International Nuclear Information System (INIS)

    Ahmed, Mainuddin

    2005-01-01

    A new coalescing black hole solution of Einstein-Maxwell equation in general relativity is given. The new solution is also found to support the 'Nerst Theorem' of thermodynamics in the case of black hole. Thus this solution poses to solve an outstanding problem of thermodynamics and black hole physics. (author)

  19. Micromechanics of transformation-induced plasticity and variant coalescence

    International Nuclear Information System (INIS)

    Marketz, F.; Fischer, F.D.; University for Mining and Metallurgy, Leoben; Tanaka, K.

    1996-01-01

    Quantitative micromechanics descriptions of both transformation-induced plasticity (TRIP) associated with the martensitic transformation in an Fe-Ni alloy and of variant coalescence in a Cu-Al-Ni shape memory alloy are presented. The macroscopic deformation behavior of a polycrystalline aggregate as a result of the rearrangements within the crystallites is modelled with the help of a finite element based periodic microfield approach. In the case of TRIP the parent→martensite transformation is described by microscale thermodynamic and kinetic equations taking into account internal stress states. The simulation of a classical experiment on TRIP allows to quantify the Magee-effect and the Greenwood-Johnson effect. Furthermore, the development of the martensitic microstructure is studied with respect to the stress-assisted transformation of preferred variants. In the case of variant coalescence the strain energy due to internal stress states has an important influence on the mechanical behavior. Formulating the reorientation process on the size scale of self-accommodating plate groups in terms of the mobility of the boundaries between martensitic variants the macroscopic behavior in uniaxial tension is predicted by an incremental modelling procedure. Furthermore, influence of energy dissipation on the overall behavior is quantified. (orig.)

  20. A Bayesian Analysis of Unobserved Component Models Using Ox

    Directory of Open Access Journals (Sweden)

    Charles S. Bos

    2011-05-01

    Full Text Available This article details a Bayesian analysis of the Nile river flow data, using a similar state space model as other articles in this volume. For this data set, Metropolis-Hastings and Gibbs sampling algorithms are implemented in the programming language Ox. These Markov chain Monte Carlo methods only provide output conditioned upon the full data set. For filtered output, conditioning only on past observations, the particle filter is introduced. The sampling methods are flexible, and this advantage is used to extend the model to incorporate a stochastic volatility process. The volatility changes both in the Nile data and also in daily S&P 500 return data are investigated. The posterior density of parameters and states is found to provide information on which elements of the model are easily identifiable, and which elements are estimated with less precision.

  1. Semiparametric Bayesian analysis of accelerated failure time models with cluster structures.

    Science.gov (United States)

    Li, Zhaonan; Xu, Xinyi; Shen, Junshan

    2017-11-10

    In this paper, we develop a Bayesian semiparametric accelerated failure time model for survival data with cluster structures. Our model allows distributional heterogeneity across clusters and accommodates their relationships through a density ratio approach. Moreover, a nonparametric mixture of Dirichlet processes prior is placed on the baseline distribution to yield full distributional flexibility. We illustrate through simulations that our model can greatly improve estimation accuracy by effectively pooling information from multiple clusters, while taking into account the heterogeneity in their random error distributions. We also demonstrate the implementation of our method using analysis of Mayo Clinic Trial in Primary Biliary Cirrhosis. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Applying Bayesian Statistics to Educational Evaluation. Theoretical Paper No. 62.

    Science.gov (United States)

    Brumet, Michael E.

    Bayesian statistical inference is unfamiliar to many educational evaluators. While the classical model is useful in educational research, it is not as useful in evaluation because of the need to identify solutions to practical problems based on a wide spectrum of information. The reason Bayesian analysis is effective for decision making is that it…

  3. An elementary introduction to Bayesian computing using WinBUGS.

    Science.gov (United States)

    Fryback, D G; Stout, N K; Rosenberg, M A

    2001-01-01

    Bayesian statistics provides effective techniques for analyzing data and translating the results to inform decision making. This paper provides an elementary tutorial overview of the WinBUGS software for performing Bayesian statistical analysis. Background information on the computational methods used by the software is provided. Two examples drawn from the field of medical decision making are presented to illustrate the features and functionality of the software.

  4. Testing non-minimally coupled inflation with CMB data: a Bayesian analysis

    International Nuclear Information System (INIS)

    Campista, Marcela; Benetti, Micol; Alcaniz, Jailson

    2017-01-01

    We use the most recent cosmic microwave background (CMB) data to perform a Bayesian statistical analysis and discuss the observational viability of inflationary models with a non-minimal coupling, ξ, between the inflaton field and the Ricci scalar. We particularize our analysis to two examples of small and large field inflationary models, namely, the Coleman-Weinberg and the chaotic quartic potentials. We find that ( i ) the ξ parameter is closely correlated with the primordial amplitude ; ( ii ) although improving the agreement with the CMB data in the r − n s plane, where r is the tensor-to-scalar ratio and n s the primordial spectral index, a non-null coupling is strongly disfavoured with respect to the minimally coupled standard ΛCDM model, since the upper bounds of the Bayes factor (odds) for ξ parameter are greater than 150:1.

  5. Status of the 2D Bayesian analysis of XENON100 data

    Energy Technology Data Exchange (ETDEWEB)

    Schindler, Stefan [JGU, Staudingerweg 7, 55128 Mainz (Germany)

    2015-07-01

    The XENON100 experiment is located in the underground laboratory at LNGS in Italy. Since Dark Matter particles will only interact very rarely with normal matter, an environment with ultra low background, which is shielded from cosmic radiation is needed. The standard analysis of XENON100 data has made use of the profile likelihood method (a most frequent approach) and still provides one of the most sensitive exclusion limits to WIMP Dark Matter. Here we present work towards a Bayesian approach to the analysis of XENON100 data, where we attempt to include the measured primary (S1) and secondary (S2) scintillation signals in a more complete way. The background and signal models in the S1-S2 space have to be defined and a corresponding likelihood function, describing these models, has to be constructed.

  6. Testing non-minimally coupled inflation with CMB data: a Bayesian analysis

    Energy Technology Data Exchange (ETDEWEB)

    Campista, Marcela; Benetti, Micol; Alcaniz, Jailson, E-mail: campista@on.br, E-mail: micolbenetti@on.br, E-mail: alcaniz@on.br [Observatório Nacional, Rua General José Cristino 77, Rio de Janeiro, RJ, 20921-400 Brazil (Brazil)

    2017-09-01

    We use the most recent cosmic microwave background (CMB) data to perform a Bayesian statistical analysis and discuss the observational viability of inflationary models with a non-minimal coupling, ξ, between the inflaton field and the Ricci scalar. We particularize our analysis to two examples of small and large field inflationary models, namely, the Coleman-Weinberg and the chaotic quartic potentials. We find that ( i ) the ξ parameter is closely correlated with the primordial amplitude ; ( ii ) although improving the agreement with the CMB data in the r − n {sub s} plane, where r is the tensor-to-scalar ratio and n {sub s} the primordial spectral index, a non-null coupling is strongly disfavoured with respect to the minimally coupled standard ΛCDM model, since the upper bounds of the Bayes factor (odds) for ξ parameter are greater than 150:1.

  7. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  8. Bayesian approach and application to operation safety

    International Nuclear Information System (INIS)

    Procaccia, H.; Suhner, M.Ch.

    2003-01-01

    The management of industrial risks requires the development of statistical and probabilistic analyses which use all the available convenient information in order to compensate the insufficient experience feedback in a domain where accidents and incidents remain too scarce to perform a classical statistical frequency analysis. The Bayesian decision approach is well adapted to this problem because it integrates both the expertise and the experience feedback. The domain of knowledge is widen, the forecasting study becomes possible and the decisions-remedial actions are strengthen thanks to risk-cost-benefit optimization analyzes. This book presents the bases of the Bayesian approach and its concrete applications in various industrial domains. After a mathematical presentation of the industrial operation safety concepts and of the Bayesian approach principles, this book treats of some of the problems that can be solved thanks to this approach: softwares reliability, controls linked with the equipments warranty, dynamical updating of databases, expertise modeling and weighting, Bayesian optimization in the domains of maintenance, quality control, tests and design of new equipments. A synthesis of the mathematical formulae used in this approach is given in conclusion. (J.S.)

  9. Void growth and coalescence in metals deformed at elevated temperature

    DEFF Research Database (Denmark)

    Klöcker, H.; Tvergaard, Viggo

    2000-01-01

    For metals deformed at elevated temperatures the growth of voids to coalescence is studied numerically. The voids are assumed to be present from the beginning of deformation, and the rate of deformation considered is so high that void growth is dominated by power law creep of the material, without...... any noticeable effect of surface diffusion. Axisymmetric unit cell model computations are used to study void growth in a material containing a periodic array of voids, and the onset of the coalescence process is defined as the stage where plastic flow localizes in the ligaments between neighbouring...... voids. The focus of the study is on various relatively high stress triaxialties. In order to represent the results in terms of a porous ductile material model a set of constitutive relations are used, which have been proposed for void growth in a material undergoing power law creep....

  10. Bayesian ideas and data analysis an introduction for scientists and statisticians

    CERN Document Server

    Christensen, Ronald; Branscum, Adam; Hanson, Timothy E.

    2010-01-01

    This book provides a good introduction to Bayesian approaches to applied statistical modelling. … The authors have fulfilled their main aim of introducing Bayesian ideas through examples using a large number of statistical models. An interesting feature of this book is the humour of the authors that make it more fun than typical statistics books. In summary, this is a very interesting introductory book, very well organised and has been written in a style that is extremely pleasant and enjoyable to read. Both the statistical concepts and examples are very well explained. In conclusion, I highly

  11. Bayesian Analysis of Multidimensional Item Response Theory Models: A Discussion and Illustration of Three Response Style Models

    Science.gov (United States)

    Leventhal, Brian C.; Stone, Clement A.

    2018-01-01

    Interest in Bayesian analysis of item response theory (IRT) models has grown tremendously due to the appeal of the paradigm among psychometricians, advantages of these methods when analyzing complex models, and availability of general-purpose software. Possible models include models which reflect multidimensionality due to designed test structure,…

  12. Uncertainty analysis of depth predictions from seismic reflection data using Bayesian statistics

    Science.gov (United States)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2018-06-01

    Estimating the depths of target horizons from seismic reflection data is an important task in exploration geophysics. To constrain these depths we need a reliable and accurate velocity model. Here, we build an optimum 2-D seismic reflection data processing flow focused on pre-stack deghosting filters and velocity model building and apply Bayesian methods, including Gaussian process emulation and Bayesian History Matching, to estimate the uncertainties of the depths of key horizons near the Deep Sea Drilling Project (DSDP) borehole 258 (DSDP-258) located in the Mentelle Basin, southwest of Australia, and compare the results with the drilled core from that well. Following this strategy, the tie between the modelled and observed depths from DSDP-258 core was in accordance with the ±2σ posterior credibility intervals and predictions for depths to key horizons were made for the two new drill sites, adjacent to the existing borehole of the area. The probabilistic analysis allowed us to generate multiple realizations of pre-stack depth migrated images, these can be directly used to better constrain interpretation and identify potential risk at drill sites. The method will be applied to constrain the drilling targets for the upcoming International Ocean Discovery Program, leg 369.

  13. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    Science.gov (United States)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  14. Bayesian analysis of heat pipe life test data for reliability demonstration testing

    International Nuclear Information System (INIS)

    Bartholomew, R.J.; Martz, H.F.

    1985-01-01

    The demonstration testing duration requirements to establish a quantitative measure of assurance of expected lifetime for heat pipes was determined. The heat pipes are candidate devices for transporting heat generated in a nuclear reactor core to thermoelectric converters for use as a space-based electric power plant. A Bayesian analysis technique is employed, utilizing a limited Delphi survey, and a geometric mean accelerated test criterion involving heat pipe power (P) and temperature (T). Resulting calculations indicate considerable test savings can be achieved by employing the method, but development testing to determine heat pipe failure mechanisms should not be circumvented

  15. On the use of higher order wave forms in the search for gravitational waves emitted by compact binary coalescences

    Science.gov (United States)

    McKechan, David J. A.

    2010-11-01

    This thesis concerns the use, in gravitational wave data analysis, of higher order wave form models of the gravitational radiation emitted by compact binary coalescences. We begin with an introductory chapter that includes an overview of the theory of general relativity, gravitational radiation and ground-based interferometric gravitational wave detectors. We then discuss, in Chapter 2, the gravitational waves emitted by compact binary coalescences, with an explanation of higher order waveforms and how they differ from leading order waveforms we also introduce the post-Newtonian formalism. In Chapter 3 the method and results of a gravitational wave search for low mass compact binary coalescences using a subset of LIGO's 5th science run data are presented and in the subsequent chapter we examine how one could use higher order waveforms in such analyses. We follow the development of a new search algorithm that incorporates higher order waveforms with promising results for detection efficiency and parameter estimation. In Chapter 5, a new method of windowing time-domain waveforms that offers benefit to gravitational wave searches is presented. The final chapter covers the development of a game designed as an outreach project to raise public awareness and understanding of the search for gravitational waves.

  16. Statistical assignment of DNA sequences using Bayesian phylogenetics

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Boomsma, Wouter Krogh; Huelsenbeck, John P.

    2008-01-01

    We provide a new automated statistical method for DNA barcoding based on a Bayesian phylogenetic analysis. The method is based on automated database sequence retrieval, alignment, and phylogenetic analysis using a custom-built program for Bayesian phylogenetic analysis. We show on real data...... that the method outperforms Blast searches as a measure of confidence and can help eliminate 80% of all false assignment based on best Blast hit. However, the most important advance of the method is that it provides statistically meaningful measures of confidence. We apply the method to a re......-analysis of previously published ancient DNA data and show that, with high statistical confidence, most of the published sequences are in fact of Neanderthal origin. However, there are several cases of chimeric sequences that are comprised of a combination of both Neanderthal and modern human DNA....

  17. Magnetic neutral point stretching and coalescence in tearing-generated magnetohydrodynamic structures

    International Nuclear Information System (INIS)

    Malara, F.; Veltri, P.; Carbone, V.

    1991-01-01

    The time evolution of the instability of a sheet pinch is numerically studied using a sufficiently high ratio of system length to width in order to allow the simultaneous growth of several unstable wavelengths. This numerical simulation provides new insights into the nonlinear development of the tearing instability. Before the instability saturates, the nonlinear interactions among the unstable modes produce local coalescence phenomena that destroy the weaker current pinches and reduce the number of magnetic islands. In contrast with the usual picture, this coalescence is not due to the attraction between the current maxima, but is due to the stretching of the X-neutral points associated with the most intense current pinches. The global perturbation growth rate remains essentially unchanged in time, being of the order of the resistive instability growth rate

  18. Bus Route Design with a Bayesian Network Analysis of Bus Service Revenues

    OpenAIRE

    Liu, Yi; Jia, Yuanhua; Feng, Xuesong; Wu, Jiang

    2018-01-01

    A Bayesian network is used to estimate revenues of bus services in consideration of the effect of bus travel demands, passenger transport distances, and so on. In this research, the area X in Beijing has been selected as the study area because of its relatively high bus travel demand and, on the contrary, unsatisfactory bus services. It is suggested that the proposed Bayesian network approach is able to rationally predict the probabilities of different revenues of various route services, from...

  19. Observation of Coalescence Process of Silver Nanospheres During Shape Transformation to Nanoprisms

    Directory of Open Access Journals (Sweden)

    Yu Pyng

    2011-01-01

    Full Text Available Abstract In this report, we observed the growth mechanism and the shape transformation from spherical nanoparticles (diameter ~6 nm to triangular nanoprisms (bisector length ~100 nm. We used a simple direct chemical reduction method and provided evidences for the growth of silver nanoprisms via a coalescence process. Unlike previous reports, our method does not rely upon light, heat, or strong oxidant for the shape transformation. This transformation could be launched by fine-tuning the pH value of the silver colloidal solution. Based on our extensive examination using transmission electron microscopy, we propose a non-point initiated growth mechanism, which is a combination of coalescence and dissolution–recrystallization process during the growth of silver nanoprisms.

  20. Bayesian analysis of deterministic and stochastic prisoner's dilemma games

    Directory of Open Access Journals (Sweden)

    Howard Kunreuther

    2009-08-01

    Full Text Available This paper compares the behavior of individuals playing a classic two-person deterministic prisoner's dilemma (PD game with choice data obtained from repeated interdependent security prisoner's dilemma games with varying probabilities of loss and the ability to learn (or not learn about the actions of one's counterpart, an area of recent interest in experimental economics. This novel data set, from a series of controlled laboratory experiments, is analyzed using Bayesian hierarchical methods, the first application of such methods in this research domain. We find that individuals are much more likely to be cooperative when payoffs are deterministic than when the outcomes are probabilistic. A key factor explaining this difference is that subjects in a stochastic PD game respond not just to what their counterparts did but also to whether or not they suffered a loss. These findings are interpreted in the context of behavioral theories of commitment, altruism and reciprocity. The work provides a linkage between Bayesian statistics, experimental economics, and consumer psychology.

  1. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    Science.gov (United States)

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  2. Embedding the results of focussed Bayesian fusion into a global context

    Science.gov (United States)

    Sander, Jennifer; Heizmann, Michael

    2014-05-01

    Bayesian statistics offers a well-founded and powerful fusion methodology also for the fusion of heterogeneous information sources. However, except in special cases, the needed posterior distribution is not analytically derivable. As consequence, Bayesian fusion may cause unacceptably high computational and storage costs in practice. Local Bayesian fusion approaches aim at reducing the complexity of the Bayesian fusion methodology significantly. This is done by concentrating the actual Bayesian fusion on the potentially most task relevant parts of the domain of the Properties of Interest. Our research on these approaches is motivated by an analogy to criminal investigations where criminalists pursue clues also only locally. This publication follows previous publications on a special local Bayesian fusion technique called focussed Bayesian fusion. Here, the actual calculation of the posterior distribution gets completely restricted to a suitably chosen local context. By this, the global posterior distribution is not completely determined. Strategies for using the results of a focussed Bayesian analysis appropriately are needed. In this publication, we primarily contrast different ways of embedding the results of focussed Bayesian fusion explicitly into a global context. To obtain a unique global posterior distribution, we analyze the application of the Maximum Entropy Principle that has been shown to be successfully applicable in metrology and in different other areas. To address the special need for making further decisions subsequently to the actual fusion task, we further analyze criteria for decision making under partial information.

  3. Bayesian network representing system dynamics in risk analysis of nuclear systems

    Science.gov (United States)

    Varuttamaseni, Athi

    2011-12-01

    A dynamic Bayesian network (DBN) model is used in conjunction with the alternating conditional expectation (ACE) regression method to analyze the risk associated with the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed operation in the Zion-1 nuclear power plant. The use of the DBN allows the joint probability distribution to be factorized, enabling the analysis to be done on many simpler network structures rather than on one complicated structure. The construction of the DBN model assumes conditional independence relations among certain key reactor parameters. The choice of parameter to model is based on considerations of the macroscopic balance statements governing the behavior of the reactor under a quasi-static assumption. The DBN is used to relate the peak clad temperature to a set of independent variables that are known to be important in determining the success of the feed and bleed operation. A simple linear relationship is then used to relate the clad temperature to the core damage probability. To obtain a quantitative relationship among different nodes in the DBN, surrogates of the RELAP5 reactor transient analysis code are used. These surrogates are generated by applying the ACE algorithm to output data obtained from about 50 RELAP5 cases covering a wide range of the selected independent variables. These surrogates allow important safety parameters such as the fuel clad temperature to be expressed as a function of key reactor parameters such as the coolant temperature and pressure together with important independent variables such as the scram delay time. The time-dependent core damage probability is calculated by sampling the independent variables from their probability distributions and propagate the information up through the Bayesian network to give the clad temperature. With the knowledge of the clad temperature and the assumption that the core damage probability has a one-to-one relationship to it, we have

  4. System Analysis by Mapping a Fault-tree into a Bayesian-network

    Science.gov (United States)

    Sheng, B.; Deng, C.; Wang, Y. H.; Tang, L. H.

    2018-05-01

    In view of the limitations of fault tree analysis in reliability assessment, Bayesian Network (BN) has been studied as an alternative technology. After a brief introduction to the method for mapping a Fault Tree (FT) into an equivalent BN, equations used to calculate the structure importance degree, the probability importance degree and the critical importance degree are presented. Furthermore, the correctness of these equations is proved mathematically. Combining with an aircraft landing gear’s FT, an equivalent BN is developed and analysed. The results show that richer and more accurate information have been achieved through the BN method than the FT, which demonstrates that the BN is a superior technique in both reliability assessment and fault diagnosis.

  5. Time Series Analysis of Non-Gaussian Observations Based on State Space Models from Both Classical and Bayesian Perspectives

    NARCIS (Netherlands)

    Durbin, J.; Koopman, S.J.M.

    1998-01-01

    The analysis of non-Gaussian time series using state space models is considered from both classical and Bayesian perspectives. The treatment in both cases is based on simulation using importance sampling and antithetic variables; Monte Carlo Markov chain methods are not employed. Non-Gaussian

  6. Bubble Coalescence: Effect of Bubble Approach Velocity and Liquid Viscosity

    Czech Academy of Sciences Publication Activity Database

    Orvalho, Sandra; Růžička, Marek; Olivieri, G.; Marzocchella, A.

    2015-01-01

    Roč. 134, SEP 29 (2015), s. 205-216 ISSN 0009-2509 R&D Projects: GA MŠk(CZ) LD13018 Institutional support: RVO:67985858 Keywords : bubble coalescence * bubble approach velocity * liquid viscosity Subject RIV: CI - Industrial Chemistry, Chemical Engineering Impact factor: 2.750, year: 2015

  7. Deep Learning and Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Prosper Harrison B.

    2017-01-01

    Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  8. Multiscale Simulations of Magnetic Island Coalescence

    Science.gov (United States)

    Dorelli, John C.

    2010-01-01

    We describe a new interactive parallel Adaptive Mesh Refinement (AMR) framework written in the Python programming language. This new framework, PyAMR, hides the details of parallel AMR data structures and algorithms (e.g., domain decomposition, grid partition, and inter-process communication), allowing the user to focus on the development of algorithms for advancing the solution of a systems of partial differential equations on a single uniform mesh. We demonstrate the use of PyAMR by simulating the pairwise coalescence of magnetic islands using the resistive Hall MHD equations. Techniques for coupling different physics models on different levels of the AMR grid hierarchy are discussed.

  9. GALAXY ROTATION AND RAPID SUPERMASSIVE BINARY COALESCENCE

    Energy Technology Data Exchange (ETDEWEB)

    Holley-Bockelmann, Kelly [Vanderbilt University, Nashville, TN (United States); Khan, Fazeel Mahmood, E-mail: k.holley@vanderbilt.edu [Institute of Space Technology (IST), Islamabad (Pakistan)

    2015-09-10

    Galaxy mergers usher the supermassive black hole (SMBH) in each galaxy to the center of the potential, where they form an SMBH binary. The binary orbit shrinks by ejecting stars via three-body scattering, but ample work has shown that in spherical galaxy models, the binary separation stalls after ejecting all the stars in its loss cone—this is the well-known final parsec problem. However, it has been shown that SMBH binaries in non-spherical galactic nuclei harden at a nearly constant rate until reaching the gravitational wave regime. Here we use a suite of direct N-body simulations to follow SMBH binary evolution in both corotating and counterrotating flattened galaxy models. For N > 500 K, we find that the evolution of the SMBH binary is convergent and is independent of the particle number. Rotation in general increases the hardening rate of SMBH binaries even more effectively than galaxy geometry alone. SMBH binary hardening rates are similar for co- and counterrotating galaxies. In the corotating case, the center of mass of the SMBH binary settles into an orbit that is in corotation resonance with the background rotating model, and the coalescence time is roughly a few 100 Myr faster than a non-rotating flattened model. We find that counterrotation drives SMBHs to coalesce on a nearly radial orbit promptly after forming a hard binary. We discuss the implications for gravitational wave astronomy, hypervelocity star production, and the effect on the structure of the host galaxy.

  10. GALAXY ROTATION AND RAPID SUPERMASSIVE BINARY COALESCENCE

    International Nuclear Information System (INIS)

    Holley-Bockelmann, Kelly; Khan, Fazeel Mahmood

    2015-01-01

    Galaxy mergers usher the supermassive black hole (SMBH) in each galaxy to the center of the potential, where they form an SMBH binary. The binary orbit shrinks by ejecting stars via three-body scattering, but ample work has shown that in spherical galaxy models, the binary separation stalls after ejecting all the stars in its loss cone—this is the well-known final parsec problem. However, it has been shown that SMBH binaries in non-spherical galactic nuclei harden at a nearly constant rate until reaching the gravitational wave regime. Here we use a suite of direct N-body simulations to follow SMBH binary evolution in both corotating and counterrotating flattened galaxy models. For N > 500 K, we find that the evolution of the SMBH binary is convergent and is independent of the particle number. Rotation in general increases the hardening rate of SMBH binaries even more effectively than galaxy geometry alone. SMBH binary hardening rates are similar for co- and counterrotating galaxies. In the corotating case, the center of mass of the SMBH binary settles into an orbit that is in corotation resonance with the background rotating model, and the coalescence time is roughly a few 100 Myr faster than a non-rotating flattened model. We find that counterrotation drives SMBHs to coalesce on a nearly radial orbit promptly after forming a hard binary. We discuss the implications for gravitational wave astronomy, hypervelocity star production, and the effect on the structure of the host galaxy

  11. A Bayesian Network Schema for Lessening Database Inference

    National Research Council Canada - National Science Library

    Chang, LiWu; Moskowitz, Ira S

    2001-01-01

    .... The authors introduce a formal schema for database inference analysis, based upon a Bayesian network structure, which identifies critical parameters involved in the inference problem and represents...

  12. Séparation par coalescence en lit fixe d'une phase aqueuse émulsionnée dans une phase organique Fixed-Bed Coalescence to Separate an Emulsified Aqueous Phase in an Organic Phase

    Directory of Open Access Journals (Sweden)

    Calteau J. P.

    2006-11-01

    Full Text Available Cette publication étudie la séparation de l'eau émulsionnée dans un kérosène par passage à travers un milieu granulaire. Les propriétés physico-chimiques du matériau coalesceur (nature et état de surface sont considérées. L'efficacité de séparation est étudiée en fonction ; - de la mouillabilité; - de la granulométrie; - de l'état de surface du matériau ; - de la vitesse de passage de l'émulsion; - de la hauteur de lit. L'analyse qualitative et quantitative de la rétention de l'eau dans le lit permet de mieux comprendre les mécanismes de coalescence. This article examines the séparation of emulsified water in kerosene by passing it through a granular medium. The physico-chemical properties of the coolescing material, i. e. nature and surface state, are considered. Separation efficiency is analyzed as a fonction of : - wettability; - grain size ; - surface state of the material ; - emulsion pass-through rate; - height of the bed. A qualitative and quantitative analysis of water retention in the bed gives a better understanding of the coalescence mechanisms.

  13. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  14. Difference in growth and coalescing patterns of droplets on bi-philic surfaces with varying spatial distribution.

    Science.gov (United States)

    Garimella, Martand Mayukh; Koppu, Sudheer; Kadlaskar, Shantanu Shrikant; Pillutla, Venkata; Abhijeet; Choi, Wonjae

    2017-11-01

    This paper reports the condensation and subsequent motion of water droplets on bi-philic surfaces, surfaces that are patterned with regions of different wettability. Bi-philic surfaces can enhance the water collection efficiency: droplets condensing on hydrophobic regions wick into hydrophilic drain channels when droplets grow to a certain size, renewing the condensation on the dry hydrophobic region. The onset of drain phenomenon can be triggered by multiple events with distinct nature ranging from gravity, direct contact between a droplet and a drain channel, to a mutual coalescence between droplets. This paper focuses on the effect of the length scale of hydrophobic regions on the dynamics of mutual coalescence between droplets and subsequent drainage. The main hypothesis was that, when the drop size is sufficient, the kinetic energy associated with a coalescence of droplets may cause dynamic advancing of a newly formed drop, leading to further coalescence with nearby droplets and ultimately to a chain reaction. We fabricate bi-philic surfaces with hydrophilic and hydrophobic stripes, and the result confirms that coalescing droplets, when the length scale of droplets increases beyond 0.2mm, indeed display dynamic expansion and chain reaction. Multiple droplets can thus migrate to hydrophilic drain simultaneously even when the initial motion of the droplets was not triggered by the direct contact between the droplet and the hydrophilic drain. Efficiency of drain due to mutual coalescence of droplets varies depending on the length scale of bi-philic patterns, and the drain phenomenon reaches its peak when the width of hydrophobic stripes is between 800μm and 1mm. The Ohnesorge number of droplets draining on noted surfaces is between 0.0042 and 0.0037 respectively. The observed length scale of bi-philic patterns matches that on the Stenocara beetle's fog harvesting back surface. This match between length scales suggests that the surface of the insect is optimized

  15. Third generation algae biofuels in Italy by 2030: A scenario analysis using Bayesian networks

    International Nuclear Information System (INIS)

    Gambelli, Danilo; Alberti, Francesca; Solfanelli, Francesco; Vairo, Daniela; Zanoli, Raffaele

    2017-01-01

    We have analysed the potential for biofuels from microalgae in the Italian biofuels context. This scenario analysis considers alternative pathways for the adoption of biofuels from microalgae by the year 2030. The scenarios were developed using a probabilistic approach based on Bayesian networks, through a structured process for elicitation of expert knowledge. We have identified the most and least favourable scenarios in terms of the expected likelihood for the development of the market of biofuels from microalgae, through which we have focussed on the contribution of economic and policy aspects in the development of the sector. A detailed analysis of the contribution of each variable in the context of the scenarios is also provided. These data represent a starting point for the evaluation of different policy options for the future biofuel market in Italy. The best scenario shows a 75% probability that biofuels from microalgae will exceed 20% of the biofuel market by 2030. This is conditional on the improvement and development of the technological changes and environmental policies, and of the markets for bioenergy and novel foods derived from microalgae. - Highlights: • Scenarios for Third generation biofuels are modelled by Bayesian networks. • Best and worst scenarios for year 2030 are presented. • The role of environmental policy is analysed. • Energy and food-feed markets influence the share of biofuels from micro-algae.

  16. Constitutively expressed Protocadherin-α regulates the coalescence and elimination of homotypic olfactory axons through its cytoplasmic region

    Directory of Open Access Journals (Sweden)

    Sonoko eHasegawa

    2012-10-01

    Full Text Available Olfactory sensory neuron (OSN axons coalesce into specific glomeruli in the olfactory bulb (OB according to their odorant receptor (OR expression. Several guidance molecules enhance the coalescence of homotypic OSN projections, in an OR-specific- and neural-activity-dependent manner. However, the mechanism by which homotypic OSN axons are organized into glomeruli is unsolved. We previously reported that the clustered protocadherin-α (Pcdh-α family of diverse cadherin-related molecules plays roles in the coalescence and elimination of homotypic OSN axons throughout development. Here we showed that the elimination of small ectopic homotypic glomeruli required the constitutive expression of a Pcdh-α isoform and Pcdh-α’s cytoplasmic region, but not OR specificity or neural activity. These results suggest that Pcdh-α proteins provide a cytoplasmic signal to regulate repulsive activity for homotypic OSN axons independently of OR expression and neural activity. The counterbalancing effect of Pcdh-α proteins for the axonal coalescence mechanisms mediated by other olfactory guidance molecules indicate a possible mechanism for the organization of homotypic OSN axons into glomeruli during development.

  17. Coalescence of two equal cylinders: exact results for creeping viscous plane flow driven by capillarity

    International Nuclear Information System (INIS)

    Hopper, R.W.

    1984-01-01

    The coalescence of two equal viscous cylinders under the influence of capillarity is of interest in the theory of sintering. Although the flow in typical cylinder coalescence experiments is not planar, the plane-flow case is of general interest and is a good approximation in the early stage. An essentially exact analytic solution giving the shape as a function of time for slow plane flow is presented in simple closed form. 16 references, 2 figures, 1 table

  18. Seeded Bayesian Networks: Constructing genetic networks from microarray data

    Directory of Open Access Journals (Sweden)

    Quackenbush John

    2008-07-01

    Full Text Available Abstract Background DNA microarrays and other genomics-inspired technologies provide large datasets that often include hidden patterns of correlation between genes reflecting the complex processes that underlie cellular metabolism and physiology. The challenge in analyzing large-scale expression data has been to extract biologically meaningful inferences regarding these processes – often represented as networks – in an environment where the datasets are often imperfect and biological noise can obscure the actual signal. Although many techniques have been developed in an attempt to address these issues, to date their ability to extract meaningful and predictive network relationships has been limited. Here we describe a method that draws on prior information about gene-gene interactions to infer biologically relevant pathways from microarray data. Our approach consists of using preliminary networks derived from the literature and/or protein-protein interaction data as seeds for a Bayesian network analysis of microarray results. Results Through a bootstrap analysis of gene expression data derived from a number of leukemia studies, we demonstrate that seeded Bayesian Networks have the ability to identify high-confidence gene-gene interactions which can then be validated by comparison to other sources of pathway data. Conclusion The use of network seeds greatly improves the ability of Bayesian Network analysis to learn gene interaction networks from gene expression data. We demonstrate that the use of seeds derived from the biomedical literature or high-throughput protein-protein interaction data, or the combination, provides improvement over a standard Bayesian Network analysis, allowing networks involving dynamic processes to be deduced from the static snapshots of biological systems that represent the most common source of microarray data. Software implementing these methods has been included in the widely used TM4 microarray analysis package.

  19. Breakup and coalescence characteristics of a hollow cone swirling spray

    Science.gov (United States)

    Saha, Abhishek; Lee, Joshua D.; Basu, Saptarshi; Kumar, Ranganathan

    2012-12-01

    This paper deals with an experimental study of the breakup characteristics of water emanating from hollow cone hydraulic injector nozzles induced by pressure-swirling. The experiments were conducted using two nozzles with different orifice diameters 0.3 mm and 0.5 mm and injection pressures (0.3-4 MPa) which correspond to Rep = 7000-26 000. Two types of laser diagnostic techniques were utilized: shadowgraph and phase Doppler particle anemometry for a complete study of the atomization process. Measurements that were made in the spray in both axial and radial directions indicate that both velocity and average droplet diameter profiles are highly dependent on the nozzle characteristics, Weber number and Reynolds number. The spatial variation of diameter and velocity arises principally due to primary breakup of liquid films and subsequent secondary breakup of large droplets due to aerodynamic shear. Downstream of the nozzle, coalescence of droplets due to collision was also found to be significant. Different types of liquid film breakup were considered and found to match well with the theory. Secondary breakup due to shear was also studied theoretically and compared to the experimental data. Coalescence probability at different axial and radial locations was computed to explain the experimental results. The spray is subdivided into three zones: near the nozzle, a zone consisting of film and ligament regime, where primary breakup and some secondary breakup take place; a second zone where the secondary breakup process continues, but weakens, and the centrifugal dispersion becomes dominant; and a third zone away from the spray where coalescence is dominant. Each regime has been analyzed in detail, characterized by timescale and Weber number and validated using experimental data.

  20. Insights into the phylogeny of Northern Hemisphere Armillaria: Neighbor-net and Bayesian analyses of translation elongation factor 1-α gene sequences.

    Science.gov (United States)

    Klopfenstein, Ned B; Stewart, Jane E; Ota, Yuko; Hanna, John W; Richardson, Bryce A; Ross-Davis, Amy L; Elías-Román, Rubén D; Korhonen, Kari; Keča, Nenad; Iturritxa, Eugenia; Alvarado-Rosales, Dionicio; Solheim, Halvor; Brazee, Nicholas J; Łakomy, Piotr; Cleary, Michelle R; Hasegawa, Eri; Kikuchi, Taisei; Garza-Ocañas, Fortunato; Tsopelas, Panaghiotis; Rigling, Daniel; Prospero, Simone; Tsykun, Tetyana; Bérubé, Jean A; Stefani, Franck O P; Jafarpour, Saeideh; Antonín, Vladimír; Tomšovský, Michal; McDonald, Geral I; Woodward, Stephen; Kim, Mee-Sook

    2017-01-01

    Armillaria possesses several intriguing characteristics that have inspired wide interest in understanding phylogenetic relationships within and among species of this genus. Nuclear ribosomal DNA sequence-based analyses of Armillaria provide only limited information for phylogenetic studies among widely divergent taxa. More recent studies have shown that translation elongation factor 1-α (tef1) sequences are highly informative for phylogenetic analysis of Armillaria species within diverse global regions. This study used Neighbor-net and coalescence-based Bayesian analyses to examine phylogenetic relationships of newly determined and existing tef1 sequences derived from diverse Armillaria species from across the Northern Hemisphere, with Southern Hemisphere Armillaria species included for reference. Based on the Bayesian analysis of tef1 sequences, Armillaria species from the Northern Hemisphere are generally contained within the following four superclades, which are named according to the specific epithet of the most frequently cited species within the superclade: (i) Socialis/Tabescens (exannulate) superclade including Eurasian A. ectypa, North American A. socialis (A. tabescens), and Eurasian A. socialis (A. tabescens) clades; (ii) Mellea superclade including undescribed annulate North American Armillaria sp. (Mexico) and four separate clades of A. mellea (Europe and Iran, eastern Asia, and two groups from North America); (iii) Gallica superclade including Armillaria Nag E (Japan), multiple clades of A. gallica (Asia and Europe), A. calvescens (eastern North America), A. cepistipes (North America), A. altimontana (western USA), A. nabsnona (North America and Japan), and at least two A. gallica clades (North America); and (iv) Solidipes/Ostoyae superclade including two A. solidipes/ostoyae clades (North America), A. gemina (eastern USA), A. solidipes/ostoyae (Eurasia), A. cepistipes (Europe and Japan), A. sinapina (North America and Japan), and A. borealis

  1. Optimal soil venting design using Bayesian Decision analysis

    OpenAIRE

    Kaluarachchi, J. J.; Wijedasa, A. H.

    1994-01-01

    Remediation of hydrocarbon-contaminated sites can be costly and the design process becomes complex in the presence of parameter uncertainty. Classical decision theory related to remediation design requires the parameter uncertainties to be stipulated in terms of statistical estimates based on site observations. In the absence of detailed data on parameter uncertainty, classical decision theory provides little contribution in designing a risk-based optimal design strategy. Bayesian decision th...

  2. Statistical analysis of modal parameters of a suspension bridge based on Bayesian spectral density approach and SHM data

    Science.gov (United States)

    Li, Zhijun; Feng, Maria Q.; Luo, Longxi; Feng, Dongming; Xu, Xiuli

    2018-01-01

    Uncertainty of modal parameters estimation appear in structural health monitoring (SHM) practice of civil engineering to quite some significant extent due to environmental influences and modeling errors. Reasonable methodologies are needed for processing the uncertainty. Bayesian inference can provide a promising and feasible identification solution for the purpose of SHM. However, there are relatively few researches on the application of Bayesian spectral method in the modal identification using SHM data sets. To extract modal parameters from large data sets collected by SHM system, the Bayesian spectral density algorithm was applied to address the uncertainty of mode extraction from output-only response of a long-span suspension bridge. The posterior most possible values of modal parameters and their uncertainties were estimated through Bayesian inference. A long-term variation and statistical analysis was performed using the sensor data sets collected from the SHM system of the suspension bridge over a one-year period. The t location-scale distribution was shown to be a better candidate function for frequencies of lower modes. On the other hand, the burr distribution provided the best fitting to the higher modes which are sensitive to the temperature. In addition, wind-induced variation of modal parameters was also investigated. It was observed that both the damping ratios and modal forces increased during the period of typhoon excitations. Meanwhile, the modal damping ratios exhibit significant correlation with the spectral intensities of the corresponding modal forces.

  3. A Bayesian Approach to Person Fit Analysis in Item Response Theory Models. Research Report.

    Science.gov (United States)

    Glas, Cees A. W.; Meijer, Rob R.

    A Bayesian approach to the evaluation of person fit in item response theory (IRT) models is presented. In a posterior predictive check, the observed value on a discrepancy variable is positioned in its posterior distribution. In a Bayesian framework, a Markov Chain Monte Carlo procedure can be used to generate samples of the posterior distribution…

  4. Gag induces the coalescence of clustered lipid rafts and tetraspanin-enriched microdomains at HIV-1 assembly sites on the plasma membrane.

    Science.gov (United States)

    Hogue, Ian B; Grover, Jonathan R; Soheilian, Ferri; Nagashima, Kunio; Ono, Akira

    2011-10-01

    The HIV-1 structural protein Gag associates with two types of plasma membrane microdomains, lipid rafts and tetraspanin-enriched microdomains (TEMs), both of which have been proposed to be platforms for HIV-1 assembly. However, a variety of studies have demonstrated that lipid rafts and TEMs are distinct microdomains in the absence of HIV-1 infection. To measure the impact of Gag on microdomain behaviors, we took advantage of two assays: an antibody-mediated copatching assay and a Förster resonance energy transfer (FRET) assay that measures the clustering of microdomain markers in live cells without antibody-mediated patching. We found that lipid rafts and TEMs copatched and clustered to a greater extent in the presence of membrane-bound Gag in both assays, suggesting that Gag induces the coalescence of lipid rafts and TEMs. Substitutions in membrane binding motifs of Gag revealed that, while Gag membrane binding is necessary to induce coalescence of lipid rafts and TEMs, either acylation of Gag or binding of phosphatidylinositol-(4,5)-bisphosphate is sufficient. Finally, a Gag derivative that is defective in inducing membrane curvature appeared less able to induce lipid raft and TEM coalescence. A higher-resolution analysis of assembly sites by correlative fluorescence and scanning electron microscopy showed that coalescence of clustered lipid rafts and TEMs occurs predominantly at completed cell surface virus-like particles, whereas a transmembrane raft marker protein appeared to associate with punctate Gag fluorescence even in the absence of cell surface particles. Together, these results suggest that different membrane microdomain components are recruited in a stepwise manner during assembly.

  5. Van der Waals Attraction and Coalescence of Aqueous Salt Nanodroplets

    Czech Academy of Sciences Publication Activity Database

    Jungwirth, Pavel; Buch, V.

    2003-01-01

    Roč. 68, č. 12 (2003), s. 2283-2291 ISSN 0010-0765 R&D Projects: GA MŠk LN00A032 Institutional research plan: CEZ:AV0Z4040901 Keywords : van der Waals interactions * aqueous droplets * coalescence Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.041, year: 2003

  6. A Bayesian account of quantum histories

    International Nuclear Information System (INIS)

    Marlow, Thomas

    2006-01-01

    We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistent with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory

  7. GW151226: Observation of Gravitational Waves from a 22-Solar-Mass Binary Black Hole Coalescence

    OpenAIRE

    Abbott, B. P.; Abbott, R.; Adhikari, R. X.; Anderson, S. B.; Arai, K.; Araya, M. C.; Barayoga, J. C.; Barish, B. C.; Berger, B. K.; Billingsley, G.; Blackburn, J. K.; Bork, R.; Brooks, A. F.; Brunett, S.; Cahillane, C.

    2016-01-01

    We report the observation of a gravitational-wave signal produced by the coalescence of two stellar-mass black holes. The signal, GW151226, was observed by the twin detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) on December 26, 2015 at 03:38:53 UTC. The signal was initially identified within 70 s by an online matched-filter search targeting binary coalescences. Subsequent off-line analyses recovered GW151226 with a network signal-to-noise ratio of 13 and a signifi...

  8. Continuous observation of cavity growth and coalescence by creep-fatigue tests in SEM

    International Nuclear Information System (INIS)

    Arai, Masayuki; Ogata, Takashi; Nitta, Akito

    1995-01-01

    Structural components operating at high temperatures in power plants are subjected to interaction of thermal fatigue and creep which results in creep-fatigue damage. In evaluating the life of those components, it is important to understand microscopic damage evolution under creep-fatigue conditions. In this study, static creep and creep-fatigue tests with tensile holdtime were conducted on SUS304 stainless steel by using a high-temperature fatigue machine combined with a scanning electron microscope (SEM), and cavity growth and coalescence behaviors on surface grain boundaries were observed continuously by the SEM. Quantitative analysis of creep cavity growth based on the observation was made for comparison with theoretical growth models. As a result, it was found that grain boundary cavities nucleate at random and grow preferentially on grain boundaries in a direction almost normal to the stress axis. Under the creep condition, the cavities grow monotonously on grain boundaries while they remain the elliptical shape. On the other hand, under the creep-fatigue condition the cavities grow with an effect of local strain distribution around the grain boundary due to cyclic loading and the micro cracks of one grain-boundary length were formed by coalescence of the cavities. Also, cavity nucleation and growth rates for creep-fatigue were more rapid than those for static creep and the constrained cavity growth model coincided well with the experimental data for creep. (author)

  9. Basics of Bayesian methods.

    Science.gov (United States)

    Ghosh, Sujit K

    2010-01-01

    Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.

  10. Bayesian Reliability Analysis of Non-Stationarity in Multi-agent Systems

    Directory of Open Access Journals (Sweden)

    TONT Gabriela

    2013-05-01

    Full Text Available The Bayesian methods provide information about the meaningful parameters in a statistical analysis obtained by combining the prior and sampling distributions to form the posterior distribution of theparameters. The desired inferences are obtained from this joint posterior. An estimation strategy for hierarchical models, where the resulting joint distribution of the associated model parameters cannotbe evaluated analytically, is to use sampling algorithms, known as Markov Chain Monte Carlo (MCMC methods, from which approximate solutions can be obtained. Both serial and parallel configurations of subcomponents are permitted. The capability of time-dependent method to describe a multi-state system is based on a case study, assessingthe operatial situation of studied system. The rationality and validity of the presented model are demonstrated via a case of study. The effect of randomness of the structural parameters is alsoexamined.

  11. Bayesian nonparametric estimation of continuous monotone functions with applications to dose-response analysis.

    Science.gov (United States)

    Bornkamp, Björn; Ickstadt, Katja

    2009-03-01

    In this article, we consider monotone nonparametric regression in a Bayesian framework. The monotone function is modeled as a mixture of shifted and scaled parametric probability distribution functions, and a general random probability measure is assumed as the prior for the mixing distribution. We investigate the choice of the underlying parametric distribution function and find that the two-sided power distribution function is well suited both from a computational and mathematical point of view. The model is motivated by traditional nonlinear models for dose-response analysis, and provides possibilities to elicitate informative prior distributions on different aspects of the curve. The method is compared with other recent approaches to monotone nonparametric regression in a simulation study and is illustrated on a data set from dose-response analysis.

  12. Bayesian Analysis for EMP Survival Probability of Solid State Relay

    International Nuclear Information System (INIS)

    Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang

    2009-01-01

    The principle to estimate the parameter p of binomial distribution by Bayesian method and the several non-informative prior are introduced. The survival probability of DC solid state relay under current injection at certain amplitude is obtained by this method. (authors)

  13. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    Science.gov (United States)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  14. Bayesian computation with R

    CERN Document Server

    Albert, Jim

    2009-01-01

    There has been a dramatic growth in the development and application of Bayesian inferential methods. Some of this growth is due to the availability of powerful simulation-based algorithms to summarize posterior distributions. There has been also a growing interest in the use of the system R for statistical analyses. R's open source nature, free availability, and large number of contributor packages have made R the software of choice for many statisticians in education and industry. Bayesian Computation with R introduces Bayesian modeling by the use of computation using the R language. The earl

  15. General and Local: Averaged k-Dependence Bayesian Classifiers

    Directory of Open Access Journals (Sweden)

    Limin Wang

    2015-06-01

    Full Text Available The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB classifier can construct at arbitrary points (values of k along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of functional dependencies, substitution-elimination resolution, a new type of semi-naive Bayesian operation, is proposed to substitute or eliminate generalization to achieve accurate estimation of conditional probability distribution while reducing computational complexity. The final classifier, averaged k-dependence Bayesian (AKDB classifiers, will average the output of KDB and local KDB. Experimental results on the repository of machine learning databases from the University of California Irvine (UCI showed that AKDB has significant advantages in zero-one loss and bias relative to naive Bayes (NB, tree augmented naive Bayes (TAN, Averaged one-dependence estimators (AODE, and KDB. Moreover, KDB and local KDB show mutually complementary characteristics with respect to variance.

  16. Void coalescence mechanism for combined tension and large amplitude cyclic shearing

    DEFF Research Database (Denmark)

    Nielsen, Kim Lau; Andersen, Rasmus Grau; Tvergaard, Viggo

    2017-01-01

    Void coalescence at severe shear deformation has been studied intensively under monotonic loading conditions, and the sequence of micro-mechanisms that governs failure has been demonstrated to involve collapse, rotation, and elongation of existing voids. Under intense shearing, the voids are flat...

  17. Predicting Catastrophic Phase Inversion on the Basis of Droplet Coalescence Kinetics

    NARCIS (Netherlands)

    Vaessen, G.E.J.; Visschers, M.; Stein, H.N.

    1996-01-01

    A predictive model for catastrophic phase inversion, based on the kinetics of droplet breakup and coalescence, is presented here. Two inversion mechanisms can be distinguished, depending on the direction of the phase inversion process. With the surfactant predominantly present in the dispersed

  18. Scalable Bayesian nonparametric measures for exploring pairwise dependence via Dirichlet Process Mixtures.

    Science.gov (United States)

    Filippi, Sarah; Holmes, Chris C; Nieto-Barajas, Luis E

    2016-11-16

    In this article we propose novel Bayesian nonparametric methods using Dirichlet Process Mixture (DPM) models for detecting pairwise dependence between random variables while accounting for uncertainty in the form of the underlying distributions. A key criteria is that the procedures should scale to large data sets. In this regard we find that the formal calculation of the Bayes factor for a dependent-vs.-independent DPM joint probability measure is not feasible computationally. To address this we present Bayesian diagnostic measures for characterising evidence against a "null model" of pairwise independence. In simulation studies, as well as for a real data analysis, we show that our approach provides a useful tool for the exploratory nonparametric Bayesian analysis of large multivariate data sets.

  19. The Bayesian Score Statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.; Kleijn, R.; Paap, R.

    2000-01-01

    We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike

  20. Molecular dynamics of coalescence and collisions of silver nanoparticles

    International Nuclear Information System (INIS)

    Guevara-Chapa, Enrique; Mejía-Rosales, Sergio

    2014-01-01

    We study how different relative orientations and impact velocity on the collision of two silver nanoparticles affect the first stages of the formation of a new, larger nanoparticle. In order to do this, we implemented a set of molecular dynamics simulations on the NVE ensemble on pairs of silver icosahedral nanoparticles at several relative orientations, that allowed us to follow the dynamics of the first nanoseconds of the coalescence processes. Using bond angle analysis, we found that the initial relative orientation of the twin planes has a critical role on the final stability of the resulting particle, and on the details of the dynamics itself. When the original particles have their closest twins aligned to each other, the formed nanoparticle will likely stabilize its structure onto a particle with a defined center and a low surface-to-volume ratio, while nanoparticles with misaligned twins will promote the formation of highly defective particles with a high inner energy

  1. Molecular dynamics of coalescence and collisions of silver nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Guevara-Chapa, Enrique, E-mail: enrique_guevara@hotmail.com [Universidad Autónoma de Nuevo León, Facultad de Ciencias Físico Matemáticas (Mexico); Mejía-Rosales, Sergio [Universidad Autónoma de Nuevo León, Center for Innovation, Research and Development in Engineering and Technology (CIIDIT), and CICFIM-Facultad de Ciencias Físico Matemáticas (Mexico)

    2014-12-15

    We study how different relative orientations and impact velocity on the collision of two silver nanoparticles affect the first stages of the formation of a new, larger nanoparticle. In order to do this, we implemented a set of molecular dynamics simulations on the NVE ensemble on pairs of silver icosahedral nanoparticles at several relative orientations, that allowed us to follow the dynamics of the first nanoseconds of the coalescence processes. Using bond angle analysis, we found that the initial relative orientation of the twin planes has a critical role on the final stability of the resulting particle, and on the details of the dynamics itself. When the original particles have their closest twins aligned to each other, the formed nanoparticle will likely stabilize its structure onto a particle with a defined center and a low surface-to-volume ratio, while nanoparticles with misaligned twins will promote the formation of highly defective particles with a high inner energy.

  2. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  3. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  4. Development and validation of models for bubble coalescence and breakup. Final report

    International Nuclear Information System (INIS)

    Liao, Y.; Lucas, D.

    2013-02-01

    A new generalized model for bubble coalescence and breakup has been developed. It is based on physical considerations and takes into account various mechanisms that can lead to bubble coalescence and breakup. First, in a detailed literature review, the available models were compiled and analyzed. It turned out that many of them show a contradictory behaviour. None of these models allows the prediction of the evolution of bubble size distributions along a pipe flow for a wide range of combinations of flow rates of the gas and the liquid phase. The new model has been extensively studied in a simplified Test-Solver. Although this does not cover all details of a developing flow along the pipe, it allows - in contrast to a CFD code - to conduct a large number of variational calculations to investigate the influence of individual sizes and models. Coalescence and breakup cannot be considered separately from other phenomena and models that reflect these phenomena. There are close interactions with the turbulence of the liquid phase and the momentum exchange between phases. Since the dissipation rate of turbulent kinetic energy is a direct input parameter for the new model, the turbulence modelling has been studied very carefully. To validate the model, a special experimental series for air-water flows was used, conducted at the TOPFLOW facility in an 8-meter long DN200 pipe. The data are characterized by high quality and were produced within the TOPFLOW-II project. The test series aims to provide a basis for the work presented here. Predicting the evolution of the bubble size distribution along the pipe could be improved significantly in comparison to the previous standard models for bubble coalescence and breakup implemented in CFX. However some quantitative discrepancies remain. The full model equations as well as an implementation as ''User-FORTRAN'' in CFX are available and can be used for further work on the simulation of poly-disperse bubbly flows.

  5. The application of bayesian statistic in data fit processing

    International Nuclear Information System (INIS)

    Guan Xingyin; Li Zhenfu; Song Zhaohui

    2010-01-01

    The rationality and disadvantage of least squares fitting that is usually used in data processing is analyzed, and the theory and commonly method that Bayesian statistic is applied in data processing is shown in detail. As it is proved in analysis, Bayesian approach avoid the limitative hypothesis that least squares fitting has in data processing, and the result has traits that it is more scientific and more easily understood, may replace the least squares fitting to apply in data processing. (authors)

  6. Wavelet-Based Bayesian Methods for Image Analysis and Automatic Target Recognition

    National Research Council Canada - National Science Library

    Nowak, Robert

    2001-01-01

    .... We have developed two new techniques. First, we have develop a wavelet-based approach to image restoration and deconvolution problems using Bayesian image models and an alternating-maximation method...

  7. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  8. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  9. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  10. How to become a Bayesian in eight easy steps : An annotated reading list

    NARCIS (Netherlands)

    Etz, A.; Gronau, Q.F.; Dablander, F.; Edelsbrunner, P.A.; Baribault, B.

    In this guide, we present a reading list to serve as a concise introduction to Bayesian data analysis. The introduction is geared toward reviewers, editors, and interested researchers who are new to Bayesian statistics. We provide commentary for eight recommended sources, which together cover the

  11. Bayesian disease mapping: hierarchical modeling in spatial epidemiology

    National Research Council Canada - National Science Library

    Lawson, Andrew

    2013-01-01

    Since the publication of the first edition, many new Bayesian tools and methods have been developed for space-time data analysis, the predictive modeling of health outcomes, and other spatial biostatistical areas...

  12. Coalescent-based genome analyses resolve the early branches of the euarchontoglires.

    Directory of Open Access Journals (Sweden)

    Vikas Kumar

    Full Text Available Despite numerous large-scale phylogenomic studies, certain parts of the mammalian tree are extraordinarily difficult to resolve. We used the coding regions from 19 completely sequenced genomes to study the relationships within the super-clade Euarchontoglires (Primates, Rodentia, Lagomorpha, Dermoptera and Scandentia because the placement of Scandentia within this clade is controversial. The difficulty in resolving this issue is due to the short time spans between the early divergences of Euarchontoglires, which may cause incongruent gene trees. The conflict in the data can be depicted by network analyses and the contentious relationships are best reconstructed by coalescent-based analyses. This method is expected to be superior to analyses of concatenated data in reconstructing a species tree from numerous gene trees. The total concatenated dataset used to study the relationships in this group comprises 5,875 protein-coding genes (9,799,170 nucleotides from all orders except Dermoptera (flying lemurs. Reconstruction of the species tree from 1,006 gene trees using coalescent models placed Scandentia as sister group to the primates, which is in agreement with maximum likelihood analyses of concatenated nucleotide sequence data. Additionally, both analytical approaches favoured the Tarsier to be sister taxon to Anthropoidea, thus belonging to the Haplorrhine clade. When divergence times are short such as in radiations over periods of a few million years, even genome scale analyses struggle to resolve phylogenetic relationships. On these short branches processes such as incomplete lineage sorting and possibly hybridization occur and make it preferable to base phylogenomic analyses on coalescent methods.

  13. Action-derived molecular dynamics simulations for the migration and coalescence of vacancies in graphene and carbon nanotubes.

    Science.gov (United States)

    Lee, Alex Taekyung; Ryu, Byungki; Lee, In-Ho; Chang, K J

    2014-03-19

    We report the results of action-derived molecular dynamics simulations for the migration and coalescence processes of monovacancies in graphene and carbon nanotubes with different chiralities. In carbon nanotubes, the migration pathways and barriers of a monovacancy depend on the tube chirality, while there is no preferential pathway in graphene due to the lattice symmetry and the absence of the curvature effect. The probable pathway changes from the axial to circumferential direction as the chirality varies from armchair to zigzag. The chirality dependence is attributed to the preferential orientation of the reconstructed bond formed around each vacancy site. It is energetically more favourable for two monovacancies to coalesce into a divacancy via alternative movements rather than simultaneous movements. The energy barriers for coalescence are generally determined by the migration barrier for the monovacancy, although there are some variations due to interactions between two diffusing vacancies. In graphene and armchair nanotubes, two monovacancies prefer to migrate along different zigzag atomic chains rather than a single atomic chain connecting these vacancies. On the other hand, in zigzag tubes, the energy barrier for coalescence increases significantly unless monovacancies lie on the same circumference.

  14. Towards a Fuzzy Bayesian Network Based Approach for Safety Risk Analysis of Tunnel-Induced Pipeline Damage.

    Science.gov (United States)

    Zhang, Limao; Wu, Xianguo; Qin, Yawei; Skibniewski, Miroslaw J; Liu, Wenli

    2016-02-01

    Tunneling excavation is bound to produce significant disturbances to surrounding environments, and the tunnel-induced damage to adjacent underground buried pipelines is of considerable importance for geotechnical practice. A fuzzy Bayesian networks (FBNs) based approach for safety risk analysis is developed in this article with detailed step-by-step procedures, consisting of risk mechanism analysis, the FBN model establishment, fuzzification, FBN-based inference, defuzzification, and decision making. In accordance with the failure mechanism analysis, a tunnel-induced pipeline damage model is proposed to reveal the cause-effect relationships between the pipeline damage and its influential variables. In terms of the fuzzification process, an expert confidence indicator is proposed to reveal the reliability of the data when determining the fuzzy probability of occurrence of basic events, with both the judgment ability level and the subjectivity reliability level taken into account. By means of the fuzzy Bayesian inference, the approach proposed in this article is capable of calculating the probability distribution of potential safety risks and identifying the most likely potential causes of accidents under both prior knowledge and given evidence circumstances. A case concerning the safety analysis of underground buried pipelines adjacent to the construction of the Wuhan Yangtze River Tunnel is presented. The results demonstrate the feasibility of the proposed FBN approach and its application potential. The proposed approach can be used as a decision tool to provide support for safety assurance and management in tunnel construction, and thus increase the likelihood of a successful project in a complex project environment. © 2015 Society for Risk Analysis.

  15. A Bayesian framework for risk perception

    NARCIS (Netherlands)

    van Erp, H.R.N.

    2017-01-01

    We present here a Bayesian framework of risk perception. This framework encompasses plausibility judgments, decision making, and question asking. Plausibility judgments are modeled by way of Bayesian probability theory, decision making is modeled by way of a Bayesian decision theory, and relevancy

  16. A Bayesian ensemble of sensitivity measures for severe accident modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Di Maio, Francesco, E-mail: francesco.dimaio@polimi.it [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Vagnoli, Matteo [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Zio, Enrico [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Chair on System Science and Energetic Challenge, Fondation EDF – Electricite de France Ecole Centrale, Paris, and Supelec, Paris (France); Pourgol-Mohammad, Mohammad [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of)

    2015-12-15

    Highlights: • We propose a sensitivity analysis (SA) method based on a Bayesian updating scheme. • The Bayesian updating schemes adjourns an ensemble of sensitivity measures. • Bootstrap replicates of a severe accident code output are fed to the Bayesian scheme. • The MELCOR code simulates the fission products release of LOFT LP-FP-2 experiment. • Results are compared with those of traditional SA methods. - Abstract: In this work, a sensitivity analysis framework is presented to identify the relevant input variables of a severe accident code, based on an incremental Bayesian ensemble updating method. The proposed methodology entails: (i) the propagation of the uncertainty in the input variables through the severe accident code; (ii) the collection of bootstrap replicates of the input and output of limited number of simulations for building a set of finite mixture models (FMMs) for approximating the probability density function (pdf) of the severe accident code output of the replicates; (iii) for each FMM, the calculation of an ensemble of sensitivity measures (i.e., input saliency, Hellinger distance and Kullback–Leibler divergence) and the updating when a new piece of evidence arrives, by a Bayesian scheme, based on the Bradley–Terry model for ranking the most relevant input model variables. An application is given with respect to a limited number of simulations of a MELCOR severe accident model describing the fission products release in the LP-FP-2 experiment of the loss of fluid test (LOFT) facility, which is a scaled-down facility of a pressurized water reactor (PWR).

  17. Search for gravitational waves from low mass compact binary coalescence in 186 days of LIGO's fifth science run

    International Nuclear Information System (INIS)

    Abbott, B. P.; Abbott, R.; Adhikari, R.; Anderson, S. B.; Araya, M.; Armandula, H.; Aso, Y.; Ballmer, S.; Barton, M. A.; Betzwieser, J.; Billingsley, G.; Black, E.; Blackburn, J. K.; Bork, R.; Boschi, V.; Brooks, A. F.; Cannon, K. C.; Cardenas, L.; Cepeda, C.; Chalermsongsak, T.

    2009-01-01

    We report on a search for gravitational waves from coalescing compact binaries, of total mass between 2 and 35M · , using LIGO observations between November 14, 2006 and May 18, 2007. No gravitational-wave signals were detected. We report upper limits on the rate of compact binary coalescence as a function of total mass. The LIGO cumulative 90%-confidence rate upper limits of the binary coalescence of neutron stars, black holes and black hole-neutron star systems are 1.4x10 -2 , 7.3x10 -4 and 3.6x10 -3 yr -1 L 10 -1 , respectively, where L 10 is 10 10 times the blue solar luminosity.

  18. Characterization of solids deposited on the modular caustic-side solvent extraction unit (MCU) coalescer media removed in May and October 2014

    Energy Technology Data Exchange (ETDEWEB)

    Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    During routine maintenance, the coalescers utilized in the Modular Caustic-Side Solvent Extraction Unit (MCU) processing of Salt Batch 6 and a portion of Salt Batch 7 were sampled and submitted to the Savannah River National Laboratory (SRNL) for characterization, for the purpose of identifying solid phase constituents that may be accumulating in these coalescers. Specifically, two samples were received and characterized: A decontaminated salt solution (DSS) coalescer sample and a strip effluent (SE) coalescer sample. Aliquots of the samples were analyzed by XRD, Fourier Transform Infrared (FTIR) Spectroscopy, SEM, and EDS. Other aliquots of the samples were leached in acid solution, and the leachates were analyzed by ICP-AES. In addition, modeling was performed to provide a basis for comparison of the analytical results.

  19. Gag Induces the Coalescence of Clustered Lipid Rafts and Tetraspanin-Enriched Microdomains at HIV-1 Assembly Sites on the Plasma Membrane ▿

    Science.gov (United States)

    Hogue, Ian B.; Grover, Jonathan R.; Soheilian, Ferri; Nagashima, Kunio; Ono, Akira

    2011-01-01

    The HIV-1 structural protein Gag associates with two types of plasma membrane microdomains, lipid rafts and tetraspanin-enriched microdomains (TEMs), both of which have been proposed to be platforms for HIV-1 assembly. However, a variety of studies have demonstrated that lipid rafts and TEMs are distinct microdomains in the absence of HIV-1 infection. To measure the impact of Gag on microdomain behaviors, we took advantage of two assays: an antibody-mediated copatching assay and a Förster resonance energy transfer (FRET) assay that measures the clustering of microdomain markers in live cells without antibody-mediated patching. We found that lipid rafts and TEMs copatched and clustered to a greater extent in the presence of membrane-bound Gag in both assays, suggesting that Gag induces the coalescence of lipid rafts and TEMs. Substitutions in membrane binding motifs of Gag revealed that, while Gag membrane binding is necessary to induce coalescence of lipid rafts and TEMs, either acylation of Gag or binding of phosphatidylinositol-(4,5)-bisphosphate is sufficient. Finally, a Gag derivative that is defective in inducing membrane curvature appeared less able to induce lipid raft and TEM coalescence. A higher-resolution analysis of assembly sites by correlative fluorescence and scanning electron microscopy showed that coalescence of clustered lipid rafts and TEMs occurs predominately at completed cell surface virus-like particles, whereas a transmembrane raft marker protein appeared to associate with punctate Gag fluorescence even in the absence of cell surface particles. Together, these results suggest that different membrane microdomain components are recruited in a stepwise manner during assembly. PMID:21813604

  20. Risk Analysis on Leakage Failure of Natural Gas Pipelines by Fuzzy Bayesian Network with a Bow-Tie Model

    OpenAIRE

    Shan, Xian; Liu, Kang; Sun, Pei-Liang

    2017-01-01

    Pipeline is the major mode of natural gas transportation. Leakage of natural gas pipelines may cause explosions and fires, resulting in casualties, environmental damage, and material loss. Efficient risk analysis is of great significance for preventing and mitigating such potential accidents. The objective of this study is to present a practical risk assessment method based on Bow-tie model and Bayesian network for risk analysis of natural gas pipeline leakage. Firstly, identify the potential...

  1. Bayesian mixture analysis for metagenomic community profiling.

    Science.gov (United States)

    Morfopoulou, Sofia; Plagnol, Vincent

    2015-09-15

    Deep sequencing of clinical samples is now an established tool for the detection of infectious pathogens, with direct medical applications. The large amount of data generated produces an opportunity to detect species even at very low levels, provided that computational tools can effectively profile the relevant metagenomic communities. Data interpretation is complicated by the fact that short sequencing reads can match multiple organisms and by the lack of completeness of existing databases, in particular for viral pathogens. Here we present metaMix, a Bayesian mixture model framework for resolving complex metagenomic mixtures. We show that the use of parallel Monte Carlo Markov chains for the exploration of the species space enables the identification of the set of species most likely to contribute to the mixture. We demonstrate the greater accuracy of metaMix compared with relevant methods, particularly for profiling complex communities consisting of several related species. We designed metaMix specifically for the analysis of deep transcriptome sequencing datasets, with a focus on viral pathogen detection; however, the principles are generally applicable to all types of metagenomic mixtures. metaMix is implemented as a user friendly R package, freely available on CRAN: http://cran.r-project.org/web/packages/metaMix sofia.morfopoulou.10@ucl.ac.uk Supplementary data are available at Bionformatics online. © The Author 2015. Published by Oxford University Press.

  2. Bayesian analysis for uncertainty estimation of a canopy transpiration model

    Science.gov (United States)

    Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.

    2007-04-01

    A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.

  3. Bayesian outcome-based strategy classification.

    Science.gov (United States)

    Lee, Michael D

    2016-03-01

    Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.

  4. Joint High-Dimensional Bayesian Variable and Covariance Selection with an Application to eQTL Analysis

    KAUST Repository

    Bhadra, Anindya

    2013-04-22

    We describe a Bayesian technique to (a) perform a sparse joint selection of significant predictor variables and significant inverse covariance matrix elements of the response variables in a high-dimensional linear Gaussian sparse seemingly unrelated regression (SSUR) setting and (b) perform an association analysis between the high-dimensional sets of predictors and responses in such a setting. To search the high-dimensional model space, where both the number of predictors and the number of possibly correlated responses can be larger than the sample size, we demonstrate that a marginalization-based collapsed Gibbs sampler, in combination with spike and slab type of priors, offers a computationally feasible and efficient solution. As an example, we apply our method to an expression quantitative trait loci (eQTL) analysis on publicly available single nucleotide polymorphism (SNP) and gene expression data for humans where the primary interest lies in finding the significant associations between the sets of SNPs and possibly correlated genetic transcripts. Our method also allows for inference on the sparse interaction network of the transcripts (response variables) after accounting for the effect of the SNPs (predictor variables). We exploit properties of Gaussian graphical models to make statements concerning conditional independence of the responses. Our method compares favorably to existing Bayesian approaches developed for this purpose. © 2013, The International Biometric Society.

  5. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  6. Gravitational Waves: Search Results, Data Analysis and Parameter Estimation. Amaldi 10 Parallel Session C2

    Science.gov (United States)

    Astone, Pia; Weinstein, Alan; Agathos, Michalis; Bejger, Michal; Christensen, Nelson; Dent, Thomas; Graff, Philip; Klimenko, Sergey; Mazzolo, Giulio; Nishizawa, Atsushi

    2015-01-01

    The Amaldi 10 Parallel Session C2 on gravitational wave(GW) search results, data analysis and parameter estimation included three lively sessions of lectures by 13 presenters, and 34 posters. The talks and posters covered a huge range of material, including results and analysis techniques for ground-based GW detectors, targeting anticipated signals from different astrophysical sources: compact binary inspiral, merger and ringdown; GW bursts from intermediate mass binary black hole mergers, cosmic string cusps, core-collapse supernovae, and other unmodeled sources; continuous waves from spinning neutron stars; and a stochastic GW background. There was considerable emphasis on Bayesian techniques for estimating the parameters of coalescing compact binary systems from the gravitational waveforms extracted from the data from the advanced detector network. This included methods to distinguish deviations of the signals from what is expected in the context of General Relativity.

  7. Gravitational waves: search results, data analysis and parameter estimation: Amaldi 10 Parallel session C2.

    Science.gov (United States)

    Astone, Pia; Weinstein, Alan; Agathos, Michalis; Bejger, Michał; Christensen, Nelson; Dent, Thomas; Graff, Philip; Klimenko, Sergey; Mazzolo, Giulio; Nishizawa, Atsushi; Robinet, Florent; Schmidt, Patricia; Smith, Rory; Veitch, John; Wade, Madeline; Aoudia, Sofiane; Bose, Sukanta; Calderon Bustillo, Juan; Canizares, Priscilla; Capano, Colin; Clark, James; Colla, Alberto; Cuoco, Elena; Da Silva Costa, Carlos; Dal Canton, Tito; Evangelista, Edgar; Goetz, Evan; Gupta, Anuradha; Hannam, Mark; Keitel, David; Lackey, Benjamin; Logue, Joshua; Mohapatra, Satyanarayan; Piergiovanni, Francesco; Privitera, Stephen; Prix, Reinhard; Pürrer, Michael; Re, Virginia; Serafinelli, Roberto; Wade, Leslie; Wen, Linqing; Wette, Karl; Whelan, John; Palomba, C; Prodi, G

    The Amaldi 10 Parallel Session C2 on gravitational wave (GW) search results, data analysis and parameter estimation included three lively sessions of lectures by 13 presenters, and 34 posters. The talks and posters covered a huge range of material, including results and analysis techniques for ground-based GW detectors, targeting anticipated signals from different astrophysical sources: compact binary inspiral, merger and ringdown; GW bursts from intermediate mass binary black hole mergers, cosmic string cusps, core-collapse supernovae, and other unmodeled sources; continuous waves from spinning neutron stars; and a stochastic GW background. There was considerable emphasis on Bayesian techniques for estimating the parameters of coalescing compact binary systems from the gravitational waveforms extracted from the data from the advanced detector network. This included methods to distinguish deviations of the signals from what is expected in the context of General Relativity.

  8. A Bayesian analysis of inflationary primordial spectrum models using Planck data

    Science.gov (United States)

    Santos da Costa, Simony; Benetti, Micol; Alcaniz, Jailson

    2018-03-01

    The current available Cosmic Microwave Background (CMB) data show an anomalously low value of the CMB temperature fluctuations at large angular scales (l power is not explained by the minimal ΛCDM model, and one of the possible mechanisms explored in the literature to address this problem is the presence of features in the primordial power spectrum (PPS) motivated by the early universe physics. In this paper, we analyse a set of cutoff inflationary PPS models using a Bayesian model comparison approach in light of the latest CMB data from the Planck Collaboration. Our results show that the standard power-law parameterisation is preferred over all models considered in the analysis, which motivates the search for alternative explanations for the observed lack of power in the CMB anisotropy spectrum.

  9. Cross-view gait recognition using joint Bayesian

    Science.gov (United States)

    Li, Chao; Sun, Shouqian; Chen, Xiaoyu; Min, Xin

    2017-07-01

    Human gait, as a soft biometric, helps to recognize people by walking. To further improve the recognition performance under cross-view condition, we propose Joint Bayesian to model the view variance. We evaluated our prosed method with the largest population (OULP) dataset which makes our result reliable in a statically way. As a result, we confirmed our proposed method significantly outperformed state-of-the-art approaches for both identification and verification tasks. Finally, sensitivity analysis on the number of training subjects was conducted, we find Joint Bayesian could achieve competitive results even with a small subset of training subjects (100 subjects). For further comparison, experimental results, learning models, and test codes are available.

  10. Narrowband interference parameterization for sparse Bayesian recovery

    KAUST Repository

    Ali, Anum

    2015-09-11

    This paper addresses the problem of narrowband interference (NBI) in SC-FDMA systems by using tools from compressed sensing and stochastic geometry. The proposed NBI cancellation scheme exploits the frequency domain sparsity of the unknown signal and adopts a Bayesian sparse recovery procedure. This is done by keeping a few randomly chosen sub-carriers data free to sense the NBI signal at the receiver. As Bayesian recovery requires knowledge of some NBI parameters (i.e., mean, variance and sparsity rate), we use tools from stochastic geometry to obtain analytical expressions for the required parameters. Our simulation results validate the analysis and depict suitability of the proposed recovery method for NBI mitigation. © 2015 IEEE.

  11. Numerical-experiment investigation of coalescence of gaseous protogalactic fragments in triple systems

    International Nuclear Information System (INIS)

    Kiseleva, L.G.; Orlov, V.V.

    1988-01-01

    The numerical-experiment approach in the framework of the general gravitational three-body problem has been used to investigate the dynamical evolution of triple systems of gaseous protogalactic fragments. The masses of the fragments are equal, the initial velocities zero. The initial positions were specified by uniform scanning in the region D of all possible initial configurations. Calculations were continued until the first two-body encounter of the fragments. Different values of the fragment radii at this times were considered, namely, r in the interval [0.001, 0.1]d, where d is the mean diameter of the system. It is shown that for such r the pair of gaseous fragments coalesces in the majority of cases (from 50.2% for r = 0.001d to 96.7% for r = 0.1d). The mean specific angular momentum of their relative motion, which becomes spin angular momentum of the coalescence product, is (0.8 +- 1.0)/centered dot/10 29 /root/μl cm 2 sec for the most probable value r = 10l kpc (the masses of the fragments are 5/centered dot/10 10 μM/circled dot/; l and μ are scale factors), this agreeing in order of magnitude with the specific angular momenta of disk galaxies if l, μ /approximately/ 1. For each value of r, a continuous zone of initial configurations corresponding to coalescences is identified in the region D

  12. Evolutionary Analysis of Dengue Serotype 2 Viruses Using Phylogenetic and Bayesian Methods from New Delhi, India.

    Directory of Open Access Journals (Sweden)

    Nazia Afreen

    2016-03-01

    Full Text Available Dengue fever is the most important arboviral disease in the tropical and sub-tropical countries of the world. Delhi, the metropolitan capital state of India, has reported many dengue outbreaks, with the last outbreak occurring in 2013. We have recently reported predominance of dengue virus serotype 2 during 2011-2014 in Delhi. In the present study, we report molecular characterization and evolutionary analysis of dengue serotype 2 viruses which were detected in 2011-2014 in Delhi. Envelope genes of 42 DENV-2 strains were sequenced in the study. All DENV-2 strains grouped within the Cosmopolitan genotype and further clustered into three lineages; Lineage I, II and III. Lineage III replaced lineage I during dengue fever outbreak of 2013. Further, a novel mutation Thr404Ile was detected in the stem region of the envelope protein of a single DENV-2 strain in 2014. Nucleotide substitution rate and time to the most recent common ancestor were determined by molecular clock analysis using Bayesian methods. A change in effective population size of Indian DENV-2 viruses was investigated through Bayesian skyline plot. The study will be a vital road map for investigation of epidemiology and evolutionary pattern of dengue viruses in India.

  13. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    Klumpp, John [Colorado State University, Department of Environmental and Radiological Health Sciences, Molecular and Radiological Biosciences Building, Colorado State University, Fort Collins, Colorado, 80523 (United States)

    2013-07-01

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements from an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)

  14. Application of Bayesian configural frequency analysis (BCFA) to determine characteristics user and non-user motor X

    Science.gov (United States)

    Mawardi, Muhamad Iqbal; Padmadisastra, Septiadi; Tantular, Bertho

    2018-03-01

    Configural Frequency Analysis is a method for cell-wise testing in contingency tables for exploratory search type and antitype, that can see the existence of discrepancy on the model by existence of a significant difference between the frequency of observation and frequency of expectation. This analysis focuses on whether or not the interaction among categories from different variables, and not the interaction among variables. One of the extensions of CFA method is Bayesian CFA, this alternative method pursue the same goal as frequentist version of CFA with the advantage that adjustment of the experiment-wise significance level α is not necessary and test whether groups of types and antitypes form composite types or composite antitypes. Hence, this research will present the concept of the Bayesian CFA and how it works for the real data. The data on this paper is based on case studies in a company about decrease Brand Awareness & Image motor X on Top Of Mind Unit indicator in Cirebon City for user 30.8% and non user 9.8%. From the result of B-CFA have four characteristics from deviation, one of the four characteristics above that is the configuration 2212 need more attention by company to determine promotion strategy to maintain and improve Top Of Mind Unit in Cirebon City.

  15. Online variational Bayesian filtering-based mobile target tracking in wireless sensor networks.

    Science.gov (United States)

    Zhou, Bingpeng; Chen, Qingchun; Li, Tiffany Jing; Xiao, Pei

    2014-11-11

    The received signal strength (RSS)-based online tracking for a mobile node in wireless sensor networks (WSNs) is investigated in this paper. Firstly, a multi-layer dynamic Bayesian network (MDBN) is introduced to characterize the target mobility with either directional or undirected movement. In particular, it is proposed to employ the Wishart distribution to approximate the time-varying RSS measurement precision's randomness due to the target movement. It is shown that the proposed MDBN offers a more general analysis model via incorporating the underlying statistical information of both the target movement and observations, which can be utilized to improve the online tracking capability by exploiting the Bayesian statistics. Secondly, based on the MDBN model, a mean-field variational Bayesian filtering (VBF) algorithm is developed to realize the online tracking of a mobile target in the presence of nonlinear observations and time-varying RSS precision, wherein the traditional Bayesian filtering scheme cannot be directly employed. Thirdly, a joint optimization between the real-time velocity and its prior expectation is proposed to enable online velocity tracking in the proposed online tacking scheme. Finally, the associated Bayesian Cramer-Rao Lower Bound (BCRLB) analysis and numerical simulations are conducted. Our analysis unveils that, by exploiting the potential state information via the general MDBN model, the proposed VBF algorithm provides a promising solution to the online tracking of a mobile node in WSNs. In addition, it is shown that the final tracking accuracy linearly scales with its expectation when the RSS measurement precision is time-varying.

  16. Risk Assessment for Mobile Systems Through a Multilayered Hierarchical Bayesian Network.

    Science.gov (United States)

    Li, Shancang; Tryfonas, Theo; Russell, Gordon; Andriotis, Panagiotis

    2016-08-01

    Mobile systems are facing a number of application vulnerabilities that can be combined together and utilized to penetrate systems with devastating impact. When assessing the overall security of a mobile system, it is important to assess the security risks posed by each mobile applications (apps), thus gaining a stronger understanding of any vulnerabilities present. This paper aims at developing a three-layer framework that assesses the potential risks which apps introduce within the Android mobile systems. A Bayesian risk graphical model is proposed to evaluate risk propagation in a layered risk architecture. By integrating static analysis, dynamic analysis, and behavior analysis in a hierarchical framework, the risks and their propagation through each layer are well modeled by the Bayesian risk graph, which can quantitatively analyze risks faced to both apps and mobile systems. The proposed hierarchical Bayesian risk graph model offers a novel way to investigate the security risks in mobile environment and enables users and administrators to evaluate the potential risks. This strategy allows to strengthen both app security as well as the security of the entire system.

  17. Maritime Transportation Risk Assessment of Tianjin Port with Bayesian Belief Networks.

    Science.gov (United States)

    Zhang, Jinfen; Teixeira, Ângelo P; Guedes Soares, C; Yan, Xinping; Liu, Kezhong

    2016-06-01

    This article develops a Bayesian belief network model for the prediction of accident consequences in the Tianjin port. The study starts with a statistical analysis of historical accident data of six years from 2008 to 2013. Then a Bayesian belief network is constructed to express the dependencies between the indicator variables and accident consequences. The statistics and expert knowledge are synthesized in the Bayesian belief network model to obtain the probability distribution of the consequences. By a sensitivity analysis, several indicator variables that have influence on the consequences are identified, including navigational area, ship type and time of the day. The results indicate that the consequences are most sensitive to the position where the accidents occurred, followed by time of day and ship length. The results also reflect that the navigational risk of the Tianjin port is at the acceptable level, despite that there is more room of improvement. These results can be used by the Maritime Safety Administration to take effective measures to enhance maritime safety in the Tianjin port. © 2016 Society for Risk Analysis.

  18. Application of a naive Bayesians classifiers in assessing the supplier

    Directory of Open Access Journals (Sweden)

    Mijailović Snežana

    2017-01-01

    Full Text Available The paper considers the class of interactive knowledge based systems whose main purpose of making proposals and assisting customers in making decisions. The mathematical model provides a set of examples of learning about the delivered series of outflows from three suppliers, as well as an analysis of an illustrative example for assessing the supplier using a naive Bayesian classifier. The model was developed on the basis of the analysis of subjective probabilities, which are later revised with the help of new empirical information and Bayesian theorem on a posterior probability, i.e. combining of subjective and objective conditional probabilities in the choice of a reliable supplier.

  19. GW151226: observation of gravitational waves from a 22-solar-mass binary black hole \\ud coalescence

    OpenAIRE

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.

    2016-01-01

    We report the observation of a gravitational-wave signal produced by the coalescence of two stellar-mass black holes. The signal, GW151226, was observed by the twin detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) on December 26, 2015 at 03:38:53 UTC. The signal was initially identified within 70 s by an online matched-filter search targeting binary coalescences. Subsequent off-line analyses recovered GW151226 with a network signal-to-noise ratio of 13 and a signifi...

  20. GeneRecon—A coalescent based tool for fine-scale association mapping

    DEFF Research Database (Denmark)

    Mailund, Thomas; Schierup, Mikkel Heide; Pedersen, Christian Nørgaard Storm

    2006-01-01

    GeneRecon is a tool for fine-scale association mapping using a coalescence model. GeneRecon takes as input case-control data from phased or unphased SNP and micro-satellite genotypes. The posterior distribution of disease locus position is obtained by Metropolis Hastings sampling in the state space...

  1. Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.

    Science.gov (United States)

    Yalch, Matthew M

    2016-03-01

    Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).

  2. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    Science.gov (United States)

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  3. Evaluation of errors in prior mean and variance in the estimation of integrated circuit failure rates using Bayesian methods

    Science.gov (United States)

    Fletcher, B. C.

    1972-01-01

    The critical point of any Bayesian analysis concerns the choice and quantification of the prior information. The effects of prior data on a Bayesian analysis are studied. Comparisons of the maximum likelihood estimator, the Bayesian estimator, and the known failure rate are presented. The results of the many simulated trails are then analyzed to show the region of criticality for prior information being supplied to the Bayesian estimator. In particular, effects of prior mean and variance are determined as a function of the amount of test data available.

  4. On the prior probabilities for two-stage Bayesian estimates

    International Nuclear Information System (INIS)

    Kohut, P.

    1992-01-01

    The method of Bayesian inference is reexamined for its applicability and for the required underlying assumptions in obtaining and using prior probability estimates. Two different approaches are suggested to determine the first-stage priors in the two-stage Bayesian analysis which avoid certain assumptions required for other techniques. In the first scheme, the prior is obtained through a true frequency based distribution generated at selected intervals utilizing actual sampling of the failure rate distributions. The population variability distribution is generated as the weighed average of the frequency distributions. The second method is based on a non-parametric Bayesian approach using the Maximum Entropy Principle. Specific features such as integral properties or selected parameters of prior distributions may be obtained with minimal assumptions. It is indicated how various quantiles may also be generated with a least square technique

  5. Bayesian inference in probabilistic risk assessment-The current state of the art

    International Nuclear Information System (INIS)

    Kelly, Dana L.; Smith, Curtis L.

    2009-01-01

    Markov chain Monte Carlo (MCMC) approaches to sampling directly from the joint posterior distribution of aleatory model parameters have led to tremendous advances in Bayesian inference capability in a wide variety of fields, including probabilistic risk analysis. The advent of freely available software coupled with inexpensive computing power has catalyzed this advance. This paper examines where the risk assessment community is with respect to implementing modern computational-based Bayesian approaches to inference. Through a series of examples in different topical areas, it introduces salient concepts and illustrates the practical application of Bayesian inference via MCMC sampling to a variety of important problems

  6. Bayesian models for astrophysical data using R, JAGS, Python, and Stan

    CERN Document Server

    Hilbe, Joseph M; Ishida, Emille E O

    2017-01-01

    This comprehensive guide to Bayesian methods in astronomy enables hands-on work by supplying complete R, JAGS, Python, and Stan code, to use directly or to adapt. It begins by examining the normal model from both frequentist and Bayesian perspectives and then progresses to a full range of Bayesian generalized linear and mixed or hierarchical models, as well as additional types of models such as ABC and INLA. The book provides code that is largely unavailable elsewhere and includes details on interpreting and evaluating Bayesian models. Initial discussions offer models in synthetic form so that readers can easily adapt them to their own data; later the models are applied to real astronomical data. The consistent focus is on hands-on modeling, analysis of data, and interpretations that address scientific questions. A must-have for astronomers, its concrete approach will also be attractive to researchers in the sciences more generally.

  7. Estimation of expected number of accidents and workforce unavailability through Bayesian population variability analysis and Markov-based model

    International Nuclear Information System (INIS)

    Chagas Moura, Márcio das; Azevedo, Rafael Valença; Droguett, Enrique López; Chaves, Leandro Rego; Lins, Isis Didier

    2016-01-01

    Occupational accidents pose several negative consequences to employees, employers, environment and people surrounding the locale where the accident takes place. Some types of accidents correspond to low frequency-high consequence (long sick leaves) events, and then classical statistical approaches are ineffective in these cases because the available dataset is generally sparse and contain censored recordings. In this context, we propose a Bayesian population variability method for the estimation of the distributions of the rates of accident and recovery. Given these distributions, a Markov-based model will be used to estimate the uncertainty over the expected number of accidents and the work time loss. Thus, the use of Bayesian analysis along with the Markov approach aims at investigating future trends regarding occupational accidents in a workplace as well as enabling a better management of the labor force and prevention efforts. One application example is presented in order to validate the proposed approach; this case uses available data gathered from a hydropower company in Brazil. - Highlights: • This paper proposes a Bayesian method to estimate rates of accident and recovery. • The model requires simple data likely to be available in the company database. • These results show the proposed model is not too sensitive to the prior estimates.

  8. Gravitational wave emission from the coalescence of white dwarfs

    International Nuclear Information System (INIS)

    Garcia-Berro, E; Loren-Aguilar, P; Isern, J; Pedemonte, A G; Guerrero, J; Lobo, J A

    2005-01-01

    We have computed the gravitational wave emission arising from the coalescence of several close white dwarf binary systems. In order to do so, we have followed the evolution of such systems using a smoothed particle hydrodynamics code. Here we present some of the results obtained so far, paying special attention to the detectability of the emitted gravitational waves. Within this context, we show which could be the impact of individual merging episodes for LISA

  9. Bayesian Fundamentalism or Enlightenment? On the explanatory status and theoretical contributions of Bayesian models of cognition.

    Science.gov (United States)

    Jones, Matt; Love, Bradley C

    2011-08-01

    The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls

  10. Bayesian molecular dating: opening up the black box.

    Science.gov (United States)

    Bromham, Lindell; Duchêne, Sebastián; Hua, Xia; Ritchie, Andrew M; Duchêne, David A; Ho, Simon Y W

    2018-05-01

    Molecular dating analyses allow evolutionary timescales to be estimated from genetic data, offering an unprecedented capacity for investigating the evolutionary past of all species. These methods require us to make assumptions about the relationship between genetic change and evolutionary time, often referred to as a 'molecular clock'. Although initially regarded with scepticism, molecular dating has now been adopted in many areas of biology. This broad uptake has been due partly to the development of Bayesian methods that allow complex aspects of molecular evolution, such as variation in rates of change across lineages, to be taken into account. But in order to do this, Bayesian dating methods rely on a range of assumptions about the evolutionary process, which vary in their degree of biological realism and empirical support. These assumptions can have substantial impacts on the estimates produced by molecular dating analyses. The aim of this review is to open the 'black box' of Bayesian molecular dating and have a look at the machinery inside. We explain the components of these dating methods, the important decisions that researchers must make in their analyses, and the factors that need to be considered when interpreting results. We illustrate the effects that the choices of different models and priors can have on the outcome of the analysis, and suggest ways to explore these impacts. We describe some major research directions that may improve the reliability of Bayesian dating. The goal of our review is to help researchers to make informed choices when using Bayesian phylogenetic methods to estimate evolutionary rates and timescales. © 2017 Cambridge Philosophical Society.

  11. Coalescence of Black Hole-Neutron Star Binaries

    Directory of Open Access Journals (Sweden)

    Masaru Shibata

    2011-08-01

    Full Text Available We review the current status of general relativistic studies for the coalescence of black hole-neutron star (BH-NS binaries. First, procedures for a solution of BH-NS binaries in quasi-equilibrium circular orbits and the numerical results, such as quasi-equilibrium sequence and mass-shedding limit, of the high-precision computation, are summarized. Then, the current status of numerical-relativity simulations for the merger of BH-NS binaries is described. We summarize our understanding for the merger and/or tidal disruption processes, the criterion for tidal disruption, the properties of the remnant formed after the tidal disruption, gravitational waveform, and gravitational-wave spectrum.

  12. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  13. Automated Bayesian model development for frequency detection in biological time series.

    Science.gov (United States)

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time

  14. Spiritual and ceremonial plants in North America: an assessment of Moerman's ethnobotanical database comparing Residual, Binomial, Bayesian and Imprecise Dirichlet Model (IDM) analysis.

    Science.gov (United States)

    Turi, Christina E; Murch, Susan J

    2013-07-09

    Ethnobotanical research and the study of plants used for rituals, ceremonies and to connect with the spirit world have led to the discovery of many novel psychoactive compounds such as nicotine, caffeine, and cocaine. In North America, spiritual and ceremonial uses of plants are well documented and can be accessed online via the University of Michigan's Native American Ethnobotany Database. The objective of the study was to compare Residual, Bayesian, Binomial and Imprecise Dirichlet Model (IDM) analyses of ritual, ceremonial and spiritual plants in Moerman's ethnobotanical database and to identify genera that may be good candidates for the discovery of novel psychoactive compounds. The database was queried with the following format "Family Name AND Ceremonial OR Spiritual" for 263 North American botanical families. Spiritual and ceremonial flora consisted of 86 families with 517 species belonging to 292 genera. Spiritual taxa were then grouped further into ceremonial medicines and items categories. Residual, Bayesian, Binomial and IDM analysis were performed to identify over and under-utilized families. The 4 statistical approaches were in good agreement when identifying under-utilized families but large families (>393 species) were underemphasized by Binomial, Bayesian and IDM approaches for over-utilization. Residual, Binomial, and IDM analysis identified similar families as over-utilized in the medium (92-392 species) and small (<92 species) classes. The families Apiaceae, Asteraceae, Ericacea, Pinaceae and Salicaceae were identified as significantly over-utilized as ceremonial medicines in medium and large sized families. Analysis of genera within the Apiaceae and Asteraceae suggest that the genus Ligusticum and Artemisia are good candidates for facilitating the discovery of novel psychoactive compounds. The 4 statistical approaches were not consistent in the selection of over-utilization of flora. Residual analysis revealed overall trends that were supported

  15. Bayesian soft x-ray tomography and MHD mode analysis on HL-2A

    Science.gov (United States)

    Li, Dong; Liu, Yi; Svensson, J.; Liu, Y. Q.; Song, X. M.; Yu, L. M.; Mao, Rui; Fu, B. Z.; Deng, Wei; Yuan, B. S.; Ji, X. Q.; Xu, Yuan; Chen, Wei; Zhou, Yan; Yang, Q. W.; Duan, X. R.; Liu, Yong; HL-2A Team

    2016-03-01

    A Bayesian based tomography method using so-called Gaussian processes (GPs) for the emission model has been applied to the soft x-ray (SXR) diagnostics on HL-2A tokamak. To improve the accuracy of reconstructions, the standard GP is extended to a non-stationary version so that different smoothness between the plasma center and the edge can be taken into account in the algorithm. The uncertainty in the reconstruction arising from measurement errors and incapability can be fully analyzed by the usage of Bayesian probability theory. In this work, the SXR reconstructions by this non-stationary Gaussian processes tomography (NSGPT) method have been compared with the equilibrium magnetic flux surfaces, generally achieving a satisfactory agreement in terms of both shape and position. In addition, singular-value-decomposition (SVD) and Fast Fourier Transform (FFT) techniques have been applied for the analysis of SXR and magnetic diagnostics, in order to explore the spatial and temporal features of the saturated long-lived magnetohydrodynamics (MHD) instability induced by energetic particles during neutral beam injection (NBI) on HL-2A. The result shows that this ideal internal kink instability has a dominant m/n  =  1/1 mode structure along with a harmonics m/n  =  2/2, which are coupled near the q  =  1 surface with a rotation frequency of 12 kHz.

  16. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  17. BAYESIAN APPROACH TO THE PROCESS OF IDENTIFICATION OF THE DETERMINANTS OF INNOVATIVENESS

    Directory of Open Access Journals (Sweden)

    Marta Czyżewska

    2014-08-01

    Full Text Available Bayesian belief networks are applied in determining the most important factors of the innovativeness level of national economies. The paper is divided into two parts. The first presentsthe basic theory of Bayesian networks whereas in the second, the belief networks have been generated by an inhouse developed computer system called BeliefSEEKER which was implemented to generate the determinants influencing the innovativeness level of national economies.Qualitative analysis of the generated belief networks provided a way to define a set of the most important dimensions influencing the innovativeness level of economies and then the indicators that form these dimensions. It has been proven that Bayesian networks are very effective methods for multidimensional analysis and forming conclusions and recommendations regarding the strength of each innovative determinant influencing the overall performance of a country’s economy.

  18. Applying Bayesian belief networks in rapid response situations

    Energy Technology Data Exchange (ETDEWEB)

    Gibson, William L [Los Alamos National Laboratory; Deborah, Leishman, A. [Los Alamos National Laboratory; Van Eeckhout, Edward [Los Alamos National Laboratory

    2008-01-01

    The authors have developed an enhanced Bayesian analysis tool called the Integrated Knowledge Engine (IKE) for monitoring and surveillance. The enhancements are suited for Rapid Response Situations where decisions must be made based on uncertain and incomplete evidence from many diverse and heterogeneous sources. The enhancements extend the probabilistic results of the traditional Bayesian analysis by (1) better quantifying uncertainty arising from model parameter uncertainty and uncertain evidence, (2) optimizing the collection of evidence to reach conclusions more quickly, and (3) allowing the analyst to determine the influence of the remaining evidence that cannot be obtained in the time allowed. These extended features give the analyst and decision maker a better comprehension of the adequacy of the acquired evidence and hence the quality of the hurried decisions. They also describe two example systems where the above features are highlighted.

  19. 3rd Bayesian Young Statisticians Meeting

    CERN Document Server

    Lanzarone, Ettore; Villalobos, Isadora; Mattei, Alessandra

    2017-01-01

    This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).

  20. Bus Route Design with a Bayesian Network Analysis of Bus Service Revenues

    Directory of Open Access Journals (Sweden)

    Yi Liu

    2018-01-01

    Full Text Available A Bayesian network is used to estimate revenues of bus services in consideration of the effect of bus travel demands, passenger transport distances, and so on. In this research, the area X in Beijing has been selected as the study area because of its relatively high bus travel demand and, on the contrary, unsatisfactory bus services. It is suggested that the proposed Bayesian network approach is able to rationally predict the probabilities of different revenues of various route services, from the perspectives of both satisfying passenger demand and decreasing bus operation cost. This way, the existing bus routes in the studied area can be optimized for their most probable high revenues.

  1. A Bayesian alternative for multi-objective ecohydrological model specification

    Science.gov (United States)

    Tang, Yating; Marshall, Lucy; Sharma, Ashish; Ajami, Hoori

    2018-01-01

    Recent studies have identified the importance of vegetation processes in terrestrial hydrologic systems. Process-based ecohydrological models combine hydrological, physical, biochemical and ecological processes of the catchments, and as such are generally more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov chain Monte Carlo (MCMC) techniques. The Bayesian approach offers an appealing alternative to traditional multi-objective hydrologic model calibrations by defining proper prior distributions that can be considered analogous to the ad-hoc weighting often prescribed in multi-objective calibration. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological modeling framework based on a traditional Pareto-based model calibration technique. In our study, a Pareto-based multi-objective optimization and a formal Bayesian framework are implemented in a conceptual ecohydrological model that combines a hydrological model (HYMOD) and a modified Bucket Grassland Model (BGM). Simulations focused on one objective (streamflow/LAI) and multiple objectives (streamflow and LAI) with different emphasis defined via the prior distribution of the model error parameters. Results show more reliable outputs for both predicted streamflow and LAI using Bayesian multi-objective calibration with specified prior distributions for error parameters based on results from the Pareto front in the ecohydrological modeling. The methodology implemented here provides insight into the usefulness of multiobjective Bayesian calibration for ecohydrologic systems and the importance of appropriate prior

  2. Integrated data analysis of fusion diagnostics by means of the Bayesian probability theory

    International Nuclear Information System (INIS)

    Fischer, R.; Dinklage, A.

    2004-01-01

    Integrated data analysis (IDA) of fusion diagnostics is the combination of heterogeneous diagnostics to obtain validated physical results. Benefits from the integrated approach result from a systematic use of interdependencies; in that sense IDA optimizes the extraction of information from sets of different data. For that purpose IDA requires a systematic and formalized error analysis of all (statistical and systematic) uncertainties involved in each diagnostic. Bayesian probability theory allows for a systematic combination of all information entering the diagnostic model by considering all uncertainties of the measured data, the calibration measurements, and the physical model. Prior physics knowledge on model parameters can be included. Handling of systematic errors is provided. A central goal of the integration of redundant or complementary diagnostics is to provide information to resolve inconsistencies by exploiting interdependencies. A comparable analysis of sets of diagnostics (meta-diagnostics) is performed by combining statistical and systematical uncertainties with model parameters and model uncertainties. Diagnostics improvement and experimental optimization and design of meta-diagnostics will be discussed

  3. Spatial Bayesian latent factor regression modeling of coordinate-based meta-analysis data.

    Science.gov (United States)

    Montagna, Silvia; Wager, Tor; Barrett, Lisa Feldman; Johnson, Timothy D; Nichols, Thomas E

    2018-03-01

    Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the article are available for Coordinate-Based Meta-Analysis (CBMA). Neuroimaging meta-analysis is used to (i) identify areas of consistent activation; and (ii) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously address these aims, we propose a Bayesian point process hierarchical model for CBMA. We model the foci from each study as a doubly stochastic Poisson process, where the study-specific log intensity function is characterized as a linear combination of a high-dimensional basis set. A sparse representation of the intensities is guaranteed through latent factor modeling of the basis coefficients. Within our framework, it is also possible to account for the effect of study-level covariates (meta-regression), significantly expanding the capabilities of the current neuroimaging meta-analysis methods available. We apply our methodology to synthetic data and neuroimaging meta-analysis datasets. © 2017, The International Biometric Society.

  4. Spatial Bayesian Latent Factor Regression Modeling of Coordinate-based Meta-analysis Data

    Science.gov (United States)

    Montagna, Silvia; Wager, Tor; Barrett, Lisa Feldman; Johnson, Timothy D.; Nichols, Thomas E.

    2017-01-01

    Summary Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the paper are available for Coordinate-Based Meta-Analysis (CBMA). Neuroimaging meta-analysis is used to 1) identify areas of consistent activation; and 2) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously address these aims, we propose a Bayesian point process hierarchical model for CBMA. We model the foci from each study as a doubly stochastic Poisson process, where the study-specific log intensity function is characterised as a linear combination of a high-dimensional basis set. A sparse representation of the intensities is guaranteed through latent factor modeling of the basis coefficients. Within our framework, it is also possible to account for the effect of study-level covariates (meta-regression), significantly expanding the capabilities of the current neuroimaging meta-analysis methods available. We apply our methodology to synthetic data and neuroimaging meta-analysis datasets. PMID:28498564

  5. Risk Analysis on Leakage Failure of Natural Gas Pipelines by Fuzzy Bayesian Network with a Bow-Tie Model

    Directory of Open Access Journals (Sweden)

    Xian Shan

    2017-01-01

    Full Text Available Pipeline is the major mode of natural gas transportation. Leakage of natural gas pipelines may cause explosions and fires, resulting in casualties, environmental damage, and material loss. Efficient risk analysis is of great significance for preventing and mitigating such potential accidents. The objective of this study is to present a practical risk assessment method based on Bow-tie model and Bayesian network for risk analysis of natural gas pipeline leakage. Firstly, identify the potential risk factors and consequences of the failure. Then construct the Bow-tie model, use the quantitative analysis of Bayesian network to find the weak links in the system, and make a prediction of the control measures to reduce the rate of the accident. In order to deal with the uncertainty existing in the determination of the probability of basic events, fuzzy logic method is used. Results of a case study show that the most likely causes of natural gas pipeline leakage occurrence are parties ignore signage, implicit signage, overload, and design defect of auxiliaries. Once the leakage occurs, it is most likely to result in fire and explosion. Corresponding measures taken on time will reduce the disaster degree of accidents to the least extent.

  6. Coalescence and compression in centrifuged emulsions studied with in situ optical microscopy

    NARCIS (Netherlands)

    Krebs, T.; Ershov, D.S.; Schroën, C.G.P.H.; Boom, R.M.

    2013-01-01

    We report an experimental method to investigate droplet dynamics in centrifuged emulsions and its application to study droplet compression and coalescence. The experimental setup permits in situ monitoring of an ensemble of droplets in a centrifuged monolayer of monodisperse emulsion droplets using

  7. Bayesian selection of misspecified models is overconfident and may cause spurious posterior probabilities for phylogenetic trees.

    Science.gov (United States)

    Yang, Ziheng; Zhu, Tianqi

    2018-02-20

    The Bayesian method is noted to produce spuriously high posterior probabilities for phylogenetic trees in analysis of large datasets, but the precise reasons for this overconfidence are unknown. In general, the performance of Bayesian selection of misspecified models is poorly understood, even though this is of great scientific interest since models are never true in real data analysis. Here we characterize the asymptotic behavior of Bayesian model selection and show that when the competing models are equally wrong, Bayesian model selection exhibits surprising and polarized behaviors in large datasets, supporting one model with full force while rejecting the others. If one model is slightly less wrong than the other, the less wrong model will eventually win when the amount of data increases, but the method may become overconfident before it becomes reliable. We suggest that this extreme behavior may be a major factor for the spuriously high posterior probabilities for evolutionary trees. The philosophical implications of our results to the application of Bayesian model selection to evaluate opposing scientific hypotheses are yet to be explored, as are the behaviors of non-Bayesian methods in similar situations.

  8. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  9. Coalescence and 2.7 K black body distorsion in baryon symmetric Big Bang Cosmology

    International Nuclear Information System (INIS)

    Ramani, A.; Puget, J.L.

    1976-01-01

    We discuss here the efficiency of coalescence during the late phases of a baryon symmetric Big Bang Cosmology. We show that during the radiative period, coalescence cannot be as efficient as it was stated in a previous paper. During the matter dominated period, matter and antimatter might be separated on the scale of clusters of galaxies, but only at the expense of substantive distorsions of the 2.7 K black body background radiation. We compute lower limits to these distorsions as functions of the density of matter in the universe and show that only in the case of a very dilute universe can these values be reconciled with experimental results. (orig.) [de

  10. Bayesian analysis for exponential random graph models using the adaptive exchange sampler

    KAUST Repository

    Jin, Ick Hoon

    2013-01-01

    Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the existence of intractable normalizing constants. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the issue of intractable normalizing constants encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency.

  11. Coalescence in PLA-PBAT blends under shear flow: Effects of blend preparation and PLA molecular weight

    Energy Technology Data Exchange (ETDEWEB)

    Nofar, M. [Center for High Performance Polymer and Composite Systems (CREPEC), Chemical Engineering Department, Polytechnique Montreal, Montreal, Quebec H3T 1J4, Canada and CREPEC, Department of Chemical Engineering, McGill University, Montreal, Quebec H3A 2B2 (Canada); Heuzey, M. C.; Carreau, P. J., E-mail: pierre.carreau@polymtl.ca [Center for High Performance Polymer and Composite Systems (CREPEC), Chemical Engineering Department, Polytechnique Montreal, Montreal, Quebec H3T 1J4 (Canada); Kamal, M. R. [CREPEC, Department of Chemical Engineering, McGill University, Montreal, Quebec H3A 2B2 (Canada); Randall, J. [NatureWorks LLC, 15305 Minnetonka Boulevard, Minnetonka, Minnesota 55345 (United States)

    2016-07-15

    Blends containing 75 wt. % of an amorphous polylactide (PLA) with two different molecular weights and 25 wt. % of a poly[(butylene adipate)-co-terephthalate] (PBAT) were prepared using either a Brabender batch mixer or a twin-screw extruder. These compounds were selected because blending PLA with PBAT can overcome various drawbacks of PLA such as its brittleness and processability limitations. In this study, we investigated the effects of varying the molecular weight of the PLA matrix and of two different mixing processes on the blend morphology and, further, on droplet coalescence during shearing. The rheological properties of these blends were investigated and the interfacial properties were analyzed using the Palierne emulsion model. Droplet coalescence was investigated by applying shear flows of 0.05 and 0.20 s{sup −1} at a fixed strain of 60. Subsequently, small amplitude oscillatory shear tests were conducted to investigate changes in the viscoelastic properties. The morphology of the blends was also examined using scanning electron microscope (SEM) micrographs. It was observed that the PBAT droplets were much smaller when twin-screw extrusion was used for the blend preparation. Shearing at 0.05 s{sup −1} induced significant droplet coalescence in all blends, but coalescence and changes in the viscoelastic properties were much more pronounced for the PLA-PBAT blend based on a lower molecular weight PLA. The viscoelastic responses were also somehow affected by the thermal degradation of the PLA matrix during the experiments.

  12. Coalescence in PLA-PBAT blends under shear flow: Effects of blend preparation and PLA molecular weight

    International Nuclear Information System (INIS)

    Nofar, M.; Heuzey, M. C.; Carreau, P. J.; Kamal, M. R.; Randall, J.

    2016-01-01

    Blends containing 75 wt. % of an amorphous polylactide (PLA) with two different molecular weights and 25 wt. % of a poly[(butylene adipate)-co-terephthalate] (PBAT) were prepared using either a Brabender batch mixer or a twin-screw extruder. These compounds were selected because blending PLA with PBAT can overcome various drawbacks of PLA such as its brittleness and processability limitations. In this study, we investigated the effects of varying the molecular weight of the PLA matrix and of two different mixing processes on the blend morphology and, further, on droplet coalescence during shearing. The rheological properties of these blends were investigated and the interfacial properties were analyzed using the Palierne emulsion model. Droplet coalescence was investigated by applying shear flows of 0.05 and 0.20 s"−"1 at a fixed strain of 60. Subsequently, small amplitude oscillatory shear tests were conducted to investigate changes in the viscoelastic properties. The morphology of the blends was also examined using scanning electron microscope (SEM) micrographs. It was observed that the PBAT droplets were much smaller when twin-screw extrusion was used for the blend preparation. Shearing at 0.05 s"−"1 induced significant droplet coalescence in all blends, but coalescence and changes in the viscoelastic properties were much more pronounced for the PLA-PBAT blend based on a lower molecular weight PLA. The viscoelastic responses were also somehow affected by the thermal degradation of the PLA matrix during the experiments.

  13. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... dynamic domains. The communication needed between instances is achieved by means of a fill-in propagation scheme....

  14. Bayesian conditional-independence modeling of the AIDS epidemic in England and Wales

    Science.gov (United States)

    Gilks, Walter R.; De Angelis, Daniela; Day, Nicholas E.

    We describe the use of conditional-independence modeling, Bayesian inference and Markov chain Monte Carlo, to model and project the HIV-AIDS epidemic in homosexual/bisexual males in England and Wales. Complexity in this analysis arises through selectively missing data, indirectly observed underlying processes, and measurement error. Our emphasis is on presentation and discussion of the concepts, not on the technicalities of this analysis, which can be found elsewhere [D. De Angelis, W.R. Gilks, N.E. Day, Bayesian projection of the the acquired immune deficiency syndrome epidemic (with discussion), Applied Statistics, in press].

  15. Micronutrients in HIV: a Bayesian meta-analysis.

    Directory of Open Access Journals (Sweden)

    George M Carter

    Full Text Available Approximately 28.5 million people living with HIV are eligible for treatment (CD4<500, but currently have no access to antiretroviral therapy. Reduced serum level of micronutrients is common in HIV disease. Micronutrient supplementation (MNS may mitigate disease progression and mortality.We synthesized evidence on the effect of micronutrient supplementation on mortality and rate of disease progression in HIV disease.We searched MEDLINE, EMBASE, the Cochrane Central, AMED and CINAHL databases through December 2014, without language restriction, for studies of greater than 3 micronutrients versus any or no comparator. We built a hierarchical Bayesian random effects model to synthesize results. Inferences are based on the posterior distribution of the population effects; posterior distributions were approximated by Markov chain Monte Carlo in OpenBugs.From 2166 initial references, we selected 49 studies for full review and identified eight reporting on disease progression and/or mortality. Bayesian synthesis of data from 2,249 adults in three studies estimated the relative risk of disease progression in subjects on MNS vs. control as 0.62 (95% credible interval, 0.37, 0.96. Median number needed to treat is 8.4 (4.8, 29.9 and the Bayes Factor 53.4. Based on data reporting on 4,095 adults reporting mortality in 7 randomized controlled studies, the RR was 0.84 (0.38, 1.85, NNT is 25 (4.3, ∞.MNS significantly and substantially slows disease progression in HIV+ adults not on ARV, and possibly reduces mortality. Micronutrient supplements are effective in reducing progression with a posterior probability of 97.9%. Considering MNS low cost and lack of adverse effects, MNS should be standard of care for HIV+ adults not yet on ARV.

  16. Coalescence collision of liquid drops I: Off-center collisions of equal-size drops

    Directory of Open Access Journals (Sweden)

    Alejandro Acevedo-Malavé

    2011-09-01

    Full Text Available The Smoothed Particle Hydrodynamics method (SPH is used here to model off-center collisions of equal-size liquid drops in a three-dimensional space. In this study the Weber number is calculated for several conditions of the droplets dynamics and the velocity vector fields formed inside the drops during the collision process are shown. For the permanent coalescence the evolution of the kinetic and internal energy is shown and also the approaching to equilibrium of the resulting drop. Depending of the Weber number three possible outcomes for the collision of droplets is obtained: permanent coalescence, flocculation and fragmentation. The fragmentation phenomena are modeled and the formation of small satellite drops can be seen. The ligament that is formed follows the “end pinching” mechanism and it is transformed into a flat structure.

  17. Flood quantile estimation at ungauged sites by Bayesian networks

    Science.gov (United States)

    Mediero, L.; Santillán, D.; Garrote, L.

    2012-04-01

    Estimating flood quantiles at a site for which no observed measurements are available is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. The most common technique used is the multiple regression analysis, which relates physical and climatic basin characteristic to flood quantiles. Regression equations are fitted from flood frequency data and basin characteristics at gauged sites. Regression equations are a rigid technique that assumes linear relationships between variables and cannot take the measurement errors into account. In addition, the prediction intervals are estimated in a very simplistic way from the variance of the residuals in the estimated model. Bayesian networks are a probabilistic computational structure taken from the field of Artificial Intelligence, which have been widely and successfully applied to many scientific fields like medicine and informatics, but application to the field of hydrology is recent. Bayesian networks infer the joint probability distribution of several related variables from observations through nodes, which represent random variables, and links, which represent causal dependencies between them. A Bayesian network is more flexible than regression equations, as they capture non-linear relationships between variables. In addition, the probabilistic nature of Bayesian networks allows taking the different sources of estimation uncertainty into account, as they give a probability distribution as result. A homogeneous region in the Tagus Basin was selected as case study. A regression equation was fitted taking the basin area, the annual maximum 24-hour rainfall for a given recurrence interval and the mean height as explanatory variables. Flood quantiles at ungauged sites were estimated by Bayesian networks. Bayesian networks need to be learnt from a huge enough data set. As observational data are reduced, a

  18. Bayesian Plackett-Luce Mixture Models for Partially Ranked Data.

    Science.gov (United States)

    Mollica, Cristina; Tardella, Luca

    2017-06-01

    The elicitation of an ordinal judgment on multiple alternatives is often required in many psychological and behavioral experiments to investigate preference/choice orientation of a specific population. The Plackett-Luce model is one of the most popular and frequently applied parametric distributions to analyze rankings of a finite set of items. The present work introduces a Bayesian finite mixture of Plackett-Luce models to account for unobserved sample heterogeneity of partially ranked data. We describe an efficient way to incorporate the latent group structure in the data augmentation approach and the derivation of existing maximum likelihood procedures as special instances of the proposed Bayesian method. Inference can be conducted with the combination of the Expectation-Maximization algorithm for maximum a posteriori estimation and the Gibbs sampling iterative procedure. We additionally investigate several Bayesian criteria for selecting the optimal mixture configuration and describe diagnostic tools for assessing the fitness of ranking distributions conditionally and unconditionally on the number of ranked items. The utility of the novel Bayesian parametric Plackett-Luce mixture for characterizing sample heterogeneity is illustrated with several applications to simulated and real preference ranked data. We compare our method with the frequentist approach and a Bayesian nonparametric mixture model both assuming the Plackett-Luce model as a mixture component. Our analysis on real datasets reveals the importance of an accurate diagnostic check for an appropriate in-depth understanding of the heterogenous nature of the partial ranking data.

  19. Bayesian linear regression : different conjugate models and their (in)sensitivity to prior-data conflict

    NARCIS (Netherlands)

    Walter, G.M.; Augustin, Th.; Kneib, Thomas; Tutz, Gerhard

    2010-01-01

    The paper is concerned with Bayesian analysis under prior-data conflict, i.e. the situation when observed data are rather unexpected under the prior (and the sample size is not large enough to eliminate the influence of the prior). Two approaches for Bayesian linear regression modeling based on

  20. Bayesian versus frequentist statistical inference for investigating a one-off cancer cluster reported to a health department

    Directory of Open Access Journals (Sweden)

    Wills Rachael A

    2009-05-01

    Full Text Available Abstract Background The problem of silent multiple comparisons is one of the most difficult statistical problems faced by scientists. It is a particular problem for investigating a one-off cancer cluster reported to a health department because any one of hundreds, or possibly thousands, of neighbourhoods, schools, or workplaces could have reported a cluster, which could have been for any one of several types of cancer or any one of several time periods. Methods This paper contrasts the frequentist approach with a Bayesian approach for dealing with silent multiple comparisons in the context of a one-off cluster reported to a health department. Two published cluster investigations were re-analysed using the Dunn-Sidak method to adjust frequentist p-values and confidence intervals for silent multiple comparisons. Bayesian methods were based on the Gamma distribution. Results Bayesian analysis with non-informative priors produced results similar to the frequentist analysis, and suggested that both clusters represented a statistical excess. In the frequentist framework, the statistical significance of both clusters was extremely sensitive to the number of silent multiple comparisons, which can only ever be a subjective "guesstimate". The Bayesian approach is also subjective: whether there is an apparent statistical excess depends on the specified prior. Conclusion In cluster investigations, the frequentist approach is just as subjective as the Bayesian approach, but the Bayesian approach is less ambitious in that it treats the analysis as a synthesis of data and personal judgements (possibly poor ones, rather than objective reality. Bayesian analysis is (arguably a useful tool to support complicated decision-making, because it makes the uncertainty associated with silent multiple comparisons explicit.