WorldWideScience

Sample records for wmap observations bayesian

  1. Bayesian analysis of anisotropic cosmologies: Bianchi VIIh and WMAP

    Science.gov (United States)

    McEwen, J. D.; Josset, T.; Feeney, S. M.; Peiris, H. V.; Lasenby, A. N.

    2013-12-01

    We perform a definitive analysis of Bianchi VIIh cosmologies with Wilkinson Microwave Anisotropy Probe (WMAP) observations of the cosmic microwave background (CMB) temperature anisotropies. Bayesian analysis techniques are developed to study anisotropic cosmologies using full-sky and partial-sky masked CMB temperature data. We apply these techniques to analyse the full-sky internal linear combination (ILC) map and a partial-sky masked W-band map of WMAP 9 yr observations. In addition to the physically motivated Bianchi VIIh model, we examine phenomenological models considered in previous studies, in which the Bianchi VIIh parameters are decoupled from the standard cosmological parameters. In the two phenomenological models considered, Bayes factors of 1.7 and 1.1 units of log-evidence favouring a Bianchi component are found in full-sky ILC data. The corresponding best-fitting Bianchi maps recovered are similar for both phenomenological models and are very close to those found in previous studies using earlier WMAP data releases. However, no evidence for a phenomenological Bianchi component is found in the partial-sky W-band data. In the physical Bianchi VIIh model, we find no evidence for a Bianchi component: WMAP data thus do not favour Bianchi VIIh cosmologies over the standard Λ cold dark matter (ΛCDM) cosmology. It is not possible to discount Bianchi VIIh cosmologies in favour of ΛCDM completely, but we are able to constrain the vorticity of physical Bianchi VIIh cosmologies at (ω/H)0 < 8.6 × 10-10 with 95 per cent confidence.

  2. Bayesian Analysis of White Noise Levels in the Five-Year WMAP Data

    Science.gov (United States)

    Groeneboom, N. E.; Eriksen, H. K.; Gorski, K.; Huey, G.; Jewell, J.; Wandelt, B.

    2009-09-01

    We develop a new Bayesian method for estimating white noise levels in CMB sky maps, and apply this algorithm to the five-year Wilkinson Microwave Anisotropy Probe (WMAP) data. We assume that the amplitude of the noise rms is scaled by a constant value, α, relative to a pre-specified noise level. We then derive the corresponding conditional density, P(α | s, C ell, d), which is subsequently integrated into a general CMB Gibbs sampler. We first verify our code by analyzing simulated data sets, and then apply the framework to the WMAP data. For the foreground-reduced five-year WMAP sky maps and the nominal noise levels initially provided in the five-year data release, we find that the posterior means typically range between α = 1.005 ± 0.001 and α = 1.010 ± 0.001 depending on differencing assembly, indicating that the noise level of these maps are biased low by 0.5%-1.0%. The same problem is not observed for the uncorrected WMAP sky maps. After the preprint version of this letter appeared on astro-ph., the WMAP team has corrected the values presented on their web page, noting that the initially provided values were in fact estimates from the three-year data release, not from the five-year estimates. However, internally in their five-year analysis the correct noise values were used, and no cosmological results are therefore compromised by this error. Thus, our method has already been demonstrated in practice to be both useful and accurate.

  3. Nine-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Final Maps and Results

    Science.gov (United States)

    Bennett, C. L.; Larson, D.; Weiland, J. L.; Jaorsik, N.; Hinshaw, G.; Odegard, N.; Smith, K. M.; Hill, R. S.; Gold, B.; Halpern, M; hide

    2013-01-01

    -parameter ?Lambda-CDM model, based on CMB data alone. For a model including tensors, the allowed seven-parameter volume has been reduced by a factor 117,000. Other cosmological observations are in accord with the CMB predictions, and the combined data reduces the cosmological parameter volume even further.With no significant anomalies and an adequate goodness of fit, the inflationary flat Lambda-CDM model and its precise and accurate parameters rooted in WMAP data stands as the standard model of cosmology.

  4. NINE-YEAR WILKINSON MICROWAVE ANISOTROPY PROBE (WMAP) OBSERVATIONS: FINAL MAPS AND RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C. L.; Larson, D.; Weiland, J. L. [Department of Physics and Astronomy, The Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218-2686 (United States); Jarosik, N.; Page, L. [Department of Physics, Jadwin Hall, Princeton University, Princeton, NJ 08544-0708 (United States); Hinshaw, G.; Halpern, M. [Department of Physics and Astronomy, University of British Columbia, Vancouver, BC V6T 1Z1 (Canada); Odegard, N.; Hill, R. S. [ADNET Systems, Inc., 7515 Mission Drive, Suite A100, Lanham, MD 20706 (United States); Smith, K. M. [Perimeter Institute for Theoretical Physics, Waterloo, ON N2L 2Y5 (Canada); Gold, B. [School of Physics and Astronomy, University of Minnesota, 116 Church Street S.E., Minneapolis, MN 55455 (United States); Komatsu, E. [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild Str. 1, D-85741 Garching (Germany); Nolta, M. R. [Canadian Institute for Theoretical Astrophysics, 60 St. George Street, University of Toronto, Toronto, ON M5S 3H8 (Canada); Spergel, D. N. [Department of Astrophysical Sciences, Peyton Hall, Princeton University, Princeton, NJ 08544-1001 (United States); Wollack, E.; Kogut, A. [Code 665, NASA/Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Dunkley, J. [Oxford Astrophysics, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Limon, M. [Columbia Astrophysics Laboratory, 550 West 120th Street, Mail Code 5247, New York, NY 10027-6902 (United States); Meyer, S. S. [Departments of Astrophysics and Physics, KICP and EFI, University of Chicago, Chicago, IL 60637 (United States); Tucker, G. S., E-mail: cbennett@jhu.edu [Department of Physics, Brown University, 182 Hope Street, Providence, RI 02912-1843 (United States); and others

    2013-10-01

    of 68,000 for the standard six-parameter ΛCDM model, based on CMB data alone. For a model including tensors, the allowed seven-parameter volume has been reduced by a factor 117,000. Other cosmological observations are in accord with the CMB predictions, and the combined data reduces the cosmological parameter volume even further. With no significant anomalies and an adequate goodness of fit, the inflationary flat ΛCDM model and its precise and accurate parameters rooted in WMAP data stands as the standard model of cosmology.

  5. Five-Year Wilkinson Microwave Anisotropy Probe (WMAP) Observations: Data Processing, Sky Maps, and Basic Results

    Science.gov (United States)

    Weiland, J.L.; Hill, R.S.; Odegard, 3.; Larson, D.; Bennett, C.L.; Dunkley, J.; Jarosik, N.; Page, L.; Spergel, D.N.; Halpern, M.; hide

    2008-01-01

    The Wilkinson Microwave Anisotropy Probe (WMAP) is a Medium-Class Explorer (MIDEX) satellite aimed at elucidating cosmology through full-sky observations of the cosmic microwave background (CMB). The WMAP full-sky maps of the temperature and polarization anisotropy in five frequency bands provide our most accurate view to date of conditions in the early universe. The multi-frequency data facilitate the separation of the CMB signal from foreground emission arising both from our Galaxy and from extragalactic sources. The CMB angular power spectrum derived from these maps exhibits a highly coherent acoustic peak structure which makes it possible to extract a wealth of information about the composition and history of the universe. as well as the processes that seeded the fluctuations. WMAP data have played a key role in establishing ACDM as the new standard model of cosmology (Bennett et al. 2003: Spergel et al. 2003; Hinshaw et al. 2007: Spergel et al. 2007): a flat universe dominated by dark energy, supplemented by dark matter and atoms with density fluctuations seeded by a Gaussian, adiabatic, nearly scale invariant process. The basic properties of this universe are determined by five numbers: the density of matter, the density of atoms. the age of the universe (or equivalently, the Hubble constant today), the amplitude of the initial fluctuations, and their scale dependence. By accurately measuring the first few peaks in the angular power spectrum, WMAP data have enabled the following accomplishments: Showing the dark matter must be non-baryonic and interact only weakly with atoms and radiation. The WMAP measurement of the dark matter density puts important constraints on supersymmetric dark matter models and on the properties of other dark matter candidates. With five years of data and a better determination of our beam response, this measurement has been significantly improved. Precise determination of the density of atoms in the universe. The agreement between

  6. Fermi LAT and WMAP observations of the supernova remnant HB 21

    Energy Technology Data Exchange (ETDEWEB)

    Pivato, G. [Dipartimento di Fisica e Astronomia " G. Galilei," Università di Padova, I-35131 Padova (Italy); Hewitt, J. W. [CRESST, University of Maryland, Baltimore County, Baltimore, MD 21250 (United States); Tibaldo, L. [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States); Acero, F.; Brandt, T. J. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Ballet, J. [Laboratoire AIM, CEA-IRFU/CNRS/Université Paris Diderot, Service d' Astrophysique, CEA Saclay, F-91191 Gif sur Yvette (France); De Palma, F.; Giordano, F. [Dipartimento di Fisica " M. Merlin" dell' Università e del Politecnico di Bari, I-70126 Bari (Italy); Janssen, G. H. [University of Manchester, Manchester, M13 9PL (United Kingdom); Jóhannesson, G. [Science Institute, University of Iceland, IS-107 Reykjavik (Iceland); Smith, D. A., E-mail: giovanna.pivato@pd.infn.it, E-mail: john.w.hewitt@nasa.gov, E-mail: ltibaldo@slac.stanford.edu [Centre d' Études Nucléaires de Bordeaux Gradignan, IN2P3/CNRS, Université Bordeaux 1, BP120, F-33175 Gradignan Cedex (France)

    2013-12-20

    We present the analysis of Fermi Large Area Telescope γ-ray observations of HB 21 (G89.0+4.7). We detect significant γ-ray emission associated with the remnant: the flux >100 MeV is 9.4 ± 0.8 (stat) ± 1.6 (syst) × 10{sup –11} erg cm{sup –2} s{sup –1}. HB 21 is well modeled by a uniform disk centered at l = 88.°75 ± 0.°04, b = +4.°65 ± 0.°06 with a radius of 1.°19 ± 0.°06. The γ-ray spectrum shows clear evidence of curvature, suggesting a cutoff or break in the underlying particle population at an energy of a few GeV. We complement γ-ray observations with the analysis of the WMAP 7 yr data from 23 to 93 GHz, achieving the first detection of HB 21 at these frequencies. In combination with archival radio data, the radio spectrum shows a spectral break, which helps to constrain the relativistic electron spectrum, and, in turn, parameters of simple non-thermal radiation models. In one-zone models multiwavelength data favor the origin of γ rays from nucleon-nucleon collisions. A single population of electrons cannot produce both γ rays through bremsstrahlung and radio emission through synchrotron radiation. A predominantly inverse-Compton origin of the γ-ray emission is disfavored because it requires lower interstellar densities than are inferred for HB 21. In the hadronic-dominated scenarios, accelerated nuclei contribute a total energy of ∼3 × 10{sup 49} erg, while, in a two-zone bremsstrahlung-dominated scenario, the total energy in accelerated particles is ∼1 × 10{sup 49} erg.

  7. NINE-YEAR WILKINSON MICROWAVE ANISOTROPY PROBE (WMAP) OBSERVATIONS: COSMOLOGICAL PARAMETER RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Hinshaw, G.; Halpern, M. [Department of Physics and Astronomy, University of British Columbia, Vancouver, BC V6T 1Z1 (Canada); Larson, D.; Bennett, C. L.; Weiland, J. L. [Department of Physics and Astronomy, The Johns Hopkins University, 3400 N. Charles St., Baltimore, MD 21218-2686 (United States); Komatsu, E. [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild Str. 1, D-85741 Garching (Germany); Spergel, D. N. [Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU, WPI), Todai Institutes for Advanced Study, University of Tokyo, Kashiwa 277-8583 (Japan); Dunkley, J. [Oxford Astrophysics, Denys Wilkinson Building, Keble Road, Oxford, OX1 3RH (United Kingdom); Nolta, M. R. [Canadian Institute for Theoretical Astrophysics, 60 St. George St., University of Toronto, Toronto, ON M5S 3H8 (Canada); Hill, R. S.; Odegard, N. [ADNET Systems, Inc., 7515 Mission Dr., Suite A100 Lanham, MD 20706 (United States); Page, L.; Jarosik, N. [Department of Physics, Jadwin Hall, Princeton University, Princeton, NJ 08544-0708 (United States); Smith, K. M. [Department of Astrophysical Sciences, Peyton Hall, Princeton University, Princeton, NJ 08544-1001 (United States); Gold, B. [University of Minnesota, School of Physics and Astronomy, 116 Church Street S.E., Minneapolis, MN 55455 (United States); Kogut, A.; Wollack, E. [Code 665, NASA/Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Limon, M. [Columbia Astrophysics Laboratory, 550 W. 120th St., Mail Code 5247, New York, NY 10027-6902 (United States); Meyer, S. S. [Departments of Astrophysics and Physics, KICP and EFI, University of Chicago, Chicago, IL 60637 (United States); Tucker, G. S., E-mail: hinshaw@physics.ubc.ca [Department of Physics, Brown University, 182 Hope St., Providence, RI 02912-1843 (United States); and others

    2013-10-01

    We present cosmological parameter constraints based on the final nine-year Wilkinson Microwave Anisotropy Probe (WMAP) data, in conjunction with a number of additional cosmological data sets. The WMAP data alone, and in combination, continue to be remarkably well fit by a six-parameter ΛCDM model. When WMAP data are combined with measurements of the high-l cosmic microwave background anisotropy, the baryon acoustic oscillation scale, and the Hubble constant, the matter and energy densities, Ω {sub b} h {sup 2}, Ω {sub c} h {sup 2}, and Ω{sub Λ}, are each determined to a precision of ∼1.5%. The amplitude of the primordial spectrum is measured to within 3%, and there is now evidence for a tilt in the primordial spectrum at the 5σ level, confirming the first detection of tilt based on the five-year WMAP data. At the end of the WMAP mission, the nine-year data decrease the allowable volume of the six-dimensional ΛCDM parameter space by a factor of 68,000 relative to pre-WMAP measurements. We investigate a number of data combinations and show that their ΛCDM parameter fits are consistent. New limits on deviations from the six-parameter model are presented, for example: the fractional contribution of tensor modes is limited to r < 0.13 (95% CL); the spatial curvature parameter is limited to Ω{sub k} = -0.0027{sup +0.0039}{sub -0.0038}; the summed mass of neutrinos is limited to Σm {sub ν} < 0.44 eV (95% CL); and the number of relativistic species is found to lie within N {sub eff} = 3.84 ± 0.40, when the full data are analyzed. The joint constraint on N {sub eff} and the primordial helium abundance, Y {sub He}, agrees with the prediction of standard big bang nucleosynthesis. We compare recent Planck measurements of the Sunyaev-Zel'dovich effect with our seven-year measurements, and show their mutual agreement. Our analysis of the polarization pattern around temperature extrema is updated. This confirms a fundamental prediction of the standard

  8. BONNSAI: correlated stellar observables in Bayesian methods

    Science.gov (United States)

    Schneider, F. R. N.; Castro, N.; Fossati, L.; Langer, N.; de Koter, A.

    2017-02-01

    In an era of large spectroscopic surveys of stars and big data, sophisticated statistical methods become more and more important in order to infer fundamental stellar parameters such as mass and age. Bayesian techniques are powerful methods because they can match all available observables simultaneously to stellar models while taking prior knowledge properly into account. However, in most cases it is assumed that observables are uncorrelated which is generally not the case. Here, we include correlations in the Bayesian code Bonnsai by incorporating the covariance matrix in the likelihood function. We derive a parametrisation of the covariance matrix that, in addition to classical uncertainties, only requires the specification of a correlation parameter that describes how observables co-vary. Our correlation parameter depends purely on the method with which observables have been determined and can be analytically derived in some cases. This approach therefore has the advantage that correlations can be accounted for even if information for them are not available in specific cases but are known in general. Because the new likelihood model is a better approximation of the data, the reliability and robustness of the inferred parameters are improved. We find that neglecting correlations biases the most likely values of inferred stellar parameters and affects the precision with which these parameters can be determined. The importance of these biases depends on the strength of the correlations and the uncertainties. For example, we apply our technique to massive OB stars, but emphasise that it is valid for any type of stars. For effective temperatures and surface gravities determined from atmosphere modelling, we find that masses can be underestimated on average by 0.5σ and mass uncertainties overestimated by a factor of about 2 when neglecting correlations. At the same time, the age precisions are underestimated over a wide range of stellar parameters. We conclude that

  9. Bayesian Network Models for Local Dependence among Observable Outcome Variables

    Science.gov (United States)

    Almond, Russell G.; Mulder, Joris; Hemat, Lisa A.; Yan, Duanli

    2009-01-01

    Bayesian network models offer a large degree of flexibility for modeling dependence among observables (item outcome variables) from the same task, which may be dependent. This article explores four design patterns for modeling locally dependent observations: (a) no context--ignores dependence among observables; (b) compensatory context--introduces…

  10. Wilkinson Microwave Anisotropy Probe (WMAP) Attitude Estimation Filter Comparison

    Science.gov (United States)

    Harman, Richard R.

    2005-01-01

    The Wilkinson Microwave Anisotropy Probe (WMAP) spacecraft was launched in June of 2001. The sensor complement of WMAP consists of two Autonomous Star Trackers (ASTs), two Fine Sun Sensors (FSSs), and a gyro package which contains redundancy about one of the WMAP body axes. The onboard attitude estimation filter consists of an extended Kalman filter (EKF) solving for attitude and gyro bias errors which are then resolved into a spacecraft attitude quaternion and gyro bias. A pseudo-linear Kalman filter has been developed which directly estimates the spacecraft attitude quaternion, rate, and gyro bias. In this paper, the performance of the two filters is compared for the two major control modes of WMAP: inertial mode and observation mode.

  11. Large-scale alignments from WMAP and Planck

    CERN Document Server

    Copi, Craig J.; Schwarz, Dominik J.; Starkman, Glenn D.

    2013-01-01

    We revisit the alignments of the largest structures observed in the cosmic microwave background (CMB) using the seven and nine-year WMAP and first-year Planck data releases. The observed alignments -- the quadrupole with the octopole and their joint alignment with the direction of our motion with respect to the CMB (the dipole direction) and the geometry of the Solar System (defined by the Ecliptic plane) -- are generally in good agreement with results from the previous WMAP data releases. However, a closer look at full-sky data on the largest scales reveals discrepancies between the earlier WMAP data releases (three to seven-year) and the final nine-year release. There are also discrepancies between all the WMAP data releases and the first-year Planck release. Nevertheless, both the WMAP and Planck data confirm the alignments of the largest observable CMB modes in the Universe. In particular, the p-values for the mutual alignment between the quadrupole and octopole, and the alignment of the plane defined by ...

  12. WMAP - A Portrait of the Early Universe

    Science.gov (United States)

    Wollack, Edward J.

    2008-01-01

    A host of astrophysical observations suggest that early Universe was incredibly hot, dense, and homogeneous. A powerful probe of this time is provided by the relic radiation which we refer to today as the Cosmic Microwave Background (CMB). Images produced from this light contain the earliest glimpse of the Universe after the 'Big Bang' and the signature of the evolution of its contents. By exploiting these clues, constraints on the age, mass density, and geometry of the early Universe can be derived. A brief history of the evolution of the microwave radiometer systems and map making approaches used in advancing these aspects our understanding of cosmological will be reviewed. In addition, an overview of the results from NASA's Wilkinson Microwave Anisotropy (WMAP) will be presented.

  13. Bayesian inference for Markov jump processes with informative observations.

    Science.gov (United States)

    Golightly, Andrew; Wilkinson, Darren J

    2015-04-01

    In this paper we consider the problem of parameter inference for Markov jump process (MJP) representations of stochastic kinetic models. Since transition probabilities are intractable for most processes of interest yet forward simulation is straightforward, Bayesian inference typically proceeds through computationally intensive methods such as (particle) MCMC. Such methods ostensibly require the ability to simulate trajectories from the conditioned jump process. When observations are highly informative, use of the forward simulator is likely to be inefficient and may even preclude an exact (simulation based) analysis. We therefore propose three methods for improving the efficiency of simulating conditioned jump processes. A conditioned hazard is derived based on an approximation to the jump process, and used to generate end-point conditioned trajectories for use inside an importance sampling algorithm. We also adapt a recently proposed sequential Monte Carlo scheme to our problem. Essentially, trajectories are reweighted at a set of intermediate time points, with more weight assigned to trajectories that are consistent with the next observation. We consider two implementations of this approach, based on two continuous approximations of the MJP. We compare these constructs for a simple tractable jump process before using them to perform inference for a Lotka-Volterra system. The best performing construct is used to infer the parameters governing a simple model of motility regulation in Bacillus subtilis.

  14. Observing the observer (I: meta-bayesian models of learning and decision-making.

    Directory of Open Access Journals (Sweden)

    Jean Daunizeau

    2010-12-01

    Full Text Available In this paper, we present a generic approach that can be used to infer how subjects make optimal decisions under uncertainty. This approach induces a distinction between a subject's perceptual model, which underlies the representation of a hidden "state of affairs" and a response model, which predicts the ensuing behavioural (or neurophysiological responses to those inputs. We start with the premise that subjects continuously update a probabilistic representation of the causes of their sensory inputs to optimise their behaviour. In addition, subjects have preferences or goals that guide decisions about actions given the above uncertain representation of these hidden causes or state of affairs. From a Bayesian decision theoretic perspective, uncertain representations are so-called "posterior" beliefs, which are influenced by subjective "prior" beliefs. Preferences and goals are encoded through a "loss" (or "utility" function, which measures the cost incurred by making any admissible decision for any given (hidden state of affair. By assuming that subjects make optimal decisions on the basis of updated (posterior beliefs and utility (loss functions, one can evaluate the likelihood of observed behaviour. Critically, this enables one to "observe the observer", i.e. identify (context- or subject-dependent prior beliefs and utility-functions using psychophysical or neurophysiological measures. In this paper, we describe the main theoretical components of this meta-Bayesian approach (i.e. a Bayesian treatment of Bayesian decision theoretic predictions. In a companion paper ('Observing the observer (II: deciding when to decide', we describe a concrete implementation of it and demonstrate its utility by applying it to simulated and real reaction time data from an associative learning task.

  15. Bayesian data analysis in observational comparative effectiveness research: rationale and examples.

    Science.gov (United States)

    Olson, William H; Crivera, Concetta; Ma, Yi-Wen; Panish, Jessica; Mao, Lian; Lynch, Scott M

    2013-11-01

    Many comparative effectiveness research and patient-centered outcomes research studies will need to be observational for one or both of two reasons: first, randomized trials are expensive and time-consuming; and second, only observational studies can answer some research questions. It is generally recognized that there is a need to increase the scientific validity and efficiency of observational studies. Bayesian methods for the design and analysis of observational studies are scientifically valid and offer many advantages over frequentist methods, including, importantly, the ability to conduct comparative effectiveness research/patient-centered outcomes research more efficiently. Bayesian data analysis is being introduced into outcomes studies that we are conducting. Our purpose here is to describe our view of some of the advantages of Bayesian methods for observational studies and to illustrate both realized and potential advantages by describing studies we are conducting in which various Bayesian methods have been or could be implemented.

  16. Joint Planck and WMAP CMB map reconstruction

    Science.gov (United States)

    Bobin, J.; Sureau, F.; Starck, J.-L.; Rassat, A.; Paykari, P.

    2014-03-01

    We present a novel estimate of the cosmological microwave background (CMB) map by combining the two latest full-sky microwave surveys: WMAP nine-year and Planck PR1. The joint processing benefits from a recently introduced component separation method coined"local-generalized morphological component analysis" (LGMCA) and based on the sparse distribution of the foregrounds in the wavelet domain. The proposed estimation procedure takes advantage of the IRIS 100 μm as an extra observation on the galactic center for enhanced dust removal. We show that this new CMB map presents several interesting aspects: i) it is a full sky map without using any inpainting or interpolating method; ii) foreground contamination is very low; iii) the Galactic center is very clean with especially low dust contamination as measured by the cross-correlation between the estimated CMB map and the IRIS 100 μm map; and iv) it is free of thermal SZ contamination. Appendix is available in electronic form at http://www.aanda.org

  17. Informative Bayesian Type A uncertainty evaluation, especially applicable to a small number of observations

    Science.gov (United States)

    Cox, M.; Shirono, K.

    2017-10-01

    A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM’s Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all n ≥slant 2 , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.

  18. Accelerating inference for diffusions observed with measurement error and large sample sizes using approximate Bayesian computation

    DEFF Research Database (Denmark)

    Picchini, Umberto; Forman, Julie Lyng

    2016-01-01

    a nonlinear stochastic differential equation model observed with correlated measurement errors and an application to protein folding modelling. An approximate Bayesian computation (ABC)-MCMC algorithm is suggested to allow inference for model parameters within reasonable time constraints. The ABC algorithm...

  19. Bayesian estimation of discretely observed multi-dimensional diffusion processes using guided proposals

    NARCIS (Netherlands)

    van der Meulen, F.H.; Schauer, M.R.

    2017-01-01

    Estimation of parameters of a diffusion based on discrete time observations poses a difficult problem due to the lack of a closed form expression for the likelihood. From a Bayesian computational perspective it can be casted as a missing data problem where the diffusion bridges in between

  20. Bayesian estimation of discretely observed multi-dimensional diffusion processes using guided proposals

    NARCIS (Netherlands)

    Meulen, van der F.; Schauer, M.

    2017-01-01

    Estimation of parameters of a diffusion based on discrete time observations poses a difficult problem due to the lack of a closed form expression for the likelihood. From a Bayesian computational perspective it can be casted as a missing data problem where the diffusion bridges in between

  1. Predicting Plasma Glucose From Interstitial Glucose Observations Using Bayesian Methods

    DEFF Research Database (Denmark)

    Hansen, Alexander Hildenbrand; Duun-Henriksen, Anne Katrine; Juhl, Rune

    2014-01-01

    One way of constructing a control algorithm for an artificial pancreas is to identify a model capable of predicting plasma glucose (PG) from interstitial glucose (IG) observations. Stochastic differential equations (SDEs) make it possible to account both for the unknown influence of the continuous...... glucose monitor (CGM) and for unknown physiological influences. Combined with prior knowledge about the measurement devices, this approach can be used to obtain a robust predictive model. A stochastic-differential-equation-based gray box (SDE-GB) model is formulated on the basis of an identifiable...

  2. Bayesian Analysis Of HMI Solar Image Observables And Comparison To TSI Variations And MWO Image Observables

    Science.gov (United States)

    Parker, D. G.; Ulrich, R. K.; Beck, J.

    2014-12-01

    We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from May, 2010 to June, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables. The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment. Ulrich, R.K., Parker, D, Bertello, L. and

  3. WMAP - A Glimpse of the Early Universe

    Science.gov (United States)

    Wollack, Edward

    2009-01-01

    The early Universe was incredibly hot, dense, and homogeneous. A powerful probe of this time is provided by the relic radiation which we refer to today as the Cosmic Microwave Background (CMB). Images produced from this light contain the earliest glimpse of the Universe after the "Big Bang" and the signature of the evolution of its contents. By exploiting these clues, precise constraints on the age, mass density, and geometry of the early Universe can be derived. The history of this intriguing cosmological detective story will be reviewed. Recent results from NASA's Wilkinson Microwave Anisotropy Probe (WMAP) will be presented.

  4. Bayesian Network Models for Local Dependence among Observable Outcome Variables. Research Report. ETS RR-06-36

    Science.gov (United States)

    Almond, Russell G.; Mulder, Joris; Hemat, Lisa A.; Yan, Duanli

    2006-01-01

    Bayesian network models offer a large degree of flexibility for modeling dependence among observables (item outcome variables) from the same task that may be dependent. This paper explores four design patterns for modeling locally dependent observations from the same task: (1) No context--Ignore dependence among observables; (2) Compensatory…

  5. Lack of large-angle TT correlations persists in WMAP and Planck

    CERN Document Server

    Copi, Craig J; Schwarz, Dominik J; Starkman, Glenn D

    2015-01-01

    The lack of large-angle correlations in the observed microwave background temperature fluctuations persists in the final-year maps from WMAP and the first cosmological data release from Planck. We find a statistically robust and significant result: p-values for the missing correlations lying below 0.24 per cent (i.e. evidence at more than 3 sigma) for foreground cleaned maps, in complete agreement with previous analyses based upon earlier WMAP data. A conservative cut-sky analysis of the Planck HFI 100 GHz frequency band, the `cleanest CMB channel', returns a p-value as small as 0.03 per cent, based on the conservative mask defined by WMAP. These findings are in stark contrast to expectations from the inflationary Lambda cold dark matter model and still lack a convincing explanation. If this lack of large-angle correlations is a true feature of our Universe, and not just a statistical fluke, then the cosmological dipole must be considerably smaller than that predicted in the best-fitting model.

  6. The Lyman α forest and WMAP year three

    Science.gov (United States)

    Viel, Matteo; Haehnelt, Martin G.; Lewis, Antony

    2006-07-01

    A combined analysis of cosmic microwave background (CMB) and Lyman α forest data can constrain the matter power spectrum from small scales of about 1h-1 Mpc all the way to the horizon scale. The long lever arm and complementarity provided by such an analysis have previously led to a significant tightening of the constraints on the shape and the amplitude of the power spectrum of primordial density fluctuations. We present here a combined analysis of the WMAP three-year results with Lyman α forest data. The amplitude of the matter power spectrum σ8 and the spectral index ns inferred from the joint analysis of high- and low-resolution Lyman α forest data as analysed by Viel & Haehnelt are consistent with the new WMAP results to within 1σ. The joint analysis with the mainly low-resolution data as analysed by McDonald et al. suggests a value of σ8 that is ~2σ higher than that inferred from the WMAP three-year data alone. The joint analysis of the three-year WMAP and the Lyman α forest data also does not favour a running of the spectral index. The best-fitting values for a combined analysis of the three-year WMAP data, other CMB data, 2-degree Field (2dF) data and the Lyman α forest data are (σ8,ns) = (0.78 +/- 0.03,0.96 +/- 0.01).

  7. Bayesian inversion method for the 3D reconstruction of settlements from metric SAR observations

    Science.gov (United States)

    Quartulli, Marco; Datcu, Mihai P.

    2002-02-01

    The reconstruction of urban structures from InSAR (Interferometric Synthetic Aperture Radar) observations is a complex task. Until now it has been tipically approached using the methods of radargrammetry and SAR interferometry, in a direct extension of what had been done in the past for the reconstruction of natural surfaces from generally much lower resolution data. In this work, we present a new concept aiming at the accurate and detailed reconstruction of the observed city scenes for metric SAR observations. We use a model-based approach for the synergetic analysis of the different sources of information in InSAR data. We define a hierarchical model of the InSAR observation that is both deterministic and stochastic. While the deterministic section describes the SAR imaging geometry and its effects and expresses the different scene structures, the stochastic part incapsulates instead prior knowledge about the signal and defines its attributes while also describing incertitude over the parameters of the geometrical model. Bayesian inference is used to couple the diffent levels of the model, and to further define parameter estimation algorithms.

  8. Bayesian framework for modeling diffusion processes with nonlinear drift based on nonlinear and incomplete observations

    Science.gov (United States)

    Wu, Hao; Noé, Frank

    2011-03-01

    Diffusion processes are relevant for a variety of phenomena in the natural sciences, including diffusion of cells or biomolecules within cells, diffusion of molecules on a membrane or surface, and diffusion of a molecular conformation within a complex energy landscape. Many experimental tools exist now to track such diffusive motions in single cells or molecules, including high-resolution light microscopy, optical tweezers, fluorescence quenching, and Förster resonance energy transfer (FRET). Experimental observations are most often indirect and incomplete: (1) They do not directly reveal the potential or diffusion constants that govern the diffusion process, (2) they have limited time and space resolution, and (3) the highest-resolution experiments do not track the motion directly but rather probe it stochastically by recording single events, such as photons, whose properties depend on the state of the system under investigation. Here, we propose a general Bayesian framework to model diffusion processes with nonlinear drift based on incomplete observations as generated by various types of experiments. A maximum penalized likelihood estimator is given as well as a Gibbs sampling method that allows to estimate the trajectories that have caused the measurement, the nonlinear drift or potential function and the noise or diffusion matrices, as well as uncertainty estimates of these properties. The approach is illustrated on numerical simulations of FRET experiments where it is shown that trajectories, potentials, and diffusion constants can be efficiently and reliably estimated even in cases with little statistics or nonequilibrium measurement conditions.

  9. A Bayesian kriging approach for blending satellite and ground precipitation observations

    Science.gov (United States)

    Verdin, Andrew P.; Rajagopalan, Balaji; Kleiber, William; Funk, Christopher C.

    2015-01-01

    Drought and flood management practices require accurate estimates of precipitation. Gauge observations, however, are often sparse in regions with complicated terrain, clustered in valleys, and of poor quality. Consequently, the spatial extent of wet events is poorly represented. Satellite-derived precipitation data are an attractive alternative, though they tend to underestimate the magnitude of wet events due to their dependency on retrieval algorithms and the indirect relationship between satellite infrared observations and precipitation intensities. Here we offer a Bayesian kriging approach for blending precipitation gauge data and the Climate Hazards Group Infrared Precipitation satellite-derived precipitation estimates for Central America, Colombia, and Venezuela. First, the gauge observations are modeled as a linear function of satellite-derived estimates and any number of other variables—for this research we include elevation. Prior distributions are defined for all model parameters and the posterior distributions are obtained simultaneously via Markov chain Monte Carlo sampling. The posterior distributions of these parameters are required for spatial estimation, and thus are obtained prior to implementing the spatial kriging model. This functional framework is applied to model parameters obtained by sampling from the posterior distributions, and the residuals of the linear model are subject to a spatial kriging model. Consequently, the posterior distributions and uncertainties of the blended precipitation estimates are obtained. We demonstrate this method by applying it to pentadal and monthly total precipitation fields during 2009. The model's performance and its inherent ability to capture wet events are investigated. We show that this blending method significantly improves upon the satellite-derived estimates and is also competitive in its ability to represent wet events. This procedure also provides a means to estimate a full conditional distribution

  10. Bayesian Unidimensional Scaling for visualizing uncertainty in high dimensional datasets with latent ordering of observations.

    Science.gov (United States)

    Nguyen, Lan Huong; Holmes, Susan

    2017-09-13

    Detecting patterns in high-dimensional multivariate datasets is non-trivial. Clustering and dimensionality reduction techniques often help in discerning inherent structures. In biological datasets such as microbial community composition or gene expression data, observations can be generated from a continuous process, often unknown. Estimating data points' 'natural ordering' and their corresponding uncertainties can help researchers draw insights about the mechanisms involved. We introduce a Bayesian Unidimensional Scaling (BUDS) technique which extracts dominant sources of variation in high dimensional datasets and produces their visual data summaries, facilitating the exploration of a hidden continuum. The method maps multivariate data points to latent one-dimensional coordinates along their underlying trajectory, and provides estimated uncertainty bounds. By statistically modeling dissimilarities and applying a DiSTATIS registration method to their posterior samples, we are able to incorporate visualizations of uncertainties in the estimated data trajectory across different regions using confidence contours for individual data points. We also illustrate the estimated overall data density across different areas by including density clouds. One-dimensional coordinates recovered by BUDS help researchers discover sample attributes or covariates that are factors driving the main variability in a dataset. We demonstrated usefulness and accuracy of BUDS on a set of published microbiome 16S and RNA-seq and roll call data. Our method effectively recovers and visualizes natural orderings present in datasets. Automatic visualization tools for data exploration and analysis are available at: https://nlhuong.shinyapps.io/visTrajectory/ .

  11. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A bayesian geostatistical parameter estimation approach

    Science.gov (United States)

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-01-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologie parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into faci??s associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O) ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained. Copyright 2009 by the American Geophysical Union.

  12. Bayesian Analysis of Hmi Images and Comparison to Tsi Variations and MWO Image Observables

    Science.gov (United States)

    Parker, D. G.; Ulrich, R. K.; Beck, J.; Tran, T. V.

    2015-12-01

    We have previously applied the Bayesian automatic classification system AutoClass to solar magnetogram and intensity images from the 150 Foot Solar Tower at Mount Wilson to identify classes of solar surface features associated with variations in total solar irradiance (TSI) and, using those identifications, modeled TSI time series with improved accuracy (r > 0.96). (Ulrich, et al, 2010) AutoClass identifies classes by a two-step process in which it: (1) finds, without human supervision, a set of class definitions based on specified attributes of a sample of the image data pixels, such as magnetic field and intensity in the case of MWO images, and (2) applies the class definitions thus found to new data sets to identify automatically in them the classes found in the sample set. HMI high resolution images capture four observables-magnetic field, continuum intensity, line depth and line width-in contrast to MWO's two observables-magnetic field and intensity. In this study, we apply AutoClass to the HMI observables for images from June, 2010 to December, 2014 to identify solar surface feature classes. We use contemporaneous TSI measurements to determine whether and how variations in the HMI classes are related to TSI variations and compare the characteristic statistics of the HMI classes to those found from MWO images. We also attempt to derive scale factors between the HMI and MWO magnetic and intensity observables.The ability to categorize automatically surface features in the HMI images holds out the promise of consistent, relatively quick and manageable analysis of the large quantity of data available in these images. Given that the classes found in MWO images using AutoClass have been found to improve modeling of TSI, application of AutoClass to the more complex HMI images should enhance understanding of the physical processes at work in solar surface features and their implications for the solar-terrestrial environment.Ulrich, R.K., Parker, D, Bertello, L. and

  13. Quantitative Precipitation Estimation over Ocean Using Bayesian Approach from Microwave Observations during the Typhoon Season

    Directory of Open Access Journals (Sweden)

    Jen-Chi Hu

    2009-01-01

    Full Text Available We have developed a new Bayesian approach to retrieve oceanic rain rate from the Tropical Rainfall Measuring Mission (TRMM Microwave Imager (TMI, with an emphasis on typhoon cases in the West Pacific. Retrieved rain rates are validated with measurements of rain gauges located on Japanese islands. To demonstrate improvement, retrievals are also compared with those from the TRMM/Precipitation Radar (PR, the Goddard Profiling Algorithm (GPROF, and a multi-channel linear regression statistical method (MLRS. We have found that qualitatively, all methods retrieved similar horizontal distributions in terms of locations of eyes and rain bands of typhoons. Quantitatively, our new Bayesian retrievals have the best linearity and the smallest root mean square (RMS error against rain gauge data for 16 typhoon over passes in 2004. The correlation coefficient and RMS of our retrievals are 0.95 and ~2 mm hr-1, respectively. In particular, at heavy rain rates, our Bayesian retrievals out perform those retrieved from GPROF and MLRS. Over all, the new Bayesian approach accurately retrieves surface rain rate for typhoon cases. Ac cu rate rain rate estimates from this method can be assimilated in models to improve forecast and prevent potential damages in Taiwan during typhoon seasons.

  14. Using polarimetric radar observations and probabilistic inference to develop the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), a novel microphysical parameterization framework

    Science.gov (United States)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.

    2016-12-01

    Microphysical parameterization schemes have reached an impressive level of sophistication: numerous prognostic hydrometeor categories, and either size-resolved (bin) particle size distributions, or multiple prognostic moments of the size distribution. Yet, uncertainty in model representation of microphysical processes and the effects of microphysics on numerical simulation of weather has not shown a improvement commensurate with the advanced sophistication of these schemes. We posit that this may be caused by unconstrained assumptions of these schemes, such as ad-hoc parameter value choices and structural uncertainties (e.g. choice of a particular form for the size distribution). We present work on development and observational constraint of a novel microphysical parameterization approach, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), which seeks to address these sources of uncertainty. Our framework avoids unnecessary a priori assumptions, and instead relies on observations to provide probabilistic constraint of the scheme structure and sensitivities to environmental and microphysical conditions. We harness the rich microphysical information content of polarimetric radar observations to develop and constrain BOSS within a Bayesian inference framework using a Markov Chain Monte Carlo sampler (see Kumjian et al., this meeting for details on development of an associated polarimetric forward operator). Our work shows how knowledge of microphysical processes is provided by polarimetric radar observations of diverse weather conditions, and which processes remain highly uncertain, even after considering observations.

  15. Peak Bagging of red giant stars observed by Kepler: first results with a new method based on Bayesian nested sampling

    Directory of Open Access Journals (Sweden)

    Corsaro Enrico

    2015-01-01

    Full Text Available The peak bagging analysis, namely the fitting and identification of single oscillation modes in stars’ power spectra, coupled to the very high-quality light curves of red giant stars observed by Kepler, can play a crucial role for studying stellar oscillations of different flavor with an unprecedented level of detail. A thorough study of stellar oscillations would thus allow for deeper testing of stellar structure models and new insights in stellar evolution theory. However, peak bagging inferences are in general very challenging problems due to the large number of observed oscillation modes, hence of free parameters that can be involved in the fitting models. Efficiency and robustness in performing the analysis is what may be needed to proceed further. For this purpose, we developed a new code implementing the Nested Sampling Monte Carlo (NSMC algorithm, a powerful statistical method well suited for Bayesian analyses of complex problems. In this talk we show the peak bagging of a sample of high signal-to-noise red giant stars by exploiting recent Kepler datasets and a new criterion for the detection of an oscillation mode based on the computation of the Bayesian evidence. Preliminary results for frequencies and lifetimes for single oscillation modes, together with acoustic glitches, are therefore presented.

  16. A Bayesian method for inferring transmission chains in a partially observed epidemic.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Ray, Jaideep

    2008-10-01

    We present a Bayesian approach for estimating transmission chains and rates in the Abakaliki smallpox epidemic of 1967. The epidemic affected 30 individuals in a community of 74; only the dates of appearance of symptoms were recorded. Our model assumes stochastic transmission of the infections over a social network. Distinct binomial random graphs model intra- and inter-compound social connections, while disease transmission over each link is treated as a Poisson process. Link probabilities and rate parameters are objects of inference. Dates of infection and recovery comprise the remaining unknowns. Distributions for smallpox incubation and recovery periods are obtained from historical data. Using Markov chain Monte Carlo, we explore the joint posterior distribution of the scalar parameters and provide an expected connectivity pattern for the social graph and infection pathway.

  17. Monte Carlo Algorithms for a Bayesian Analysis of the Cosmic Microwave Background

    Science.gov (United States)

    Jewell, Jeffrey B.; Eriksen, H. K.; ODwyer, I. J.; Wandelt, B. D.; Gorski, K.; Knox, L.; Chu, M.

    2006-01-01

    A viewgraph presentation on the review of Bayesian approach to Cosmic Microwave Background (CMB) analysis, numerical implementation with Gibbs sampling, a summary of application to WMAP I and work in progress with generalizations to polarization, foregrounds, asymmetric beams, and 1/f noise is given.

  18. Bayesian multimodel estimation of global terrestrial latent heat flux from eddy covariance, meteorological, and satellite observations

    Science.gov (United States)

    Yao, Yunjun; Liang, Shunlin; Li, Xianglan; Hong, Yang; Fisher, Joshua B.; Zhang, Nannan; Chen, Jiquan; Cheng, Jie; Zhao, Shaohua; Zhang, Xiaotong; Jiang, Bo; Sun, Liang; Jia, Kun; Wang, Kaicun; Chen, Yang; Mu, Qiaozhen; Feng, Fei

    2014-04-01

    Accurate estimation of the satellite-based global terrestrial latent heat flux (LE) at high spatial and temporal scales remains a major challenge. In this study, we introduce a Bayesian model averaging (BMA) method to improve satellite-based global terrestrial LE estimation by merging five process-based algorithms. These are the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product algorithm, the revised remote-sensing-based Penman-Monteith LE algorithm, the Priestley-Taylor-based LE algorithm, the modified satellite-based Priestley-Taylor LE algorithm, and the semi-empirical Penman LE algorithm. We validated the BMA method using data for 2000-2009 and by comparison with a simple model averaging (SA) method and five process-based algorithms. Validation data were collected for 240 globally distributed eddy covariance tower sites provided by FLUXNET projects. The validation results demonstrate that the five process-based algorithms used have variable uncertainty and the BMA method enhances the daily LE estimates, with smaller root mean square errors (RMSEs) than the SA method and the individual algorithms driven by tower-specific meteorology and Modern Era Retrospective Analysis for Research and Applications (MERRA) meteorological data provided by the NASA Global Modeling and Assimilation Office (GMAO), respectively. The average RMSE for the BMA method driven by daily tower-specific meteorology decreased by more than 5 W/m2 for crop and grass sites, and by more than 6 W/m2 for forest, shrub, and savanna sites. The average coefficients of determination (R2) increased by approximately 0.05 for most sites. To test the BMA method for regional mapping, we applied it for MODIS data and GMAO-MERRA meteorology to map annual global terrestrial LE averaged over 2001-2004 for spatial resolution of 0.05°. The BMA method provides a basis for generating a long-term global terrestrial LE product for characterizing global energy, hydrological, and carbon cycles.

  19. Mapping snow depth within a tundra ecosystem using multiscale observations and Bayesian methods

    Science.gov (United States)

    Wainwright, Haruko M.; Liljedahl, Anna K.; Dafflon, Baptiste; Ulrich, Craig; Peterson, John E.; Gusmeroli, Alessio; Hubbard, Susan S.

    2017-04-01

    This paper compares and integrates different strategies to characterize the variability of end-of-winter snow depth and its relationship to topography in ice-wedge polygon tundra of Arctic Alaska. Snow depth was measured using in situ snow depth probes and estimated using ground-penetrating radar (GPR) surveys and the photogrammetric detection and ranging (phodar) technique with an unmanned aerial system (UAS). We found that GPR data provided high-precision estimates of snow depth (RMSE = 2.9 cm), with a spatial sampling of 10 cm along transects. Phodar-based approaches provided snow depth estimates in a less laborious manner compared to GPR and probing, while yielding a high precision (RMSE = 6.0 cm) and a fine spatial sampling (4 cm × 4 cm). We then investigated the spatial variability of snow depth and its correlation to micro- and macrotopography using the snow-free lidar digital elevation map (DEM) and the wavelet approach. We found that the end-of-winter snow depth was highly variable over short (several meter) distances, and the variability was correlated with microtopography. Microtopographic lows (i.e., troughs and centers of low-centered polygons) were filled in with snow, which resulted in a smooth and even snow surface following macrotopography. We developed and implemented a Bayesian approach to integrate the snow-free lidar DEM and multiscale measurements (probe and GPR) as well as the topographic correlation for estimating snow depth over the landscape. Our approach led to high-precision estimates of snow depth (RMSE = 6.0 cm), at 0.5 m resolution and over the lidar domain (750 m × 700 m).

  20. Searching for hidden mirror symmetries in CMB fluctuations from WMAP 7 year maps

    Energy Technology Data Exchange (ETDEWEB)

    Finelli, Fabio; Gruppuso, Alessandro [INAF/IASF-BO, Istituto di Astrofisica Spaziale e Fisica Cosmica di Bologna, via Gobetti 101, I-40129 Bologna (Italy); Paci, Francesco [Laboratoire Astroparticule and Cosmologie (APC), CNRS, Université Paris Diderot, 10, rue A. Domon et L. Duquet, 75205 Paris Cedex 13 (France); Starobinsky, Alexey A., E-mail: finelli@iasfbo.inaf.it, E-mail: gruppuso@iasfbo.inaf.it, E-mail: fpaci@apc.univ-paris7.fr, E-mail: alstar@landau.ac.ru [Landau Institute for Theoretical Physics RAS, 119334 Moscow (Russian Federation)

    2012-07-01

    We search for hidden mirror symmetries at large angular scales in the WMAP 7 year Internal Linear Combination map of CMB temperature anisotropies using global pixel based estimators introduced for this aim. Two different axes are found for which the CMB intensity pattern is anomalously symmetric (or anti-symmetric) under reflection with respect to orthogonal planes at the 99.84(99.96)% CL (confidence level), if compared to a result for an arbitrary axis in simulations without the symmetry. We have verified that our results are robust to the introduction of the galactic mask. The direction of such axes is close to the CMB kinematic dipole and nearly orthogonal to the ecliptic plane, respectively. If instead the real data are compared to those in simulations taken with respect to planes for which the maximal mirror symmetry is generated by chance, the confidence level decreases to 92.39(76.65)%. But when the effect in question translates into the anomalous alignment between normals to planes of maximal mirror (anti)-symmetry and these natural axes mentioned. We also introduce the representation of the above estimators in the harmonic domain, confirming the results obtained in the pixel one. The symmetry anomaly is shown to be almost entirely due to low multipoles, so it may have a cosmological and even primordial origin. Contrary, the anti-symmetry one is mainly due to intermediate multipoles that probably suggests its non-fundamental nature. We have demonstrated that these anomalies are not connected to the known issue of the low variance in WMAP observations and we have checked that axially symmetric parts of these anomalies are small, so that the axes are not the symmetry ones.

  1. Detecting potential safety issues in large clinical or observational trials by Bayesian screening when event counts arise from poisson distributions.

    Science.gov (United States)

    Gould, A Lawrence

    2013-01-01

    Patients in large clinical trials and in studies employing large observational databases report many different adverse events, most of which will not have been anticipated at the outset. Conventional hypothesis testing of between group differences for each adverse event can be problematic: Lack of significance does not mean lack of risk, the tests usually are not adjusted for multiplicity, and the data determine which hypotheses are tested. This article describes a Bayesian screening approach that does not test hypotheses, is self-adjusting for multiplicity, provides a direct assessment of the likelihood of no material drug-event association, and quantifies the strength of the observed association. The criteria for assessing drug-event associations can be determined by clinical or regulatory considerations. In contrast to conventional approaches, the diagnostic properties of this new approach can be evaluated analytically. Application of the method to findings from a vaccine trial yields results similar to those found by methods using a false discovery rate argument or a hierarchical Bayes approach. [Supplemental materials are available for this article. Go to the publisher's online edition of Journal of Biopharmaceutical Statistics for the following free supplemental resource: Appendix R: Code for calculations.].

  2. New inflation vs. chaotic inflation, higher degree potentials and the reconstruction program in light of WMAP3

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Chiu Man; Boyanovsky, D.; de Vega, H.J.; Ho, C.M.; Sanchez, N.G.

    2007-02-12

    The cosmic microwave background power spectra are studied for different families of single field new and chaotic inflation models in the effective field theory approach to inflation. We implement a systematic expansion in 1/N(e), where N(e)~;;50 is the number of e-folds before the end of inflation. We study the dependence of the observables (n(s), r and dn(s)/dlnk) on the degree of the potential (2n) and confront them to the WMAP3 and large scale structure data: This shows in general that fourth degree potentials (n=2) provide the best fit to the data; the window of consistency with the WMAP3 and LSS data narrows for growing n. New inflation yields a good fit to the r and n(s) data in a wide range of field and parameter space. Small field inflation yields r<0.16 while large field inflation yields r>0.16 (for N(e)=50). All members of the new inflation family predict a small but negative running -4(n+1) x 10-4<=dn(s)/dlnk<=-2 x 10-4. (The values of r, n(s), dn(s)/dlnk for arbitrary N(e) follow by a simple rescaling from the N(e)=50 values.) A reconstruction program is carried out suggesting quite generally that for n(s) consistent with the WMAP3 and LSS data and r<0.1 the symmetry breaking scale for new inflation is |phi0|~;;10MPl while the field scale at Hubble crossing is lbar phi(c) rbar~;;M(Pl). The family of chaotic models features r>=0.16 (for N(e)=50) and only a restricted subset of chaotic models are consistent with the combined WMAP3 bounds on r, n(s), dn(s)/dlnk with a narrow window in field amplitude around |phi(c)|~;;15M(Pl). We conclude that a measurement of r<0.16 (for N(e)=50) distinctly rules out a large class of chaotic scenarios and favors small field new inflationary models. As a general consequence, new inflation emerges more favored than chaotic inflation.

  3. From Neutron Star Observables to the Equation of State. II. Bayesian Inference of Equation of State Pressures

    Science.gov (United States)

    Raithel, Carolyn A.; Özel, Feryal; Psaltis, Dimitrios

    2017-08-01

    One of the key goals of observing neutron stars is to infer the equation of state (EoS) of the cold, ultradense matter in their interiors. Here, we present a Bayesian statistical method of inferring the pressures at five fixed densities, from a sample of mock neutron star masses and radii. We show that while five polytropic segments are needed for maximum flexibility in the absence of any prior knowledge of the EoS, regularizers are also necessary to ensure that simple underlying EoS are not over-parameterized. For ideal data with small measurement uncertainties, we show that the pressure at roughly twice the nuclear saturation density, {ρ }{sat}, can be inferred to within 0.3 dex for many realizations of potential sources of uncertainties. The pressures of more complicated EoS with significant phase transitions can also be inferred to within ˜30%. We also find that marginalizing the multi-dimensional parameter space of pressure to infer a mass-radius relation can lead to biases of nearly 1 km in radius, toward larger radii. Using the full, five-dimensional posterior likelihoods avoids this bias.

  4. A Bayesian spatial assimilation scheme for snow coverage observations in a gridded snow model

    Directory of Open Access Journals (Sweden)

    S. Kolberg

    2006-01-01

    Full Text Available A method for assimilating remotely sensed snow covered area (SCA into the snow subroutine of a grid distributed precipitation-runoff model (PRM is presented. The PRM is assumed to simulate the snow state in each grid cell by a snow depletion curve (SDC, which relates that cell's SCA to its snow cover mass balance. The assimilation is based on Bayes' theorem, which requires a joint prior distribution of the SDC variables in all the grid cells. In this paper we propose a spatial model for this prior distribution, and include similarities and dependencies among the grid cells. Used to represent the PRM simulated snow cover state, our joint prior model regards two elevation gradients and a degree-day factor as global variables, rather than describing their effect separately for each cell. This transformation results in smooth normalised surfaces for the two related mass balance variables, supporting a strong inter-cell dependency in their joint prior model. The global features and spatial interdependency in the prior model cause each SCA observation to provide information for many grid cells. The spatial approach similarly facilitates the utilisation of observed discharge. Assimilation of SCA data using the proposed spatial model is evaluated in a 2400 km2 mountainous region in central Norway (61° N, 9° E, based on two Landsat 7 ETM+ images generalized to 1 km2 resolution. An image acquired on 11 May, a week before the peak flood, removes 78% of the variance in the remaining snow storage. Even an image from 4 May, less than a week after the melt onset, reduces this variance by 53%. These results are largely improved compared to a cell-by-cell independent assimilation routine previously reported. Including observed discharge in the updating information improves the 4 May results, but has weak effect on 11 May. Estimated elevation gradients are shown to be sensitive to informational deficits occurring at high altitude, where snowmelt has not started

  5. Bayesian Inference for Multivariate Meta-regression with a Partially Observed Within-Study Sample Covariance Matrix

    Science.gov (United States)

    Yao, Hui; Kim, Sungduk; Chen, Ming-Hui; Ibrahim, Joseph G.; Shah, Arvind K.; Lin, Jianxin

    2015-01-01

    Summary Multivariate meta-regression models are commonly used in settings where the response variable is naturally multi-dimensional. Such settings are common in cardiovascular and diabetes studies where the goal is to study cholesterol levels once a certain medication is given. In this setting, the natural multivariate endpoint is Low Density Lipoprotein Cholesterol (LDL-C), High Density Lipoprotein Cholesterol (HDL-C), and Triglycerides (TG) (LDL-C, HDL-C, TG). In this paper, we examine study level (aggregate) multivariate meta-data from 26 Merck sponsored double-blind, randomized, active or placebo-controlled clinical trials on adult patients with primary hypercholesterolemia. Our goal is to develop a methodology for carrying out Bayesian inference for multivariate meta-regression models with study level data when the within-study sample covariance matrix S for the multivariate response data is partially observed. Specifically, the proposed methodology is based on postulating a multivariate random effects regression model with an unknown within-study covariance matrix Σ in which we treat the within-study sample correlations as missing data, the standard deviations of the within-study sample covariance matrix S are assumed observed, and given Σ, S follows a Wishart distribution. Thus, we treat the off-diagonal elements of S as missing data, and these missing elements are sampled from the appropriate full conditional distribution in a Markov chain Monte Carlo (MCMC) sampling scheme via a novel transformation based on partial correlations. We further propose several structures (models) for Σ, which allow for borrowing strength across different treatment arms and trials. The proposed methodology is assessed using simulated as well as real data, and the results are shown to be quite promising. PMID:26257452

  6. Measuring neutron star tidal deformability with Advanced LIGO: A Bayesian analysis of neutron star-black hole binary observations

    Science.gov (United States)

    Kumar, Prayush; Pürrer, Michael; Pfeiffer, Harald P.

    2017-02-01

    The pioneering discovery of gravitational waves (GWs) by Advanced LIGO has ushered us into an era of observational GW astrophysics. Compact binaries remain the primary target sources for GW observation, of which neutron star-black hole (NSBH) binaries form an important subset. GWs from NSBH sources carry signatures of (a) the tidal distortion of the neutron star by its companion black hole during inspiral, and (b) its potential tidal disruption near merger. In this paper, we present a Bayesian study of the measurability of neutron star tidal deformability ΛNS∝(R /M )NS5 using observation(s) of inspiral-merger GW signals from disruptive NSBH coalescences, taking into account the crucial effect of black hole spins. First, we find that if nontidal templates are used to estimate source parameters for an NSBH signal, the bias introduced in the estimation of nontidal physical parameters will only be significant for loud signals with signal-to-noise ratios greater than ≃30 . For similarly loud signals, we also find that we can begin to put interesting constraints on ΛNS (factor of 1-2) with individual observations. Next, we study how a population of realistic NSBH detections will improve our measurement of neutron star tidal deformability. For an astrophysically likely population of disruptive NSBH coalescences, we find that 20-35 events are sufficient to constrain ΛNS within ±25 %- 50 % , depending on the neutron star equation of state. For these calculations we assume that LIGO will detect black holes with masses within the astrophysical mass gap. In case the mass gap remains preserved in NSBHs detected by LIGO, we estimate that approximately 25% additional detections will furnish comparable ΛNS measurement accuracy. In both cases, we find that it is the loudest 5-10 events that provide most of the tidal information, and not the combination of tens of low-SNR events, thereby facilitating targeted numerical-GR follow-ups of NSBHs. We find these results

  7. Time Series Analysis of Non-Gaussian Observations Based on State Space Models from Both Classical and Bayesian Perspectives

    NARCIS (Netherlands)

    Durbin, J.; Koopman, S.J.M.

    1998-01-01

    The analysis of non-Gaussian time series using state space models is considered from both classical and Bayesian perspectives. The treatment in both cases is based on simulation using importance sampling and antithetic variables; Monte Carlo Markov chain methods are not employed. Non-Gaussian

  8. Constraining the regular Galactic magnetic field with the 5-year WMAP polarization measurements at 22 GHz

    Science.gov (United States)

    Ruiz-Granados, B.; Rubiño-Martín, J. A.; Battaner, E.

    2010-11-01

    Context. The knowledge of the regular (large scale) component of the Galactic magnetic field gives important information about the structure and dynamics of the Milky Way, and also constitutes a basic tool to determine cosmic ray trajectories. It can also provide clear windows where primordial magnetic fields could be detected. Aims: We aim to obtain the regular (large scale) pattern of the magnetic field distribution of the Milky Way that better fits the polarized synchrotron emission as seen by the WMAP satellite in the 5 years data at 22 GHz. Methods: We have done a systematic study of a number of Galactic magnetic field models: axisymmetric (with and without radial dependence on the field strength), bisymmetric (with and without radial dependence), logarithmic spiral arms, concentric circular rings with reversals and bi-toroidal. We have explored the parameter space defining each of these models using a grid-based approach. In total, more than one million models were computed. The model selection was done using a Bayesian approach. For each model, the posterior distributions were obtained and marginalized over the unwanted parameters to obtain the marginal (one-parameter) probability distribution functions. Results: In general, axisymmetric models provide a better description of the halo component, although with regard to their goodness-of-fit, the other models cannot be rejected. In the case of the disk component, the analysis is not very sensitive for obtaining the disk large-scale structure, because of the effective available area (less than 8% of the whole map and less than 40% of the disk). Nevertheless, within a given family of models, the best-fit parameters are compatible with those found in the literature. Conclusions: The family of models that better describes the polarized synchrotron halo emission is the axisymmetric one, with magnetic spiral arms with a pitch angle of ≈24°, and a strong vertical field of 1 μG at z ≈ 1 kpc. When a radial

  9. A CMB foreground study in WMAP data: Extragalactic point sources and zodiacal light emission

    Science.gov (United States)

    Chen, Xi

    The Cosmic Microwave Background (CMB) radiation is the remnant heat from the Big Bang. It serves as a primary tool to understand the global properties, content and evolution of the universe. Since 2001, NASA's Wilkinson Microwave Anisotropy Probe (WMAP) satellite has been napping the full sky anisotropy with unprecedented accuracy, precision and reliability. The CMB angular power spectrum calculated from the WMAP full sky maps not only enables accurate testing of cosmological models, but also places significant constraints on model parameters. The CMB signal in the WMAP sky maps is contaminated by microwave emission from the Milky Way and from extragalactic sources. Therefore, in order to use the maps reliably for cosmological studies, the foreground signals must be well understood and removed from the maps. This thesis focuses on the separation of two foreground contaminants from the WMAP maps: extragalactic point sources and zodiacal light emission. Extragalactic point sources constitute the most important foreground on small angular scales. Various methods have been applied to the WMAP single frequency maps to extract sources. However, due to the limited angular resolution of WMAP, it is possible to confuse positive CMB excursions with point sources or miss sources that are embedded in negative CMB fluctuations. We present a novel CMB-free source finding technique that utilizes the spectrum difference of point sources and CMB to form internal linear combinations of multifrequency maps to suppress the CMB and better reveal sources. When applied to the WMAP 41, 64 and 94 GHz maps, this technique has not only enabled detection of sources that are previously cataloged by independent methods, but also allowed disclosure of new sources. Without the noise contribution from the CMB, this method responds rapidly with the integration time. The number of detections varies as 0( t 0.72 in the two-band search and 0( t 0.70 in the three-band search from one year to five years

  10. Reproducibility of parameter learning with missing observations in naive Wnt Bayesian network trained on colorectal cancer samples and doxycycline-treated cell lines.

    Science.gov (United States)

    Sinha, Shriprakash

    2015-07-01

    In this manuscript the reproducibility of parameter learning with missing observations in a naive Bayesian network and its effect on the prediction results for Wnt signaling activation in colorectal cancer is tested. The training of the network is carried out separately on doxycycline-treated LS174T cell lines (GSE18560) as well as normal and adenoma samples (GSE8671). A computational framework to test the reproducibility of the parameters is designed in order check the veracity of the prediction results. Detailed experimental analysis suggests that the prediction results are accurate and reproducible with negligible deviations. Anomalies in estimated parameters are accounted for due to the representation issues of the Bayesian network model. High prediction accuracies are reported for normal (N) and colon-related adenomas (AD), colorectal cancer (CRC), carcinomas (C), adenocarcinomas (ADC) and replication error colorectal cancer (RER CRC) test samples. Test samples from inflammatory bowel diseases (IBD) do not fare well in the prediction test. Also, an interesting case regarding hypothesis testing came up while proving the statistical significance of the different design setups of the Bayesian network model. It was found that hypothesis testing may not be the correct way to check the significance between design setups, especially when the structure of the model is the same, given that the model is trained on a single piece of test data. The significance test does have value when the datasets are independent. Finally, in comparison to the biologically inspired models, the naive Bayesian model may give accurate results, but this accuracy comes at the cost of a loss of crucial biological knowledge which might help reveal hidden relations among intra/extracellular factors affecting the Wnt pathway.

  11. Further investigation about inflation and reheating stages based on the Planck and WMAP-9

    Science.gov (United States)

    Ghayour, Basem

    The potential V (ϕ) = λϕn is responsible for the inflation of the universe as scalar field ϕ oscillates quickly around some point where V (ϕ) has a minimum. The end of this stage has an important role on the further evolution stages of the universe. The created particles are responsible for reheating the universe at the end of this stage. The behavior of the inflation and reheating stages are often known as power law expansion S(η) ∝ η1+β, S(η) ∝ η1+βs, respectively. The reheating temperature (Trh) and βs give us valuable information about the reheating stage. Recently, people have studied about the behavior of Trh based on slow-roll inflation and initial condition of quantum normalization. It is shown that there is some discrepancy on Trh due to the amount of βs under the condition of slow-roll inflation and quantum normalization [M. Tong, Class. Quantum Grav. 30 (2013) 055013.]. Therefore, the author is believed in [M. Tong, Class. Quantum Grav. 30 (2013) 055013.] that the quantum normalization may not be a good initial condition but it seems that, we can remove this discrepancy by determining the appropriate parameter βs and hence the obtained temperatures based on the calculated βs are in favor of both mentioned conditions. Then from given βs, we can calculate Trh, tensor-to-scalar ratio r and parameters β,n based on the Planck and WMAP-9 data. The observed results of r,βs,β and n have consistency with their constrains. Also the results of Trh are in agreement with its general range and special range based on the DECIGO and BBO detectors.

  12. Bayesian models in cognitive neuroscience: A tutorial

    NARCIS (Netherlands)

    O'Reilly, J.X.; Mars, R.B.

    2015-01-01

    This chapter provides an introduction to Bayesian models and their application in cognitive neuroscience. The central feature of Bayesian models, as opposed to other classes of models, is that Bayesian models represent the beliefs of an observer as probability distributions, allowing them to

  13. Spectral Bayesian Knowledge Tracing

    Science.gov (United States)

    Falakmasir, Mohammad; Yudelson, Michael; Ritter, Steve; Koedinger, Ken

    2015-01-01

    Bayesian Knowledge Tracing (BKT) has been in wide use for modeling student skill acquisition in Intelligent Tutoring Systems (ITS). BKT tracks and updates student's latent mastery of a skill as a probability distribution of a binary variable. BKT does so by accounting for observed student successes in applying the skill correctly, where success is…

  14. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  15. Nonthermal WIMPs as ``dark radiation'' in light of ATACAMA, SPT, WMAP9, and Planck

    Science.gov (United States)

    Kelso, Chris; Profumo, Stefano; Queiroz, Farinaldo S.

    2013-07-01

    The Planck and WMAP9 satellites, as well as the ATACAMA and South Pole telescopes, have recently presented results on the angular power spectrum of the comic microwave background. Data tentatively point to the existence of an extra radiation component in the early Universe. Here, we show that this extra component can be mimicked by ordinary weakly interacting massive particle dark matter particles whose majority is cold, but with a small fraction being nonthermally produced in a relativistic state. We present a few example theories where this scenario is explicitly realized and explore the relevant parameter space consistent with big bang nucleosynthesis, cosmic microwave background, and structure formation bounds.

  16. Uncertainty quantification of GEOS-5 L-band radiative transfer model parameters using Bayesian inference and SMOS observations

    NARCIS (Netherlands)

    De Lannoy, G.J.M.; Reichle, R.H.; Vrugt, J.A.

    2014-01-01

    Uncertainties in L-band (1.4 GHz) microwave radiative transfer modeling (RTM) affect the simulation of brightness temperatures (Tb) over land and the inversion of satellite-observed Tb into soil moisture retrievals. In particular, accurate estimates of the microwave soil roughness, vegetation

  17. Bayesian coronal seismology

    Science.gov (United States)

    Arregui, Iñigo

    2018-01-01

    In contrast to the situation in a laboratory, the study of the solar atmosphere has to be pursued without direct access to the physical conditions of interest. Information is therefore incomplete and uncertain and inference methods need to be employed to diagnose the physical conditions and processes. One of such methods, solar atmospheric seismology, makes use of observed and theoretically predicted properties of waves to infer plasma and magnetic field properties. A recent development in solar atmospheric seismology consists in the use of inversion and model comparison methods based on Bayesian analysis. In this paper, the philosophy and methodology of Bayesian analysis are first explained. Then, we provide an account of what has been achieved so far from the application of these techniques to solar atmospheric seismology and a prospect of possible future extensions.

  18. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...

  19. FERMI-LAT AND WMAP OBSERVATIONS OF THE PUPPIS A SUPERNOVA REMNANT

    Energy Technology Data Exchange (ETDEWEB)

    Hewitt, J. W. [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Grondin, M.-H. [Max-Planck-Institut fuer Kernphysik, D-69029 Heidelberg (Germany); Lemoine-Goumard, M.; Reposeur, T. [Centre d' Etudes Nucleaires de Bordeaux-Gradignan, Universite Bordeaux 1, CNRS/IN2p3, F-33175 Gradignan (France); Ballet, J. [Laboratoire AIM, CEA-IRFU/CNRS/Universite Paris Diderot, Service d' Astrophysique, CEA Saclay, F-91191 Gif sur Yvette (France); Tanaka, T., E-mail: john.w.hewitt@nasa.gov, E-mail: marie-helene.grondin@mpi-hd.mpg.de, E-mail: lemoine@cenbg.in2p3.fr [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2012-11-10

    We report the detection of GeV {gamma}-ray emission from the supernova remnant (SNR) Puppis A with the Fermi Gamma-Ray Space Telescope. Puppis A is among the faintest SNRs yet detected at GeV energies, with a luminosity of only 2.7 Multiplication-Sign 10{sup 34} (D/2.2 kpc){sup 2} erg s{sup -1} between 1 and 100 GeV. The {gamma}-ray emission from the remnant is spatially extended, with a morphology matching that of the radio and X-ray emission, and is well described by a simple power law with an index of 2.1. We attempt to model the broadband spectral energy distribution (SED), from radio to {gamma}-rays, using standard nonthermal emission mechanisms. To constrain the relativistic electron population we use 7 years of Wilkinson Microwave Anisotropy Probe data to extend the radio spectrum up to 93 GHz. Both leptonic- and hadronic-dominated models can reproduce the nonthermal SED, requiring a total content of cosmic-ray electrons and protons accelerated in Puppis A of at least W {sub CR} Almost-Equal-To (1-5) Multiplication-Sign 10{sup 49} erg.

  20. Assessing the Association between Natural Food Folate Intake and Blood Folate Concentrations: A Systematic Review and Bayesian Meta-Analysis of Trials and Observational Studies

    Directory of Open Access Journals (Sweden)

    Claire M. Marchetta

    2015-04-01

    Full Text Available Folate is found naturally in foods or as synthetic folic acid in dietary supplements and fortified foods. Adequate periconceptional folic acid intake can prevent neural tube defects. Folate intake impacts blood folate concentration; however, the dose-response between natural food folate and blood folate concentrations has not been well described. We estimated this association among healthy females. A systematic literature review identified studies (1 1992–3 2014 with both natural food folate intake alone and blood folate concentration among females aged 12–49 years. Bayesian methods were used to estimate regression model parameters describing the association between natural food folate intake and subsequent blood folate concentration. Seven controlled trials and 29 observational studies met the inclusion criteria. For the six studies using microbiologic assay (MA included in the meta-analysis, we estimate that a 6% (95% Credible Interval (CrI: 4%, 9% increase in red blood cell (RBC folate concentration and a 7% (95% CrI: 1%, 12% increase in serum/plasma folate concentration can occur for every 10% increase in natural food folate intake. Using modeled results, we estimate that a natural food folate intake of ≥450 μg dietary folate equivalents (DFE/day could achieve the lower bound of an RBC folate concentration (~1050 nmol/L associated with the lowest risk of a neural tube defect. Natural food folate intake affects blood folate concentration and adequate intakes could help women achieve a RBC folate concentration associated with a risk of 6 neural tube defects/10,000 live births.

  1. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  2. Comparison between Fisherian and Bayesian approach to ...

    African Journals Online (AJOL)

    ... the Bayesian approach assigns an observed unit to a group with the greatest posterior probability. Fisher's linear discriminant analysis though is the most widely used method of classification because of its simplicity and optimality properties is normally used for two group cases. However, Bayesian approach is found to ...

  3. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  4. Foreground removal from WMAP 5 yr temperature maps using an MLP neural network

    DEFF Research Database (Denmark)

    Nørgaard-Nielsen, Hans Ulrik

    2010-01-01

    CMB signal makes it essential to minimize the systematic errors in the CMB temperature determinations. Methods. The feasibility of using simple neural networks to extract the CMB signal from detailed simulated data has already been demonstrated. Here, simple neural networks are applied to the WMAP 5...... yr temperature data without using any auxiliary data. Results. A simple multilayer perceptron neural network with two hidden layers provides temperature estimates over more than 75 per cent of the sky with random errors significantly below those previously extracted from these data. Also......, the systematic errors, i.e. errors correlated with the Galactic foregrounds, are very small. Conclusions. With these results the neural network method is well prepared for dealing with the high-quality CMB data from the ESA Planck Surveyor satellite. © ESO, 2010....

  5. Collisions with other universes the optimal analysis of the WMAP data

    CERN Document Server

    Osborne, Stephen; Smith, Kendrick

    2013-01-01

    An appealing theory is that our current patch of universe was born as a nucleation bubble from a phase of false vacuum eternal inflation. We search for evidence for this theory by looking for the signal imprinted on the CMB that is generated when another bubble "universe" collides with our own. We create an efficient and optimal estimator for the signal in the WMAP 7-year data. We find no detectable signal, and constrain the amplitude, a, of the initial curvature perturbation that would be generated by a collision: -4.66 \\times 10^{-8} < a (\\sin{\\thetabubble})^{4/3} < 4.73 \\times 10^{-8} [Mpc^{-1}] at 95% confidence where \\thetabubble is the angular radius of the bubble signal.

  6. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...... of perception, and use this observation to frame a new computational account of the need for, and action of, attention - unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments......, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...

  7. Bayesian Inference of FRC plasmas

    Science.gov (United States)

    Romero, Jesus A.; Dettrick, Sean; Onofri, Marco; TAE Team

    2017-10-01

    Bayesian analysis techniques are currently being used at TAE to infer FRC magnetic topology and the radial profile of the electron density. The Bayesian method provides all the solutions compatible with both the prior assumptions and the measurements in the form of a probability distribution termed the posterior, from which the most likely solution and its uncertainty can readily be obtained. Bayesian analysis of field reversed configurations reveals strong field reversal on axis as well as non-monotonic radial density profiles. The later feature is only observed in global transport simulations in cases where significant fast ion pressure and current drive are present. Hence the inferred non-monotonic density profiles are indicative of current drive in the experiment.

  8. Approximate Bayesian Computation

    Science.gov (United States)

    Sunnåker, Mikael; Corander, Jukka; Foll, Matthieu; Dessimoz, Christophe

    2013-01-01

    Approximate Bayesian computation (ABC) constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology). PMID:23341757

  9. Neutrino cosmology after WMAP 7-year data and LHC first Z' bounds.

    Science.gov (United States)

    Anchordoqui, Luis Alfredo; Goldberg, Haim

    2012-02-24

    The gauge-extended U(1)(C)×SU(2)(L)×U(1)(I(R))×U(1)(L) model elevates the global symmetries of the standard model (baryon number B and lepton number L) to local gauge symmetries. The U(1)(L) symmetry leads to three superweakly interacting right-handed neutrinos. This also renders a B-L symmetry nonanomalous. The superweak interactions of these Dirac states permit ν(R) decoupling just above the QCD phase transition: 175 is < or approximately equal to T(ν(R))(dec)/MeV is < or approximately equal to 250. In this transitional region, the residual temperature ratio between ν(L) and ν(R) generates extra relativistic degrees of freedom at BBN and at the CMB epochs. Consistency with both WMAP 7-year data and recent estimates of the primordial 4He mass fraction is achieved for 3

  10. Spectrum of the Anomalous Microwave Emission in the North Celestial Pole with WMAP 7-Year Data

    Directory of Open Access Journals (Sweden)

    Anna Bonaldi

    2012-01-01

    Full Text Available We estimate the frequency spectrum of the diffuse anomalous microwave emission (AME on the North Celestial Pole (NCP region of the sky with the Correlated Component Analysis (CCA component separation method applied to WMAP 7-yr data. The NCP is a suitable region for this analysis because the AME is weakly contaminated by synchrotron and free-free emission. By modeling the AME component as a peaked spectrum we estimate the peak frequency to be 21.7±0.8 GHz, in agreement with previous analyses which favored νp < 23 GHz. The ability of our method to correctly recover the position of the peak is verified through simulations. We compare the estimated AME spectrum with theoretical spinning dust models to constrain the hydrogen density nH. The best results are obtained with densities around 0.2–0.3 cm−3, typical of warm ionised medium (WIM to warm neutral medium (WNM conditions. The degeneracy with the gas temperature prevents an accurate determination of nH, especially for low hydrogen ionization fractions, where densities of a few cm−3 are also allowed.

  11. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  12. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  13. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  14. Multinomial probit Bayesian additive regression trees.

    Science.gov (United States)

    Kindo, Bereket P; Wang, Hao; Peña, Edsel A

    This article proposes multinomial probit Bayesian additive regression trees (MPBART) as a multinomial probit extension of BART - Bayesian additive regression trees. MPBART is flexible to allow inclusion of predictors that describe the observed units as well as the available choice alternatives. Through two simulation studies and four real data examples, we show that MPBART exhibits very good predictive performance in comparison to other discrete choice and multiclass classification methods. To implement MPBART, the R package mpbart is freely available from CRAN repositories.

  15. Variations on Bayesian Prediction and Inference

    Science.gov (United States)

    2016-05-09

    inference using mixtures. Journal of the American Statistical Association 90, 577–588. Ferguson , T. S . (1973). A Bayesian analysis of some...SECURITY CLASSIFICATION OF: A Bayesian approach, based on updating prior information in light of new observations, via Bayes’s formula , has both nice...findings contained in this report are those of the author( s ) and should not contrued as an official Department of the Army position, policy or

  16. Maximum a posteriori probability estimates in infinite-dimensional Bayesian inverse problems

    Science.gov (United States)

    Helin, T.; Burger, M.

    2015-08-01

    A demanding challenge in Bayesian inversion is to efficiently characterize the posterior distribution. This task is problematic especially in high-dimensional non-Gaussian problems, where the structure of the posterior can be very chaotic and difficult to analyse. Current inverse problem literature often approaches the problem by considering suitable point estimators for the task. Typically the choice is made between the maximum a posteriori (MAP) or the conditional mean (CM) estimate. The benefits of either choice are not well-understood from the perspective of infinite-dimensional theory. Most importantly, there exists no general scheme regarding how to connect the topological description of a MAP estimate to a variational problem. The recent results by Dashti and others (Dashti et al 2013 Inverse Problems 29 095017) resolve this issue for nonlinear inverse problems in Gaussian framework. In this work we improve the current understanding by introducing a novel concept called the weak MAP (wMAP) estimate. We show that any MAP estimate in the sense of Dashti et al (2013 Inverse Problems 29 095017) is a wMAP estimate and, moreover, how the wMAP estimate connects to a variational formulation in general infinite-dimensional non-Gaussian problems. The variational formulation enables to study many properties of the infinite-dimensional MAP estimate that were earlier impossible to study. In a recent work by the authors (Burger and Lucka 2014 Maximum a posteriori estimates in linear inverse problems with logconcave priors are proper bayes estimators preprint) the MAP estimator was studied in the context of the Bayes cost method. Using Bregman distances, proper convex Bayes cost functions were introduced for which the MAP estimator is the Bayes estimator. Here, we generalize these results to the infinite-dimensional setting. Moreover, we discuss the implications of our results for some examples of prior models such as the Besov prior and hierarchical prior.

  17. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  18. Detecting Exoplanets using Bayesian Object Detection

    Science.gov (United States)

    Feroz, Farhan

    2015-08-01

    Detecting objects from noisy data-sets is common practice in astrophysics. Object detection presents a particular challenge in terms of statistical inference, not only because of its multi-modal nature but also because it combines both the parameter estimation (for characterizing objects) and model selection problems (in order to quantify the detection). Bayesian inference provides a mathematically rigorous solution to this problem by calculating marginal posterior probabilities of models with different number of sources, but the use of this method in astrophysics has been hampered by the computational cost of evaluating the Bayesian evidence. Nonetheless, Bayesian model selection has the potential to improve the interpretation of existing observational data. I will discuss several Bayesian approaches to object detection problems, both in terms of their theoretical framework and also the practical details about carrying out the computation. I will also describe some recent applications of these methods in the detection of exoplanets.

  19. Konstruksi Bayesian Network Dengan Algoritma Bayesian Association Rule Mining Network

    OpenAIRE

    Octavian

    2015-01-01

    Beberapa tahun terakhir, Bayesian Network telah menjadi konsep yang populer digunakan dalam berbagai bidang kehidupan seperti dalam pengambilan sebuah keputusan dan menentukan peluang suatu kejadian dapat terjadi. Sayangnya, pengkonstruksian struktur dari Bayesian Network itu sendiri bukanlah hal yang sederhana. Oleh sebab itu, penelitian ini mencoba memperkenalkan algoritma Bayesian Association Rule Mining Network untuk memudahkan kita dalam mengkonstruksi Bayesian Network berdasarkan data ...

  20. BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS

    Directory of Open Access Journals (Sweden)

    Thordis Linda Thorarinsdottir

    2011-05-01

    Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.

  1. Bayesian computation with R

    CERN Document Server

    Albert, Jim

    2009-01-01

    There has been a dramatic growth in the development and application of Bayesian inferential methods. Some of this growth is due to the availability of powerful simulation-based algorithms to summarize posterior distributions. There has been also a growing interest in the use of the system R for statistical analyses. R's open source nature, free availability, and large number of contributor packages have made R the software of choice for many statisticians in education and industry. Bayesian Computation with R introduces Bayesian modeling by the use of computation using the R language. The earl

  2. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2017-04-12

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  3. Bayesian Networks An Introduction

    CERN Document Server

    Koski, Timo

    2009-01-01

    Bayesian Networks: An Introduction provides a self-contained introduction to the theory and applications of Bayesian networks, a topic of interest and importance for statisticians, computer scientists and those involved in modelling complex data sets. The material has been extensively tested in classroom teaching and assumes a basic knowledge of probability, statistics and mathematics. All notions are carefully explained and feature exercises throughout. Features include:.: An introduction to Dirichlet Distribution, Exponential Families and their applications.; A detailed description of learni

  4. Basics of Bayesian methods.

    Science.gov (United States)

    Ghosh, Sujit K

    2010-01-01

    Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.

  5. Bayesian Compressed Sensing with Unknown Measurement Noise Level

    DEFF Research Database (Denmark)

    Hansen, Thomas Lundgaard; Jørgensen, Peter Bjørn; Pedersen, Niels Lovmand

    2013-01-01

    In sparse Bayesian learning (SBL) approximate Bayesian inference is applied to find sparse estimates from observations corrupted by additive noise. Current literature only vaguely considers the case where the noise level is unknown a priori. We show that for most state-of-the-art reconstruction...

  6. Bayesian Methods for Radiation Detection and Dosimetry

    CERN Document Server

    Groer, Peter G

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...

  7. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  8. VizieR Online Data Catalog: CMB intensity map from WMAP and Planck PR2 data (Bobin+, 2016)

    Science.gov (United States)

    Bobin, J.; Sureau, F.; Starck, J.-L.

    2016-05-01

    This paper presents a novel estimation of the CMB map reconstructed from the Planck 2015 data (PR2) and the WMAP nine-year data (Bennett et al., 2013ApJS..208...20B), which updates the CMB map we published in (Bobin et al., 2014A&A...563A.105B). This new map is based on the sparse component separation method L-GMCA (Bobin et al., 2013A&A...550A..73B). Additionally, the map benefits from the latest advances in this field (Bobin et al., 2015, IEEE Transactions on Signal Processing, 63, 1199), which allows us to accurately discriminate between correlated components. In this update to our previous work, we show that this new map presents significant improvements with respect to the available CMB map estimates. (3 data files).

  9. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  10. Bayesian Exploratory Factor Analysis

    Science.gov (United States)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  11. A Bayesian Concept Learning Approach to Crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.

    2011-01-01

    techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing......We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....

  12. Bayesian Model Checking for Multivariate Outcome Data.

    Science.gov (United States)

    Crespi, Catherine M; Boscardin, W John

    2009-09-01

    Bayesian models are increasingly used to analyze complex multivariate outcome data. However, diagnostics for such models have not been well-developed. We present a diagnostic method of evaluating the fit of Bayesian models for multivariate data based on posterior predictive model checking (PPMC), a technique in which observed data are compared to replicated data generated from model predictions. Most previous work on PPMC has focused on the use of test quantities that are scalar summaries of the data and parameters. However, scalar summaries are unlikely to capture the rich features of multivariate data. We introduce the use of dissimilarity measures for checking Bayesian models for multivariate outcome data. This method has the advantage of checking the fit of the model to the complete data vectors or vector summaries with reduced dimension, providing a comprehensive picture of model fit. An application with longitudinal binary data illustrates the methods.

  13. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  14. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimenta...

  15. Bayesian Dark Knowledge

    NARCIS (Netherlands)

    Korattikara, A.; Rathod, V.; Murphy, K.; Welling, M.

    2015-01-01

    We consider the problem of Bayesian parameter estimation for deep neural networks, which is important in problem settings where we may have little data, and/ or where we need accurate posterior predictive densities p(y|x, D), e.g., for applications involving bandits or active learning. One simple

  16. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which r...... signals in hybridization filters and (2) localization of knit units in textile samples....

  17. Large-Scale Analysis of Relationships between Mineral Dust, Ice Cloud Properties, and Precipitation from Satellite Observations Using a Bayesian Approach: Theoretical Basis and First Results for the Tropical Atlantic Ocean

    Directory of Open Access Journals (Sweden)

    Lars Klüser

    2017-01-01

    Full Text Available Mineral dust and ice cloud observations from the Infrared Atmospheric Sounding Interferometer (IASI are used to assess the relationships between desert dust aerosols and ice clouds over the tropical Atlantic Ocean during the hurricane season 2008. Cloud property histograms are first adjusted for varying cloud top temperature or ice water path distributions with a Bayesian approach to account for meteorological constraints on the cloud variables. Then, histogram differences between dust load classes are used to describe the impact of dust load on cloud property statistics. The analysis of the histogram differences shows that ice crystal sizes are reduced with increasing aerosol load and ice cloud optical depth and ice water path are increased. The distributions of all three variables broaden and get less skewed in dusty environments. For ice crystal size the significant bimodality is reduced and the order of peaks is reversed. Moreover, it is shown that not only are distributions of ice cloud variables simply shifted linearly but also variance, skewness, and complexity of the cloud variable distributions are significantly affected. This implies that the whole cloud variable distributions have to be considered for indirect aerosol effects in any application for climate modelling.

  18. Bayesian optimization for materials science

    CERN Document Server

    Packwood, Daniel

    2017-01-01

    This book provides a short and concise introduction to Bayesian optimization specifically for experimental and computational materials scientists. After explaining the basic idea behind Bayesian optimization and some applications to materials science in Chapter 1, the mathematical theory of Bayesian optimization is outlined in Chapter 2. Finally, Chapter 3 discusses an application of Bayesian optimization to a complicated structure optimization problem in computational surface science. Bayesian optimization is a promising global optimization technique that originates in the field of machine learning and is starting to gain attention in materials science. For the purpose of materials design, Bayesian optimization can be used to predict new materials with novel properties without extensive screening of candidate materials. For the purpose of computational materials science, Bayesian optimization can be incorporated into first-principles calculations to perform efficient, global structure optimizations. While re...

  19. GPS-ABC: Gaussian Process Surrogate Approximate Bayesian Computation

    NARCIS (Netherlands)

    Meeds, E.; Welling, M.; Zhang, N.; Tian, J.

    2014-01-01

    Scientists often express their understanding of the world through a computationally demanding simulation program. Analyzing the posterior distribution of the parameters given observations (the inverse problem) can be extremely challenging. The Approximate Bayesian Computation (ABC) framework is the

  20. A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri

    2013-01-01

    In this paper, we present a Bayesian channel estimation algorithm for multicarrier receivers based on pilot symbol observations. The inherent sparse nature of wireless multipath channels is exploited by modeling the prior distribution of multipath components' gains with a hierarchical...

  1. Bayesian community detection

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel N

    2012-01-01

    for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities......Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. The humble Bayesian: model checking from a fully Bayesian perspective.

    Science.gov (United States)

    Morey, Richard D; Romeijn, Jan-Willem; Rouder, Jeffrey N

    2013-02-01

    Gelman and Shalizi (2012) criticize what they call the 'usual story' in Bayesian statistics: that the distribution over hypotheses or models is the sole means of statistical inference, thus excluding model checking and revision, and that inference is inductivist rather than deductivist. They present an alternative hypothetico-deductive approach to remedy both shortcomings. We agree with Gelman and Shalizi's criticism of the usual story, but disagree on whether Bayesian confirmation theory should be abandoned. We advocate a humble Bayesian approach, in which Bayesian confirmation theory is the central inferential method. A humble Bayesian checks her models and critically assesses whether the Bayesian statistical inferences can reasonably be called upon to support real-world inferences. © 2012 The British Psychological Society.

  4. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  5. Introduction to Bayesian statistics

    CERN Document Server

    Koch, Karl-Rudolf

    2007-01-01

    This book presents Bayes' theorem, the estimation of unknown parameters, the determination of confidence regions and the derivation of tests of hypotheses for the unknown parameters. It does so in a simple manner that is easy to comprehend. The book compares traditional and Bayesian methods with the rules of probability presented in a logical way allowing an intuitive understanding of random variables and their probability distributions to be formed.

  6. WMAP7 and future CMB constraints on annihilating dark matter: implications for GeV-scale WIMPs

    Science.gov (United States)

    Hütsi, G.; Chluba, J.; Hektor, A.; Raidal, M.

    2011-11-01

    Aims: We calculate constraints from current and future cosmic microwave background (CMB) measurements on annihilating dark matter (DM) with masses below the electroweak scale: mDM = 5 - 100 GeV. In particular, we assume the S-wave annihilation mode to be dominant, and focus our attention on the lower end of this mass range, as DM particles with masses mDM ~ 10 GeV have recently been claimed to be consistent with the CoGeNT and DAMA/LIBRA results, while also providing viable DM candidates to explain the measurements of Fermi and WMAP haze. We study the model (in)dependence of the CMB power spectra on particle physics DM models, large-scale structure formation and cosmological uncertainties. We attempt to find a simple and practical recipe for estimating current and future CMB bounds on a broad class of DM annihilation models. Methods: We use a model-independent description for DM annihilation into a wide set of Standard Model particles simulated by PYTHIA Monte Carlo. Our Markov chain Monte Carlo calculations used for finding model constraints involve realistic CMB likelihoods and assume a standard 6-parameter ΛCDM background cosmological model, which is extended by two additional DM annihilation parameters: mDM and ⟨ σAυ ⟩ /mDM. Results: We show that in the studied DM mass range the CMB signal of DM annihilations is independent of the details of large-scale structure formation, distribution, and profile of DM halos and other cosmological uncertainties. All particle physics models of DM annihilation can be described with only one parameter, the fraction of energy carried away by neutrinos in DM annihilation. As the main result we provide a simple and rather generic fitting formula for calculating CMB constraints on the annihilation cross section of light WIMPs. We show that thermal relic DM in the CoGeNT, DAMA/LIBRA favored mass range is in a serious conflict with present CMB data for the annihilation channels with few neutrinos, and will definitely be tested

  7. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  8. Bayesian ARTMAP for regression.

    Science.gov (United States)

    Sasu, L M; Andonie, R

    2013-10-01

    Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. A computational analysis of the neural bases of Bayesian inference.

    Science.gov (United States)

    Kolossa, Antonio; Kopp, Bruno; Fingscheidt, Tim

    2015-02-01

    Empirical support for the Bayesian brain hypothesis, although of major theoretical importance for cognitive neuroscience, is surprisingly scarce. This hypothesis posits simply that neural activities code and compute Bayesian probabilities. Here, we introduce an urn-ball paradigm to relate event-related potentials (ERPs) such as the P300 wave to Bayesian inference. Bayesian model comparison is conducted to compare various models in terms of their ability to explain trial-by-trial variation in ERP responses at different points in time and over different regions of the scalp. Specifically, we are interested in dissociating specific ERP responses in terms of Bayesian updating and predictive surprise. Bayesian updating refers to changes in probability distributions given new observations, while predictive surprise equals the surprise about observations under current probability distributions. Components of the late positive complex (P3a, P3b, Slow Wave) provide dissociable measures of Bayesian updating and predictive surprise. Specifically, the updating of beliefs about hidden states yields the best fit for the anteriorly distributed P3a, whereas the updating of predictions of observations accounts best for the posteriorly distributed Slow Wave. In addition, parietally distributed P3b responses are best fit by predictive surprise. These results indicate that the three components of the late positive complex reflect distinct neural computations. As such they are consistent with the Bayesian brain hypothesis, but these neural computations seem to be subject to nonlinear probability weighting. We integrate these findings with the free-energy principle that instantiates the Bayesian brain hypothesis. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Non-linear Bayesian update of PCE coefficients

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).

  11. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  12. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  13. Bayesian Regression with Network Prior: Optimal Bayesian Filtering Perspective.

    Science.gov (United States)

    Qian, Xiaoning; Dougherty, Edward R

    2016-12-01

    The recently introduced intrinsically Bayesian robust filter (IBRF) provides fully optimal filtering relative to a prior distribution over an uncertainty class ofjoint random process models, whereas formerly the theory was limited to model-constrained Bayesian robust filters, for which optimization was limited to the filters that are optimal for models in the uncertainty class. This paper extends the IBRF theory to the situation where there are both a prior on the uncertainty class and sample data. The result is optimal Bayesian filtering (OBF), where optimality is relative to the posterior distribution derived from the prior and the data. The IBRF theories for effective characteristics and canonical expansions extend to the OBF setting. A salient focus of the present work is to demonstrate the advantages of Bayesian regression within the OBF setting over the classical Bayesian approach in the context otlinear Gaussian models.

  14. Bayesian Evidence Framework for Decision Tree Learning

    Science.gov (United States)

    Chatpatanasiri, Ratthachat; Kijsirikul, Boonserm

    2005-11-01

    This work is primary interested in the problem of, given the observed data, selecting a single decision (or classification) tree. Although a single decision tree has a high risk to be overfitted, the induced tree is easily interpreted. Researchers have invented various methods such as tree pruning or tree averaging for preventing the induced tree from overfitting (and from underfitting) the data. In this paper, instead of using those conventional approaches, we apply the Bayesian evidence framework of Gull, Skilling and Mackay to a process of selecting a decision tree. We derive a formal function to measure `the fitness' for each decision tree given a set of observed data. Our method, in fact, is analogous to a well-known Bayesian model selection method for interpolating noisy continuous-value data. As in regression problems, given reasonable assumptions, this derived score function automatically quantifies the principle of Ockham's razor, and hence reasonably deals with the issue of underfitting-overfitting tradeoff.

  15. Local Bayesian inversion: theoretical developments

    Science.gov (United States)

    Moraes, Fernando S.; Scales, John A.

    2000-06-01

    We derive a new Bayesian formulation for the discrete geophysical inverse problem that can significantly reduce the cost of the computations. The Bayesian approach focuses on obtaining a probability distribution (the posterior distribution), assimilating three kinds of information: physical theories (data modelling), observations (data measurements) and prior information on models. Once this goal is achieved, all inferences can be obtained from the posterior by computing statistics relative to individual parameters (e.g. marginal distributions), a daunting computational problem in high dimensions. Our formulation is developed from the working hypothesis that the local (subsurface) prior information on model parameters supercedes any additional information from other parts of the model. Based on this hypothesis, we propose an approximation that permits a reduction of the dimensionality involved in the calculations via marginalization of the probability distributions. The marginalization facilitates the tasks of incorporating diverse prior information and conducting inferences on individual parameters, because the final result is a collection of 1-D posterior distributions. Parameters are considered individually, one at a time. The approximation involves throwing away, at each step, cross-moment information of order higher than two, while preserving all marginal information about the parameter being estimated. The main advantage of the method is allowing for systematic integration of prior information while maintaining practical feasibility. This is achieved by combining (1) probability density estimation methods to derive marginal prior distributions from available local information, and (2) the use of multidimensional Gaussian distributions, which can be marginalized in closed form. Using a six-parameter problem, we illustrate how the proposed methodology works. In the example, the marginal prior distributions are derived from the application of the principle of

  16. Classification using Bayesian neural nets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)

    1995-01-01

    textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to

  17. Kernelized Bayesian Matrix Factorization.

    Science.gov (United States)

    Gönen, Mehmet; Kaski, Samuel

    2014-10-01

    We extend kernelized matrix factorization with a full-Bayesian treatment and with an ability to work with multiple side information sources expressed as different kernels. Kernels have been introduced to integrate side information about the rows and columns, which is necessary for making out-of-matrix predictions. We discuss specifically binary output matrices but extensions to realvalued matrices are straightforward. We extend the state of the art in two key aspects: (i) A full-conjugate probabilistic formulation of the kernelized matrix factorization enables an efficient variational approximation, whereas full-Bayesian treatments are not computationally feasible in the earlier approaches. (ii) Multiple side information sources are included, treated as different kernels in multiple kernel learning which additionally reveals which side sources are informative. We then show that the framework can also be used for supervised and semi-supervised multilabel classification and multi-output regression, by considering samples and outputs as the domains where matrix factorization operates. Our method outperforms alternatives in predicting drug-protein interactions on two data sets. On multilabel classification, our algorithm obtains the lowest Hamming losses on 10 out of 14 data sets compared to five state-of-the-art multilabel classification algorithms. We finally show that the proposed approach outperforms alternatives in multi-output regression experiments on a yeast cell cycle data set.

  18. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  19. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  20. Foreground removal from WMAP 7 yr polarization maps using an MLP neural network

    DEFF Research Database (Denmark)

    Nørgaard-Nielsen, Hans Ulrik

    2012-01-01

    One of the fundamental problems in extracting the cosmic microwave background signal (CMB) from millimeter/submillimeter observations is the pollution by emission from the Milky Way: synchrotron, free-free, and thermal dust emission. To extract the fundamental cosmological parameters from CMB sig...

  1. How to practise Bayesian statistics outside the Bayesian church: What philosophy for Bayesian statistical modelling?

    NARCIS (Netherlands)

    Borsboom, D.; Haig, B.D.

    2013-01-01

    Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science

  2. Bayesian auxiliary particle filters for estimating neural tuning parameters.

    Science.gov (United States)

    Mountney, John; Sobel, Marc; Obeid, Iyad

    2009-01-01

    A common challenge in neural engineering is to track the dynamic parameters of neural tuning functions. This work introduces the application of Bayesian auxiliary particle filters for this purpose. Based on Monte-Carlo filtering, Bayesian auxiliary particle filters use adaptive methods to model the prior densities of the state parameters being tracked. The observations used are the neural firing times, modeled here as a Poisson process, and the biological driving signal. The Bayesian auxiliary particle filter was evaluated by simultaneously tracking the three parameters of a hippocampal place cell and compared to a stochastic state point process filter. It is shown that Bayesian auxiliary particle filters are substantially more accurate and robust than alternative methods of state parameter estimation. The effects of time-averaging on parameter estimation are also evaluated.

  3. Bayesian Vision for Shape Recovery

    Science.gov (United States)

    Jalobeanu, Andre

    2004-01-01

    We present a new Bayesian vision technique that aims at recovering a shape from two or more noisy observations taken under similar lighting conditions. The shape is parametrized by a piecewise linear height field, textured by a piecewise linear irradiance field, and we assume Gaussian Markovian priors for both shape vertices and irradiance variables. The observation process. also known as rendering, is modeled by a non-affine projection (e.g. perspective projection) followed by a convolution with a piecewise linear point spread function. and contamination by additive Gaussian noise. We assume that the observation parameters are calibrated beforehand. The major novelty of the proposed method consists of marginalizing out the irradiances considered as nuisance parameters, which is achieved by Laplace approximations. This reduces the inference to minimizing an energy that only depends on the shape vertices, and therefore allows an efficient Iterated Conditional Mode (ICM) optimization scheme to be implemented. A Gaussian approximation of the posterior shape density is computed, thus providing estimates both the geometry and its uncertainty. We illustrate the effectiveness of the new method by shape reconstruction results in a 2D case. A 3D version is currently under development and aims at recovering a surface from multiple images, reconstructing the topography by marginalizing out both albedo and shading.

  4. Observation

    Science.gov (United States)

    Kripalani, Lakshmi A.

    2016-01-01

    The adult who is inexperienced in the art of observation may, even with the best intentions, react to a child's behavior in a way that hinders instead of helping the child's development. Kripalani outlines the need for training and practice in observation in order to "understand the needs of the children and...to understand how to remove…

  5. Beliefs and Bayesian reasoning.

    Science.gov (United States)

    Cohen, Andrew L; Sidlowski, Sara; Staub, Adrian

    2017-06-01

    We examine whether judgments of posterior probabilities in Bayesian reasoning problems are affected by reasoners' beliefs about corresponding real-world probabilities. In an internet-based task, participants were asked to determine the probability that a hypothesis is true (posterior probability, e.g., a person has a disease, given a positive medical test) based on relevant probabilities (e.g., that any person has the disease and the true and false positive rates of the test). We varied whether the correct posterior probability was close to, or far from, independent intuitive estimates of the corresponding 'real-world' probability. Responses were substantially closer to the correct posterior when this value was close to the intuitive estimate. A model in which the response is a weighted sum of the intuitive estimate and an additive combination of the probabilities provides an excellent account of the results.

  6. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...... locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...... parameter values are unknown. The results show that in this situation a wide range of interpoint distances should be included in the design, and the widely used regular design is often not the best choice....

  7. Bayesian Methods for Radiation Detection and Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Peter G. Groer

    2002-09-29

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed compartmental activities. From the estimated probability densities of the model parameters we were able to derive the densities for compartmental activities for a two compartment catenary model at different times. We also calculated the average activities and their standard deviation for a simple two compartment model.

  8. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  9. Non Bayesian Conditioning and Deconditioning

    OpenAIRE

    Jean Dezert; Florentin Smarandache

    2010-01-01

    International audience; In this paper, we present a Non-Bayesian condition- ing rule for belief revision. This rule is truly Non-Bayesian in the sense that it doesn't satisfy the common adopted principle that when a prior belief is Bayesian, after conditioning by X, Bel(X|X) must be equal to one. Our new conditioning rule for belief revision is based on the proportional conflict redistribution rule of combination developed in DSmT (Dezert-Smarandache Theory) which abandons Bayes' conditioning...

  10. Bayesian adaptive methods for clinical trials

    CERN Document Server

    Berry, Scott M; Muller, Peter

    2010-01-01

    Already popular in the analysis of medical device trials, adaptive Bayesian designs are increasingly being used in drug development for a wide variety of diseases and conditions, from Alzheimer's disease and multiple sclerosis to obesity, diabetes, hepatitis C, and HIV. Written by leading pioneers of Bayesian clinical trial designs, Bayesian Adaptive Methods for Clinical Trials explores the growing role of Bayesian thinking in the rapidly changing world of clinical trial analysis. The book first summarizes the current state of clinical trial design and analysis and introduces the main ideas and potential benefits of a Bayesian alternative. It then gives an overview of basic Bayesian methodological and computational tools needed for Bayesian clinical trials. With a focus on Bayesian designs that achieve good power and Type I error, the next chapters present Bayesian tools useful in early (Phase I) and middle (Phase II) clinical trials as well as two recent Bayesian adaptive Phase II studies: the BATTLE and ISP...

  11. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  12. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification...

  13. UNIFORMLY MOST POWERFUL BAYESIAN TESTS

    Science.gov (United States)

    Johnson, Valen E.

    2014-01-01

    Uniformly most powerful tests are statistical hypothesis tests that provide the greatest power against a fixed null hypothesis among all tests of a given size. In this article, the notion of uniformly most powerful tests is extended to the Bayesian setting by defining uniformly most powerful Bayesian tests to be tests that maximize the probability that the Bayes factor, in favor of the alternative hypothesis, exceeds a specified threshold. Like their classical counterpart, uniformly most powerful Bayesian tests are most easily defined in one-parameter exponential family models, although extensions outside of this class are possible. The connection between uniformly most powerful tests and uniformly most powerful Bayesian tests can be used to provide an approximate calibration between p-values and Bayes factors. Finally, issues regarding the strong dependence of resulting Bayes factors and p-values on sample size are discussed. PMID:24659829

  14. Bayesian Tracking of Visual Objects

    Science.gov (United States)

    Zheng, Nanning; Xue, Jianru

    Tracking objects in image sequences involves performing motion analysis at the object level, which is becoming an increasingly important technology in a wide range of computer video applications, including video teleconferencing, security and surveillance, video segmentation, and editing. In this chapter, we focus on sequential Bayesian estimation techniques for visual tracking. We first introduce the sequential Bayesian estimation framework, which acts as the theoretic basis for visual tracking. Then, we present approaches to constructing representation models for specific objects.

  15. Posterior Predictive Bayesian Phylogenetic Model Selection

    Science.gov (United States)

    Lewis, Paul O.; Xie, Wangang; Chen, Ming-Hui; Fan, Yu; Kuo, Lynn

    2014-01-01

    We present two distinctly different posterior predictive approaches to Bayesian phylogenetic model selection and illustrate these methods using examples from green algal protein-coding cpDNA sequences and flowering plant rDNA sequences. The Gelfand–Ghosh (GG) approach allows dissection of an overall measure of model fit into components due to posterior predictive variance (GGp) and goodness-of-fit (GGg), which distinguishes this method from the posterior predictive P-value approach. The conditional predictive ordinate (CPO) method provides a site-specific measure of model fit useful for exploratory analyses and can be combined over sites yielding the log pseudomarginal likelihood (LPML) which is useful as an overall measure of model fit. CPO provides a useful cross-validation approach that is computationally efficient, requiring only a sample from the posterior distribution (no additional simulation is required). Both GG and CPO add new perspectives to Bayesian phylogenetic model selection based on the predictive abilities of models and complement the perspective provided by the marginal likelihood (including Bayes Factor comparisons) based solely on the fit of competing models to observed data. [Bayesian; conditional predictive ordinate; CPO; L-measure; LPML; model selection; phylogenetics; posterior predictive.] PMID:24193892

  16. Flood quantile estimation at ungauged sites by Bayesian networks

    Science.gov (United States)

    Mediero, L.; Santillán, D.; Garrote, L.

    2012-04-01

    Estimating flood quantiles at a site for which no observed measurements are available is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. The most common technique used is the multiple regression analysis, which relates physical and climatic basin characteristic to flood quantiles. Regression equations are fitted from flood frequency data and basin characteristics at gauged sites. Regression equations are a rigid technique that assumes linear relationships between variables and cannot take the measurement errors into account. In addition, the prediction intervals are estimated in a very simplistic way from the variance of the residuals in the estimated model. Bayesian networks are a probabilistic computational structure taken from the field of Artificial Intelligence, which have been widely and successfully applied to many scientific fields like medicine and informatics, but application to the field of hydrology is recent. Bayesian networks infer the joint probability distribution of several related variables from observations through nodes, which represent random variables, and links, which represent causal dependencies between them. A Bayesian network is more flexible than regression equations, as they capture non-linear relationships between variables. In addition, the probabilistic nature of Bayesian networks allows taking the different sources of estimation uncertainty into account, as they give a probability distribution as result. A homogeneous region in the Tagus Basin was selected as case study. A regression equation was fitted taking the basin area, the annual maximum 24-hour rainfall for a given recurrence interval and the mean height as explanatory variables. Flood quantiles at ungauged sites were estimated by Bayesian networks. Bayesian networks need to be learnt from a huge enough data set. As observational data are reduced, a

  17. Bayesian Adaptive Lasso for Ordinal Regression with Latent Variables

    Science.gov (United States)

    Feng, Xiang-Nan; Wu, Hao-Tian; Song, Xin-Yuan

    2017-01-01

    We consider an ordinal regression model with latent variables to investigate the effects of observable and latent explanatory variables on the ordinal responses of interest. Each latent variable is characterized by correlated observed variables through a confirmatory factor analysis model. We develop a Bayesian adaptive lasso procedure to conduct…

  18. Improved Bayesian multimodeling: Integration of copulas and Bayesian model averaging

    Science.gov (United States)

    Madadgar, Shahrbanou; Moradkhani, Hamid

    2014-12-01

    Bayesian model averaging (BMA) is a popular approach to combine hydrologic forecasts from individual models and characterize the uncertainty induced by model structure. In the original form of BMA, the conditional probability density function (PDF) of each model is assumed to be a particular probability distribution (e.g., Gaussian, gamma, etc.). If the predictions of any hydrologic model do not follow certain distribution, a data transformation procedure is required prior to model averaging. Moreover, it is strongly recommended to apply BMA on unbiased forecasts, whereas it is sometimes difficult to effectively remove bias from the predictions of complex hydrologic models. To overcome these limitations, we develop an approach to integrate a group of multivariate functions, the so-called copula functions, into BMA. Here we introduce a copula-embedded BMA (Cop-BMA) method that relaxes any assumption on the shape of conditional PDFs. Copula functions have a flexible structure and do not restrict the shape of posterior distributions. Furthermore, copulas are effective tools in removing bias from hydrologic forecasts. To compare the performance of BMA with Cop-BMA, they are applied to hydrologic forecasts from different rainfall-runoff and land-surface models. We consider the streamflow observation and simulations for 10 river basins provided by the Model Parameter Estimation Experiment (MOPEX) project. Results demonstrate that the predictive distributions are more accurate and reliable, less biased, and more confident with small uncertainty after Cop-BMA application. It is also shown that the postprocessed forecasts have better correlation with observation after Cop-BMA application.

  19. Dynamic Bayesian Network Modeling of Game Based Diagnostic Assessments. CRESST Report 837

    Science.gov (United States)

    Levy, Roy

    2014-01-01

    Digital games offer an appealing environment for assessing student proficiencies, including skills and misconceptions in a diagnostic setting. This paper proposes a dynamic Bayesian network modeling approach for observations of student performance from an educational video game. A Bayesian approach to model construction, calibration, and use in…

  20. Productioon uncertainties modelling by Bayesian inference using Gibbs sampling

    Directory of Open Access Journals (Sweden)

    Azizi, Amir

    2015-11-01

    Full Text Available Analysis by modelling production throughput is an efficient way to provide information for production decision-making. Observation and investigation based on a real-life tile production line revealed that the five main uncertain variables are demand rate, breakdown time, scrap rate, setup time, and lead time. The volatile nature of these random variables was observed over a specific period of 104 weeks. The processes were sequential and multi-stage. These five uncertain variables of production were modelled to reflect the performance of overall production by applying Bayesian inference using Gibbs sampling. The application of Bayesian inference for handling production uncertainties showed a robust model with 2.5 per cent mean absolute percentage error. It is recommended to consider the five main uncertain variables that are introduced in this study for production decision-making. The study proposes the use of Bayesian inference for superior accuracy in production decision-making.

  1. Bayesian approach to avoiding track seduction

    Science.gov (United States)

    Salmond, David J.; Everett, Nicholas O.

    2002-08-01

    The problem of maintaining track on a primary target in the presence spurious objects is addressed. Recursive and batch filtering approaches are developed. For the recursive approach, a Bayesian track splitting filter is derived which spawns candidate tracks if there is a possibility of measurement misassociation. The filter evaluates the probability of each candidate track being associated with the primary target. The batch filter is a Markov-chain Monte Carlo (MCMC) algorithm which fits the observed data sequence to models of target dynamics and measurement-track association. Simulation results are presented.

  2. Bayesian analysis of CCDM models

    Science.gov (United States)

    Jesus, J. F.; Valentim, R.; Andrade-Oliveira, F.

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3αH0 model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  3. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  4. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees a...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....... and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last...

  5. Perception, illusions and Bayesian inference.

    Science.gov (United States)

    Nour, Matthew M; Nour, Joseph M

    2015-01-01

    Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms. © 2015 S. Karger AG, Basel.

  6. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  7. The humble Bayesian : Model checking from a fully Bayesian perspective

    NARCIS (Netherlands)

    Morey, Richard D.; Romeijn, Jan-Willem; Rouder, Jeffrey N.

    Gelman and Shalizi (2012) criticize what they call the usual story in Bayesian statistics: that the distribution over hypotheses or models is the sole means of statistical inference, thus excluding model checking and revision, and that inference is inductivist rather than deductivist. They present

  8. Bayesian networks and food security - An introduction

    NARCIS (Netherlands)

    Stein, A.

    2004-01-01

    This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision

  9. Impact assessment of extreme storm events using a Bayesian network

    Science.gov (United States)

    den Heijer, C.(Kees); Knipping, Dirk T.J.A.; Plant, Nathaniel G.; van Thiel de Vries, Jaap S. M.; Baart, Fedor; van Gelder, Pieter H. A. J. M.

    2012-01-01

    This paper describes an investigation on the usefulness of Bayesian Networks in the safety assessment of dune coasts. A network has been created that predicts the erosion volume based on hydraulic boundary conditions and a number of cross-shore profile indicators. Field measurement data along a large part of the Dutch coast has been used to train the network. Corresponding storm impact on the dunes was calculated with an empirical dune erosion model named duros+. Comparison between the Bayesian Network predictions and the original duros+ results, here considered as observations, results in a skill up to 0.88, provided that the training data covers the range of predictions. Hence, the predictions from a deterministic model (duros+) can be captured in a probabilistic model (Bayesian Network) such that both the process knowledge and uncertainties can be included in impact and vulnerability assessments.

  10. Bayesian Recovery of the Initial Condition for the Heat Equation

    NARCIS (Netherlands)

    Knapik, B.T.; van der Vaart, A.W.; van Zanten, J.H.

    2013-01-01

    We study a Bayesian approach to recovering the initial condition for the heat equation from noisy observations of the solution at a later time. We consider a class of prior distributions indexed by a parameter quantifying "smoothness" and show that the corresponding posterior distributions contract

  11. On Bayesian Modelling of Fat Tails and Skewness

    NARCIS (Netherlands)

    Fernández, C.; Steel, M.F.J.

    1996-01-01

    We consider a Bayesian analysis of linear regression models that can account for skewed error distributions with fat tails.The latter two features are often observed characteristics of empirical data sets, and we will formally incorporate them in the inferential process.A general procedure for

  12. A new prior for bayesian anomaly detection: application to biosurveillance.

    Science.gov (United States)

    Shen, Y; Cooper, G F

    2010-01-01

    control chart method. The time complexity of the Bayesian algorithm is linear in the number of the observed events being monitored, due to a novel, closed-form derivation that is introduced in the paper. This paper introduces a novel prior probability for Bayesian outbreak detection that is expressive, easy-to-apply, computationally efficient, and performs as well or better than a standard frequentist method.

  13. Polynomial Chaos Surrogates for Bayesian Inference

    KAUST Repository

    Le Maitre, Olivier

    2016-01-06

    The Bayesian inference is a popular probabilistic method to solve inverse problems, such as the identification of field parameter in a PDE model. The inference rely on the Bayes rule to update the prior density of the sought field, from observations, and derive its posterior distribution. In most cases the posterior distribution has no explicit form and has to be sampled, for instance using a Markov-Chain Monte Carlo method. In practice the prior field parameter is decomposed and truncated (e.g. by means of Karhunen- Lo´eve decomposition) to recast the inference problem into the inference of a finite number of coordinates. Although proved effective in many situations, the Bayesian inference as sketched above faces several difficulties requiring improvements. First, sampling the posterior can be a extremely costly task as it requires multiple resolutions of the PDE model for different values of the field parameter. Second, when the observations are not very much informative, the inferred parameter field can highly depends on its prior which can be somehow arbitrary. These issues have motivated the introduction of reduced modeling or surrogates for the (approximate) determination of the parametrized PDE solution and hyperparameters in the description of the prior field. Our contribution focuses on recent developments in these two directions: the acceleration of the posterior sampling by means of Polynomial Chaos expansions and the efficient treatment of parametrized covariance functions for the prior field. We also discuss the possibility of making such approach adaptive to further improve its efficiency.

  14. Differentiated Bayesian Conjoint Choice Designs

    NARCIS (Netherlands)

    Z. Sándor (Zsolt); M. Wedel (Michel)

    2003-01-01

    textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about

  15. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic...

  16. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  17. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  18. 3-D contextual Bayesian classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...

  19. Approximation for Bayesian Ability Estimation.

    Science.gov (United States)

    1987-02-18

    two-way contingency tables. Journal of Educational Statistics, 11, 33-56. Lindley, D.V. (1980). Approximate Bayesian methods. Trabajos Estadistica , 31...Sloan-Kettering Cancer Center 1275 York Avenue New York, NY 10021 Dr. Wallace Wulfeck, 11 Navy Personnel R&D Center San Diego, CA 92152-6800 Dr. Wendy

  20. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...

  1. Bayesian depth estimation from monocular natural images.

    Science.gov (United States)

    Su, Che-Chun; Cormack, Lawrence K; Bovik, Alan C

    2017-05-01

    Estimating an accurate and naturalistic dense depth map from a single monocular photographic image is a difficult problem. Nevertheless, human observers have little difficulty understanding the depth structure implied by photographs. Two-dimensional (2D) images of the real-world environment contain significant statistical information regarding the three-dimensional (3D) structure of the world that the vision system likely exploits to compute perceived depth, monocularly as well as binocularly. Toward understanding how this might be accomplished, we propose a Bayesian model of monocular depth computation that recovers detailed 3D scene structures by extracting reliable, robust, depth-sensitive statistical features from single natural images. These features are derived using well-accepted univariate natural scene statistics (NSS) models and recent bivariate/correlation NSS models that describe the relationships between 2D photographic images and their associated depth maps. This is accomplished by building a dictionary of canonical local depth patterns from which NSS features are extracted as prior information. The dictionary is used to create a multivariate Gaussian mixture (MGM) likelihood model that associates local image features with depth patterns. A simple Bayesian predictor is then used to form spatial depth estimates. The depth results produced by the model, despite its simplicity, correlate well with ground-truth depths measured by a current-generation terrestrial light detection and ranging (LIDAR) scanner. Such a strong form of statistical depth information could be used by the visual system when creating overall estimated depth maps incorporating stereopsis, accommodation, and other conditions. Indeed, even in isolation, the Bayesian predictor delivers depth estimates that are competitive with state-of-the-art "computer vision" methods that utilize highly engineered image features and sophisticated machine learning algorithms.

  2. Bayesian Alternation During Tactile Augmentation

    Directory of Open Access Journals (Sweden)

    Caspar Mathias Goeke

    2016-10-01

    Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in

  3. Bayesian analysis of zero inflated spatiotemporal HIV/TB child mortality data through the INLA and SPDE approaches: Applied to data observed between 1992 and 2010 in rural North East South Africa

    Science.gov (United States)

    Musenge, Eustasius; Chirwa, Tobias Freeman; Kahn, Kathleen; Vounatsou, Penelope

    2013-06-01

    Longitudinal mortality data with few deaths usually have problems of zero-inflation. This paper presents and applies two Bayesian models which cater for zero-inflation, spatial and temporal random effects. To reduce the computational burden experienced when a large number of geo-locations are treated as a Gaussian field (GF) we transformed the field to a Gaussian Markov Random Fields (GMRF) by triangulation. We then modelled the spatial random effects using the Stochastic Partial Differential Equations (SPDEs). Inference was done using a computationally efficient alternative to Markov chain Monte Carlo (MCMC) called Integrated Nested Laplace Approximation (INLA) suited for GMRF. The models were applied to data from 71,057 children aged 0 to under 10 years from rural north-east South Africa living in 15,703 households over the years 1992-2010. We found protective effects on HIV/TB mortality due to greater birth weight, older age and more antenatal clinic visits during pregnancy (adjusted RR (95% CI)): 0.73(0.53;0.99), 0.18(0.14;0.22) and 0.96(0.94;0.97) respectively. Therefore childhood HIV/TB mortality could be reduced if mothers are better catered for during pregnancy as this can reduce mother-to-child transmissions and contribute to improved birth weights. The INLA and SPDE approaches are computationally good alternatives in modelling large multilevel spatiotemporal GMRF data structures.

  4. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  5. Trends in epidemiology in the 21st century: time to adopt Bayesian methods

    Directory of Open Access Journals (Sweden)

    Edson Zangiacomi Martinez

    2014-04-01

    Full Text Available 2013 marked the 250th anniversary of the presentation of Bayes’ theorem by the philosopher Richard Price. Thomas Bayes was a figure little known in his own time, but in the 20th century the theorem that bears his name became widely used in many fields of research. The Bayes theorem is the basis of the so-called Bayesian methods, an approach to statistical inference that allows studies to incorporate prior knowledge about relevant data characteristics into statistical analysis. Nowadays, Bayesian methods are widely used in many different areas such as astronomy, economics, marketing, genetics, bioinformatics and social sciences. This study observed that a number of authors discussed recent advances in techniques and the advantages of Bayesian methods for the analysis of epidemiological data. This article presents an overview of Bayesian methods, their application to epidemiological research and the main areas of epidemiology which should benefit from the use of Bayesian methods in coming years.

  6. Bayesian Cosmic Web Reconstruction: BARCODE for Clusters

    Science.gov (United States)

    Bos, E. G. Patrick; van de Weygaert, Rien; Kitaura, Francisco; Cautun, Marius

    2016-10-01

    We describe the Bayesian \\barcode\\ formalism that has been designed towards the reconstruction of the Cosmic Web in a given volume on the basis of the sampled galaxy cluster distribution. Based on the realization that the massive compact clusters are responsible for the major share of the large scale tidal force field shaping the anisotropic and in particular filamentary features in the Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster configurations, we resort to a state-of-the-art constrained reconstruction technique to find a proper statistically sampled realization of the original initial density and velocity field in the same cosmic region. Ultimately, the subsequent gravitational evolution of these initial conditions towards the implied Cosmic Web configuration can be followed on the basis of a proper analytical model or an N-body computer simulation. The BARCODE formalism includes an implicit treatment for redshift space distortions. This enables a direct reconstruction on the basis of observational data, without the need for a correction of redshift space artifacts. In this contribution we provide a general overview of the the Cosmic Web connection with clusters and a description of the Bayesian BARCODE formalism. We conclude with a presentation of its successful workings with respect to test runs based on a simulated large scale matter distribution, in physical space as well as in redshift space.

  7. Fully Bayesian Experimental Design for Pharmacokinetic Studies

    Directory of Open Access Journals (Sweden)

    Elizabeth G. Ryan

    2015-03-01

    Full Text Available Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future dataset drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature, which rapidly obtains samples from the posterior, is importance sampling, using the prior as the importance distribution. However, importance sampling from the prior will tend to break down if there is a reasonable number of experimental observations. In this paper, we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study, which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times that produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.

  8. Thermal bioaerosol cloud tracking with Bayesian classification

    Science.gov (United States)

    Smith, Christian W.; Dupuis, Julia R.; Schundler, Elizabeth C.; Marinelli, William J.

    2017-05-01

    The development of a wide area, bioaerosol early warning capability employing existing uncooled thermal imaging systems used for persistent perimeter surveillance is discussed. The capability exploits thermal imagers with other available data streams including meteorological data and employs a recursive Bayesian classifier to detect, track, and classify observed thermal objects with attributes consistent with a bioaerosol plume. Target detection is achieved based on similarity to a phenomenological model which predicts the scene-dependent thermal signature of bioaerosol plumes. Change detection in thermal sensor data is combined with local meteorological data to locate targets with the appropriate thermal characteristics. Target motion is tracked utilizing a Kalman filter and nearly constant velocity motion model for cloud state estimation. Track management is performed using a logic-based upkeep system, and data association is accomplished using a combinatorial optimization technique. Bioaerosol threat classification is determined using a recursive Bayesian classifier to quantify the threat probability of each tracked object. The classifier can accept additional inputs from visible imagers, acoustic sensors, and point biological sensors to improve classification confidence. This capability was successfully demonstrated for bioaerosol simulant releases during field testing at Dugway Proving Grounds. Standoff detection at a range of 700m was achieved for as little as 500g of anthrax simulant. Developmental test results will be reviewed for a range of simulant releases, and future development and transition plans for the bioaerosol early warning platform will be discussed.

  9. Adaptive Naive Bayesian Anti-Spam Engine

    CERN Document Server

    Gajewski, W P

    2006-01-01

    The problem of spam has been seriously troubling the Internet community during the last few years and currently reached an alarming scale. Observations made at CERN (European Organization for Nuclear Research located in Geneva, Switzerland) show that spam mails can constitute up to 75% of daily SMTP traffic. A naïve Bayesian classifier based on a Bag of Words representation of an email is widely used to stop this unwanted flood as it combines good performance with simplicity of the training and classification processes. However, facing the constantly changing patterns of spam, it is necessary to assure online adaptability of the classifier. This work proposes combining such a classifier with another NBC (naïve Bayesian classifier) based on pairs of adjacent words. Only the latter will be retrained with examples of spam reported by users. Tests are performed on considerable sets of mails both from public spam archives and CERN mailboxes. They suggest that this architecture can increase spam recall without af...

  10. Finite‐fault Bayesian inversion of teleseismic body waves

    Science.gov (United States)

    Clayton, Brandon; Hartzell, Stephen; Moschetti, Morgan P.; Minson, Sarah E.

    2017-01-01

    Inverting geophysical data has provided fundamental information about the behavior of earthquake rupture. However, inferring kinematic source model parameters for finite‐fault ruptures is an intrinsically underdetermined problem (the problem of nonuniqueness), because we are restricted to finite noisy observations. Although many studies use least‐squares techniques to make the finite‐fault problem tractable, these methods generally lack the ability to apply non‐Gaussian error analysis and the imposition of nonlinear constraints. However, the Bayesian approach can be employed to find a Gaussian or non‐Gaussian distribution of all probable model parameters, while utilizing nonlinear constraints. We present case studies to quantify the resolving power and associated uncertainties using only teleseismic body waves in a Bayesian framework to infer the slip history for a synthetic case and two earthquakes: the 2011 Mw 7.1 Van, east Turkey, earthquake and the 2010 Mw 7.2 El Mayor–Cucapah, Baja California, earthquake. In implementing the Bayesian method, we further present two distinct solutions to investigate the uncertainties by performing the inversion with and without velocity structure perturbations. We find that the posterior ensemble becomes broader when including velocity structure variability and introduces a spatial smearing of slip. Using the Bayesian framework solely on teleseismic body waves, we find rake is poorly constrained by the observations and rise time is poorly resolved when slip amplitude is low.

  11. Deep Learning and Bayesian Methods

    Science.gov (United States)

    Prosper, Harrison B.

    2017-03-01

    A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  12. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, and exercises are included for the reader to check his/her level of understanding. The techniques and methods presented for knowledge elicitation, model construction and verification, modeling techniques and tricks, learning models from data, and analyses of models have all been developed and refined......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...

  13. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    , troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...... Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, and exercises are included for the reader to check his/her level of understanding. The techniques and methods presented for knowledge elicitation, model construction and verification, modeling techniques and tricks, learning models from data, and analyses of models have all been developed and refined...

  14. Bayesian Models of Individual Differences.

    Science.gov (United States)

    Powell, Georgie; Meredith, Zoe; McMillin, Rebecca; Freeman, Tom C A

    2016-12-01

    According to Bayesian models, perception and cognition depend on the optimal combination of noisy incoming evidence with prior knowledge of the world. Individual differences in perception should therefore be jointly determined by a person's sensitivity to incoming evidence and his or her prior expectations. It has been proposed that individuals with autism have flatter prior distributions than do nonautistic individuals, which suggests that prior variance is linked to the degree of autistic traits in the general population. We tested this idea by studying how perceived speed changes during pursuit eye movement and at low contrast. We found that individual differences in these two motion phenomena were predicted by differences in thresholds and autistic traits when combined in a quantitative Bayesian model. Our findings therefore support the flatter-prior hypothesis and suggest that individual differences in prior expectations are more systematic than previously thought. In order to be revealed, however, individual differences in sensitivity must also be taken into account.

  15. Bayesian analyses of cognitive architecture.

    Science.gov (United States)

    Houpt, Joseph W; Heathcote, Andrew; Eidels, Ami

    2017-06-01

    The question of cognitive architecture-how cognitive processes are temporally organized-has arisen in many areas of psychology. This question has proved difficult to answer, with many proposed solutions turning out to be spurious. Systems factorial technology (Townsend & Nozawa, 1995) provided the first rigorous empirical and analytical method of identifying cognitive architecture, using the survivor interaction contrast (SIC) to determine when people are using multiple sources of information in parallel or in series. Although the SIC is based on rigorous nonparametric mathematical modeling of response time distributions, for many years inference about cognitive architecture has relied solely on visual assessment. Houpt and Townsend (2012) recently introduced null hypothesis significance tests, and here we develop both parametric and nonparametric (encompassing prior) Bayesian inference. We show that the Bayesian approaches can have considerable advantages. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Deep Learning and Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Prosper Harrison B.

    2017-01-01

    Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  17. Bayesian inference for agreement measures.

    Science.gov (United States)

    Vidal, Ignacio; de Castro, Mário

    2016-08-25

    The agreement of different measurement methods is an important issue in several disciplines like, for example, Medicine, Metrology, and Engineering. In this article, some agreement measures, common in the literature, were analyzed from a Bayesian point of view. Posterior inferences for such agreement measures were obtained based on well-known Bayesian inference procedures for the bivariate normal distribution. As a consequence, a general, simple, and effective method is presented, which does not require Markov Chain Monte Carlo methods and can be applied considering a great variety of prior distributions. Illustratively, the method was exemplified using five objective priors for the bivariate normal distribution. A tool for assessing the adequacy of the model is discussed. Results from a simulation study and an application to a real dataset are also reported.

  18. Bayesian Repulsive Gaussian Mixture Model

    OpenAIRE

    Xie, Fangzheng; Xu, Yanxun

    2017-01-01

    We develop a general class of Bayesian repulsive Gaussian mixture models that encourage well-separated clusters, aiming at reducing potentially redundant components produced by independent priors for locations (such as the Dirichlet process). The asymptotic results for the posterior distribution of the proposed models are derived, including posterior consistency and posterior contraction rate in the context of nonparametric density estimation. More importantly, we show that compared to the in...

  19. Elements of Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.

  20. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  1. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  2. Bayesian analysis on gravitational waves and exoplanets

    Science.gov (United States)

    Deng, Xihao

    Attempts to detect gravitational waves using a pulsar timing array (PTA), i.e., a collection of pulsars in our Galaxy, have become more organized over the last several years. PTAs act to detect gravitational waves generated from very distant sources by observing the small and correlated effect the waves have on pulse arrival times at the Earth. In this thesis, I present advanced Bayesian analysis methods that can be used to search for gravitational waves in pulsar timing data. These methods were also applied to analyze a set of radial velocity (RV) data collected by the Hobby- Eberly Telescope on observing a K0 giant star. They confirmed the presence of two Jupiter mass planets around a K0 giant star and also characterized the stellar p-mode oscillation. The first part of the thesis investigates the effect of wavefront curvature on a pulsar's response to a gravitational wave. In it we show that we can assume the gravitational wave phasefront is planar across the array only if the source luminosity distance " 2piL2/lambda, where L is the pulsar distance to the Earth (˜ kpc) and lambda is the radiation wavelength (˜ pc) in the PTA waveband. Correspondingly, for a point gravitational wave source closer than ˜ 100 Mpc, we should take into account the effect of wavefront curvature across the pulsar-Earth line of sight, which depends on the luminosity distance to the source, when evaluating the pulsar timing response. As a consequence, if a PTA can detect a gravitational wave from a source closer than ˜ 100 Mpc, the effects of wavefront curvature on the response allows us to determine the source luminosity distance. The second and third parts of the thesis propose a new analysis method based on Bayesian nonparametric regression to search for gravitational wave bursts and a gravitational wave background in PTA data. Unlike the conventional Bayesian analysis that introduces a signal model with a fixed number of parameters, Bayesian nonparametric regression sets

  3. Comparison of Bayesian and frequentist approaches

    OpenAIRE

    Ageyeva, Anna

    2010-01-01

    The thesis deals with Bayesian approach to statistics and its comparison to frequentist approach. The main aim of the thesis is to compare frequentist and Bayesian approaches to statistics by analyzing statistical inferences, examining the question of subjectivity and objectivity in statistics. Another goal of the thesis is to draw attention to the importance and necessity to teach Bayesian statistics at our University more profound. The thesis includes three chapters. The first chapter prese...

  4. Bayesian characterization of uncertainty in species interaction strengths.

    Science.gov (United States)

    Wolf, Christopher; Novak, Mark; Gitelman, Alix I

    2017-06-01

    Considerable effort has been devoted to the estimation of species interaction strengths. This effort has focused primarily on statistical significance testing and obtaining point estimates of parameters that contribute to interaction strength magnitudes, leaving the characterization of uncertainty associated with those estimates unconsidered. We consider a means of characterizing the uncertainty of a generalist predator's interaction strengths by formulating an observational method for estimating a predator's prey-specific per capita attack rates as a Bayesian statistical model. This formulation permits the explicit incorporation of multiple sources of uncertainty. A key insight is the informative nature of several so-called non-informative priors that have been used in modeling the sparse data typical of predator feeding surveys. We introduce to ecology a new neutral prior and provide evidence for its superior performance. We use a case study to consider the attack rates in a New Zealand intertidal whelk predator, and we illustrate not only that Bayesian point estimates can be made to correspond with those obtained by frequentist approaches, but also that estimation uncertainty as described by 95% intervals is more useful and biologically realistic using the Bayesian method. In particular, unlike in bootstrap confidence intervals, the lower bounds of the Bayesian posterior intervals for attack rates do not include zero when a predator-prey interaction is in fact observed. We conclude that the Bayesian framework provides a straightforward, probabilistic characterization of interaction strength uncertainty, enabling future considerations of both the deterministic and stochastic drivers of interaction strength and their impact on food webs.

  5. A Bayesian Approach to Identifying New Risk Factors for Dementia

    Science.gov (United States)

    Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua

    2016-01-01

    Abstract Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics. We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular diseases. Then, we used Bayesian statistics to verify 2 new potential risk factors for dementia, namely hearing loss and senile cataract, determined from the Taiwan's National Health Insurance Research Database. We included a total of 6546 (6.0%) patients diagnosed with dementia. We observed older age, female sex, and lower income as independent risk factors for dementia. Moreover, we verified the 4 recognized risk factors for dementia in the older Taiwanese population; their odds ratios (ORs) ranged from 3.469 to 1.207. Furthermore, we observed that hearing loss (OR = 1.577) and senile cataract (OR = 1.549) were associated with an increased risk of dementia. We found that the results obtained using Bayesian statistics for assessing risk factors for dementia, such as head injury, depression, DM, and vascular diseases, were consistent with those obtained using classical frequentist probability. Moreover, hearing loss and senile cataract were found to be potential risk factors for dementia in the older Taiwanese population. Bayesian statistics could help clinicians explore other potential risk factors for dementia and for developing appropriate treatment strategies for these patients. PMID:27227925

  6. Minimum mean square error estimation and approximation of the Bayesian update

    KAUST Repository

    Litvinenko, Alexander

    2015-01-07

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(w), a measurement operator Y (u(q); q), where u(q; w) uncertain solution. Aim: to identify q(w). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(w) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a functional approximation, e.g. polynomial chaos expansion (PCE). New: We derive linear, quadratic etc approximation of full Bayesian update.

  7. Approximate Bayesian computation with functional statistics.

    Science.gov (United States)

    Soubeyrand, Samuel; Carpentier, Florence; Guiton, François; Klein, Etienne K

    2013-03-26

    Functional statistics are commonly used to characterize spatial patterns in general and spatial genetic structures in population genetics in particular. Such functional statistics also enable the estimation of parameters of spatially explicit (and genetic) models. Recently, Approximate Bayesian Computation (ABC) has been proposed to estimate model parameters from functional statistics. However, applying ABC with functional statistics may be cumbersome because of the high dimension of the set of statistics and the dependences among them. To tackle this difficulty, we propose an ABC procedure which relies on an optimized weighted distance between observed and simulated functional statistics. We applied this procedure to a simple step model, a spatial point process characterized by its pair correlation function and a pollen dispersal model characterized by genetic differentiation as a function of distance. These applications showed how the optimized weighted distance improved estimation accuracy. In the discussion, we consider the application of the proposed ABC procedure to functional statistics characterizing non-spatial processes.

  8. Default Bayesian Estimation of the Fundamental Frequency

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2013-01-01

    Joint fundamental frequency and model order esti- mation is an important problem in several applications. In this paper, a default estimation algorithm based on a minimum of prior information is presented. The algorithm is developed in a Bayesian framework, and it can be applied to both real......- and complex-valued discrete-time signals which may have missing samples or may have been sampled at a non-uniform sampling frequency. The observation model and prior distributions corre- sponding to the prior information are derived in a consistent fashion using maximum entropy and invariance arguments....... Moreover, several approximations of the posterior distributions on the fundamental frequency and the model order are derived, and one of the state-of-the-art joint fundamental frequency and model order estimators is demonstrated to be a special case of one of these approximations. The performance...

  9. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  10. PARALLEL ALGORITHM FOR BAYESIAN NETWORK STRUCTURE LEARNING

    Directory of Open Access Journals (Sweden)

    S. A. Arustamov

    2013-03-01

    Full Text Available The article deals with implementation of a scalable parallel algorithm for structure learning of Bayesian network. Comparative analysis of sequential and parallel algorithms is done.

  11. 3rd Bayesian Young Statisticians Meeting

    CERN Document Server

    Lanzarone, Ettore; Villalobos, Isadora; Mattei, Alessandra

    2017-01-01

    This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).

  12. A Bayesian Framework for SNP Identification

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Havre, Susan L.; Payne, Deborah A.

    2005-07-01

    Current proteomics techniques, such as mass spectrometry, focus on protein identification, usually ignoring most types of modifications beyond post-translational modifications, with the assumption that only a small number of peptides have to be matched to a protein for a positive identification. However, not all proteins are being identified with current techniques and improved methods to locate points of mutation are becoming a necessity. In the case when single-nucleotide polymorphisms (SNPs) are observed, brute force is the most common method to locate them, quickly becoming computationally unattractive as the size of the database associated with the model organism grows. We have developed a Bayesian model for SNPs, BSNP, incorporating evolutionary information at both the nucleotide and amino acid levels. Formulating SNPs as a Bayesian inference problem allows probabilities of interest to be easily obtained, for example the probability of a specific SNP or specific type of mutation over a gene or entire genome. Three SNP databases were observed in the evaluation of the BSNP model; the first SNP database is a disease specific gene in human, hemoglobin, the second is also a disease specific gene in human, p53, and the third is a more general SNP database for multiple genes in mouse. We validate that the BSNP model assigns higher posterior probabilities to the SNPs defined in all three separate databases than can be attributed to chance under specific evolutionary information, for example the amino acid model described by Majewski and Ott in conjunction with either the four-parameter nucleotide model by Bulmer or seven-parameter nucleotide model by Majewski and Ott.

  13. A Bayesian view on acoustic model-based techniques for robust speech recognition

    Science.gov (United States)

    Maas, Roland; Huemmer, Christian; Sehr, Armin; Kellermann, Walter

    2015-12-01

    This article provides a unifying Bayesian view on various approaches for acoustic model adaptation, missing feature, and uncertainty decoding that are well-known in the literature of robust automatic speech recognition. The representatives of these classes can often be deduced from a Bayesian network that extends the conventional hidden Markov models used in speech recognition. These extensions, in turn, can in many cases be motivated from an underlying observation model that relates clean and distorted feature vectors. By identifying and converting the observation models into a Bayesian network representation, we formulate the corresponding compensation rules. We thus summarize the various approaches as approximations or modifications of the same Bayesian decoding rule leading to a unified view on known derivations as well as to new formulations for certain approaches.

  14. Bayesian Methods and Universal Darwinism

    Science.gov (United States)

    Campbell, John

    2009-12-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.

  15. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  16. Bayesian population receptive field modelling.

    Science.gov (United States)

    Zeidman, Peter; Silson, Edward Harry; Schwarzkopf, Dietrich Samuel; Baker, Chris Ian; Penny, Will

    2017-09-08

    We introduce a probabilistic (Bayesian) framework and associated software toolbox for mapping population receptive fields (pRFs) based on fMRI data. This generic approach is intended to work with stimuli of any dimension and is demonstrated and validated in the context of 2D retinotopic mapping. The framework enables the experimenter to specify generative (encoding) models of fMRI timeseries, in which experimental stimuli enter a pRF model of neural activity, which in turns drives a nonlinear model of neurovascular coupling and Blood Oxygenation Level Dependent (BOLD) response. The neuronal and haemodynamic parameters are estimated together on a voxel-by-voxel or region-of-interest basis using a Bayesian estimation algorithm (variational Laplace). This offers several novel contributions to receptive field modelling. The variance/covariance of parameters are estimated, enabling receptive fields to be plotted while properly representing uncertainty about pRF size and location. Variability in the haemodynamic response across the brain is accounted for. Furthermore, the framework introduces formal hypothesis testing to pRF analysis, enabling competing models to be evaluated based on their log model evidence (approximated by the variational free energy), which represents the optimal tradeoff between accuracy and complexity. Using simulations and empirical data, we found that parameters typically used to represent pRF size and neuronal scaling are strongly correlated, which is taken into account by the Bayesian methods we describe when making inferences. We used the framework to compare the evidence for six variants of pRF model using 7 T functional MRI data and we found a circular Difference of Gaussians (DoG) model to be the best explanation for our data overall. We hope this framework will prove useful for mapping stimulus spaces with any number of dimensions onto the anatomy of the brain. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Bayesian regularization of neural networks.

    Science.gov (United States)

    Burden, Frank; Winkler, Dave

    2008-01-01

    Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a "well-posed" statistical problem in the manner of a ridge regression. The advantage of BRANNs is that the models are robust and the validation process, which scales as O(N2) in normal regression methods, such as back propagation, is unnecessary. These networks provide solutions to a number of problems that arise in QSAR modeling, such as choice of model, robustness of model, choice of validation set, size of validation effort, and optimization of network architecture. They are difficult to overtrain, since evidence procedures provide an objective Bayesian criterion for stopping training. They are also difficult to overfit, because the BRANN calculates and trains on a number of effective network parameters or weights, effectively turning off those that are not relevant. This effective number is usually considerably smaller than the number of weights in a standard fully connected back-propagation neural net. Automatic relevance determination (ARD) of the input variables can be used with BRANNs, and this allows the network to "estimate" the importance of each input. The ARD method ensures that irrelevant or highly correlated indices used in the modeling are neglected as well as showing which are the most important variables for modeling the activity data. This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model. Some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models.

  18. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  19. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept...... of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...

  20. Bayesian versus frequentist upper limits

    CERN Document Server

    Rover, Christian; Prix, Reinhard

    2011-01-01

    While gravitational waves have not yet been measured directly, data analysis from detection experiments commonly includes an upper limit statement. Such upper limits may be derived via a frequentist or Bayesian approach; the theoretical implications are very different, and on the technical side, one notable difference is that one case requires maximization of the likelihood function over parameter space, while the other requires integration. Using a simple example (detection of a sinusoidal signal in white Gaussian noise), we investigate the differences in performance and interpretation, and the effect of the "trials factor", or "look-elsewhere effect".

  1. Numeracy, frequency, and Bayesian reasoning

    Directory of Open Access Journals (Sweden)

    Gretchen B. Chapman

    2009-02-01

    Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.

  2. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  3. Bayesian Calibration of Simultaneity in Audiovisual Temporal Order Judgments

    Science.gov (United States)

    Yamamoto, Shinya; Miyazaki, Makoto; Iwano, Takayuki; Kitazawa, Shigeru

    2012-01-01

    After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation). In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to Bayesian integration theory (Bayesian calibration). We further showed, in theory, that the effect of Bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that Bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone) in a different block, the point of simultaneity shifted to “sound-first” for the pitch associated with sound-first stimuli, and to “light-first” for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to “light-first” for the pitch associated with sound-first stimuli, and to “sound-first” for the pitch associated with light-first stimuli. The results clearly show that Bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli. PMID:22792297

  4. Bayesian calibration of simultaneity in audiovisual temporal order judgments.

    Directory of Open Access Journals (Sweden)

    Shinya Yamamoto

    Full Text Available After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation. In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to bayesian integration theory (bayesian calibration. We further showed, in theory, that the effect of bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone in a different block, the point of simultaneity shifted to "sound-first" for the pitch associated with sound-first stimuli, and to "light-first" for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to "light-first" for the pitch associated with sound-first stimuli, and to "sound-first" for the pitch associated with light-first stimuli. The results clearly show that bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.

  5. A fully Bayesian strategy for high-dimensional hierarchical modeling using massively parallel computing

    OpenAIRE

    Landau, Will; Niemi, Jarad

    2016-01-01

    Markov chain Monte Carlo (MCMC) is the predominant tool used in Bayesian parameter estimation for hierarchical models. When the model expands due to an increasing number of hierarchical levels, number of groups at a particular level, or number of observations in each group, a fully Bayesian analysis via MCMC can easily become computationally demanding, even intractable. We illustrate how the steps in an MCMC for hierarchical models are predominantly one of two types: conditionally independent...

  6. Neural substrate of dynamic Bayesian inference in the cerebral cortex.

    Science.gov (United States)

    Funamizu, Akihiro; Kuhn, Bernd; Doya, Kenji

    2016-12-01

    Dynamic Bayesian inference allows a system to infer the environmental state under conditions of limited sensory observation. Using a goal-reaching task, we found that posterior parietal cortex (PPC) and adjacent posteromedial cortex (PM) implemented the two fundamental features of dynamic Bayesian inference: prediction of hidden states using an internal state transition model and updating the prediction with new sensory evidence. We optically imaged the activity of neurons in mouse PPC and PM layers 2, 3 and 5 in an acoustic virtual-reality system. As mice approached a reward site, anticipatory licking increased even when sound cues were intermittently presented; this was disturbed by PPC silencing. Probabilistic population decoding revealed that neurons in PPC and PM represented goal distances during sound omission (prediction), particularly in PPC layers 3 and 5, and prediction improved with the observation of cue sounds (updating). Our results illustrate how cerebral cortex realizes mental simulation using an action-dependent dynamic model.

  7. Learning dynamic Bayesian networks with mixed variables

    DEFF Research Database (Denmark)

    Bøttcher, Susanne Gammelgaard

    This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned...

  8. Bayesian Joint Estimation of Binomial Proportions.

    Science.gov (United States)

    Viana, Marlos A. G.

    1991-01-01

    A Bayesian solution is suggested to the problem of jointly estimating "k is greater than 1" binomial parameters in conjunction with the problem of testing, in a Bayesian sense, the hypothesis "H" of parametric homogeneity. Applications of the estimates are illustrated with several types of data, including ophthalmological…

  9. Bayesian inference in physics: case studies

    Science.gov (United States)

    Dose, V.

    2003-09-01

    This report describes the Bayesian approach to probability theory with emphasis on the application to the evaluation of experimental data. A brief summary of Bayesian principles is given, with a discussion of concepts, terminology and pitfalls. The step from Bayesian principles to data processing involves major numerical efforts. We address the presently employed procedures of numerical integration, which are mainly based on the Monte Carlo method. The case studies include examples from electron spectroscopies, plasma physics, ion beam analysis and mass spectrometry. Bayesian solutions to the ubiquitous problem of spectrum restoration are presented and advantages and limitations are discussed. Parameter estimation within the Bayesian framework is shown to allow for the incorporation of expert knowledge which in turn allows the treatment of under-determined problems which are inaccessible by the traditional maximum likelihood method. A unique and extremely valuable feature of Bayesian theory is the model comparison option. Bayesian model comparison rests on Ockham's razor which limits the complexity of a model to the amount necessary to explain the data without fitting noise. Finally we deal with the treatment of inconsistent data. They arise frequently in experimental work either from incorrect estimation of the errors associated with a measurement or alternatively from distortions of the measurement signal by some unrecognized spurious source. Bayesian data analysis sometimes meets with spectacular success. However, the approach cannot do wonders, but it does result in optimal robust inferences on the basis of all available and explicitly declared information.

  10. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating and ...

  11. Bayesian learning theory applied to human cognition.

    Science.gov (United States)

    Jacobs, Robert A; Kruschke, John K

    2011-01-01

    Probabilistic models based on Bayes' rule are an increasingly popular approach to understanding human cognition. Bayesian models allow immense representational latitude and complexity. Because they use normative Bayesian mathematics to process those representations, they define optimal performance on a given task. This article focuses on key mechanisms of Bayesian information processing, and provides numerous examples illustrating Bayesian approaches to the study of human cognition. We start by providing an overview of Bayesian modeling and Bayesian networks. We then describe three types of information processing operations-inference, parameter learning, and structure learning-in both Bayesian networks and human cognition. This is followed by a discussion of the important roles of prior knowledge and of active learning. We conclude by outlining some challenges for Bayesian models of human cognition that will need to be addressed by future research. WIREs Cogn Sci 2011 2 8-21 DOI: 10.1002/wcs.80 For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  12. Bayesian Decision Theoretical Framework for Clustering

    Science.gov (United States)

    Chen, Mo

    2011-01-01

    In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…

  13. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  14. Bayesian unit root tests and marginal likelihood

    NARCIS (Netherlands)

    de Vos, A.F.; Francke, M.K.

    2008-01-01

    Unit root tests based on classical marginal likelihood are practically uniformly most powerful (Francke and de Vos, 2007). Bayesian unit root tests can be constructed that are very similar, however in the Bayesian analysis the classical size is determined by prior considerations. A fundamental

  15. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  16. Particle identification in ALICE: a Bayesian approach

    NARCIS (Netherlands)

    Adam, J.; Adamova, D.; Aggarwal, M. M.; Rinella, G. Aglieri; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Anticic, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshaeuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badala, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnafoeldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Bathen, B.; Batigne, G.; Camejo, A. Batista; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielcik, J.; Bielcikova, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boggild, H.; Boldizsar, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossu, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Diaz, L. Calero; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castellanos, J. Castillo; Castro, A. J.; Casula, E. A. R.; Sanchez, C. Ceballos; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Barroso, V. Chibante; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Balbastre, G. Conesa; del Valle, Z. Conesa; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Morales, Y. Corrales; Cortes Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Denes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Corchero, M. A. Diaz; Dietel, T.; Dillenseger, P.; Divia, R.; Djuvsland, O.; Dobrin, A.; Gimenez, D. Domenicis; Doenigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernandez Tellez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Girard, M. Fusco; Gaardhoje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glaessel, P.; Gomez Coral, D. M.; Ramirez, A. Gomez; Gonzalez, A. S.; Gonzalez, V.; Gonzalez-Zamora, P.; Gorbunov, S.; Goerlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Haake, R.; Haaland, O.; Hadjidakis, C.; Haiduc, M.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbaer, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Horak, D.; Hosokawa, R.; Hristov, P.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Incani, E.; Ippolitov, M.; Irfan, M.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacazio, N.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Jang, H. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Bustamante, R. T. Jimenez; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kamin, J.; Kaplin, V.; Kar, S.; Uysal, A. Karasu; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Khan, M. Mohisin; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, J. S.; Kim, M.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein-Boesing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kostarakis, P.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Meethaleveedu, G. Koyithatta; Kralik, I.; Kravcakova, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kucera, V.; Kuijer, P. G.; Kumar, J.; Kumar, L.; Kumar, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Ladron de Guevara, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lea, R.; Leardini, L.; Lee, G. R.; Lee, S.; Lehas, F.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; Monzon, I. Leon; Leon Vargas, H.; Leoncino, M.; Levai, P.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Ljunggren, H. M.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Lopez, X.; Torres, E. Lopez; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Maldonado Cervantes, I.; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Marchisone, M.; Mares, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marin, A.; Markert, C.; Marquard, M.; Martin, N. A.; Blanco, J. Martin; Martinengo, P.; Martinez, M. I.; Garcia, G. Martinez; Pedreira, M. Martinez; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzoni, M. A.; Mcdonald, D.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Perez, J. Mercado; Meres, M.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Miskowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montano Zetina, L.; Montes, E.; De Godoy, D. A. Moreira; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Muehlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Nellen, L.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Oh, S. K.; Ohlson, A.; Okatan, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pagano, D.; Pagano, P.; Paic, G.; Pal, S. K.; Pan, J.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Da Costa, H. Pereira; Peresunko, D.; Lara, C. E. Perez; Lezama, E. Perez; Peskov, V.; Pestov, Y.; Petracek, V.; Petrov, V.; Petrovici, M.; Petta, C.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Ploskon, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Raniwala, R.; Raniwala, S.; Raesaenen, S. S.; Rascanu, B. T.; Rathee, D.; Read, K. F.; Redlich, K.; Reed, R. J.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rocco, E.; Rodriguez Cahuantzi, M.; Manso, A. Rodriguez; Roed, K.; Rogochaya, E.; Rohr, D.; Roehrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Montero, A. J. Rubio; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Safarik, K.; Sahlmuller, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Sandor, L.; Sandoval, A.; Sano, M.; Sarkar, D.; Sarkar, N.; Sarma, P.; Scapparone, E.; Scarlassara, F.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schuchmann, S.; Schukraft, J.; Schulc, M.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Sefcik, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shahzad, M. I.; Shangaraev, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singha, S.; Singhal, V.; Sinha, B. C.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Song, Z.; Soramel, F.; Sorensen, S.; de Souza, R. D.; Sozzi, F.; Spacek, M.; Spiriti, E.; Sputowska, I.; Spyropoulou-Stassinaki, M.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Steyn, G.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Sumbera, M.; Sumowidagdo, S.; Szabo, A.; Szanto de Toledo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Tejeda Munoz, G.; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thaeder, J.; Thakur, D.; Thomas, D.; Tieulent, R.; Timmins, A. R.; Toia, A.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Palomo, L. Valencia; Vallero, S.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vyvre, P. Vande; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vechernin, V.; Veen, A. M.; Veldhoen, M.; Velure, A.; Vercellin, E.; Vergara Limon, S.; Vernet, R.; Verweij, M.; Vickovic, L.; Viesti, G.; Viinikainen, J.; Vilakazi, Z.; Baillie, O. Villalobos; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Vinogradov, Y.; Virgili, T.; Vislavicius, V.; Viyogi, Y. P.; Vodopyanov, A.; Voelkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Vranic, D.; Vrlakova, J.; Vulpescu, B.; Wagner, B.; Wagner, J.; Wang, H.; Watanabe, D.; Watanabe, Y.; Weiser, D. F.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Williams, M. C. S.; Windelband, B.; Winn, M.; Yang, H.; Yano, S.; Yasin, Z.; Yokoyama, H.; Yoo, I. -K.; Yoon, J. H.; Yurchenko, V.; Yushmanov, I.; Zaborowska, A.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Zavada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zgura, I. S.; Zhalov, M.; Zhang, C.; Zhao, C.; Zhigareva, N.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zinovjev, G.; Zyzak, M.; Collaboration, ALICE

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian

  17. Using Bayesian belief networks in adaptive management.

    Science.gov (United States)

    J.B. Nyberg; B.G. Marcot; R. Sulyma

    2006-01-01

    Bayesian belief and decision networks are relatively new modeling methods that are especially well suited to adaptive-management applications, but they appear not to have been widely used in adaptive management to date. Bayesian belief networks (BBNs) can serve many purposes for practioners of adaptive management, from illustrating system relations conceptually to...

  18. A guide to Bayesian model checking for ecologists

    OpenAIRE

    Conn, Paul; Johnson, Devin; Williams, Perry; Melin, Sharon; Hooten, Mevin

    2017-01-01

    Checking that models adequately represent data is an essential component of applied statistical inference. Ecologists increasingly use hierarchical Bayesian statistical models in their research. The appeal of this modeling paradigm is undeniable, as researchers can build and fit models that embody complex ecological processes while simultaneously controlling observation error. However, ecologists tend to be less focused on checking model assumptions and assessing potential lack-of-fit when ap...

  19. Sharp transitions in low-number quantum dots Bayesian magnetometry

    Science.gov (United States)

    Mazurek, Paweł; Horodecki, Michał; Czekaj, Łukasz; Horodecki, Paweł

    2016-09-01

    We consider Bayesian estimate of static magnetic field, characterized by a prior Gaussian probability distribution, in systems of a few electron quantum dot spins interacting with infinite temperature spin environment via hyperfine interaction. Sudden transitions among optimal states and measurements are observed. Usefulness of measuring occupation levels is shown for all times of the evolution, together with the role of entanglement in the optimal scenario. For low values of magnetic field, memory effects stemming from the interaction with environment provide limited metrological advantage.

  20. BELM: Bayesian extreme learning machine.

    Science.gov (United States)

    Soria-Olivas, Emilio; Gómez-Sanchis, Juan; Martín, José D; Vila-Francés, Joan; Martínez, Marcelino; Magdalena, José R; Serrano, Antonio J

    2011-03-01

    The theory of extreme learning machine (ELM) has become very popular on the last few years. ELM is a new approach for learning the parameters of the hidden layers of a multilayer neural network (as the multilayer perceptron or the radial basis function neural network). Its main advantage is the lower computational cost, which is especially relevant when dealing with many patterns defined in a high-dimensional space. This brief proposes a bayesian approach to ELM, which presents some advantages over other approaches: it allows the introduction of a priori knowledge; obtains the confidence intervals (CIs) without the need of applying methods that are computationally intensive, e.g., bootstrap; and presents high generalization capabilities. Bayesian ELM is benchmarked against classical ELM in several artificial and real datasets that are widely used for the evaluation of machine learning algorithms. Achieved results show that the proposed approach produces a competitive accuracy with some additional advantages, namely, automatic production of CIs, reduction of probability of model overfitting, and use of a priori knowledge.

  1. Bayesian natural language semantics and pragmatics

    CERN Document Server

    Zeevat, Henk

    2015-01-01

    The contributions in this volume focus on the Bayesian interpretation of natural languages, which is widely used in areas of artificial intelligence, cognitive science, and computational linguistics. This is the first volume to take up topics in Bayesian Natural Language Interpretation and make proposals based on information theory, probability theory, and related fields. The methodologies offered here extend to the target semantic and pragmatic analyses of computational natural language interpretation. Bayesian approaches to natural language semantics and pragmatics are based on methods from signal processing and the causal Bayesian models pioneered by especially Pearl. In signal processing, the Bayesian method finds the most probable interpretation by finding the one that maximizes the product of the prior probability and the likelihood of the interpretation. It thus stresses the importance of a production model for interpretation as in Grice's contributions to pragmatics or in interpretation by abduction.

  2. Hepatitis disease detection using Bayesian theory

    Science.gov (United States)

    Maseleno, Andino; Hidayati, Rohmah Zahroh

    2017-02-01

    This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.

  3. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  4. Mechanical equivalent of Bayesian inference from monitoring data

    Science.gov (United States)

    Cappello, Carlo; Bolognani, Denise; Zonta, Daniele

    2016-04-01

    Structural health monitoring requires engineers to understand the state of a structure from its observed response. When this information is uncertain, Bayesian probability theory provides a consistent framework for making inference. However, structural engineers are often unenthusiastic about Bayesian logic and prefer to make inference using heuristics. Herein we propose a quantitative method for logical inference based on a formal analogy between linear elastic mechanics and Bayesian inference with Gaussian variables. We start by discussing the estimation of a single parameter under the assumption that all of the uncertain quantities have a Gaussian distribution and that the relationship between the observations and the parameter is linear. With these assumptions, the analogy is stated as follows: the expected value of the considered parameter corresponds to the position of a bar with one degree of freedom and uncertain observations of the parameter are modelled as linear elastic springs placed in series or parallel. If we want to extend the analogy to multiple parameters, we simply have to express the potential energy of the mechanical system associated to the inference problem. The expected value of the parameters is then calculated by minimizing that potential energy. We conclude our contribution by presenting the application of mechanical equivalent to a real-life case study in which we seek the elongation trend of a cable belonging to Adige Bridge, a cable-stayed bridge located North of Trento, Italy.

  5. Bayesian network modeling applied to coastal geomorphology: lessons learned from a decade of experimentation and application

    Science.gov (United States)

    Plant, N. G.; Thieler, E. R.; Gutierrez, B.; Lentz, E. E.; Zeigler, S. L.; Van Dongeren, A.; Fienen, M. N.

    2016-12-01

    We evaluate the strengths and weaknesses of Bayesian networks that have been used to address scientific and decision-support questions related to coastal geomorphology. We will provide an overview of coastal geomorphology research that has used Bayesian networks and describe what this approach can do and when it works (or fails to work). Over the past decade, Bayesian networks have been formulated to analyze the multi-variate structure and evolution of coastal morphology and associated human and ecological impacts. The approach relates observable system variables to each other by estimating discrete correlations. The resulting Bayesian-networks make predictions that propagate errors, conduct inference via Bayes rule, or both. In scientific applications, the model results are useful for hypothesis testing, using confidence estimates to gage the strength of tests while applications to coastal resource management are aimed at decision-support, where the probabilities of desired ecosystems outcomes are evaluated. The range of Bayesian-network applications to coastal morphology includes emulation of high-resolution wave transformation models to make oceanographic predictions, morphologic response to storms and/or sea-level rise, groundwater response to sea-level rise and morphologic variability, habitat suitability for endangered species, and assessment of monetary or human-life risk associated with storms. All of these examples are based on vast observational data sets, numerical model output, or both. We will discuss the progression of our experiments, which has included testing whether the Bayesian-network approach can be implemented and is appropriate for addressing basic and applied scientific problems and evaluating the hindcast and forecast skill of these implementations. We will present and discuss calibration/validation tests that are used to assess the robustness of Bayesian-network models and we will compare these results to tests of other models. This will

  6. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    , and exercises are included for the reader to check his/her level of understanding. The techniques and methods presented for knowledge elicitation, model construction and verification, modeling techniques and tricks, learning models from data, and analyses of models have all been developed and refined......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...... primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning under uncertainty. The theory and methods presented are illustrated through more than 140 examples...

  7. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  8. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks are iden......Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  9. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  10. Bayesian analysis of botanical epidemics using stochastic compartmental models.

    Science.gov (United States)

    Gibson, G J; Kleczkowski, A; Gilligan, C A

    2004-08-17

    A stochastic model for an epidemic, incorporating susceptible, latent, and infectious states, is developed. The model represents primary and secondary infection rates and a time-varying host susceptibility with applications to a wide range of epidemiological systems. A Markov chain Monte Carlo algorithm is presented that allows the model to be fitted to experimental observations within a Bayesian framework. The approach allows the uncertainty in unobserved aspects of the process to be represented in the parameter posterior densities. The methods are applied to experimental observations of damping-off of radish (Raphanus sativus) caused by the fungal pathogen Rhizoctonia solani, in the presence and absence of the antagonistic fungus Trichoderma viride, a biological control agent that has previously been shown to affect the rate of primary infection by using a maximum-likelihood estimate for a simpler model with no allowance for a latent period. Using the Bayesian analysis, we are able to estimate the latent period from population data, even when there is uncertainty in discriminating infectious from latently infected individuals in data collection. We also show that the inference that T. viride can control primary, but not secondary, infection is robust to inclusion of the latent period in the model, although the absolute values of the parameters change. Some refinements and potential difficulties with the Bayesian approach in this context, when prior information on parameters is lacking, are discussed along with broader applications of the methods to a wide range of epidemiological systems.

  11. Bayesian Inversion for Large Scale Antarctic Ice Sheet Flow

    KAUST Repository

    Ghattas, Omar

    2015-01-07

    The flow of ice from the interior of polar ice sheets is the primary contributor to projected sea level rise. One of the main difficulties faced in modeling ice sheet flow is the uncertain spatially-varying Robin boundary condition that describes the resistance to sliding at the base of the ice. Satellite observations of the surface ice flow velocity, along with a model of ice as a creeping incompressible shear-thinning fluid, can be used to infer this uncertain basal boundary condition. We cast this ill-posed inverse problem in the framework of Bayesian inference, which allows us to infer not only the basal sliding parameters, but also the associated uncertainty. To overcome the prohibitive nature of Bayesian methods for large-scale inverse problems, we exploit the fact that, despite the large size of observational data, they typically provide only sparse information on model parameters. We show results for Bayesian inversion of the basal sliding parameter field for the full Antarctic continent, and demonstrate that the work required to solve the inverse problem, measured in number of forward (and adjoint) ice sheet model solves, is independent of the parameter and data dimensions

  12. Robust bayesian analysis of an autoregressive model with ...

    African Journals Online (AJOL)

    In this work, robust Bayesian analysis of the Bayesian estimation of an autoregressive model with exponential innovations is performed. Using a Bayesian robustness methodology, we show that, using a suitable generalized quadratic loss, we obtain optimal Bayesian estimators of the parameters corresponding to the ...

  13. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  14. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  15. Bayesian data assimilation in shape registration

    KAUST Repository

    Cotter, C J

    2013-03-28

    In this paper we apply a Bayesian framework to the problem of geodesic curve matching. Given a template curve, the geodesic equations provide a mapping from initial conditions for the conjugate momentum onto topologically equivalent shapes. Here, we aim to recover the well-defined posterior distribution on the initial momentum which gives rise to observed points on the target curve; this is achieved by explicitly including a reparameterization in the formulation. Appropriate priors are chosen for the functions which together determine this field and the positions of the observation points, the initial momentum p0 and the reparameterization vector field ν, informed by regularity results about the forward model. Having done this, we illustrate how maximum likelihood estimators can be used to find regions of high posterior density, but also how we can apply recently developed Markov chain Monte Carlo methods on function spaces to characterize the whole of the posterior density. These illustrative examples also include scenarios where the posterior distribution is multimodal and irregular, leading us to the conclusion that knowledge of a state of global maximal posterior density does not always give us the whole picture, and full posterior sampling can give better quantification of likely states and the overall uncertainty inherent in the problem. © 2013 IOP Publishing Ltd.

  16. Using Bayesian Networks for Candidate Generation in Consistency-based Diagnosis

    Science.gov (United States)

    Narasimhan, Sriram; Mengshoel, Ole

    2008-01-01

    Consistency-based diagnosis relies heavily on the assumption that discrepancies between model predictions and sensor observations can be detected accurately. When sources of uncertainty like sensor noise and model abstraction exist robust schemes have to be designed to make a binary decision on whether predictions are consistent with observations. This risks the occurrence of false alarms and missed alarms when an erroneous decision is made. Moreover when multiple sensors (with differing sensing properties) are available the degree of match between predictions and observations can be used to guide the search for fault candidates. In this paper we propose a novel approach to handle this problem using Bayesian networks. In the consistency- based diagnosis formulation, automatically generated Bayesian networks are used to encode a probabilistic measure of fit between predictions and observations. A Bayesian network inference algorithm is used to compute most probable fault candidates.

  17. A tutorial introduction to Bayesian inference for stochastic epidemic models using Approximate Bayesian Computation.

    Science.gov (United States)

    Kypraios, Theodore; Neal, Peter; Prangle, Dennis

    2017-05-01

    Likelihood-based inference for disease outbreak data can be very challenging due to the inherent dependence of the data and the fact that they are usually incomplete. In this paper we review recent Approximate Bayesian Computation (ABC) methods for the analysis of such data by fitting to them stochastic epidemic models without having to calculate the likelihood of the observed data. We consider both non-temporal and temporal-data and illustrate the methods with a number of examples featuring different models and datasets. In addition, we present extensions to existing algorithms which are easy to implement and provide an improvement to the existing methodology. Finally, R code to implement the algorithms presented in the paper is available on https://github.com/kypraios/epiABC. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Bayesian adaptive methods for clinical trials

    National Research Council Canada - National Science Library

    Berry, Scott M

    2011-01-01

    .... One is that Bayesian approaches implemented with the majority of their informative content coming from the current data, and not any external prior informa- tion, typically have good frequentist properties (e.g...

  19. Bayesian analysis for the social sciences

    CERN Document Server

    Jackman, Simon

    2009-01-01

    Bayesian methods are increasingly being used in the social sciences, as the problems encountered lend themselves so naturally to the subjective qualities of Bayesian methodology. This book provides an accessible introduction to Bayesian methods, tailored specifically for social science students. It contains lots of real examples from political science, psychology, sociology, and economics, exercises in all chapters, and detailed descriptions of all the key concepts, without assuming any background in statistics beyond a first course. It features examples of how to implement the methods using WinBUGS - the most-widely used Bayesian analysis software in the world - and R - an open-source statistical software. The book is supported by a Website featuring WinBUGS and R code, and data sets.

  20. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  1. Learning motion: Human vs. optimal Bayesian learner

    National Research Council Canada - National Science Library

    Trenti, Edgardo J; Barraza, José F; Eckstein, Miguel P

    2010-01-01

    We used the optimal perceptual learning paradigm (Eckstein, Abbey, Pham, & Shimozaki, 2004) to investigate the dynamics of human rapid learning processes in motion discrimination tasks and compare it to an optimal Bayesian learner...

  2. MRBAYES: Bayesian inference of phylogenetic trees.

    Science.gov (United States)

    Huelsenbeck, J P; Ronquist, F

    2001-08-01

    The program MRBAYES performs Bayesian inference of phylogeny using a variant of Markov chain Monte Carlo. MRBAYES, including the source code, documentation, sample data files, and an executable, is available at http://brahms.biology.rochester.edu/software.html.

  3. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    .... In this approach, the source and target ontologies are first translated into Bayesian networks (BN); the concept mapping between the two ontologies are treated as evidential reasoning between the two translated BNs...

  4. Learning Bayesian Network Model Structure from Data

    National Research Council Canada - National Science Library

    Margaritis, Dimitris

    2003-01-01

    In this thesis I address the important problem of the determination of the structure of directed statistical models, with the widely used class of Bayesian network models as a concrete vehicle of my ideas...

  5. An overview on Approximate Bayesian computation*

    Directory of Open Access Journals (Sweden)

    Baragatti Meïli

    2014-01-01

    Full Text Available Approximate Bayesian computation techniques, also called likelihood-free methods, are one of the most satisfactory approach to intractable likelihood problems. This overview presents recent results since its introduction about ten years ago in population genetics.

  6. Orbits for 18 Visual Binaries and Two Double-line Spectroscopic Binaries Observed with HRCAM on the CTIO SOAR 4 m Telescope, Using a New Bayesian Orbit Code Based on Markov Chain Monte Carlo

    Science.gov (United States)

    Mendez, Rene A.; Claveria, Ruben M.; Orchard, Marcos E.; Silva, Jorge F.

    2017-11-01

    We present orbital elements and mass sums for 18 visual binary stars of spectral types B to K (five of which are new orbits) with periods ranging from 20 to more than 500 yr. For two double-line spectroscopic binaries with no previous orbits, the individual component masses, using combined astrometric and radial velocity data, have a formal uncertainty of ˜ 0.1 {M}⊙ . Adopting published photometry and trigonometric parallaxes, plus our own measurements, we place these objects on an H-R diagram and discuss their evolutionary status. These objects are part of a survey to characterize the binary population of stars in the Southern Hemisphere using the SOAR 4 m telescope+HRCAM at CTIO. Orbital elements are computed using a newly developed Markov chain Monte Carlo (MCMC) algorithm that delivers maximum-likelihood estimates of the parameters, as well as posterior probability density functions that allow us to evaluate the uncertainty of our derived parameters in a robust way. For spectroscopic binaries, using our approach, it is possible to derive a self-consistent parallax for the system from the combined astrometric and radial velocity data (“orbital parallax”), which compares well with the trigonometric parallaxes. We also present a mathematical formalism that allows a dimensionality reduction of the feature space from seven to three search parameters (or from 10 to seven dimensions—including parallax—in the case of spectroscopic binaries with astrometric data), which makes it possible to explore a smaller number of parameters in each case, improving the computational efficiency of our MCMC code. Based on observations obtained at the Southern Astrophysical Research (SOAR) telescope, which is a joint project of the Ministério da Ciência, Tecnologia, e Inovação (MCTI) da República Federativa do Brasil, the U.S. National Optical Astronomy Observatory (NOAO), the University of North Carolina at Chapel Hill (UNC), and Michigan State University (MSU).

  7. Covariance Kernels from Bayesian Generative Models

    OpenAIRE

    Seeger, Matthias

    2002-01-01

    We propose the framework of mutual information kernels for learning covariance kernels, as used in Support Vector machines and Gaussian process classifiers, from unlabeled task data using Bayesian techniques. We describe an implementation of this framework which uses variational Bayesian mixtures of factor analyzers in order to attack classification problems in high-dimensional spaces where labeled data is sparse, but unlabeled data is abundant.

  8. Bayesian Inference of High-Dimensional Dynamical Ocean Models

    Science.gov (United States)

    Lin, J.; Lermusiaux, P. F. J.; Lolla, S. V. T.; Gupta, A.; Haley, P. J., Jr.

    2015-12-01

    This presentation addresses a holistic set of challenges in high-dimension ocean Bayesian nonlinear estimation: i) predict the probability distribution functions (pdfs) of large nonlinear dynamical systems using stochastic partial differential equations (PDEs); ii) assimilate data using Bayes' law with these pdfs; iii) predict the future data that optimally reduce uncertainties; and (iv) rank the known and learn the new model formulations themselves. Overall, we allow the joint inference of the state, equations, geometry, boundary conditions and initial conditions of dynamical models. Examples are provided for time-dependent fluid and ocean flows, including cavity, double-gyre and Strait flows with jets and eddies. The Bayesian model inference, based on limited observations, is illustrated first by the estimation of obstacle shapes and positions in fluid flows. Next, the Bayesian inference of biogeochemical reaction equations and of their states and parameters is presented, illustrating how PDE-based machine learning can rigorously guide the selection and discovery of complex ecosystem models. Finally, the inference of multiscale bottom gravity current dynamics is illustrated, motivated in part by classic overflows and dense water formation sites and their relevance to climate monitoring and dynamics. This is joint work with our MSEAS group at MIT.

  9. Bayesian estimation of a proportion under an asymmetric observation error Estimación bayesiana de una proporción bajo error de estimación asimétrico

    Directory of Open Access Journals (Sweden)

    Juan Carlos Correa Morales

    2012-06-01

    Full Text Available The process of estimating a proportion that is associated with a sensitive question can yield responses that are not necessarily according to the reality.To reduce the probability o false response to this kind of sensitive questions some authors have proposed techniques of randomized response assuming asymmetric observation error. In this paper we present a generalization of the case where a symmetric error is assumed since this assumption could be unrealistic in practice. Under the assumption of an assymetric error the likelihood function is built. By doing this we intend that in practice the final user hasan alternative method to reduce the probability of false response. Assuming informative a priori distributions an expresion for the posterior distribution is found. Since this posterior distribution does not have a closed mathematical expression, it is neccesary to use the Gibbs sampler to carry out the estimation process. This technique is illustrated using real data about drug consumptions that were collected by the Oficina de Bienestar from the Universidad Nacional de Colombia at Medellín.El proceso de estimación de una proporción relacionada con una pregunta que puede ser altamente sensible para el encuestado, puede generar respuestas que no necesariamente coinciden con la realidad. Para reducir la probabilidad de respuestas falsas a este tipo de preguntas algunos autores han propuesto técnicas de respuesta aleatorizada asumiendo un error de observación asimétrico. En este artículo se presenta una generalización al caso donde se asume un error simétrico lo cual puede ser un supuesto poco realista en la práctica. Se deduce la función de verosimilitud bajo el supuesto de error de estimación asimétrico.Con esto se pretende que en la práctica se cuente con un método alternativo para reducir la probabilidad de respuestas falsas. Asumiendo distribuciones a priori informativas se encuentra una expresión para la distribuci

  10. A Bayesian Generative Model for Surface Template Estimation

    Directory of Open Access Journals (Sweden)

    Jun Ma

    2010-01-01

    Full Text Available 3D surfaces are important geometric models for many objects of interest in image analysis and Computational Anatomy. In this paper, we describe a Bayesian inference scheme for estimating a template surface from a set of observed surface data. In order to achieve this, we use the geodesic shooting approach to construct a statistical model for the generation and the observations of random surfaces. We develop a mode approximation EM algorithm to infer the maximum a posteriori estimation of initial momentum μ, which determines the template surface. Experimental results of caudate, thalamus, and hippocampus data are presented.

  11. Bayesian LASSO, scale space and decision making in association genetics.

    Science.gov (United States)

    Pasanen, Leena; Holmström, Lasse; Sillanpää, Mikko J

    2015-01-01

    LASSO is a penalized regression method that facilitates model fitting in situations where there are as many, or even more explanatory variables than observations, and only a few variables are relevant in explaining the data. We focus on the Bayesian version of LASSO and consider four problems that need special attention: (i) controlling false positives, (ii) multiple comparisons, (iii) collinearity among explanatory variables, and (iv) the choice of the tuning parameter that controls the amount of shrinkage and the sparsity of the estimates. The particular application considered is association genetics, where LASSO regression can be used to find links between chromosome locations and phenotypic traits in a biological organism. However, the proposed techniques are relevant also in other contexts where LASSO is used for variable selection. We separate the true associations from false positives using the posterior distribution of the effects (regression coefficients) provided by Bayesian LASSO. We propose to solve the multiple comparisons problem by using simultaneous inference based on the joint posterior distribution of the effects. Bayesian LASSO also tends to distribute an effect among collinear variables, making detection of an association difficult. We propose to solve this problem by considering not only individual effects but also their functionals (i.e. sums and differences). Finally, whereas in Bayesian LASSO the tuning parameter is often regarded as a random variable, we adopt a scale space view and consider a whole range of fixed tuning parameters, instead. The effect estimates and the associated inference are considered for all tuning parameters in the selected range and the results are visualized with color maps that provide useful insights into data and the association problem considered. The methods are illustrated using two sets of artificial data and one real data set, all representing typical settings in association genetics.

  12. Philosophy and the practice of Bayesian statistics.

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.

  13. Bayesian Analysis of an Anisotropic Universe Model: Systematics and Polarization

    Science.gov (United States)

    Groeneboom, Nicolaas E.; Ackerman, Lotty; Wehus, Ingunn Kathrine; Eriksen, Hans Kristian

    2010-10-01

    We revisit the anisotropic universe model previously developed by Ackerman, Carroll, and Wise (ACW), and generalize both the theoretical and computational framework to include polarization and various forms of systematic effects. We apply our new tools to simulated Wilkinson Microwave Anisotropy Probe (WMAP) data in order to understand the potential impact of asymmetric beams, noise misestimation, and potential zodiacal light emission. We find that neither has any significant impact on the results. We next show that the previously reported ACW signal is also present in the one-year WMAP temperature sky map presented by Liu & Li, where data cuts are more aggressive. Finally, we re-analyze the five-year WMAP data taking into account a previously neglected (-i)^{l-l'}-term in the signal covariance matrix. We still find a strong detection of a preferred direction in the temperature map. Including multipoles up to ell = 400, the anisotropy amplitude for the W band is found to be g = 0.29 ± 0.031, nonzero at 9σ. However, the corresponding preferred direction is also shifted very close to the ecliptic poles at (l, b) = (96, 30), in agreement with the analysis of Hanson & Lewis, indicating that the signal is aligned along the plane of the solar system. This strongly suggests that the signal is not of cosmological origin, but most likely is a product of an unknown systematic effect. Determining the nature of the systematic effect is of vital importance, as it might affect other cosmological conclusions from the WMAP experiment. Finally, we provide a forecast for the Planck experiment including polarization.

  14. Sparse Bayesian Inference and the Temperature Structure of the Solar Corona

    Science.gov (United States)

    Warren, Harry P.; Byers, Jeff M.; Crump, Nicholas A.

    2017-02-01

    Measuring the temperature structure of the solar atmosphere is critical to understanding how it is heated to high temperatures. Unfortunately, the temperature of the upper atmosphere cannot be observed directly, but must be inferred from spectrally resolved observations of individual emission lines that span a wide range of temperatures. Such observations are “inverted” to determine the distribution of plasma temperatures along the line of sight. This inversion is ill posed and, in the absence of regularization, tends to produce wildly oscillatory solutions. We introduce the application of sparse Bayesian inference to the problem of inferring the temperature structure of the solar corona. Within a Bayesian framework a preference for solutions that utilize a minimum number of basis functions can be encoded into the prior and many ad hoc assumptions can be avoided. We demonstrate the efficacy of the Bayesian approach by considering a test library of 40 assumed temperature distributions.

  15. Incremental Bayesian Category Learning From Natural Language.

    Science.gov (United States)

    Frermann, Lea; Lapata, Mirella

    2016-08-01

    Models of category learning have been extensively studied in cognitive science and primarily tested on perceptual abstractions or artificial stimuli. In this paper, we focus on categories acquired from natural language stimuli, that is, words (e.g., chair is a member of the furniture category). We present a Bayesian model that, unlike previous work, learns both categories and their features in a single process. We model category induction as two interrelated subproblems: (a) the acquisition of features that discriminate among categories, and (b) the grouping of concepts into categories based on those features. Our model learns categories incrementally using particle filters, a sequential Monte Carlo method commonly used for approximate probabilistic inference that sequentially integrates newly observed data and can be viewed as a plausible mechanism for human learning. Experimental results show that our incremental learner obtains meaningful categories which yield a closer fit to behavioral data compared to related models while at the same time acquiring features which characterize the learned categories. (An earlier version of this work was published in Frermann and Lapata .). Copyright © 2015 Cognitive Science Society, Inc.

  16. Bayesian Hypothesis Testing for Planet Finding

    Science.gov (United States)

    Braems, I.; Kasdin, N. J.

    2003-12-01

    One of the most important performance metrics of any space planet finding system is integration time. The time needed to make a positive detection of an extrasolar planet determines the number of systems we can observe for the life of the mission and the stability requirements of the spacecraft and optical control systems. Most astronomical detection approaches rely on fairly simple signal-to-noise calculations and a threshold determined by the ability of the human eye to extract the planet image from the background (usually a signal-to-noise ratio of five). In this paper we present an alternative approach to detection using Bayesian hypothesis testing. This optimal approach provides a quantitative measure of the probability of detection under various conditions and integration times (such as known or unknown background levels) and under different prior assumptions. We also show how the technique allows for a much higher probability of detection for shorter integration times than the previous photometric approaches. We gratefully acknowledge the support of the Jet Propulsion Laboratory of the National Aeronautics and Space Administration for this work and Institut National de Recherche en Informatique et Automatique (INRIA) for its support of Ms. Braems.

  17. Bayesian segmentation of protein secondary structure.

    Science.gov (United States)

    Schmidler, S C; Liu, J S; Brutlag, D L

    2000-01-01

    We present a novel method for predicting the secondary structure of a protein from its amino acid sequence. Most existing methods predict each position in turn based on a local window of residues, sliding this window along the length of the sequence. In contrast, we develop a probabilistic model of protein sequence/structure relationships in terms of structural segments, and formulate secondary structure prediction as a general Bayesian inference problem. A distinctive feature of our approach is the ability to develop explicit probabilistic models for alpha-helices, beta-strands, and other classes of secondary structure, incorporating experimentally and empirically observed aspects of protein structure such as helical capping signals, side chain correlations, and segment length distributions. Our model is Markovian in the segments, permitting efficient exact calculation of the posterior probability distribution over all possible segmentations of the sequence using dynamic programming. The optimal segmentation is computed and compared to a predictor based on marginal posterior modes, and the latter is shown to provide significant improvement in predictive accuracy. The marginalization procedure provides exact secondary structure probabilities at each sequence position, which are shown to be reliable estimates of prediction uncertainty. We apply this model to a database of 452 nonhomologous structures, achieving accuracies as high as the best currently available methods. We conclude by discussing an extension of this framework to model nonlocal interactions in protein structures, providing a possible direction for future improvements in secondary structure prediction accuracy.

  18. Optimal Bayesian Experimental Design for Combustion Kinetics

    KAUST Repository

    Huan, Xun

    2011-01-04

    Experimental diagnostics play an essential role in the development and refinement of chemical kinetic models, whether for the combustion of common complex hydrocarbons or of emerging alternative fuels. Questions of experimental design—e.g., which variables or species to interrogate, at what resolution and under what conditions—are extremely important in this context, particularly when experimental resources are limited. This paper attempts to answer such questions in a rigorous and systematic way. We propose a Bayesian framework for optimal experimental design with nonlinear simulation-based models. While the framework is broadly applicable, we use it to infer rate parameters in a combustion system with detailed kinetics. The framework introduces a utility function that reflects the expected information gain from a particular experiment. Straightforward evaluation (and maximization) of this utility function requires Monte Carlo sampling, which is infeasible with computationally intensive models. Instead, we construct a polynomial surrogate for the dependence of experimental observables on model parameters and design conditions, with the help of dimension-adaptive sparse quadrature. Results demonstrate the efficiency and accuracy of the surrogate, as well as the considerable effectiveness of the experimental design framework in choosing informative experimental conditions.

  19. DIAMONDS: a new Bayesian nested sampling tool*

    Directory of Open Access Journals (Sweden)

    Corsaro Enrico

    2015-01-01

    Full Text Available In the context of high-quality asteroseismic data provided by the NASA Kepler Mission, we developed a new code, termed DIAMONDS (high-DImensional And multi-MOdal NesteD Sampling, for fast Bayesian parameter estimation and model comparison by means of the Nested Sampling Monte Carlo (NSMC algorithm, an efficient and powerful method very suitable for high-dimensional problems (like the peak bagging analysis of solar-like oscillations and multi-modal problems (i.e. problems that show multiple solutions. We applied the code to the peak bagging analysis of solar-like oscillations observed in a challenging F-type star. By means of DIAMONDS one is able to detect the different backgrounds in the power spectrum of the star (e.g. stellar granulation and faculae activity and to understand whether one or two oscillation peaks can be identified or not. In addition, we demonstrate a novel approach to peak bagging based on multi-modality, which is able to reduce significantly the number of free parameters involved in the peak bagging model. This novel approach is therefore of great interest for possible future automatization of the entire analysis technique.

  20. Benchmarking for Bayesian Reinforcement Learning.

    Directory of Open Access Journals (Sweden)

    Michael Castronovo

    Full Text Available In the Bayesian Reinforcement Learning (BRL setting, agents try to maximise the collected rewards while interacting with their environment while using some prior knowledge that is accessed beforehand. Many BRL algorithms have already been proposed, but the benchmarks used to compare them are only relevant for specific cases. The paper addresses this problem, and provides a new BRL comparison methodology along with the corresponding open source library. In this methodology, a comparison criterion that measures the performance of algorithms on large sets of Markov Decision Processes (MDPs drawn from some probability distributions is defined. In order to enable the comparison of non-anytime algorithms, our methodology also includes a detailed analysis of the computation time requirement of each algorithm. Our library is released with all source code and documentation: it includes three test problems, each of which has two different prior distributions, and seven state-of-the-art RL algorithms. Finally, our library is illustrated by comparing all the available algorithms and the results are discussed.

  1. Benchmarking for Bayesian Reinforcement Learning.

    Science.gov (United States)

    Castronovo, Michael; Ernst, Damien; Couëtoux, Adrien; Fonteneau, Raphael

    2016-01-01

    In the Bayesian Reinforcement Learning (BRL) setting, agents try to maximise the collected rewards while interacting with their environment while using some prior knowledge that is accessed beforehand. Many BRL algorithms have already been proposed, but the benchmarks used to compare them are only relevant for specific cases. The paper addresses this problem, and provides a new BRL comparison methodology along with the corresponding open source library. In this methodology, a comparison criterion that measures the performance of algorithms on large sets of Markov Decision Processes (MDPs) drawn from some probability distributions is defined. In order to enable the comparison of non-anytime algorithms, our methodology also includes a detailed analysis of the computation time requirement of each algorithm. Our library is released with all source code and documentation: it includes three test problems, each of which has two different prior distributions, and seven state-of-the-art RL algorithms. Finally, our library is illustrated by comparing all the available algorithms and the results are discussed.

  2. Post hoc Bayesian model selection.

    Science.gov (United States)

    Friston, Karl; Penny, Will

    2011-06-15

    This note describes a Bayesian model selection or optimization procedure for post hoc inferences about reduced versions of a full model. The scheme provides the evidence (marginal likelihood) for any reduced model as a function of the posterior density over the parameters of the full model. It rests upon specifying models through priors on their parameters, under the assumption that the likelihood remains the same for all models considered. This provides a quick and efficient scheme for scoring arbitrarily large numbers of models, after inverting a single (full) model. In turn, this enables the selection among discrete models that are distinguished by the presence or absence of free parameters, where free parameters are effectively removed from the model using very precise shrinkage priors. An alternative application of this post hoc model selection considers continuous model spaces, defined in terms of hyperparameters (sufficient statistics) of the prior density over model parameters. In this instance, the prior (model) can be optimized with respect to its evidence. The expressions for model evidence become remarkably simple under the Laplace (Gaussian) approximation to the posterior density. Special cases of this scheme include Savage-Dickey density ratio tests for reduced models and automatic relevance determination in model optimization. We illustrate the approach using general linear models and a more complicated nonlinear state-space model. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. EXONEST: The Bayesian Exoplanetary Explorer

    Directory of Open Access Journals (Sweden)

    Kevin H. Knuth

    2017-10-01

    Full Text Available The fields of astronomy and astrophysics are currently engaged in an unprecedented era of discovery as recent missions have revealed thousands of exoplanets orbiting other stars. While the Kepler Space Telescope mission has enabled most of these exoplanets to be detected by identifying transiting events, exoplanets often exhibit additional photometric effects that can be used to improve the characterization of exoplanets. The EXONEST Exoplanetary Explorer is a Bayesian exoplanet inference engine based on nested sampling and originally designed to analyze archived Kepler Space Telescope and CoRoT (Convection Rotation et Transits planétaires exoplanet mission data. We discuss the EXONEST software package and describe how it accommodates plug-and-play models of exoplanet-associated photometric effects for the purpose of exoplanet detection, characterization and scientific hypothesis testing. The current suite of models allows for both circular and eccentric orbits in conjunction with photometric effects, such as the primary transit and secondary eclipse, reflected light, thermal emissions, ellipsoidal variations, Doppler beaming and superrotation. We discuss our new efforts to expand the capabilities of the software to include more subtle photometric effects involving reflected and refracted light. We discuss the EXONEST inference engine design and introduce our plans to port the current MATLAB-based EXONEST software package over to the next generation Exoplanetary Explorer, which will be a Python-based open source project with the capability to employ third-party plug-and-play models of exoplanet-related photometric effects.

  4. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.

  5. Use of Bayesian Estimates to determine the Volatility Parameter Input in the Black-Scholes and Binomial Option Pricing Models

    Directory of Open Access Journals (Sweden)

    Shu Wing Ho

    2011-12-01

    Full Text Available The valuation of options and many other derivative instruments requires an estimation of exante or forward looking volatility. This paper adopts a Bayesian approach to estimate stock price volatility. We find evidence that overall Bayesian volatility estimates more closely approximate the implied volatility of stocks derived from traded call and put options prices compared to historical volatility estimates sourced from IVolatility.com (“IVolatility”. Our evidence suggests use of the Bayesian approach to estimate volatility can provide a more accurate measure of ex-ante stock price volatility and will be useful in the pricing of derivative securities where the implied stock price volatility cannot be observed.

  6. The Bayesian formulation and well-posedness of fractional elliptic inverse problems

    Science.gov (United States)

    García Trillos, Nicolás; Sanz-Alonso, Daniel

    2017-06-01

    We study the inverse problem of recovering the order and the diffusion coefficient of an elliptic fractional partial differential equation from a finite number of noisy observations of the solution. We work in a Bayesian framework and show conditions under which the posterior distribution is given by a change of measure from the prior. Moreover, we show well-posedness of the inverse problem, in the sense that small perturbations of the observed solution lead to small Hellinger perturbations of the associated posterior measures. We thus provide a mathematical foundation to the Bayesian learning of the order—and other inputs—of fractional models.

  7. Bayesian inversion of refraction seismic traveltime data

    Science.gov (United States)

    Ryberg, T.; Haberland, Ch

    2018-03-01

    We apply a Bayesian Markov chain Monte Carlo (McMC) formalism to the inversion of refraction seismic, traveltime data sets to derive 2-D velocity models below linear arrays (i.e. profiles) of sources and seismic receivers. Typical refraction data sets, especially when using the far-offset observations, are known as having experimental geometries which are very poor, highly ill-posed and far from being ideal. As a consequence, the structural resolution quickly degrades with depth. Conventional inversion techniques, based on regularization, potentially suffer from the choice of appropriate inversion parameters (i.e. number and distribution of cells, starting velocity models, damping and smoothing constraints, data noise level, etc.) and only local model space exploration. McMC techniques are used for exhaustive sampling of the model space without the need of prior knowledge (or assumptions) of inversion parameters, resulting in a large number of models fitting the observations. Statistical analysis of these models allows to derive an average (reference) solution and its standard deviation, thus providing uncertainty estimates of the inversion result. The highly non-linear character of the inversion problem, mainly caused by the experiment geometry, does not allow to derive a reference solution and error map by a simply averaging procedure. We present a modified averaging technique, which excludes parts of the prior distribution in the posterior values due to poor ray coverage, thus providing reliable estimates of inversion model properties even in those parts of the models. The model is discretized by a set of Voronoi polygons (with constant slowness cells) or a triangulated mesh (with interpolation within the triangles). Forward traveltime calculations are performed by a fast, finite-difference-based eikonal solver. The method is applied to a data set from a refraction seismic survey from Northern Namibia and compared to conventional tomography. An inversion test

  8. Collective animal behavior from Bayesian estimation and probability matching.

    Directory of Open Access Journals (Sweden)

    Alfonso Pérez-Escudero

    2011-11-01

    Full Text Available Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  9. Collective animal behavior from Bayesian estimation and probability matching.

    Science.gov (United States)

    Pérez-Escudero, Alfonso; de Polavieja, Gonzalo G

    2011-11-01

    Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  10. Bayesian modeling of flexible cognitive control.

    Science.gov (United States)

    Jiang, Jiefeng; Heller, Katherine; Egner, Tobias

    2014-10-01

    "Cognitive control" describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Bayesian conformational analysis of ring molecules through reversible jump MCMC

    DEFF Research Database (Denmark)

    Nolsøe, Kim; Kessler, Mathieu; Pérez, José

    2005-01-01

    In this paper we address the problem of classifying the conformations of mmembered rings using experimental observations obtained by crystal structure analysis. We formulate a model for the data generation mechanism that consists in a multidimensional mixture model. We perform inference for the p...... for the proportions and the components in a Bayesian framework, implementing an MCMC Reversible Jumps Algorithm to obtain samples of the posterior distributions. The method is illustrated on a simulated data set and on real data corresponding to cyclo-octane structures....

  12. Bayesian Parameter Estimation via Filtering and Functional Approximations

    KAUST Repository

    Matthies, Hermann G.

    2016-11-25

    The inverse problem of determining parameters in a model by comparing some output of the model with observations is addressed. This is a description for what hat to be done to use the Gauss-Markov-Kalman filter for the Bayesian estimation and updating of parameters in a computational model. This is a filter acting on random variables, and while its Monte Carlo variant --- the Ensemble Kalman Filter (EnKF) --- is fairly straightforward, we subsequently only sketch its implementation with the help of functional representations.

  13. A Bayesian Network View on Nested Effects Models

    Directory of Open Access Journals (Sweden)

    Fröhlich Holger

    2009-01-01

    Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the /Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.

  14. Bayesian analysis of MEG visual evoked responses

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  15. Bayesian markets to elicit private information.

    Science.gov (United States)

    Baillon, Aurélien

    2017-07-25

    Financial markets reveal what investors think about the future, and prediction markets are used to forecast election results. Could markets also encourage people to reveal private information, such as subjective judgments (e.g., "Are you satisfied with your life?") or unverifiable facts? This paper shows how to design such markets, called Bayesian markets. People trade an asset whose value represents the proportion of affirmative answers to a question. Their trading position then reveals their own answer to the question. The results of this paper are based on a Bayesian setup in which people use their private information (their "type") as a signal. Hence, beliefs about others' types are correlated with one's own type. Bayesian markets transform this correlation into a mechanism that rewards truth telling. These markets avoid two complications of alternative methods: they need no knowledge of prior information and no elicitation of metabeliefs regarding others' signals.

  16. Junction trees constructions in Bayesian networks

    Science.gov (United States)

    Smail, Linda

    2017-10-01

    Junction trees are used as graphical structures over which propagation will be carried out through a very important property called the ruining intersection property. This paper examines an alternative method for constructing junction trees that are essential for the efficient computations of probabilities in Bayesian networks. The new proposed method converts a sequence of subsets of a Bayesian network into a junction tree, in other words, into a set of cliques that has the running intersection property. The obtained set of cliques and separators coincide with the junction trees obtained by the moralization and triangulation process, but it has the advantage of adapting to any computational task by adding links to the Bayesian network graph.

  17. Assessment of CT image quality using a Bayesian approach

    Science.gov (United States)

    Reginatto, M.; Anton, M.; Elster, C.

    2017-08-01

    One of the most promising approaches for evaluating CT image quality is task-specific quality assessment. This involves a simplified version of a clinical task, e.g. deciding whether an image belongs to the class of images that contain the signature of a lesion or not. Task-specific quality assessment can be done by model observers, which are mathematical procedures that carry out the classification task. The most widely used figure of merit for CT image quality is the area under the ROC curve (AUC), a quantity which characterizes the performance of a given model observer. In order to estimate AUC from a finite sample of images, different approaches from classical statistics have been suggested. The goal of this paper is to introduce task-specific quality assessment of CT images to metrology and to propose a novel Bayesian estimation of AUC for the channelized Hotelling observer (CHO) applied to the task of detecting a lesion at a known image location. It is assumed that signal-present and signal-absent images follow multivariate normal distributions with the same covariance matrix. The Bayesian approach results in a posterior distribution for the AUC of the CHO which provides in addition a complete characterization of the uncertainty of this figure of merit. The approach is illustrated by its application to both simulated and experimental data.

  18. A Bayesian framework for simultaneously modeling neural and behavioral data.

    Science.gov (United States)

    Turner, Brandon M; Forstmann, Birte U; Wagenmakers, Eric-Jan; Brown, Scott D; Sederberg, Per B; Steyvers, Mark

    2013-05-15

    Scientists who study cognition infer underlying processes either by observing behavior (e.g., response times, percentage correct) or by observing neural activity (e.g., the BOLD response). These two types of observations have traditionally supported two separate lines of study. The first is led by cognitive modelers, who rely on behavior alone to support their computational theories. The second is led by cognitive neuroimagers, who rely on statistical models to link patterns of neural activity to experimental manipulations, often without any attempt to make a direct connection to an explicit computational theory. Here we present a flexible Bayesian framework for combining neural and cognitive models. Joining neuroimaging and computational modeling in a single hierarchical framework allows the neural data to influence the parameters of the cognitive model and allows behavioral data, even in the absence of neural data, to constrain the neural model. Critically, our Bayesian approach can reveal interactions between behavioral and neural parameters, and hence between neural activity and cognitive mechanisms. We demonstrate the utility of our approach with applications to simulated fMRI data with a recognition model and to diffusion-weighted imaging data with a response time model of perceptual choice. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. A Bayesian framework for simultaneously modeling neural and behavioral data✩

    Science.gov (United States)

    Turner, Brandon M.; Forstmann, Birte U.; Wagenmakers, Eric-Jan; Brown, Scott D.; Sederberg, Per B.; Steyvers, Mark

    2013-01-01

    Scientists who study cognition infer underlying processes either by observing behavior (e.g., response times, percentage correct) or by observing neural activity (e.g., the BOLD response). These two types of observations have traditionally supported two separate lines of study. The first is led by cognitive modelers, who rely on behavior alone to support their computational theories. The second is led by cognitive neuroimagers, who rely on statistical models to link patterns of neural activity to experimental manipulations, often without any attempt to make a direct connection to an explicit computational theory. Here we present a flexible Bayesian framework for combining neural and cognitive models. Joining neuroimaging and computational modeling in a single hierarchical framework allows the neural data to influence the parameters of the cognitive model and allows behavioral data, even in the absence of neural data, to constrain the neural model. Critically, our Bayesian approach can reveal interactions between behavioral and neural parameters, and hence between neural activity and cognitive mechanisms. We demonstrate the utility of our approach with applications to simulated fMRI data with a recognition model and to diffusion-weighted imaging data with a response time model of perceptual choice. PMID:23370060

  20. Geometrical destabilization, premature end of inflation and Bayesian model selection

    Science.gov (United States)

    Renaux-Petel, Sébastien; Turzyński, Krzysztof; Vennin, Vincent

    2017-11-01

    By means of Bayesian techniques, we study how a premature ending of inflation, motivated by geometrical destabilization, affects the observational evidences of typical inflationary models. Large-field models are worsened, and inflection point potentials are drastically improved for a specific range of the field-space curvature characterizing the geometrical destabilization. For other models we observe shifts in the preferred values of the model parameters. For quartic hilltop models for instance, contrary to the standard case, we find preference for theoretically natural sub-Planckian hill widths. Eventually, the Bayesian ranking of models becomes substantially reordered with a premature end of inflation. Such a phenomenon also modifies the constraints on the reheating expansion history, which has to be properly accounted for since it determines the position of the observational window with respect to the end of inflation. Our results demonstrate how the interpretation of cosmological data in terms of fundamental physics is considerably modified in the presence of premature end of inflation mechanisms.

  1. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  2. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  3. Length Scales in Bayesian Automatic Adaptive Quadrature

    Science.gov (United States)

    Adam, Gh.; Adam, S.

    2016-02-01

    Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1-16 (2012)] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule), mesoscopic (Simpson rule), and macroscopic (quadrature sums of high algebraic degrees of precision). Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  4. A strongly quasiconvex PAC-Bayesian bound

    DEFF Research Database (Denmark)

    Thiemann, Niklas; Igel, Christian; Wintenberger, Olivier

    We propose a new PAC-Bayesian bound and a way of constructing a hypothesis space, so that the bound is convex in the posterior distribution and also convex in a trade-off parameter between empirical performance of the posterior distribution and its complexity. The complexity is measured by the Ku......We propose a new PAC-Bayesian bound and a way of constructing a hypothesis space, so that the bound is convex in the posterior distribution and also convex in a trade-off parameter between empirical performance of the posterior distribution and its complexity. The complexity is measured...

  5. COBRA: a Bayesian approach to pulsar searching

    Science.gov (United States)

    Lentati, L.; Champion, D. J.; Kramer, M.; Barr, E.; Torne, P.

    2018-02-01

    We introduce COBRA, a GPU-accelerated Bayesian analysis package for performing pulsar searching, that uses candidates from traditional search techniques to set the prior used for the periodicity of the source, and performs a blind search in all remaining parameters. COBRA incorporates models for both isolated and accelerated systems, as well as both Keplerian and relativistic binaries, and exploits pulse phase information to combine search epochs coherently, over time, frequency or across multiple telescopes. We demonstrate the efficacy of our approach in a series of simulations that challenge typical search techniques, including highly aliased signals, and relativistic binary systems. In the most extreme case, we simulate an 8 h observation containing 24 orbits of a pulsar in a binary with a 30 M⊙ companion. Even in this scenario we show that we can build up from an initial low-significance candidate, to fully recovering the signal. We also apply the method to survey data of three pulsars from the globular cluster 47Tuc: PSRs J0024-7204D, J0023-7203J and J0024-7204R. This final pulsar is in a 1.6 h binary, the shortest of any pulsar in 47Tuc, and additionally shows significant scintillation. By allowing the amplitude of the source to vary as a function of time, however, we show that we are able to obtain optimal combinations of such noisy data. We also demonstrate the ability of COBRA to perform high-precision pulsar timing directly on the single pulse survey data, and obtain a 95 per cent upper limit on the eccentricity of PSR J0024-7204R of εb < 0.0007.

  6. Bayesian methods to estimate urban growth potential

    Science.gov (United States)

    Smith, Jordan W.; Smart, Lindsey S.; Dorning, Monica; Dupéy, Lauren Nicole; Méley, Andréanne; Meentemeyer, Ross K.

    2017-01-01

    Urban growth often influences the production of ecosystem services. The impacts of urbanization on landscapes can subsequently affect landowners’ perceptions, values and decisions regarding their land. Within land-use and land-change research, very few models of dynamic landscape-scale processes like urbanization incorporate empirically-grounded landowner decision-making processes. Very little attention has focused on the heterogeneous decision-making processes that aggregate to influence broader-scale patterns of urbanization. We examine the land-use tradeoffs faced by individual landowners in one of the United States’ most rapidly urbanizing regions − the urban area surrounding Charlotte, North Carolina. We focus on the land-use decisions of non-industrial private forest owners located across the region’s development gradient. A discrete choice experiment is used to determine the critical factors influencing individual forest owners’ intent to sell their undeveloped properties across a series of experimentally varied scenarios of urban growth. Data are analyzed using a hierarchical Bayesian approach. The estimates derived from the survey data are used to modify a spatially-explicit trend-based urban development potential model, derived from remotely-sensed imagery and observed changes in the region’s socioeconomic and infrastructural characteristics between 2000 and 2011. This modeling approach combines the theoretical underpinnings of behavioral economics with spatiotemporal data describing a region’s historical development patterns. By integrating empirical social preference data into spatially-explicit urban growth models, we begin to more realistically capture processes as well as patterns that drive the location, magnitude and rates of urban growth.

  7. Comparison of the Bayesian and Frequentist Approach to the Statistics

    OpenAIRE

    Hakala, Michal

    2015-01-01

    The Thesis deals with introduction to Bayesian statistics and comparing Bayesian approach with frequentist approach to statistics. Bayesian statistics is modern branch of statistics which provides an alternative comprehensive theory to the frequentist approach. Bayesian concepts provides solution for problems not being solvable by frequentist theory. In the thesis are compared definitions, concepts and quality of statistical inference. The main interest is focused on a point estimation, an in...

  8. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    Science.gov (United States)

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  9. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  10. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  11. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  12. From arguments to constraints on a Bayesian network

    NARCIS (Netherlands)

    Bex, F.J.; Renooij, S.

    2016-01-01

    In this paper, we propose a way to derive constraints for a Bayesian Network from structured arguments. Argumentation and Bayesian networks can both be considered decision support techniques, but are typically used by experts with different backgrounds. Bayesian network experts have the mathematical

  13. Current Observational Constraints on Cosmic Doomsday

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yun

    2004-09-14

    In a broad class of dark energy models, the universe may collapse within a finite time t{sub c}. Here we study a representative model of dark energy with a linear potential, V({phi})=V{sub 0}(1 + {alpha}{phi}). This model is the simplest doomsday model, in which the universe collapses rather quickly after it stops expanding. Observational data from type Ia supernovae (SNe Ia), cosmic microwave background anisotropy (CMB), and large scale structure (LSS) are complementary in constraining dark energy models. Using the new SN Ia data (Riess sample), the CMB data from WMAP, and the LSS data from 2dF, we find that the collapse time of the universe is t{sub c} {approx}> 42 (24) gigayears from today at 68% (95%) confidence.

  14. Basic and Advanced Bayesian Structural Equation Modeling With Applications in the Medical and Behavioral Sciences

    CERN Document Server

    Lee, Sik-Yum

    2012-01-01

    This book provides clear instructions to researchers on how to apply Structural Equation Models (SEMs) for analyzing the inter relationships between observed and latent variables. Basic and Advanced Bayesian Structural Equation Modeling introduces basic and advanced SEMs for analyzing various kinds of complex data, such as ordered and unordered categorical data, multilevel data, mixture data, longitudinal data, highly non-normal data, as well as some of their combinations. In addition, Bayesian semiparametric SEMs to capture the true distribution of explanatory latent variables are introduce

  15. A Bayesian network fault diagnostic system for proton exchange membrane fuel cells

    Science.gov (United States)

    Riascos, Luis Alberto M.; Simoes, Marcelo G.; Miyagi, Paulo E.

    This paper considers the effects of different types of faults on a proton exchange membrane fuel cell model (PEMFC). Using databases (which record the fault effects) and probabilistic methods (such as the Bayesian-Score and Markov Chain Monte Carlo), a graphical-probabilistic structure for fault diagnosis is constructed. The graphical model defines the cause-effect relationship among the variables, and the probabilistic method captures the numerical dependence among these variables. Finally, the Bayesian network (i.e. the graphical-probabilistic structure) is used to execute the diagnosis of fault causes in the PEMFC model based on the effects observed.

  16. A Bayesian network fault diagnostic system for proton exchange membrane fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Riascos, Luis Alberto M. [Federal University of ABC, r. Santa Adelia, 166, CEP 09210-170, Santo Andre, SP (Brazil); Simoes, Marcelo G. [Colorado School of Mines, 1500 Illinois Street, 80401, Golden, CO (United States); Miyagi, Paulo E. [Escola Politecnica, University of Sao Paulo, Av. Prof. Mello Moraes, 2231, CEP 05508-900, Sao Paulo (Brazil)

    2007-02-25

    This paper considers the effects of different types of faults on a proton exchange membrane fuel cell model (PEMFC). Using databases (which record the fault effects) and probabilistic methods (such as the Bayesian-Score and Markov Chain Monte Carlo), a graphical-probabilistic structure for fault diagnosis is constructed. The graphical model defines the cause-effect relationship among the variables, and the probabilistic method captures the numerical dependence among these variables. Finally, the Bayesian network (i.e. the graphical-probabilistic structure) is used to execute the diagnosis of fault causes in the PEMFC model based on the effects observed. (author)

  17. Bayesian truncation errors in chiral effective field theory: model checking and accounting for correlations

    Science.gov (United States)

    Melendez, Jordan; Wesolowski, Sarah; Furnstahl, Dick

    2017-09-01

    Chiral effective field theory (EFT) predictions are necessarily truncated at some order in the EFT expansion, which induces an error that must be quantified for robust statistical comparisons to experiment. A Bayesian model yields posterior probability distribution functions for these errors based on expectations of naturalness encoded in Bayesian priors and the observed order-by-order convergence pattern of the EFT. As a general example of a statistical approach to truncation errors, the model was applied to chiral EFT for neutron-proton scattering using various semi-local potentials of Epelbaum, Krebs, and Meißner (EKM). Here we discuss how our model can learn correlation information from the data and how to perform Bayesian model checking to validate that the EFT is working as advertised. Supported in part by NSF PHY-1614460 and DOE NUCLEI SciDAC DE-SC0008533.

  18. A Bayesian network driven approach to model the transcriptional response to nitric oxide in Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jingchun Zhu

    Full Text Available The transcriptional response to exogenously supplied nitric oxide in Saccharomyces cerevisiae was modeled using an integrated framework of Bayesian network learning and experimental feedback. A Bayesian network learning algorithm was used to generate network models of transcriptional output, followed by model verification and revision through experimentation. Using this framework, we generated a network model of the yeast transcriptional response to nitric oxide and a panel of other environmental signals. We discovered two environmental triggers, the diauxic shift and glucose repression, that affected the observed transcriptional profile. The computational method predicted the transcriptional control of yeast flavohemoglobin YHB1 by glucose repression, which was subsequently experimentally verified. A freely available software application, ExpressionNet, was developed to derive Bayesian network models from a combination of gene expression profile clusters, genetic information and experimental conditions.

  19. Risk Forecasting of Karachi Stock Exchange: A Comparison of Classical and Bayesian GARCH Models

    Directory of Open Access Journals (Sweden)

    Farhat Iqbal

    2016-09-01

    Full Text Available This paper is concerned with the estimation, forecasting and evaluation of Value-at-Risk (VaR of Karachi Stock Exchange before and after the global financial crisis of 2008 using Bayesian method. The generalized autoregressive conditional heteroscedastic (GARCH models under the assumption of normal and heavy-tailed errors are used to forecast one-day-ahead risk estimates. Various measures and backtesting methods are employed to evaluate VaR forecasts. The observed number of VaR violations using Bayesian method is found close to the expected number of violations. The losses are also found smaller than the competing Maximum Likelihood method. The results showed that the Bayesian method produce accurate and reliable VaR forecasts and can be preferred over other methods. 

  20. Bayesian Maximum Entropy prediction of soil categories using a traditional soil map as soft information.

    NARCIS (Netherlands)

    Brus, D.J.; Bogaert, P.; Heuvelink, G.B.M.

    2008-01-01

    Bayesian Maximum Entropy was used to estimate the probabilities of occurrence of soil categories in the Netherlands, and to simulate realizations from the associated multi-point pdf. Besides the hard observations (H) of the categories at 8369 locations, the soil map of the Netherlands 1:50 000 was

  1. Bayesian Network Meta-Analysis for Unordered Categorical Outcomes with Incomplete Data

    Science.gov (United States)

    Schmid, Christopher H.; Trikalinos, Thomas A.; Olkin, Ingram

    2014-01-01

    We develop a Bayesian multinomial network meta-analysis model for unordered (nominal) categorical outcomes that allows for partially observed data in which exact event counts may not be known for each category. This model properly accounts for correlations of counts in mutually exclusive categories and enables proper comparison and ranking of…

  2. Group Tracking of Space Objects within Bayesian Framework

    Directory of Open Access Journals (Sweden)

    Huang Jian

    2013-03-01

    Full Text Available It is imperative to efficiently track and catalogue the extensive dense group space objects for space surveillance. As the main instrument for Low Earth Orbit (LEO space surveillance, ground-based radar system is usually limited by its resolving power while tracking the small space debris with high dense population. Thus, the obtained information about target detection and observation will be seriously missed, which makes the traditional tracking method inefficient. Therefore, we conceived the concept of group tracking. The overall motional tendency of the group objects is particularly focused, while the individual object is simultaneously tracked in effect. The tracking procedure is based on the Bayesian frame. According to the restriction among the group center and observations of multi-targets, the reconstruction of targets’ number and estimation of individual trajectory can be greatly improved on the accuracy and robustness in the case of high miss alarm. The Markov Chain Monte Carlo Particle (MCMC-Particle algorism is utilized for solving the Bayesian integral problem. Finally, the simulation of the group space objects tracking is carried out to validate the efficiency of the proposed method.

  3. MAP estimators and their consistency in Bayesian nonparametric inverse problems

    KAUST Repository

    Dashti, M.

    2013-09-01

    We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ0. We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μy. Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. © 2013 IOP Publishing Ltd.

  4. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  5. Comprehension and computation in Bayesian problem solving

    Directory of Open Access Journals (Sweden)

    Eric D. Johnson

    2015-07-01

    Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.

  6. Optimal Detection under the Restricted Bayesian Criterion

    Directory of Open Access Journals (Sweden)

    Shujun Liu

    2017-07-01

    Full Text Available This paper aims to find a suitable decision rule for a binary composite hypothesis-testing problem with a partial or coarse prior distribution. To alleviate the negative impact of the information uncertainty, a constraint is considered that the maximum conditional risk cannot be greater than a predefined value. Therefore, the objective of this paper becomes to find the optimal decision rule to minimize the Bayes risk under the constraint. By applying the Lagrange duality, the constrained optimization problem is transformed to an unconstrained optimization problem. In doing so, the restricted Bayesian decision rule is obtained as a classical Bayesian decision rule corresponding to a modified prior distribution. Based on this transformation, the optimal restricted Bayesian decision rule is analyzed and the corresponding algorithm is developed. Furthermore, the relation between the Bayes risk and the predefined value of the constraint is also discussed. The Bayes risk obtained via the restricted Bayesian decision rule is a strictly decreasing and convex function of the constraint on the maximum conditional risk. Finally, the numerical results including a detection example are presented and agree with the theoretical results.

  7. Dependence discovery in modular Bayesian networks

    NARCIS (Netherlands)

    de Oude, P.; Pavlin, G.

    2009-01-01

    This paper introduces an information theoretic approach to verification of modular causal probabilistic models. We assume systems which are gradually extended by adding new functional modules, each having a limited domain knowledge captured by a local Bayesian network. Different modules originate

  8. Bayesian inference of the metazoan phylogeny

    DEFF Research Database (Denmark)

    Glenner, Henrik; Hansen, Anders J; Sørensen, Martin V

    2004-01-01

    with rigorous statistical approaches less prone to such inconsistencies. We present the first statistically founded analysis of a metazoan data set based on a combination of morphological and molecular data and compare the results with a traditional parsimony analysis. Interestingly, the Bayesian analyses...

  9. Non-Linear Approximation of Bayesian Update

    KAUST Repository

    Litvinenko, Alexander

    2016-06-23

    We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.

  10. Bayesian Cosmic Web Reconstruction: BARCODE for Clusters

    NARCIS (Netherlands)

    Patrick Bos, E. G.; van de Weygaert, Rien; Kitaura, Francisco; Cautun, Marius

    2016-01-01

    We describe the Bayesian \\barcode\\ formalism that has been designed towards the reconstruction of the Cosmic Web in a given volume on the basis of the sampled galaxy cluster distribution. Based on the realization that the massive compact clusters are responsible for the major share of the large

  11. Bayesian mixture models for partially verified data

    DEFF Research Database (Denmark)

    Kostoulas, Polychronis; Browne, William J.; Nielsen, Søren Saxmose

    2013-01-01

    Bayesian mixture models can be used to discriminate between the distributions of continuous test responses for different infection stages. These models are particularly useful in case of chronic infections with a long latent period, like Mycobacterium avium subsp. paratuberculosis (MAP) infection...

  12. Sequential Bayesian technique: An alternative approach for ...

    Indian Academy of Sciences (India)

    This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become available. The usefulness of the method is demonstrated with ...

  13. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    by constructing a junction tree from this network. In this paper we propose a method for translating directly from object oriented Bayesian networks to junction trees, avoiding the intermediate translation. We pursue two main purposes: firstly, to maintain the original structure organized in an instance tree...

  14. Modelling crime linkage with Bayesian networks

    NARCIS (Netherlands)

    de Zoete, J.; Sjerps, M.; Lagnado, D.; Fenton, N.

    2015-01-01

    When two or more crimes show specific similarities, such as a very distinct modus operandi, the probability that they were committed by the same offender becomes of interest. This probability depends on the degree of similarity and distinctiveness. We show how Bayesian networks can be used to model

  15. Bayesian Averaging is Well-Temperated

    DEFF Research Database (Denmark)

    Hansen, Lars Kai

    2000-01-01

    Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation...

  16. Neural associative memory with optimal Bayesian learning.

    Science.gov (United States)

    Knoblauch, Andreas

    2011-06-01

    Neural associative memories are perceptron-like single-layer networks with fast synaptic learning typically storing discrete associations between pairs of neural activity patterns. Previous work optimized the memory capacity for various models of synaptic learning: linear Hopfield-type rules, the Willshaw model employing binary synapses, or the BCPNN rule of Lansner and Ekeberg, for example. Here I show that all of these previous models are limit cases of a general optimal model where synaptic learning is determined by probabilistic Bayesian considerations. Asymptotically, for large networks and very sparse neuron activity, the Bayesian model becomes identical to an inhibitory implementation of the Willshaw and BCPNN-type models. For less sparse patterns, the Bayesian model becomes identical to Hopfield-type networks employing the covariance rule. For intermediate sparseness or finite networks, the optimal Bayesian learning rule differs from the previous models and can significantly improve memory performance. I also provide a unified analytical framework to determine memory capacity at a given output noise level that links approaches based on mutual information, Hamming distance, and signal-to-noise ratio.

  17. Bayesian networks: a combined tuning heuristic

    NARCIS (Netherlands)

    Bolt, J.H.

    2016-01-01

    One of the issues in tuning an output probability of a Bayesian network by changing multiple parameters is the relative amount of the individual parameter changes. In an existing heuristic parameters are tied such that their changes induce locally a maximal change of the tuned probability. This

  18. Sequential Bayesian technique: An alternative approach for ...

    Indian Academy of Sciences (India)

    MS received 8 October 2007; revised 15 July 2008. Abstract. This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become ...

  19. Speech Segmentation Using Bayesian Autoregressive Changepoint Detector

    Directory of Open Access Journals (Sweden)

    P. Sovka

    1998-12-01

    Full Text Available This submission is devoted to the study of the Bayesian autoregressive changepoint detector (BCD and its use for speech segmentation. Results of the detector application to autoregressive signals as well as to real speech are given. BCD basic properties are described and discussed. The novel two-step algorithm consisting of cepstral analysis and BCD for automatic speech segmentation is suggested.

  20. Explanation mode for Bayesian automatic object recognition

    Science.gov (United States)

    Hazlett, Thomas L.; Cofer, Rufus H.; Brown, Harold K.

    1992-09-01

    One of the more useful techniques to emerge from AI is the provision of an explanation modality used by the researcher to understand and subsequently tune the reasoning of an expert system. Such a capability, missing in the arena of statistical object recognition, is not that difficult to provide. Long standing results show that the paradigm of Bayesian object recognition is truly optimal in a minimum probability of error sense. To a large degree, the Bayesian paradigm achieves optimality through adroit fusion of a wide range of lower informational data sources to give a higher quality decision--a very 'expert system' like capability. When various sources of incoming data are represented by C++ classes, it becomes possible to automatically backtrack the Bayesian data fusion process, assigning relative weights to the more significant datums and their combinations. A C++ object oriented engine is then able to synthesize 'English' like textural description of the Bayesian reasoning suitable for generalized presentation. Key concepts and examples are provided based on an actual object recognition problem.

  1. Multisnapshot Sparse Bayesian Learning for DOA

    DEFF Research Database (Denmark)

    Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki

    2016-01-01

    The directions of arrival (DOA) of plane waves are estimated from multisnapshot sensor array data using sparse Bayesian learning (SBL). The prior for the source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters, the unknown variances (i.e., the source p...... is discussed and evaluated competitively against LASSO (l(1)-regularization), conventional beamforming, and MUSIC....

  2. Basics of Bayesian Learning - Basically Bayes

    DEFF Research Database (Denmark)

    Larsen, Jan

    Tutorial presented at the IEEE Machine Learning for Signal Processing Workshop 2006, Maynooth, Ireland, September 8, 2006. The tutorial focuses on the basic elements of Bayesian learning and its relation to classical learning paradigms. This includes a critical discussion of the pros and cons....... The theory is illustrated by specific models and examples....

  3. Bayesian Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  4. Face detection by aggregated Bayesian network classifiers

    NARCIS (Netherlands)

    Pham, T.V.; Worring, M.; Smeulders, A.W.M.

    2002-01-01

    A face detection system is presented. A new classification method using forest-structured Bayesian networks is used. The method is used in an aggregated classifier to discriminate face from non-face patterns. The process of generating non-face patterns is integrated with the construction of the

  5. Asymptotically informative prior for Bayesian analysis

    NARCIS (Netherlands)

    Yuan, A.; de Gooijer, J.G.

    2011-01-01

    In classical Bayesian inference the prior is treated as fixed, it is asymptotically negligible, thus any information contained in the prior is ignored from the asymptotic first order result. However, in practice often an informative prior is summarized from previous similar or the same kind of

  6. Inverse Problems in a Bayesian Setting

    KAUST Repository

    Matthies, Hermann G.

    2016-02-13

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.

  7. A Bayesian Blackboard for Information Fusion

    Science.gov (United States)

    2004-01-01

    As an architecture for intelligence analysis and data fusion this has many advantages: the blackboard is a shared workspace or "corporate memory...The Bayesian blackboard architecture presented here, called AIID, serves both as a prototype system for intelligence analysis and as a laboratory for testing mathematical models of the economics of intelligence analysis .

  8. Diagnosis of Subtraction Bugs Using Bayesian Networks

    Science.gov (United States)

    Lee, Jihyun; Corter, James E.

    2011-01-01

    Diagnosis of misconceptions or "bugs" in procedural skills is difficult because of their unstable nature. This study addresses this problem by proposing and evaluating a probability-based approach to the diagnosis of bugs in children's multicolumn subtraction performance using Bayesian networks. This approach assumes a causal network relating…

  9. Encoding dependence in Bayesian causal networks

    Science.gov (United States)

    Bayesian networks (BNs) represent complex, uncertain spatio-temporal dynamics by propagation of conditional probabilities between identifiable states with a testable causal interaction model. Typically, they assume random variables are discrete in time and space with a static network structure that ...

  10. Combining morphological analysis and Bayesian networks for ...

    African Journals Online (AJOL)

    Morphological analysis (MA) and Bayesian networks (BN) are two closely related modelling methods, each of which has its advantages and disadvantages for strategic decision support modelling. MA is a method for defining, linking and evaluating problem spaces. BNs are graphical models which consist of a qualitative ...

  11. Dynamic Bayesian Networks for Student Modeling

    Science.gov (United States)

    Kaser, Tanja; Klingler, Severin; Schwing, Alexander G.; Gross, Markus

    2017-01-01

    Intelligent tutoring systems adapt the curriculum to the needs of the individual student. Therefore, an accurate representation and prediction of student knowledge is essential. Bayesian Knowledge Tracing (BKT) is a popular approach for student modeling. The structure of BKT models, however, makes it impossible to represent the hierarchy and…

  12. Identification of transmissivity fields using a Bayesian strategy and perturbative approach

    Science.gov (United States)

    Zanini, Andrea; Tanda, Maria Giovanna; Woodbury, Allan D.

    2017-10-01

    The paper deals with the crucial problem of the groundwater parameter estimation that is the basis for efficient modeling and reclamation activities. A hierarchical Bayesian approach is developed: it uses the Akaike's Bayesian Information Criteria in order to estimate the hyperparameters (related to the covariance model chosen) and to quantify the unknown noise variance. The transmissivity identification proceeds in two steps: the first, called empirical Bayesian interpolation, uses Y* (Y = lnT) observations to interpolate Y values on a specified grid; the second, called empirical Bayesian update, improve the previous Y estimate through the addition of hydraulic head observations. The relationship between the head and the lnT has been linearized through a perturbative solution of the flow equation. In order to test the proposed approach, synthetic aquifers from literature have been considered. The aquifers in question contain a variety of boundary conditions (both Dirichelet and Neuman type) and scales of heterogeneities (σY2 = 1.0 and σY2 = 5.3). The estimated transmissivity fields were compared to the true one. The joint use of Y* and head measurements improves the estimation of Y considering both degrees of heterogeneity. Even if the variance of the strong transmissivity field can be considered high for the application of the perturbative approach, the results show the same order of approximation of the non-linear methods proposed in literature. The procedure allows to compute the posterior probability distribution of the target quantities and to quantify the uncertainty in the model prediction. Bayesian updating has advantages related both to the Monte-Carlo (MC) and non-MC approaches. In fact, as the MC methods, Bayesian updating allows computing the direct posterior probability distribution of the target quantities and as non-MC methods it has computational times in the order of seconds.

  13. Bayesian modeling of the mass and density of asteroids

    Science.gov (United States)

    Dotson, Jessie L.; Mathias, Donovan

    2017-10-01

    Mass and density are two of the fundamental properties of any object. In the case of near earth asteroids, knowledge about the mass of an asteroid is essential for estimating the risk due to (potential) impact and planning possible mitigation options. The density of an asteroid can illuminate the structure of the asteroid. A low density can be indicative of a rubble pile structure whereas a higher density can imply a monolith and/or higher metal content. The damage resulting from an impact of an asteroid with Earth depends on its interior structure in addition to its total mass, and as a result, density is a key parameter to understanding the risk of asteroid impact. Unfortunately, measuring the mass and density of asteroids is challenging and often results in measurements with large uncertainties. In the absence of mass / density measurements for a specific object, understanding the range and distribution of likely values can facilitate probabilistic assessments of structure and impact risk. Hierarchical Bayesian models have recently been developed to investigate the mass - radius relationship of exoplanets (Wolfgang, Rogers & Ford 2016) and to probabilistically forecast the mass of bodies large enough to establish hydrostatic equilibrium over a range of 9 orders of magnitude in mass (from planemos to main sequence stars; Chen & Kipping 2017). Here, we extend this approach to investigate the mass and densities of asteroids. Several candidate Bayesian models are presented, and their performance is assessed relative to a synthetic asteroid population. In addition, a preliminary Bayesian model for probablistically forecasting masses and densities of asteroids is presented. The forecasting model is conditioned on existing asteroid data and includes observational errors, hyper-parameter uncertainties and intrinsic scatter.

  14. Quantum Bayesian networks with application to games displaying Parrondo's paradox

    Science.gov (United States)

    Pejic, Michael

    Bayesian networks and their accompanying graphical models are widely used for prediction and analysis across many disciplines. We will reformulate these in terms of linear maps. This reformulation will suggest a natural extension, which we will show is equivalent to standard textbook quantum mechanics. Therefore, this extension will be termed quantum. However, the term quantum should not be taken to imply this extension is necessarily only of utility in situations traditionally thought of as in the domain of quantum mechanics. In principle, it may be employed in any modelling situation, say forecasting the weather or the stock market---it is up to experiment to determine if this extension is useful in practice. Even restricting to the domain of quantum mechanics, with this new formulation the advantages of Bayesian networks can be maintained for models incorporating quantum and mixed classical-quantum behavior. The use of these will be illustrated by various basic examples. Parrondo's paradox refers to the situation where two, multi-round games with a fixed winning criteria, both with probability greater than one-half for one player to win, are combined. Using a possibly biased coin to determine the rule to employ for each round, paradoxically, the previously losing player now wins the combined game with probabilitygreater than one-half. Using the extended Bayesian networks, we will formulate and analyze classical observed, classical hidden, and quantum versions of a game that displays this paradox, finding bounds for the discrepancy from naive expectations for the occurrence of the paradox. A quantum paradox inspired by Parrondo's paradox will also be analyzed. We will prove a bound for the discrepancy from naive expectations for this paradox as well. Games involving quantum walks that achieve this bound will be presented.

  15. Introduction to Bayesian scientific computing ten lectures on subjective computing

    CERN Document Server

    Calvetti, Daniela

    2007-01-01

    A combination of the concepts subjective – or Bayesian – statistics and scientific computing, the book provides an integrated view across numerical linear algebra and computational statistics. Inverse problems act as the bridge between these two fields where the goal is to estimate an unknown parameter that is not directly observable by using measured data and a mathematical model linking the observed and the unknown. Inverse problems are closely related to statistical inference problems, where the observations are used to infer on an underlying probability distribution. This connection between statistical inference and inverse problems is a central topic of the book. Inverse problems are typically ill-posed: small uncertainties in data may propagate in huge uncertainties in the estimates of the unknowns. To cope with such problems, efficient regularization techniques are developed in the framework of numerical analysis. The counterpart of regularization in the framework of statistical inference is the us...

  16. Fast genomic predictions via Bayesian G-BLUP and multilocus models of threshold traits including censored Gaussian data.

    Science.gov (United States)

    Kärkkäinen, Hanni P; Sillanpää, Mikko J

    2013-09-04

    Because of the increased availability of genome-wide sets of molecular markers along with reduced cost of genotyping large samples of individuals, genomic estimated breeding values have become an essential resource in plant and animal breeding. Bayesian methods for breeding value estimation have proven to be accurate and efficient; however, the ever-increasing data sets are placing heavy demands on the parameter estimation algorithms. Although a commendable number of fast estimation algorithms are available for Bayesian models of continuous Gaussian traits, there is a shortage for corresponding models of discrete or censored phenotypes. In this work, we consider a threshold approach of binary, ordinal, and censored Gaussian observations for Bayesian multilocus association models and Bayesian genomic best linear unbiased prediction and present a high-speed generalized expectation maximization algorithm for parameter estimation under these models. We demonstrate our method with simulated and real data. Our example analyses suggest that the use of the extra information present in an ordered categorical or censored Gaussian data set, instead of dichotomizing the data into case-control observations, increases the accuracy of genomic breeding values predicted by Bayesian multilocus association models or by Bayesian genomic best linear unbiased prediction. Furthermore, the example analyses indicate that the correct threshold model is more accurate than the directly used Gaussian model with a censored Gaussian data, while with a binary or an ordinal data the superiority of the threshold model could not be confirmed.

  17. Bayesian and Quasi-Bayesian Estimators for Mutual Information from Discrete Data

    Directory of Open Access Journals (Sweden)

    Il Memming Park

    2013-05-01

    Full Text Available Mutual information (MI quantifies the statistical dependency between a pair of random variables, and plays a central role in the analysis of engineering and biological systems. Estimation of MI is difficult due to its dependence on an entire joint distribution, which is difficult to estimate from samples. Here we discuss several regularized estimators for MI that employ priors based on the Dirichlet distribution. First, we discuss three “quasi-Bayesian” estimators that result from linear combinations of Bayesian estimates for conditional and marginal entropies. We show that these estimators are not in fact Bayesian, and do not arise from a well-defined posterior distribution and may in fact be negative. Second, we show that a fully Bayesian MI estimator proposed by Hutter (2002, which relies on a fixed Dirichlet prior, exhibits strong prior dependence and has large bias for small datasets. Third, we formulate a novel Bayesian estimator using a mixture-of-Dirichlets prior, with mixing weights designed to produce an approximately flat prior over MI. We examine the performance of these estimators with a variety of simulated datasets and show that, surprisingly, quasi-Bayesian estimators generally outperform our Bayesian estimator. We discuss outstanding challenges for MI estimation and suggest promising avenues for future research.

  18. Computational statistics using the Bayesian Inference Engine

    Science.gov (United States)

    Weinberg, Martin D.

    2013-09-01

    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.

  19. Bayesian Correlation Analysis for Sequence Count Data.

    Directory of Open Access Journals (Sweden)

    Daniel Sánchez-Taltavull

    Full Text Available Evaluating the similarity of different measured variables is a fundamental task of statistics, and a key part of many bioinformatics algorithms. Here we propose a Bayesian scheme for estimating the correlation between different entities' measurements based on high-throughput sequencing data. These entities could be different genes or miRNAs whose expression is measured by RNA-seq, different transcription factors or histone marks whose expression is measured by ChIP-seq, or even combinations of different types of entities. Our Bayesian formulation accounts for both measured signal levels and uncertainty in those levels, due to varying sequencing depth in different experiments and to varying absolute levels of individual entities, both of which affect the precision of the measurements. In comparison with a traditional Pearson correlation analysis, we show that our Bayesian correlation analysis retains high correlations when measurement confidence is high, but suppresses correlations when measurement confidence is low-especially for entities with low signal levels. In addition, we consider the influence of priors on the Bayesian correlation estimate. Perhaps surprisingly, we show that naive, uniform priors on entities' signal levels can lead to highly biased correlation estimates, particularly when different experiments have widely varying sequencing depths. However, we propose two alternative priors that provably mitigate this problem. We also prove that, like traditional Pearson correlation, our Bayesian correlation calculation constitutes a kernel in the machine learning sense, and thus can be used as a similarity measure in any kernel-based machine learning algorithm. We demonstrate our approach on two RNA-seq datasets and one miRNA-seq dataset.

  20. Influence of Genotype on Warfarin Maintenance Dose Predictions Produced Using a Bayesian Dose Individualization Tool.

    Science.gov (United States)

    Saffian, Shamin M; Duffull, Stephen B; Roberts, Rebecca L; Tait, Robert C; Black, Leanne; Lund, Kirstin A; Thomson, Alison H; Wright, Daniel F B

    2016-12-01

    A previously established Bayesian dosing tool for warfarin was found to produce biased maintenance dose predictions. In this study, we aimed (1) to determine whether the biased warfarin dose predictions previously observed could be replicated in a new cohort of patients from 2 different clinical settings, (2) to explore the influence of CYP2C9 and VKORC1 genotype on predictive performance of the Bayesian dosing tool, and (3) to determine whether the previous population used to develop the kinetic-pharmacodynamic model underpinning the Bayesian dosing tool was sufficiently different from the test (posterior) population to account for the biased dose predictions. The warfarin maintenance doses for 140 patients were predicted using the dosing tool and compared with the observed maintenance dose. The impact of genotype was assessed by predicting maintenance doses with prior parameter values known to be altered by genetic variability (eg, EC50 for VKORC1 genotype). The prior population was evaluated by fitting the published kinetic-pharmacodynamic model, which underpins the Bayesian tool, to the observed data using NONMEM and comparing the model parameter estimates with published values. The Bayesian tool produced positively biased dose predictions in the new cohort of patients (mean prediction error [95% confidence interval]; 0.32 mg/d [0.14-0.5]). The bias was only observed in patients requiring ≥7 mg/d. The direction and magnitude of the observed bias was not influenced by genotype. The prior model provided a good fit to our data, which suggests that the bias was not caused by different prior and posterior populations. Maintenance doses for patients requiring ≥7 mg/d were overpredicted. The bias was not due to the influence of genotype nor was it related to differences between the prior and posterior populations. There is a need for a more mechanistic model that captures warfarin dose-response relationship at higher warfarin doses.

  1. Traffic Video Image Segmentation Model Based on Bayesian and Spatio-Temporal Markov Random Field

    Science.gov (United States)

    Zhou, Jun; Bao, Xu; Li, Dawei; Yin, Yongwen

    2017-10-01

    Traffic video image is a kind of dynamic image and its background and foreground is changed at any time, which results in the occlusion. In this case, using the general method is more difficult to get accurate image segmentation. A segmentation algorithm based on Bayesian and Spatio-Temporal Markov Random Field is put forward, which respectively build the energy function model of observation field and label field to motion sequence image with Markov property, then according to Bayesian' rule, use the interaction of label field and observation field, that is the relationship of label field’s prior probability and observation field’s likelihood probability, get the maximum posterior probability of label field’s estimation parameter, use the ICM model to extract the motion object, consequently the process of segmentation is finished. Finally, the segmentation methods of ST - MRF and the Bayesian combined with ST - MRF were analyzed. Experimental results: the segmentation time in Bayesian combined with ST-MRF algorithm is shorter than in ST-MRF, and the computing workload is small, especially in the heavy traffic dynamic scenes the method also can achieve better segmentation effect.

  2. Universal Darwinism as a process of Bayesian inference

    CERN Document Server

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment". Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description clo...

  3. Bayesian model selection: Evidence estimation based on DREAM simulation and bridge sampling

    Science.gov (United States)

    Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.

    2017-04-01

    Bayesian inference has found widespread application in Earth and Environmental Systems Modeling, providing an effective tool for prediction, data assimilation, parameter estimation, uncertainty analysis and hypothesis testing. Under multiple competing hypotheses, the Bayesian approach also provides an attractive alternative to traditional information criteria (e.g. AIC, BIC) for model selection. The key variable for Bayesian model selection is the evidence (or marginal likelihood) that is the normalizing constant in the denominator of Bayes theorem; while it is fundamental for model selection, the evidence is not required for Bayesian inference. It is computed for each hypothesis (model) by averaging the likelihood function over the prior parameter distribution, rather than maximizing it as by information criteria; the larger a model evidence the more support it receives among a collection of hypothesis as the simulated values assign relatively high probability density to the observed data. Hence, the evidence naturally acts as an Occam's razor, preferring simpler and more constrained models against the selection of over-fitted ones by information criteria that incorporate only the likelihood maximum. Since it is not particularly easy to estimate the evidence in practice, Bayesian model selection via the marginal likelihood has not yet found mainstream use. We illustrate here the properties of a new estimator of the Bayesian model evidence, which provides robust and unbiased estimates of the marginal likelihood; the method is coined Gaussian Mixture Importance Sampling (GMIS). GMIS uses multidimensional numerical integration of the posterior parameter distribution via bridge sampling (a generalization of importance sampling) of a mixture distribution fitted to samples of the posterior distribution derived from the DREAM algorithm (Vrugt et al., 2008; 2009). Some illustrative examples are presented to show the robustness and superiority of the GMIS estimator with

  4. Tail paradox, partial identifiability, and influential priors in Bayesian branch length inference.

    Science.gov (United States)

    Rannala, Bruce; Zhu, Tianqi; Yang, Ziheng

    2012-01-01

    Recent studies have observed that Bayesian analyses of sequence data sets using the program MrBayes sometimes generate extremely large branch lengths, with posterior credibility intervals for the tree length (sum of branch lengths) excluding the maximum likelihood estimates. Suggested explanations for this phenomenon include the existence of multiple local peaks in the posterior, lack of convergence of the chain in the tail of the posterior, mixing problems, and misspecified priors on branch lengths. Here, we analyze the behavior of Bayesian Markov chain Monte Carlo algorithms when the chain is in the tail of the posterior distribution and note that all these phenomena can occur. In Bayesian phylogenetics, the likelihood function approaches a constant instead of zero when the branch lengths increase to infinity. The flat tail of the likelihood can cause poor mixing and undue influence of the prior. We suggest that the main cause of the extreme branch length estimates produced in many Bayesian analyses is the poor choice of a default prior on branch lengths in current Bayesian phylogenetic programs. The default prior in MrBayes assigns independent and identical distributions to branch lengths, imposing strong (and unreasonable) assumptions about the tree length. The problem is exacerbated by the strong correlation between the branch lengths and parameters in models of variable rates among sites or among site partitions. To resolve the problem, we suggest two multivariate priors for the branch lengths (called compound Dirichlet priors) that are fairly diffuse and demonstrate their utility in the special case of branch length estimation on a star phylogeny. Our analysis highlights the need for careful thought in the specification of high-dimensional priors in Bayesian analyses.

  5. An examination of the monophyly of morning glory taxa using Bayesian phylogenetic inference.

    Science.gov (United States)

    Miller, Richard E; Buckley, Thomas R; Manos, Paul S

    2002-10-01

    The objective of this study was to obtain a quantitative assessment of the monophyly of morning glory taxa, specifically the genus Ipomoea and the tribe Argyreieae. Previous systematic studies of morning glories intimated the paraphyly of Ipomoea by suggesting that the genera within the tribe Argyreieae are derived from within Ipomoea; however, no quantitative estimates of statistical support were developed to address these questions. We applied a Bayesian analysis to provide quantitative estimates of monophyly in an investigation of morning glory relationships using DNA sequence data. We also explored various approaches for examining convergence of the Markov chain Monte Carlo (MCMC) simulation of the Bayesian analysis by running 18 separate analyses varying in length. We found convergence of the important components of the phylogenetic model (the tree with the maximum posterior probability, branch lengths, the parameter values from the DNA substitution model, and the posterior probabilities for clade support) for these data after one million generations of the MCMC simulations. In the process, we identified a run where the parameter values obtained were often outside the range of values obtained from the other runs, suggesting an aberrant result. In addition, we compared the Bayesian method of phylogenetic analysis to maximum likelihood and maximum parsimony. The results from the Bayesian analysis and the maximum likelihood analysis were similar for topology, branch lengths, and parameters of the DNA substitution model. Topologies also were similar in the comparison between the Bayesian analysis and maximum parsimony, although the posterior probabilities and the bootstrap proportions exhibited some striking differences. In a Bayesian analysis of three data sets (ITS sequences, waxy sequences, and ITS + waxy sequences) no supoort for the monophyly of the genus Ipomoea, or for the tribe Argyreieae, was observed, with the estimate of the probability of the monophyly

  6. Quantum-Like Bayesian Networks for Modeling Decision Making

    Directory of Open Access Journals (Sweden)

    Catarina eMoreira

    2016-01-01

    Full Text Available In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios.

  7. Bayesian inference of epidemiological parameters from transmission experiments.

    Science.gov (United States)

    Hu, Ben; Gonzales, Jose L; Gubbins, Simon

    2017-12-01

    Epidemiological parameters for livestock diseases are often inferred from transmission experiments. However, there are several limitations inherent to the design of such experiments that limits the precision of parameter estimates. In particular, infection times and latent periods cannot be directly observed and infectious periods may also be censored. We present a Bayesian framework accounting for these features directly and employ Markov chain Monte Carlo techniques to provide robust inferences and quantify the uncertainty in our estimates. We describe the transmission dynamics using a susceptible-exposed-infectious-removed compartmental model, with gamma-distributed transition times. We then fit the model to published data from transmission experiments for foot-and-mouth disease virus (FMDV) and African swine fever virus (ASFV). Where the previous analyses of these data made various assumptions on the unobserved processes in order to draw inferences, our Bayesian approach includes the unobserved infection times and latent periods and quantifies them along with all other model parameters. Drawing inferences about infection times helps identify who infected whom and can also provide insights into transmission mechanisms. Furthermore, we are able to use our models to measure the difference between the latent periods of inoculated and contact-challenged animals and to quantify the effect vaccination has on transmission.

  8. Nuclear charge radii: density functional theory meets Bayesian neural networks

    Science.gov (United States)

    Utama, R.; Chen, Wei-Chia; Piekarewicz, J.

    2016-11-01

    The distribution of electric charge in atomic nuclei is fundamental to our understanding of the complex nuclear dynamics and a quintessential observable to validate nuclear structure models. The aim of this study is to explore a novel approach that combines sophisticated models of nuclear structure with Bayesian neural networks (BNN) to generate predictions for the charge radii of thousands of nuclei throughout the nuclear chart. A class of relativistic energy density functionals is used to provide robust predictions for nuclear charge radii. In turn, these predictions are refined through Bayesian learning for a neural network that is trained using residuals between theoretical predictions and the experimental data. Although predictions obtained with density functional theory provide a fairly good description of experiment, our results show significant improvement (better than 40%) after BNN refinement. Moreover, these improved results for nuclear charge radii are supplemented with theoretical error bars. We have successfully demonstrated the ability of the BNN approach to significantly increase the accuracy of nuclear models in the predictions of nuclear charge radii. However, as many before us, we failed to uncover the underlying physics behind the intriguing behavior of charge radii along the calcium isotopic chain.

  9. Bayesian Methods for Nonlinear System Identification of Civil Structures

    Directory of Open Access Journals (Sweden)

    Conte Joel P.

    2015-01-01

    Full Text Available This paper presents a new framework for the identification of mechanics-based nonlinear finite element (FE models of civil structures using Bayesian methods. In this approach, recursive Bayesian estimation methods are utilized to update an advanced nonlinear FE model of the structure using the input-output dynamic data recorded during an earthquake event. Capable of capturing the complex damage mechanisms and failure modes of the structural system, the updated nonlinear FE model can be used to evaluate the state of health of the structure after a damage-inducing event. To update the unknown time-invariant parameters of the FE model, three alternative stochastic filtering methods are used: the extended Kalman filter (EKF, the unscented Kalman filter (UKF, and the iterated extended Kalman filter (IEKF. For those estimation methods that require the computation of structural FE response sensitivities with respect to the unknown modeling parameters (EKF and IEKF, the accurate and computationally efficient direct differentiation method (DDM is used. A three-dimensional five-story two-by-one bay reinforced concrete (RC frame is used to illustrate the performance of the framework and compare the performance of the different filters in terms of convergence, accuracy, and robustness. Excellent estimation results are obtained with the UKF, EKF, and IEKF. Because of the analytical linearization used in the EKF and IEKF, abrupt and large jumps in the estimates of the modeling parameters are observed when using these filters. The UKF slightly outperforms the EKF and IEKF.

  10. Application of Bayesian Networks to hindcast barrier island morphodynamics

    Science.gov (United States)

    Wilson, Kathleen E.; Adams, Peter N.; Hapke, Cheryl J.; Lentz, Erika E.; Brenner, Owen T.

    2015-01-01

    Prediction of coastal vulnerability is of increasing concern to policy makers, coastal managers and other stakeholders. Coastal regions and barrier islands along the Atlantic and Gulf coasts are subject to frequent, large storms, whose waves and storm surge can dramatically alter beach morphology, threaten infrastructure, and impact local economies. Given that precise forecasts of regional hazards are challenging, because of the complex interactions between processes on many scales, a range of probable geomorphic change in response to storm conditions is often more helpful than deterministic predictions. Site-specific probabilistic models of coastal change are reliable because they are formulated with observations so that local factors, of potentially high influence, are inherent in the model. The development and use of predictive tools such as Bayesian Networks in response to future storms has the potential to better inform management decisions and hazard preparation in coastal communities. We present several Bayesian Networks designed to hindcast distinct morphologic changes attributable to the Nor'Ida storm of 2009, at Fire Island, New York. Model predictions are informed with historical system behavior, initial morphologic conditions, and a parameterized treatment of wave climate.

  11. Bayesian blind source separation for data with network structure.

    Science.gov (United States)

    Illner, Katrin; Fuchs, Christiane; Theis, Fabian J

    2014-11-01

    In biology, more and more information about the interactions in regulatory systems becomes accessible, and this often leads to prior knowledge for recent data interpretations. In this work we focus on multivariate signaling data, where the structure of the data is induced by a known regulatory network. To extract signals of interest we assume a blind source separation (BSS) model, and we capture the structure of the source signals in terms of a Bayesian network. To keep the parameter space small, we consider stationary signals, and we introduce the new algorithm emGrade, where model parameters and source signals are estimated using expectation maximization. For network data, we find an improved estimation performance compared to other BSS algorithms, and the flexible Bayesian modeling enables us to deal with repeated and missing observation values. The main advantage of our method is the statistically interpretable likelihood, and we can use model selection criteria to determine the (in general unknown) number of source signals or decide between different given networks. In simulations we demonstrate the recovery of the source signals dependent on the graph structure and the dimensionality of the data.

  12. Mining data from hemodynamic simulations via Bayesian emulation

    Directory of Open Access Journals (Sweden)

    Nair Prasanth B

    2007-12-01

    Full Text Available Abstract Background: Arterial geometry variability is inevitable both within and across individuals. To ensure realistic prediction of cardiovascular flows, there is a need for efficient numerical methods that can systematically account for geometric uncertainty. Methods and results: A statistical framework based on Bayesian Gaussian process modeling was proposed for mining data generated from computer simulations. The proposed approach was applied to analyze the influence of geometric parameters on hemodynamics in the human carotid artery bifurcation. A parametric model in conjunction with a design of computer experiments strategy was used for generating a set of observational data that contains the maximum wall shear stress values for a range of probable arterial geometries. The dataset was mined via a Bayesian Gaussian process emulator to estimate: (a the influence of key parameters on the output via sensitivity analysis, (b uncertainty in output as a function of uncertainty in input, and (c which settings of the input parameters result in maximum and minimum values of the output. Finally, potential diagnostic indicators were proposed that can be used to aid the assessment of stroke risk for a given patient's geometry.

  13. Bayesian inference for duplication-mutation with complementarity network models.

    Science.gov (United States)

    Jasra, Ajay; Persing, Adam; Beskos, Alexandros; Heine, Kari; De Iorio, Maria

    2015-11-01

    We observe an undirected graph G without multiple edges and self-loops, which is to represent a protein-protein interaction (PPI) network. We assume that G evolved under the duplication-mutation with complementarity (DMC) model from a seed graph, G0, and we also observe the binary forest Γ that represents the duplication history of G. A posterior density for the DMC model parameters is established, and we outline a sampling strategy by which one can perform Bayesian inference; that sampling strategy employs a particle marginal Metropolis-Hastings (PMMH) algorithm. We test our methodology on numerical examples to demonstrate a high accuracy and precision in the inference of the DMC model's mutation and homodimerization parameters.

  14. Bayesian Inference for Duplication–Mutation with Complementarity Network Models

    Science.gov (United States)

    Persing, Adam; Beskos, Alexandros; Heine, Kari; De Iorio, Maria

    2015-01-01

    Abstract We observe an undirected graph G without multiple edges and self-loops, which is to represent a protein–protein interaction (PPI) network. We assume that G evolved under the duplication–mutation with complementarity (DMC) model from a seed graph, G0, and we also observe the binary forest Γ that represents the duplication history of G. A posterior density for the DMC model parameters is established, and we outline a sampling strategy by which one can perform Bayesian inference; that sampling strategy employs a particle marginal Metropolis–Hastings (PMMH) algorithm. We test our methodology on numerical examples to demonstrate a high accuracy and precision in the inference of the DMC model's mutation and homodimerization parameters. PMID:26355682

  15. Advances in Bayesian Model Based Clustering Using Particle Learning

    Energy Technology Data Exchange (ETDEWEB)

    Merl, D M

    2009-11-19

    Recent work by Carvalho, Johannes, Lopes and Polson and Carvalho, Lopes, Polson and Taddy introduced a sequential Monte Carlo (SMC) alternative to traditional iterative Monte Carlo strategies (e.g. MCMC and EM) for Bayesian inference for a large class of dynamic models. The basis of SMC techniques involves representing the underlying inference problem as one of state space estimation, thus giving way to inference via particle filtering. The key insight of Carvalho et al was to construct the sequence of filtering distributions so as to make use of the posterior predictive distribution of the observable, a distribution usually only accessible in certain Bayesian settings. Access to this distribution allows a reversal of the usual propagate and resample steps characteristic of many SMC methods, thereby alleviating to a large extent many problems associated with particle degeneration. Furthermore, Carvalho et al point out that for many conjugate models the posterior distribution of the static variables can be parametrized in terms of [recursively defined] sufficient statistics of the previously observed data. For models where such sufficient statistics exist, particle learning as it is being called, is especially well suited for the analysis of streaming data do to the relative invariance of its algorithmic complexity with the number of data observations. Through a particle learning approach, a statistical model can be fit to data as the data is arriving, allowing at any instant during the observation process direct quantification of uncertainty surrounding underlying model parameters. Here we describe the use of a particle learning approach for fitting a standard Bayesian semiparametric mixture model as described in Carvalho, Lopes, Polson and Taddy. In Section 2 we briefly review the previously presented particle learning algorithm for the case of a Dirichlet process mixture of multivariate normals. In Section 3 we describe several novel extensions to the original

  16. Bayesian statistics for infection experiments

    NARCIS (Netherlands)

    Heres, L.; Engel, B.

    2004-01-01

    To intervene cycles of food-borne pathogens in poultry new intervention methods need to be tested for their effectiveness. In this paper a statistical method is described that was applied to quantify the observed differences between test groups and control groups. Treated chickens and their controls

  17. Elicitation by design in ecology: using expert opinion to inform priors for Bayesian statistical models.

    Science.gov (United States)

    Choy, Samantha Low; O'Leary, Rebecca; Mengersen, Kerrie

    2009-01-01

    Bayesian statistical modeling has several benefits within an ecological context. In particular, when observed data are limited in sample size or representativeness, then the Bayesian framework provides a mechanism to combine observed data with other "prior" information. Prior information may be obtained from earlier studies, or in their absence, from expert knowledge. This use of the Bayesian framework reflects the scientific "learning cycle," where prior or initial estimates are updated when new data become available. In this paper we outline a framework for statistical design of expert elicitation processes for quantifying such expert knowledge, in a form suitable for input as prior information into Bayesian models. We identify six key elements: determining the purpose and motivation for using prior information; specifying the relevant expert knowledge available; formulating the statistical model; designing effective and efficient numerical encoding; managing uncertainty; and designing a practical elicitation protocol. We demonstrate this framework applies to a variety of situations, with two examples from the ecological literature and three from our experience. Analysis of these examples reveals several recurring important issues affecting practical design of elicitation in ecological problems.

  18. Bayesian Population Projections for the United Nations.

    Science.gov (United States)

    Raftery, Adrian E; Alkema, Leontine; Gerland, Patrick

    2014-02-01

    The United Nations regularly publishes projections of the populations of all the world's countries broken down by age and sex. These projections are the de facto standard and are widely used by international organizations, governments and researchers. Like almost all other population projections, they are produced using the standard deterministic cohort-component projection method and do not yield statements of uncertainty. We describe a Bayesian method for producing probabilistic population projections for most countries that the United Nations could use. It has at its core Bayesian hierarchical models for the total fertility rate and life expectancy at birth. We illustrate the method and show how it can be extended to address concerns about the UN's current assumptions about the long-term distribution of fertility. The method is implemented in the R packages bayesTFR, bayesLife, bayesPop and bayesDem.

  19. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  20. A Bayesian Probabilistic Framework for Rain Detection

    Directory of Open Access Journals (Sweden)

    Chen Yao

    2014-06-01

    Full Text Available Heavy rain deteriorates the video quality of outdoor imaging equipments. In order to improve video clearness, image-based and sensor-based methods are adopted for rain detection. In earlier literature, image-based detection methods fall into spatio-based and temporal-based categories. In this paper, we propose a new image-based method by exploring spatio-temporal united constraints in a Bayesian framework. In our framework, rain temporal motion is assumed to be Pathological Motion (PM, which is more suitable to time-varying character of rain steaks. Temporal displaced frame discontinuity and spatial Gaussian mixture model are utilized in the whole framework. Iterated expectation maximization solving method is taken for Gaussian parameters estimation. Pixels state estimation is finished by an iterated optimization method in Bayesian probability formulation. The experimental results highlight the advantage of our method in rain detection.

  1. Bayesian Peak Picking for NMR Spectra

    Directory of Open Access Journals (Sweden)

    Yichen Cheng

    2014-02-01

    Full Text Available Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.

  2. Bayesian system reliability assessment under fuzzy environments

    Energy Technology Data Exchange (ETDEWEB)

    Wu, H.-C

    2004-03-01

    The Bayesian system reliability assessment under fuzzy environments is proposed in this paper. In order to apply the Bayesian approach, the fuzzy parameters are assumed as fuzzy random variables with fuzzy prior distributions. The (conventional) Bayes estimation method will be used to create the fuzzy Bayes point estimator of system reliability by invoking the well-known theorem called 'Resolution Identity' in fuzzy sets theory. On the other hand, we also provide the computational procedures to evaluate the membership degree of any given Bayes point estimate of system reliability. In order to achieve this purpose, we transform the original problem into a nonlinear programming problem. This nonlinear programming problem is then divided into four subproblems for the purpose of simplifying computation. Finally, the subproblems can be solved by using any commercial optimizers, e.g. GAMS or LINGO.

  3. Bayesian Modelling of Functional Whole Brain Connectivity

    DEFF Research Database (Denmark)

    Røge, Rasmus

    the prevalent strategy of standardizing of fMRI time series and model data using directional statistics or we model the variability in the signal across the brain and across multiple subjects. In either case, we use Bayesian nonparametric modeling to automatically learn from the fMRI data the number......This thesis deals with parcellation of whole-brain functional magnetic resonance imaging (fMRI) using Bayesian inference with mixture models tailored to the fMRI data. In the three included papers and manuscripts, we analyze two different approaches to modeling fMRI signal; either we accept...... of funcional units, i.e. parcels. We benchmark the proposed mixture models against state of the art methods of brain parcellation, both probabilistic and non-probabilistic. The time series of each voxel are most often standardized using z-scoring which projects the time series data onto a hypersphere...

  4. Machine learning a Bayesian and optimization perspective

    CERN Document Server

    Theodoridis, Sergios

    2015-01-01

    This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches, which rely on optimization techniques, as well as Bayesian inference, which is based on a hierarchy of probabilistic models. The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts. The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as shor...

  5. Structure Learning in Bayesian Sensorimotor Integration.

    Directory of Open Access Journals (Sweden)

    Tim Genewein

    2015-08-01

    Full Text Available Previous studies have shown that sensorimotor processing can often be described by Bayesian learning, in particular the integration of prior and feedback information depending on its degree of reliability. Here we test the hypothesis that the integration process itself can be tuned to the statistical structure of the environment. We exposed human participants to a reaching task in a three-dimensional virtual reality environment where we could displace the visual feedback of their hand position in a two dimensional plane. When introducing statistical structure between the two dimensions of the displacement, we found that over the course of several days participants adapted their feedback integration process in order to exploit this structure for performance improvement. In control experiments we found that this adaptation process critically depended on performance feedback and could not be induced by verbal instructions. Our results suggest that structural learning is an important meta-learning component of Bayesian sensorimotor integration.

  6. Bayesian Peak Picking for NMR Spectra

    KAUST Repository

    Cheng, Yichen

    2014-02-01

    Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR) has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.

  7. Bayesian image reconstruction: Application to emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, J.; Llacer, J.

    1989-02-01

    In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.

  8. Distributed Bayesian Networks for User Modeling

    DEFF Research Database (Denmark)

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang

    2006-01-01

    The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used...... of Web-based eLearning platforms. The scenario we are tackling assumes learners who use several systems over time, which are able to create partial Bayesian Networks for user models based on the local system context. In particular, we focus on how to merge these partial user models. Our merge mechanism...... efficiently combines distributed learner models without the need to exchange internal structure of local Bayesian networks, nor local evidence between the involved platforms....

  9. Bayesianism and inference to the best explanation

    Directory of Open Access Journals (Sweden)

    Valeriano IRANZO

    2008-01-01

    Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.

  10. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... complex prior representation achieve improved sparsity representations in low signalto- noise ratio as opposed to state-of-the-art sparse estimators. This result is of particular importance for the applicability of the algorithms in the field of channel estimation. We then derive various iterative...

  11. Software Health Management with Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole; Schumann, JOhann

    2011-01-01

    Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.

  12. Quantum-like Representation of Bayesian Updating

    Science.gov (United States)

    Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Khrennikov, Andrei; Basieva, Irina

    2011-03-01

    Recently, applications of quantum mechanics to coginitive psychology have been discussed, see [1]-[11]. It was known that statistical data obtained in some experiments of cognitive psychology cannot be described by classical probability model (Kolmogorov's model) [12]-[15]. Quantum probability is one of the most advanced mathematical models for non-classical probability. In the paper of [11], we proposed a quantum-like model describing decision-making process in a two-player game, where we used the generalized quantum formalism based on lifting of density operators [16]. In this paper, we discuss the quantum-like representation of Bayesian inference, which has been used to calculate probabilities for decision making under uncertainty. The uncertainty is described in the form of quantum superposition, and Bayesian updating is explained as a reduction of state by quantum measurement.

  13. Narrowband interference parameterization for sparse Bayesian recovery

    KAUST Repository

    Ali, Anum

    2015-09-11

    This paper addresses the problem of narrowband interference (NBI) in SC-FDMA systems by using tools from compressed sensing and stochastic geometry. The proposed NBI cancellation scheme exploits the frequency domain sparsity of the unknown signal and adopts a Bayesian sparse recovery procedure. This is done by keeping a few randomly chosen sub-carriers data free to sense the NBI signal at the receiver. As Bayesian recovery requires knowledge of some NBI parameters (i.e., mean, variance and sparsity rate), we use tools from stochastic geometry to obtain analytical expressions for the required parameters. Our simulation results validate the analysis and depict suitability of the proposed recovery method for NBI mitigation. © 2015 IEEE.

  14. Bayesian hypothesis testing: Editorial to the Special Issue on Bayesian data analysis.

    Science.gov (United States)

    Hoijtink, Herbert; Chow, Sy-Miin

    2017-06-01

    In the past 20 years, there has been a steadily increasing attention and demand for Bayesian data analysis across multiple scientific disciplines, including psychology. Bayesian methods and the related Markov chain Monte Carlo sampling techniques offered renewed ways of handling old and challenging new problems that may be difficult or impossible to handle using classical approaches. Yet, such opportunities and potential improvements have not been sufficiently explored and investigated. This is 1 of 2 special issues in Psychological Methods dedicated to the topic of Bayesian data analysis, with an emphasis on Bayesian hypothesis testing, model comparison, and general guidelines for applications in psychology. In this editorial, we provide an overview of the use of Bayesian methods in psychological research and a brief history of the Bayes factor and the posterior predictive p value. Translational abstracts that summarize the articles in this issue in very clear and understandable terms are included in the Appendix. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Statistical Information: A Bayesian Perspective

    Directory of Open Access Journals (Sweden)

    Carlos A. de B. Pereira

    2012-11-01

    Full Text Available We explore the meaning of information about quantities of interest. Our approach is divided in two scenarios: the analysis of observations and the planning of an experiment. First, we review the Sufficiency, Conditionality and Likelihood principles and how they relate to trivial experiments. Next, we review Blackwell Sufficiency and show that sampling without replacement is Blackwell Sufficient for sampling with replacement. Finally, we unify the two scenarios presenting an extension of the relationship between Blackwell Equivalence and the Likelihood Principle.

  16. Variational algorithms for approximate Bayesian inference

    Science.gov (United States)

    Beal, Matthew James

    The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents a unified variational Bayesian (VB) framework which approximates these computations in models with latent variables using a lower bound on the marginal likelihood. Chapter 1 presents background material on Bayesian inference, graphical models, and propagation algorithms. Chapter 2 forms the theoretical core of the thesis, generalising the expectation- maximisation (EM) algorithm for learning maximum likelihood parameters to the VB EM algorithm which integrates over model parameters. The algorithm is then specialised to the large family of conjugate-exponential (CE) graphical models, and several theorems are presented to pave the road for automated VB derivation procedures in both directed and undirected graphs (Bayesian and Markov networks, respectively). Chapters 3--5 derive and apply the VB EM algorithm to three commonly-used and important models: mixtures of factor analysers, linear dynamical systems, and hidden Markov models. It is shown how model selection tasks such as determining the dimensionality, cardinality, or number of variables are possible using VB approximations. Also explored are methods for combining sampling procedures with variational approximations, to estimate the tightness of VB bounds and to obtain more effective sampling algorithms. Chapter 6 applies VB learning to a long-standing problem of scoring discrete-variable directed acyclic graphs, and compares the performance to annealed importance sampling amongst other methods. Throughout, the VB approximation is compared to other methods including sampling, Cheeseman-Stutz, and asymptotic approximations such as BIC. The thesis concludes with a discussion of evolving directions for model selection

  17. Bayesian estimation of traffic lane state

    Czech Academy of Sciences Publication Activity Database

    Nagy, Ivan; Kárný, Miroslav; Nedoma, Petr; Voráčová, Š.

    2003-01-01

    Roč. 17, č. 1 (2003), s. 51-65 ISSN 0890-6327 R&D Projects: GA ČR GA102/03/0049; GA AV ČR IBS1075351 Institutional research plan: CEZ:AV0Z1075907 Keywords : mixture models * estimation * Bayesian approach Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.602, year: 2003 http://library.utia.cas.cz/prace/20030021.ps

  18. Multilevel Monte Carlo in Approximate Bayesian Computation

    KAUST Repository

    Jasra, Ajay

    2017-02-13

    In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.

  19. Personalized Audio Systems - a Bayesian Approach

    DEFF Research Database (Denmark)

    Nielsen, Jens Brehm; Jensen, Bjørn Sand; Hansen, Toke Jansen

    2013-01-01

    , the present paper presents a general inter-active framework for personalization of such audio systems. The framework builds on Bayesian Gaussian process regression in which a model of the users's objective function is updated sequentially. The parameter setting to be evaluated in a given trial is selected...... are optimized using the proposed framework. Twelve test subjects obtain a personalized setting with the framework, and these settings are signicantly preferred to those obtained with random experimentation....

  20. Bayesian analysis for categorical survey data

    OpenAIRE

    Grapsa, Erofili

    2010-01-01

    In this thesis, we develop Bayesian methodology for univariate and multivariate categorical survey data. The Multinomial model is used and the following problems are addressed. Limited information about the design variables leads us to model the unknown design variables taking into account the sampling scheme. Random effects are incorporated in the model to deal with the effect of sampling design, that produces the Multinomial GLMM and issues such as model comparison and model averaging are a...

  1. Dynamic Dimensionality Selection for Bayesian Classifier Ensembles

    Science.gov (United States)

    2015-03-19

    Prescribed by ANSI Std. Z39.18 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is...by ANSI Std Z39-18 Final Report for AOARD Grant AOARD-124030 “Dynamic dimensionality selection for Bayesian classifier ensembles” March 19, 2015...learning workbench . We compared the two levels of ANDE without either extension, with each in isolation and with both in combination. We also compared

  2. Bayesian belief networks in business continuity.

    Science.gov (United States)

    Phillipson, Frank; Matthijssen, Edwin; Attema, Thomas

    2014-01-01

    Business continuity professionals aim to mitigate the various challenges to the continuity of their company. The goal is a coherent system of measures that encompass detection, prevention and recovery. Choices made in one part of the system affect other parts as well as the continuity risks of the company. In complex organisations, however, these relations are far from obvious. This paper proposes the use of Bayesian belief networks to expose these relations, and presents a modelling framework for this approach.

  3. Learning Bayesian networks from big meteorological spatial datasets. An alternative to complex network analysis

    Science.gov (United States)

    Gutiérrez, Jose Manuel; San Martín, Daniel; Herrera, Sixto; Santiago Cofiño, Antonio

    2016-04-01

    The growing availability of spatial datasets (observations, reanalysis, and regional and global climate models) demands efficient multivariate spatial modeling techniques for many problems of interest (e.g. teleconnection analysis, multi-site downscaling, etc.). Complex networks have been recently applied in this context using graphs built from pairwise correlations between the different stations (or grid boxes) forming the dataset. However, this analysis does not take into account the full dependence structure underlying the data, gien by all possible marginal and conditional dependencies among the stations, and does not allow a probabilistic analysis of the dataset. In this talk we introduce Bayesian networks as an alternative multivariate analysis and modeling data-driven technique which allows building a joint probability distribution of the stations including all relevant dependencies in the dataset. Bayesian networks is a sound machine learning technique using a graph to 1) encode the main dependencies among the variables and 2) to obtain a factorization of the joint probability distribution of the stations given by a reduced number of parameters. For a particular problem, the resulting graph provides a qualitative analysis of the spatial relationships in the dataset (alternative to complex network analysis), and the resulting model allows for a probabilistic analysis of the dataset. Bayesian networks have been widely applied in many fields, but their use in climate problems is hampered by the large number of variables (stations) involved in this field, since the complexity of the existing algorithms to learn from data the graphical structure grows nonlinearly with the number of variables. In this contribution we present a modified local learning algorithm for Bayesian networks adapted to this problem, which allows inferring the graphical structure for thousands of stations (from observations) and/or gridboxes (from model simulations) thus providing new

  4. Particle identification in ALICE: a Bayesian approach

    CERN Document Server

    Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Diozcora Vargas Trevino, Aurora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym

    2016-05-25

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...

  5. Discriminative Bayesian Dictionary Learning for Classification.

    Science.gov (United States)

    Akhtar, Naveed; Shafait, Faisal; Mian, Ajmal

    2016-12-01

    We propose a Bayesian approach to learn discriminative dictionaries for sparse representation of data. The proposed approach infers probability distributions over the atoms of a discriminative dictionary using a finite approximation of Beta Process. It also computes sets of Bernoulli distributions that associate class labels to the learned dictionary atoms. This association signifies the selection probabilities of the dictionary atoms in the expansion of class-specific data. Furthermore, the non-parametric character of the proposed approach allows it to infer the correct size of the dictionary. We exploit the aforementioned Bernoulli distributions in separately learning a linear classifier. The classifier uses the same hierarchical Bayesian model as the dictionary, which we present along the analytical inference solution for Gibbs sampling. For classification, a test instance is first sparsely encoded over the learned dictionary and the codes are fed to the classifier. We performed experiments for face and action recognition; and object and scene-category classification using five public datasets and compared the results with state-of-the-art discriminative sparse representation approaches. Experiments show that the proposed Bayesian approach consistently outperforms the existing approaches.

  6. Bayesian Analysis of Individual Level Personality Dynamics

    Directory of Open Access Journals (Sweden)

    Edward Cripps

    2016-07-01

    Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.

  7. Bayesian Recurrent Neural Network for Language Modeling.

    Science.gov (United States)

    Chien, Jen-Tzung; Ku, Yuan-Chu

    2016-02-01

    A language model (LM) is calculated as the probability of a word sequence that provides the solution to word prediction for a variety of information systems. A recurrent neural network (RNN) is powerful to learn the large-span dynamics of a word sequence in the continuous space. However, the training of the RNN-LM is an ill-posed problem because of too many parameters from a large dictionary size and a high-dimensional hidden layer. This paper presents a Bayesian approach to regularize the RNN-LM and apply it for continuous speech recognition. We aim to penalize the too complicated RNN-LM by compensating for the uncertainty of the estimated model parameters, which is represented by a Gaussian prior. The objective function in a Bayesian classification network is formed as the regularized cross-entropy error function. The regularized model is constructed not only by calculating the regularized parameters according to the maximum a posteriori criterion but also by estimating the Gaussian hyperparameter by maximizing the marginal likelihood. A rapid approximation to a Hessian matrix is developed to implement the Bayesian RNN-LM (BRNN-LM) by selecting a small set of salient outer-products. The proposed BRNN-LM achieves a sparser model than the RNN-LM. Experiments on different corpora show the robustness of system performance by applying the rapid BRNN-LM under different conditions.

  8. Bayesian statistical approaches to evaluating cognitive models.

    Science.gov (United States)

    Annis, Jeffrey; Palmeri, Thomas J

    2017-11-28

    Cognitive models aim to explain complex human behavior in terms of hypothesized mechanisms of the mind. These mechanisms can be formalized in terms of mathematical structures containing parameters that are theoretically meaningful. For example, in the case of perceptual decision making, model parameters might correspond to theoretical constructs like response bias, evidence quality, response caution, and the like. Formal cognitive models go beyond verbal models in that cognitive mechanisms are instantiated in terms of mathematics and they go beyond statistical models in that cognitive model parameters are psychologically interpretable. We explore three key elements used to formally evaluate cognitive models: parameter estimation, model prediction, and model selection. We compare and contrast traditional approaches with Bayesian statistical approaches to performing each of these three elements. Traditional approaches rely on an array of seemingly ad hoc techniques, whereas Bayesian statistical approaches rely on a single, principled, internally consistent system. We illustrate the Bayesian statistical approach to evaluating cognitive models using a running example of the Linear Ballistic Accumulator model of decision making (Brown SD, Heathcote A. The simplest complete model of choice response time: linear ballistic accumulation. Cogn Psychol 2008, 57:153-178). This article is categorized under: Neuroscience > Computation Psychology > Reasoning and Decision Making Psychology > Theory and Methods. © 2017 Wiley Periodicals, Inc.

  9. Dynamic Bayesian learning by expectation propagation

    Science.gov (United States)

    Wei, Tao

    2005-12-01

    For modeling time-series data, it is natural to use directed graphical models, since they can capture the time flow. If arcs of a graphical model are all directed both within and between time-slice, the model is called dynamic Bayesian network (DBN). Dynamic Bayesian networks are becoming increasingly important for research and applications in the area of machine learning, artificial intelligence and signal processing. It has several advantages over other data analysis methods including rule bases, neural network, decision trees, etc. In this paper, there explored dynamic Bayesian learning over DBNs by a new deterministic approximate inference method called Expectation Propagation (EP). EP is an extension of belief propagation and is developed in machine learning. A crucial step of EP is the likelihoods recycling, which makes possible further improvement over the extended Kalman smoother. This study examined EP solutions to a non-linear state-space model and compared its performance with other inference methods such as particle filter, extended Kalman filter, etc.

  10. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  11. Rating exposure control using Bayesian decision analysis.

    Science.gov (United States)

    Hewett, Paul; Logan, Perry; Mulhausen, John; Ramachandran, Gurumurthy; Banerjee, Sudipto

    2006-10-01

    A model is presented for applying Bayesian statistical techniques to the problem of determining, from the usual limited number of exposure measurements, whether the exposure profile for a similar exposure group can be considered a Category 0, 1, 2, 3, or 4 exposure. The categories were adapted from the AIHA exposure category scheme and refer to (0) negligible or trivial exposure (i.e., the true X 0.95 or =1%OEL) exposures. Unlike conventional statistical methods applied to exposure data, Bayesian statistical techniques can be adapted to explicitly take into account professional judgment or other sources of information. The analysis output consists of a distribution (i.e., set) of decision probabilities: e.g., 1%, 80%, 12%, 5%, and 2% probability that the exposure profile is a Category 0, 1, 2, 3, or 4 exposure. By inspection of these decision probabilities, rather than the often difficult to interpret point estimates (e.g., the sample 95th percentile exposure) and confidence intervals, a risk manager can be better positioned to arrive at an effective (i.e., correct) and efficient decision. Bayesian decision methods are based on the concepts of prior, likelihood, and posterior distributions of decision probabilities. The prior decision distribution represents what an industrial hygienist knows about this type of operation, using professional judgment; company, industry, or trade organization experience; historical or surrogate exposure data; or exposure modeling predictions. The likelihood decision distribution represents the decision probabilities based on an analysis of only the current data. The posterior decision distribution is derived by mathematically combining the functions underlying the prior and likelihood decision distributions, and represents the final decision probabilities. Advantages of Bayesian decision analysis include: (a) decision probabilities are easier to understand by risk managers and employees; (b) prior data, professional judgment, or

  12. Fuzzy Naive Bayesian for constructing regulated network with weights.

    Science.gov (United States)

    Zhou, Xi Y; Tian, Xue W; Lim, Joon S

    2015-01-01

    In the data mining field, classification is a very crucial technology, and the Bayesian classifier has been one of the hotspots in classification research area. However, assumptions of Naive Bayesian and Tree Augmented Naive Bayesian (TAN) are unfair to attribute relations. Therefore, this paper proposes a new algorithm named Fuzzy Naive Bayesian (FNB) using neural network with weighted membership function (NEWFM) to extract regulated relations and weights. Then, we can use regulated relations and weights to construct a regulated network. Finally, we will classify the heart and Haberman datasets by the FNB network to compare with experiments of Naive Bayesian and TAN. The experiment results show that the FNB has a higher classification rate than Naive Bayesian and TAN.

  13. Bayesian generalized linear mixed modeling of Tuberculosis using informative priors

    OpenAIRE

    Ojo, Oluwatobi Blessing; Lougue, Siaka; Woldegerima, Woldegebriel Assefa

    2017-01-01

    TB is rated as one of the world's deadliest diseases and South Africa ranks 9th out of the 22 countries with hardest hit of TB. Although many pieces of research have been carried out on this subject, this paper steps further by inculcating past knowledge into the model, using Bayesian approach with informative prior. Bayesian statistics approach is getting popular in data analyses. But, most applications of Bayesian inference technique are limited to situations of non-informative prior, where...

  14. Bayesian inference with information content model check for Langevin equations

    Science.gov (United States)

    Krog, Jens; Lomholt, Michael A.

    2017-12-01

    The Bayesian data analysis framework has been proven to be a systematic and effective method of parameter inference and model selection for stochastic processes. In this work, we introduce an information content model check that may serve as a goodness-of-fit, like the χ2 procedure, to complement conventional Bayesian analysis. We demonstrate this extended Bayesian framework on a system of Langevin equations, where coordinate-dependent mobilities and measurement noise hinder the normal mean-squared displacement approach.

  15. Bayesian extreme quantile regression for hidden Markov models

    OpenAIRE

    Koutsourelis, Antonios

    2012-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and was awarded by Brunel University The main contribution of this thesis is the introduction of Bayesian quantile regression for hidden Markov models, especially when we have to deal with extreme quantile regression analysis, as there is a limited research to inference conditional quantiles for hidden Markov models, under a Bayesian approach. The first objective is to compare Bayesian extreme quantile regression and th...

  16. Bayesian non- and semi-parametric methods and applications

    CERN Document Server

    Rossi, Peter

    2014-01-01

    This book reviews and develops Bayesian non-parametric and semi-parametric methods for applications in microeconometrics and quantitative marketing. Most econometric models used in microeconomics and marketing applications involve arbitrary distributional assumptions. As more data becomes available, a natural desire to provide methods that relax these assumptions arises. Peter Rossi advocates a Bayesian approach in which specific distributional assumptions are replaced with more flexible distributions based on mixtures of normals. The Bayesian approach can use either a large but fixed number

  17. Bayesian area-to-point kriging using expert knowledge as informative priors

    Science.gov (United States)

    Truong, Phuong N.; Heuvelink, Gerard B. M.; Pebesma, Edzer

    2014-08-01

    Area-to-point (ATP) kriging is a common geostatistical framework to address the problem of spatial disaggregation or downscaling from block support observations (BSO) to point support (PoS) predictions for continuous variables. This approach requires that the PoS variogram is known. Without PoS observations, the parameters of the PoS variogram cannot be deterministically estimated from BSO, and as a result, the PoS variogram parameters are uncertain. In this research, we used Bayesian ATP conditional simulation to estimate the PoS variogram parameters from expert knowledge and BSO, and quantify uncertainty of the PoS variogram parameters and disaggregation outcomes. We first clarified that the nugget parameter of the PoS variogram cannot be estimated from only BSO. Next, we used statistical expert elicitation techniques to elicit the PoS variogram parameters from expert knowledge. These were used as informative priors in a Bayesian inference of the PoS variogram from BSO and implemented using a Markov chain Monte Carlo algorithm. ATP conditional simulation was done to obtain stochastic simulations at point support. MODIS (Moderate Resolution Imaging Spectroradiometer) atmospheric temperature profile data were used in an illustrative example. The outcomes from the Bayesian ATP inference for the Matérn variogram model parameters confirmed that the posterior distribution of the nugget parameter was effectively the same as its prior distribution; for the other parameters, the uncertainty was substantially decreased when BSO were introduced to the Bayesian ATP estimator. This confirmed that expert knowledge brought new information to infer the nugget effect at PoS while BSO only brought new information to infer the other parameters. Bayesian ATP conditional simulations provided a satisfactory way to quantify parameters and model uncertainty propagation through spatial disaggregation.

  18. A Bayesian Approach to Identifying New Risk Factors for Dementia: A Nationwide Population-Based Study.

    Science.gov (United States)

    Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua

    2016-05-01

    Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics.We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular diseases. Then, we used Bayesian statistics to verify 2 new potential risk factors for dementia, namely hearing loss and senile cataract, determined from the Taiwan's National Health Insurance Research Database.We included a total of 6546 (6.0%) patients diagnosed with dementia. We observed older age, female sex, and lower income as independent risk factors for dementia. Moreover, we verified the 4 recognized risk factors for dementia in the older Taiwanese population; their odds ratios (ORs) ranged from 3.469 to 1.207. Furthermore, we observed that hearing loss (OR = 1.577) and senile cataract (OR = 1.549) were associated with an increased risk of dementia.We found that the results obtained using Bayesian statistics for assessing risk factors for dementia, such as head injury, depression, DM, and vascular diseases, were consistent with those obtained using classical frequentist probability. Moreover, hearing loss and senile cataract were found to be potential risk factors for dementia in the older Taiwanese population. Bayesian statistics could help clinicians explore other potential risk factors for dementia and for developing appropriate treatment strategies for these patients.

  19. Bayesian approach to magnetotelluric tensor decomposition

    Directory of Open Access Journals (Sweden)

    Michel Menvielle

    2010-05-01

    ;} -->

    Magnetotelluric directional analysis and impedance tensor decomposition are basic tools to validate a local/regional composite electrical model of the underlying structure. Bayesian stochastic methods approach the problem of the parameter estimation and their uncertainty characterization in a fully probabilistic fashion, through the use of posterior model probabilities.We use the standard Groom­Bailey 3­D local/2­D regional composite model in our bayesian approach. We assume that the experimental impedance estimates are contamined with the Gaussian noise and define the likelihood of a particular composite model with respect to the observed data. We use non­informative, flat priors over physically reasonable intervals for the standard Groom­Bailey decomposition parameters. We apply two numerical methods, the Markov chain Monte Carlo procedure based on the Gibbs sampler and a single­component adaptive Metropolis algorithm. From the posterior samples, we characterize the estimates and uncertainties of the individual decomposition parameters by using the respective marginal posterior probabilities. We conclude that the stochastic scheme performs reliably for a variety of models, including the multisite and multifrequency case with up to

  20. Spatio-Temporal Series Remote Sensing Image Prediction Based on Multi-Dictionary Bayesian Fusion

    Directory of Open Access Journals (Sweden)

    Chu He

    2017-11-01

    Full Text Available Contradictions in spatial resolution and temporal coverage emerge from earth observation remote sensing images due to limitations in technology and cost. Therefore, how to combine remote sensing images with low spatial yet high temporal resolution as well as those with high spatial yet low temporal resolution to construct images with both high spatial resolution and high temporal coverage has become an important problem called spatio-temporal fusion problem in both research and practice. A Multi-Dictionary Bayesian Spatio-Temporal Reflectance Fusion Model (MDBFM has been proposed in this paper. First, multiple dictionaries from regions of different classes are trained. Second, a Bayesian framework is constructed to solve the dictionary selection problem. A pixel-dictionary likehood function and a dictionary-dictionary prior function are constructed under the Bayesian framework. Third, remote sensing images before and after the middle moment are combined to predict images at the middle moment. Diverse shapes and textures information is learned from different landscapes in multi-dictionary learning to help dictionaries capture the distinctions between regions. The Bayesian framework makes full use of the priori information while the input image is classified. The experiments with one simulated dataset and two satellite datasets validate that the MDBFM is highly effective in both subjective and objective evaluation indexes. The results of MDBFM show more precise details and have a higher similarity with real images when dealing with both type changes and phenology changes.

  1. Clinical outcome prediction in aneurysmal subarachnoid hemorrhage using Bayesian neural networks with fuzzy logic inferences.

    Science.gov (United States)

    Lo, Benjamin W Y; Macdonald, R Loch; Baker, Andrew; Levine, Mitchell A H

    2013-01-01

    The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH). The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients). Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs). Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique) denoted cut-off points for poor prognosis at greater than 2.5 clusters. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication.

  2. Reduced Neural Recruitment for Bayesian Adjustment of Inhibitory Control in Methamphetamine Dependence.

    Science.gov (United States)

    Harlé, Katia M; Zhang, Shunan; Ma, Ning; Yu, Angela J; Paulus, Martin P

    2016-09-01

    Delineating the processes that contribute to the progression and maintenance of substance dependence is critical to understanding and preventing addiction. Several previous studies have shown inhibitory control deficits in individuals with stimulant use disorder. We used a Bayesian computational approach to examine potential neural deficiencies in the dynamic predictive processing underlying inhibitory function among recently abstinent methamphetamine-dependent individuals (MDIs), a population at high risk of relapse. Sixty-two MDIs were recruited from a 28-day inpatient treatment program at the San Diego Veterans Affairs Medical Center and compared with 34 healthy control subjects. They completed a stop-signal task during functional magnetic resonance imaging. A Bayesian ideal observer model was used to predict individuals' trial-to-trial probabilistic expectations of inhibitory response, P(stop), to identify group differences specific to Bayesian expectation and prediction error computation. Relative to control subjects, MDIs were more likely to make stop errors on difficult trials and had attenuated slowing following stop errors. MDIs further exhibited reduced sensitivity as measured by the neural tracking of a Bayesian measure of surprise (unsigned prediction error), which was evident across all trials in the left posterior caudate and orbitofrontal cortex (Brodmann area 11), and selectively on stop error trials in the right thalamus and inferior parietal lobule. MDIs are less sensitive to surprising task events, both across trials and upon making commission errors, which may help explain why these individuals may not engage in switching strategy when the environment changes, leading to adverse consequences.

  3. Bayesian Hierarchical Scale Mixtures of Log-Normal Models for Inference in Reliability with Stochastic Constraint

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2017-06-01

    Full Text Available This paper develops Bayesian inference in reliability of a class of scale mixtures of log-normal failure time (SMLNFT models with stochastic (or uncertain constraint in their reliability measures. The class is comprehensive and includes existing failure time (FT models (such as log-normal, log-Cauchy, and log-logistic FT models as well as new models that are robust in terms of heavy-tailed FT observations. Since classical frequency approaches to reliability analysis based on the SMLNFT model with stochastic constraint are intractable, the Bayesian method is pursued utilizing a Markov chain Monte Carlo (MCMC sampling based approach. This paper introduces a two-stage maximum entropy (MaxEnt prior, which elicits a priori uncertain constraint and develops Bayesian hierarchical SMLNFT model by using the prior. The paper also proposes an MCMC method for Bayesian inference in the SMLNFT model reliability and calls attention to properties of the MaxEnt prior that are useful for method development. Finally, two data sets are used to illustrate how the proposed methodology works.

  4. Clinical Outcome Prediction in Aneurysmal Subarachnoid Hemorrhage Using Bayesian Neural Networks with Fuzzy Logic Inferences

    Directory of Open Access Journals (Sweden)

    Benjamin W. Y. Lo

    2013-01-01

    Full Text Available Objective. The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH. Methods. The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients. Results. Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs. Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique denoted cut-off points for poor prognosis at greater than 2.5 clusters. Discussion. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication.

  5. A surrogate-based sensitivity quantification and Bayesian inversion of a regional groundwater flow model

    Science.gov (United States)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor

    2018-02-01

    Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.

  6. Bayesian Network Model with Application to Smart Power Semiconductor Lifetime Data.

    Science.gov (United States)

    Plankensteiner, Kathrin; Bluder, Olivia; Pilz, Jürgen

    2015-09-01

    In this article, Bayesian networks are used to model semiconductor lifetime data obtained from a cyclic stress test system. The data of interest are a mixture of log-normal distributions, representing two dominant physical failure mechanisms. Moreover, the data can be censored due to limited test resources. For a better understanding of the complex lifetime behavior, interactions between test settings, geometric designs, material properties, and physical parameters of the semiconductor device are modeled by a Bayesian network. Statistical toolboxes in MATLAB® have been extended and applied to find the best structure of the Bayesian network and to perform parameter learning. Due to censored observations Markov chain Monte Carlo (MCMC) simulations are employed to determine the posterior distributions. For model selection the automatic relevance determination (ARD) algorithm and goodness-of-fit criteria such as marginal likelihoods, Bayes factors, posterior predictive density distributions, and sum of squared errors of prediction (SSEP) are applied and evaluated. The results indicate that the application of Bayesian networks to semiconductor reliability provides useful information about the interactions between the significant covariates and serves as a reliable alternative to currently applied methods. © 2015 Society for Risk Analysis.

  7. Uncertainty, reward, and attention in the Bayesian brain

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma

    2008-01-01

    The ‘Bayesian Coding Hypothesis’ formalises the classic Helmholtzian picture of perception as inverse inference, stating that the brain uses Bayes’ rule to compute posterior belief distributions over states of the world. There is much behavioural evidence that human observers can behave Bayes...... results suggest that value a¿ects a fronto-striatal action selection network rather than directly impacting on sensory processing. Finally, we consider a major theoretical problem – the demonstrations of optimality that dominate the ¿eld have been obtained in tasks with a small number of objects...... in the focus of attention. When faced instead with a complex scene, the brain can’t be Bayes-optimal everywhere. We suggest that a general limitation on the representation of complex posteriors causes the brain to make approximations, which are then locally re¿ned by attention. This framework extends ideas...

  8. Approximate Bayesian computation methods for daily spatiotemporal precipitation occurrence simulation

    Science.gov (United States)

    Olson, Branden; Kleiber, William

    2017-04-01

    Stochastic precipitation generators (SPGs) produce synthetic precipitation data and are frequently used to generate inputs for physical models throughout many scientific disciplines. Especially for large data sets, statistical parameter estimation is difficult due to the high dimensionality of the likelihood function. We propose techniques to estimate SPG parameters for spatiotemporal precipitation occurrence based on an emerging set of methods called Approximate Bayesian computation (ABC), which bypass the evaluation of a likelihood function. Our statistical model employs a thresholded Gaussian process that reduces to a probit regression at single sites. We identify appropriate ABC penalization metrics for our model parameters to produce simulations whose statistical characteristics closely resemble those of the observations. Spell length metrics are appropriate for single sites, while a variogram-based metric is proposed for spatial simulations. We present numerical case studies at sites in Colorado and Iowa where the estimated statistical model adequately reproduces local and domain statistics.

  9. Excited baryons from Bayesian priors and overlap fermions

    Energy Technology Data Exchange (ETDEWEB)

    F.X. Lee; S.J. Dong; T. Draper; I. Horvath; K.F. Liu; N. Mathur; J.B. Zhang

    2003-05-01

    Using the constrained-fitting method based on Bayesian priors, we extract the masses of the two lowest states of octet and decouplet baryons with both parities. The calculation is done on quenched 163 x 28 lattices of a = 0.2 fm using an improved gauge action and overlap fermions, with the pion mass as low as 180 MeV. The Roper state N(1440)+ is clearly observed for the first time as the 1st-excited state of the nucleon from the standard interpolating field. Together with other baryons, our preliminary results indicate that the level-ordering of the low-lying baryon states on the lattice is largely consistent with experiment. The realization is helped by cross-overs between the excited + and - states in the region of mp 300 to 400 MeV.

  10. Optimal Experimental Design for Large-Scale Bayesian Inverse Problems

    KAUST Repository

    Ghattas, Omar

    2014-01-06

    We develop a Bayesian framework for the optimal experimental design of the shock tube experiments which are being carried out at the KAUST Clean Combustion Research Center. The unknown parameters are the pre-exponential parameters and the activation energies in the reaction rate expressions. The control parameters are the initial mixture composition and the temperature. The approach is based on first building a polynomial based surrogate model for the observables relevant to the shock tube experiments. Based on these surrogates, a novel MAP based approach is used to estimate the expected information gain in the proposed experiments, and to select the best experimental set-ups yielding the optimal expected information gains. The validity of the approach is tested using synthetic data generated by sampling the PC surrogate. We finally outline a methodology for validation using actual laboratory experiments, and extending experimental design methodology to the cases where the control parameters are noisy.

  11. On Bayesian treatment of systematic uncertainties in confidence interval calculation

    CERN Document Server

    Tegenfeldt, Fredrik

    2005-01-01

    In high energy physics, a widely used method to treat systematic uncertainties in confidence interval calculations is based on combining a frequentist construction of confidence belts with a Bayesian treatment of systematic uncertainties. In this note we present a study of the coverage of this method for the standard Likelihood Ratio (aka Feldman & Cousins) construction for a Poisson process with known background and Gaussian or log-Normal distributed uncertainties in the background or signal efficiency. For uncertainties in the signal efficiency of upto 40 % we find over-coverage on the level of 2 to 4 % depending on the size of uncertainties and the region in signal space. Uncertainties in the background generally have smaller effect on the coverage. A considerable smoothing of the coverage curves is observed. A software package is presented which allows fast calculation of the confidence intervals for a variety of assumptions on shape and size of systematic uncertainties for different nuisance paramete...

  12. A Bayesian Network View on Nested Effects Models

    Directory of Open Access Journals (Sweden)

    Achim Tresch

    2009-01-01

    Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the R/Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.

  13. iSEDfit: Bayesian spectral energy distribution modeling of galaxies

    Science.gov (United States)

    Moustakas, John

    2017-08-01

    iSEDfit uses Bayesian inference to extract the physical properties of galaxies from their observed broadband photometric spectral energy distribution (SED). In its default mode, the inputs to iSEDfit are the measured photometry (fluxes and corresponding inverse variances) and a measurement of the galaxy redshift. Alternatively, iSEDfit can be used to estimate photometric redshifts from the input photometry alone. After the priors have been specified, iSEDfit calculates the marginalized posterior probability distributions for the physical parameters of interest, including the stellar mass, star-formation rate, dust content, star formation history, and stellar metallicity. iSEDfit also optionally computes K-corrections and produces multiple "quality assurance" (QA) plots at each stage of the modeling procedure to aid in the interpretation of the prior parameter choices and subsequent fitting results. The software is distributed as part of the impro IDL suite.

  14. A full bayesian approach for boolean genetic network inference.

    Directory of Open Access Journals (Sweden)

    Shengtong Han

    Full Text Available Boolean networks are a simple but efficient model for describing gene regulatory systems. A number of algorithms have been proposed to infer Boolean networks. However, these methods do not take full consideration of the effects of noise and model uncertainty. In this paper, we propose a full Bayesian approach to infer Boolean genetic networks. Markov chain Monte Carlo algorithms are used to obtain the posterior samples of both the network structure and the related parameters. In addition to regular link addition and removal moves, which can guarantee the irreducibility of the Markov chain for traversing the whole network space, carefully constructed mixture proposals are used to improve the Markov chain Monte Carlo convergence. Both simulations and a real application on cell-cycle data show that our method is more powerful than existing methods for the inference of both the topology and logic relations of the Boolean network from observed data.

  15. The bugs book a practical introduction to Bayesian analysis

    CERN Document Server

    Lunn, David; Best, Nicky; Thomas, Andrew; Spiegelhalter, David

    2012-01-01

    Introduction: Probability and ParametersProbabilityProbability distributionsCalculating properties of probability distributionsMonte Carlo integrationMonte Carlo Simulations Using BUGSIntroduction to BUGSDoodleBUGSUsing BUGS to simulate from distributionsTransformations of random variablesComplex calculations using Monte CarloMultivariate Monte Carlo analysisPredictions with unknown parametersIntroduction to Bayesian InferenceBayesian learningPosterior predictive distributionsConjugate Bayesian inferenceInference about a discrete parameterCombinations of conjugate analysesBayesian and classica

  16. Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems.

    Science.gov (United States)

    Jenkinson, Garrett; Zhong, Xiaogang; Goutsias, John

    2010-11-05

    Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Our approach provides an attractive statistical methodology for estimating thermodynamically feasible values for the rate

  17. A Bayesian approach to extracting meaning from system behavior

    Energy Technology Data Exchange (ETDEWEB)

    Dress, W.B.

    1998-08-01

    The modeling relation and its reformulation to include the semiotic hierarchy is essential for the understanding, control, and successful re-creation of natural systems. This presentation will argue for a careful application of Rosen`s modeling relationship to the problems of intelligence and autonomy in natural and artificial systems. To this end, the authors discuss the essential need for a correct theory of induction, learning, and probability; and suggest that modern Bayesian probability theory, developed by Cox, Jaynes, and others, can adequately meet such demands, especially on the operational level of extracting meaning from observations. The methods of Bayesian and maximum Entropy parameter estimation have been applied to measurements of system observables to directly infer the underlying differential equations generating system behavior. This approach by-passes the usual method of parameter estimation based on assuming a functional form for the observable and then estimating the parameters that would lead to the particular observed behavior. The computational savings is great since only location parameters enter into the maximum-entropy calculations; this innovation finesses the need for nonlinear parameters altogether. Such an approach more directly extracts the semantics inherent in a given system by going to the root of system meaning as expressed by abstract form or shape, rather than in syntactic particulars, such as signal amplitude and phase. Examples will be shown how the form of a system can be followed while ignoring unnecessary details. In this sense, the authors are observing the meaning of the words rather than being concerned with their particular expression or language. For the present discussion, empirical models are embodied by the differential equations underlying, producing, or describing the behavior of a process as measured or tracked by a particular variable set--the observables. The a priori models are probability structures that

  18. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    Science.gov (United States)

    Wheeler, K.; Knuth, K.; Castle, P.

    2005-12-01

    Typical estimates of standing wood derived from remote sensing sources take advantage of aggregate measurements of canopy heights (e.g. LIDAR) and canopy diameters (segmentation of IKONOS imagery) to obtain a wood volume estimate by assuming homogeneous species and a fixed function that returns volume. The validation of such techniques use manually measured diameter at breast height records (DBH). Our goal is to improve the accuracy and applicability of biomass estimation methods to heterogeneous forests and transitional areas. We are developing estimates with quantifiable uncertainty using a new form of estimation function, active sampling, and volumetric reconstruction image rendering for species specific mass truth. Initially we are developing a Bayesian adaptive sampling method for BRDF associated with the MISR Rahman model with respect to categorical biomes. This involves characterizing the probability distributions of the 3 free parameters of the Rahman model for the 6 categories of biomes used by MISR. Subsequently, these distributions can be used to determine the optimal sampling methodology to distinguish biomes during acquisition. We have a remotely controlled semi-autonomous helicopter that has stereo imaging, lidar, differential GPS, and spectrometers covering wavelengths from visible to NIR. We intend to automatically vary the way points of the flight path via the Bayesian adaptive sampling method. The second critical part of this work is in automating the validation of biomass estimates via using machine vision techniques. This involves taking 2-D pictures of trees of known species, and then via Bayesian techniques, reconstructing 3-D models of the trees to estimate the distribution moments associated with wood volume. Similar techniques have been developed by the medical imaging community. This then provides probability distributions conditional upon species. The final part of this work is in relating the BRDF actively sampled measurements to species

  19. Bayesian inference for the information gain model.

    Science.gov (United States)

    Stringer, Sven; Borsboom, Denny; Wagenmakers, Eric-Jan

    2011-06-01

    One of the most popular paradigms to use for studying human reasoning involves the Wason card selection task. In this task, the participant is presented with four cards and a conditional rule (e.g., "If there is an A on one side of the card, there is always a 2 on the other side"). Participants are asked which cards should be turned to verify whether or not the rule holds. In this simple task, participants consistently provide answers that are incorrect according to formal logic. To account for these errors, several models have been proposed, one of the most prominent being the information gain model (Oaksford & Chater, Psychological Review, 101, 608-631, 1994). This model is based on the assumption that people independently select cards based on the expected information gain of turning a particular card. In this article, we present two estimation methods to fit the information gain model: a maximum likelihood procedure (programmed in R) and a bayesian procedure (programmed in WinBUGS). We compare the two procedures and illustrate the flexibility of the bayesian hierarchical procedure by applying it to data from a meta-analysis of the Wason task (Oaksford & Chater, Psychological Review, 101, 608-631, 1994). We also show that the goodness of fit of the information gain model can be assessed by inspecting the posterior predictives of the model. These bayesian procedures make it easy to apply the information gain model to empirical data. Supplemental materials may be downloaded along with this article from www.springerlink.com.

  20. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul

    2012-12-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.

  1. Bayesian regression of piecewise homogeneous Poisson processes

    Directory of Open Access Journals (Sweden)

    Diego Sevilla

    2015-12-01

    Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015

  2. Bayesian stratified sampling to assess corpus utility

    Energy Technology Data Exchange (ETDEWEB)

    Hochberg, J.; Scovel, C.; Thomas, T.; Hall, S.

    1998-12-01

    This paper describes a method for asking statistical questions about a large text corpus. The authors exemplify the method by addressing the question, ``What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?`` They estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Bayesian analysis and stratified sampling are used to reduce the sampling uncertainty of the estimate from over 3,100 documents to fewer than 1,000. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  3. deal: A Package for Learning Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Susanne G. Boettcher

    2003-12-01

    Full Text Available deal is a software package for use with R. It includes several methods for analysing data using Bayesian networks with variables of discrete and/or continuous types but restricted to conditionally Gaussian networks. Construction of priors for network parameters is supported and their parameters can be learned from data using conjugate updating. The network score is used as a metric to learn the structure of the network and forms the basis of a heuristic search strategy. deal has an interface to Hugin.

  4. Maximum entropy and Bayesian methods. Proceedings.

    Science.gov (United States)

    Fougère, P. F.

    Bayesian probability theory and maximum entropy are the twin foundations of consistent inductive reasoning about the physical world. This volume contains thirty-two papers which are devoted to both foundations and applications and combine tutorial presentations and more research oriented contributions. Together these provide a state of the art account of latest developments in such diverse areas as coherent imaging, regression analysis, tomography, neural networks, plasma theory, quantum mechanics, and others. The methods described will be of great interest to mathematicians, physicists, astronomers, crystallographers, engineers and those involved in all aspects of signal processing.

  5. Nonlinear Bayesian Tracking Loops for Multipath Mitigation

    Directory of Open Access Journals (Sweden)

    Pau Closas

    2012-01-01

    Full Text Available This paper studies Bayesian filtering techniques applied to the design of advanced delay tracking loops in GNSS receivers with multipath mitigation capabilities. The analysis includes tradeoff among realistic propagation channel models and the use of a realistic simulation framework. After establishing the mathematical framework for the design and analysis of tracking loops in the context of GNSS receivers, we propose a filtering technique that implements Rao-Blackwellization of linear states and a particle filter for the nonlinear partition and compare it to traditional delay lock loop/phase lock loop-based schemes.

  6. On local optima in learning bayesian networks

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Kocka, Tomas; Pena, Jose

    2003-01-01

    This paper proposes and evaluates the k-greedy equivalence search algorithm (KES) for learning Bayesian networks (BNs) from complete data. The main characteristic of KES is that it allows a trade-off between greediness and randomness, thus exploring different good local optima. When greediness...... is set at maximum, KES corresponds to the greedy equivalence search algorithm (GES). When greediness is kept at minimum, we prove that under mild assumptions KES asymptotically returns any inclusion optimal BN with nonzero probability. Experimental results for both synthetic and real data are reported...

  7. BXA: Bayesian X-ray Analysis

    Science.gov (United States)

    Buchner, Johannes

    2016-10-01

    BXA connects the nested sampling algorithm MultiNest (ascl:1109.006) to the X-ray spectral analysis environments Xspec/Sherpa for Bayesian parameter estimation and model comparison. It provides parameter estimation in arbitrary dimensions and plotting of spectral model vs. the data for best fit, posterior samples, or each component. BXA allows for model selection; it computes the evidence for the considered model, ready for use in computing Bayes factors and is not limited to nested models. It also visualizes deviations between model and data with Quantile-Quantile (QQ) plots, which do not require binning and are more comprehensive than residuals.

  8. Bayesian approach to inverse statistical mechanics

    Science.gov (United States)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  9. Remarks on Bayesian Networks and Their Applications

    OpenAIRE

    Kozdrąj, Tomasz

    2005-01-01

    Sieci Bayesa są strukturami graficznymi będącymi skierowanymi grafami acyklicznymi prezentującymi zależności pomiędzy zmiennymi losowymi. Znajdują one zastosowanie w dziedzinie tzw. oprogramowania inteligentnego, a zwłaszcza w systemach ekspertowych. Artykuł ten porusza problemy samych sieci bayesowskich, uczenia oraz ich zastosowania. Podjęto też próbę ich aplikacji na polu zagadnień ekonomicznych związanych z rynkiem kapitałowym. Bayesian networks are directed acyclic graphs ...

  10. An Approximate Bayesian Fundamental Frequency Estimator

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2012-01-01

    and the model order is based on a probability model which corresponds to a minimum of prior information. From this probability model, we give the exact posterior distributions on the fundamental frequency and the model order, and we also present analytical approximations of these distributions which lower......Joint fundamental frequency and model order estimation is an important problem in several applications such as speech and music processing. In this paper, we develop an approximate estimation algorithm of these quantities using Bayesian inference. The inference about the fundamental frequency...

  11. Filtering in hybrid dynamic Bayesian networks (left)

    DEFF Research Database (Denmark)

    Andersen, Morten Nonboe; Andersen, Rasmus Ørum; Wheeler, Kevin

    We demonstrate experimentally that inference in a complex hybrid Dynamic Bayesian Network (DBN) is possible using the 2-Time Slice DBN (2T-DBN) from (Koller & Lerner, 2000) to model fault detection in a watertank system. In (Koller & Lerner, 2000) a generic Particle Filter (PF) is used...... that the choice of network structure is very important for the performance of the generic PF and the EKF algorithms, but not for the UKF algorithms. Furthermore, we investigate the influence of data noise in the watertank simulation. Theory and implementation is based on the theory presented in (v.d. Merwe et al...

  12. Filtering in hybrid dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Andersen, Morten Nonboe; Andersen, Rasmus Ørum; Wheeler, Kevin

    2004-01-01

    We demonstrate experimentally that inference in a complex hybrid Dynamic Bayesian Network (DBN) is possible using the 2-Time Slice DBN (2T-DBN) from (Koller & Lerner, 2000) to model fault detection in a watertank system. In (Koller & Lerner, 2000) a generic Particle Filter (PF) is used...... that the choice of network structure is very important for the performance of the generic PF and the EKF algorithms, but not for the UKF algorithms. Furthermore, we investigate the influence of data noise in the watertank simulation. Theory and implementation is based on the theory presented in (v.d. Merwe et al...

  13. Filtering in hybrid dynamic Bayesian networks (center)

    DEFF Research Database (Denmark)

    Andersen, Morten Nonboe; Andersen, Rasmus Ørum; Wheeler, Kevin

    We demonstrate experimentally that inference in a complex hybrid Dynamic Bayesian Network (DBN) is possible using the 2-Time Slice DBN (2T-DBN) from (Koller & Lerner, 2000) to model fault detection in a watertank system. In (Koller & Lerner, 2000) a generic Particle Filter (PF) is used...... that the choice of network structure is very important for the performance of the generic PF and the EKF algorithms, but not for the UKF algorithms. Furthermore, we investigate the influence of data noise in the watertank simulation. Theory and implementation is based on the theory presented in (v.d. Merwe et al...

  14. Optimized Bayesian dynamic advising theory and algorithms

    CERN Document Server

    Karny, Miroslav

    2006-01-01

    Written by one of the world's leading groups in the area of Bayesian identification, control, and decision making, this book provides the theoretical and algorithmic basis of optimized probabilistic advising. Starting from abstract ideas and formulations, and culminating in detailed algorithms, the book comprises a unified treatment of an important problem of the design of advisory systems supporting supervisors of complex processes. It introduces the theoretical and algorithmic basis of developed advising, relying on novel and powerful combination black-box modelling by dynamic mixture models

  15. A Knowledge Management System Using Bayesian Networks

    Science.gov (United States)

    Ribino, Patrizia; Oliveri, Antonio; Re, Giuseppe Lo; Gaglio, Salvatore

    In today's world, decision support and knowledge management processes are strategic and interdependent activities in many organizations. The companies' interest on a correct knowledge management is grown, more than interest on the mere knowledge itself. This paper proposes a Knowledge Management System based on Bayesian networks. The system has been tested collecting and using data coming from projects and processes typical of ICT companies, and provides a Document Management System and a Decision Support system to share documents and to plan how to best use firms' knowledge.

  16. Likelihood-free Bayesian computation for structural model calibration: a feasibility study

    Science.gov (United States)

    Jin, Seung-Seop; Jung, Hyung-Jo

    2016-04-01

    Finite element (FE) model updating is often used to associate FE models with corresponding existing structures for the condition assessment. FE model updating is an inverse problem and prone to be ill-posed and ill-conditioning when there are many errors and uncertainties in both an FE model and its corresponding measurements. In this case, it is important to quantify these uncertainties properly. Bayesian FE model updating is one of the well-known methods to quantify parameter uncertainty by updating our prior belief on the parameters with the available measurements. In Bayesian inference, likelihood plays a central role in summarizing the overall residuals between model predictions and corresponding measurements. Therefore, likelihood should be carefully chosen to reflect the characteristics of the residuals. It is generally known that very little or no information is available regarding the statistical characteristics of the residuals. In most cases, the likelihood is assumed to be the independent identically distributed Gaussian distribution with the zero mean and constant variance. However, this assumption may cause biased and over/underestimated estimates of parameters, so that the uncertainty quantification and prediction are questionable. To alleviate the potential misuse of the inadequate likelihood, this study introduced approximate Bayesian computation (i.e., likelihood-free Bayesian inference), which relaxes the need for an explicit likelihood by analyzing the behavior similarities between model predictions and measurements. We performed FE model updating based on likelihood-free Markov chain Monte Carlo (MCMC) without using the likelihood. Based on the result of the numerical study, we observed that the likelihood-free Bayesian computation can quantify the updating parameters correctly and its predictive capability for the measurements, not used in calibrated, is also secured.

  17. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    Science.gov (United States)

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes.

  18. A Bayesian Approach to Multistage Fitting of the Variation of the Skeletal Age Features

    Directory of Open Access Journals (Sweden)

    Dong Hua

    2009-01-01

    Full Text Available Accurate assessment of skeletal maturity is important clinically. Skeletal age assessment is usually based on features encoded in ossification centers. Therefore, it is critical to design a mechanism to capture as much as possible characteristics of features. We have observed that given a feature, there exist stages of the skeletal age such that the variation pattern of the feature differs in these stages. Based on this observation, we propose a Bayesian cut fitting to describe features in response to the skeletal age. With our approach, appropriate positions for stage separation are determined automatically by a Bayesian approach, and a model is used to fit the variation of a feature within each stage. Our experimental results show that the proposed method surpasses the traditional fitting using only one line or one curve not only in the efficiency and accuracy of fitting but also in global and local feature characterization.

  19. Mixture priors for Bayesian performance monitoring 2: variable-constituent model

    Energy Technology Data Exchange (ETDEWEB)

    Atwood, Corwin L. [Statwood Consulting, 2905 Covington Road, Silver Spring, MD 20910 (United States)]. E-mail: cory@StatwoodConsulting.com; Youngblood, Robert W. [ISL, Inc., 11140 Rockville Pike, Suite 500, Rockville, MD 20852 (United States)]. E-mail: ryoungblood@islinc.com

    2005-08-01

    This paper uses mixture priors for Bayesian assessment of performance. In any Bayesian performance assessment, a prior distribution for performance parameter(s) is updated based on current performance information. The performance assessment is then based on the posterior distribution for the parameter(s). This paper uses a mixture prior, a mixture of conjugate distributions, which is itself conjugate and which is useful when performance may have changed recently. The present paper illustrates the process using simple models for reliability, involving parameters such as failure rates and demand failure probabilities. When few failures are observed the resulting posterior distributions tend to resemble the priors. However, when more failures are observed, the posteriors tend to change character in a rapid nonlinear way. This behavior is arguably appropriate for many applications. Choosing realistic parameters for the mixture prior is not simple, but even the crude methods given here lead to estimators that show qualitatively good behavior in examples.

  20. Recognition of Action as a Bayesian Parameter Estimation Problem over Time

    DEFF Research Database (Denmark)

    Krüger, Volker

    2007-01-01

    In this paper we will discuss two problems related to action recognition: The first problem is the one of identifying in a surveillance scenario whether a person is walking or running and in what rough direction. The second problem is concerned with the recovery of action primitives from observed...... complex actions. Both problems will be discussed within a statistical framework. Bayesian propagation over time offers a framework to treat likelihood observations at each time step and the dynamics between the time steps in a unified manner. The first problem will be approached as a patter recognition...... and tracking task by a Bayesian propagation of the likelihoods. The latter problem will be  approached by explicitly specifying the dynamics while the likelihood measure will give a measure how good each dynamical model fit at each time step. Extensive experimental results show the applicability...

  1. A Bayesian Network-Based Probabilistic Framework for Drought Forecasting and Outlook

    Directory of Open Access Journals (Sweden)

    Ji Yae Shin

    2016-01-01

    Full Text Available Reliable drought forecasting is necessary to develop mitigation plans to cope with severe drought. This study developed a probabilistic scheme for drought forecasting and outlook combined with quantification of the prediction uncertainties. The Bayesian network was mainly employed as a statistical scheme for probabilistic forecasting that can represent the cause-effect relationships between the variables. The structure of the Bayesian network-based drought forecasting (BNDF model was designed using the past, current, and forecasted drought condition. In this study, the drought conditions were represented by the standardized precipitation index (SPI. The accuracy of forecasted SPIs was assessed by comparing the observed SPIs and confidence intervals (CIs, exhibiting the associated uncertainty. Then, this study suggested the drought outlook framework based on probabilistic drought forecasting results. The overall results provided sufficient agreement between the observed and forecasted drought conditions in the outlook framework.

  2. Bayesian phase II adaptive randomization by jointly modeling time-to-event efficacy and binary toxicity.

    Science.gov (United States)

    Lei, Xiudong; Yuan, Ying; Yin, Guosheng

    2011-01-01

    In oncology, toxicity is typically observable shortly after a chemotherapy treatment, whereas efficacy, often characterized by tumor shrinkage, is observable after a relatively long period of time. In a phase II clinical trial design, we propose a Bayesian adaptive randomization procedure that accounts for both efficacy and toxicity outcomes. We model efficacy as a time-to-event endpoint and toxicity as a binary endpoint, sharing common random effects in order to induce dependence between the bivariate outcomes. More generally, we allow the randomization probability to depend on patients' specific covariates, such as prognostic factors. Early stopping boundaries are constructed for toxicity and futility, and a superior treatment arm is recommended at the end of the trial. Following the setup of a recent renal cancer clinical trial at M. D. Anderson Cancer Center, we conduct extensive simulation studies under various scenarios to investigate the performance of the proposed method, and compare it with available Bayesian adaptive randomization procedures.

  3. Bayesian Spectral Analysis of Chorus Sub-Elements

    Science.gov (United States)

    Crabtree, C. E.; Tejero, E. M.; Ganguli, G.; Hospodarsky, G. B.; Kletzing, C.

    2016-12-01

    We develop a Bayesian spectral analysis technique that calculates the probability distribution functions of a superposition of wave-modes each described by a linear growth rate, a frequency and a chirp rate. The Bayesian framework has a number of advantages, including 1) reducing the parameter space by integrating over the amplitude and phase of the wave, 2) incorporating the data from each channel to determine the model parameters such as frequency which leads to high resolution results in frequency and time, 3) the ability to consider the superposition of waves where the wave-parameters are closely spaced, 4) the ability to directly calculate the expectation value of wave parameters without resorting to ensemble averages, 5) the ability to calculate error bars on model parameters. We examine one rising-tone chorus element in detail from a disturbed time on November 14, 2012 using burst mode waveform data of the three components of the electric and magnetic field from the EMFISIS instrument on board NASA's Van Allen Probes. The results of the analysis demonstrate that whistler mode chorus sub-elements are composed of almost linear waves that are nearly parallel propagating with continuously changing wave parameters such as frequency and wave-vector. The change of wave-vector as a function of time is a three-dimensional phenomenon suggesting that 2D simulations may not accurately represent chorus. The initial parts of the sub-elements are in good agreement with the analytical theory of Omura et al. 2008. However, between sub-elements the wave parameters of the dominant mode undergo discrete changes in frequency and wave-vector. Near the boundary of sub-elements multiple waves are observed such that the evolution of the waves is reminiscent of wave-wave processes such as parametric decay or induced scattering by particles. These nonlinear processes are signatures of weak turbulence and may affect the saturation of the whistler-mode chorus instability.

  4. Comparative Study of Inference Methods for Bayesian Nonnegative Matrix Factorisation

    DEFF Research Database (Denmark)

    Brouwer, Thomas; Frellsen, Jes; Liò, Pietro

    2017-01-01

    In this paper, we study the trade-offs of different inference approaches for Bayesian matrix factorisation methods, which are commonly used for predicting missing values, and for finding patterns in the data. In particular, we consider Bayesian nonnegative variants of matrix factorisation and tri...

  5. Illustrating bayesian evaluation of informative hypotheses for regression models.

    NARCIS (Netherlands)

    Kluytmans, A.; Schoot, R. van de; Mulder, J.; Hoijtink, H.

    2012-01-01

    In the present article we illustrate a Bayesian method of evaluating informative hypotheses for regression models. Our main aim is to make this method accessible to psychological researchers without a mathematical or Bayesian background. The use of informative hypotheses is illustrated using two

  6. Bayesian Nonlinear Assimilation of Eulerian and Lagrangian Coastal Flow Data

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Bayesian Nonlinear Assimilation of Eulerian and...Develop and apply theory, schemes and computational systems for rigorous Bayesian nonlinear assimilation of Eulerian and Lagrangian coastal flow data...of coastal ocean fields by assimilation of Eulerian and Lagrangian flow data. - Apply our DO and GMM-DO schemes, as well as their theoretical

  7. A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation

    Science.gov (United States)

    2007-09-07

    A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee Whye Teh Gatsby Computational Neuroscience Unit University...thesis, Gatsby Computa- tional Neuroscience Unit, University College London, 2003. [12] J. Sung, Z. Ghahramani, and S. Choi. Variational Bayesian EM: A

  8. Universal Darwinism as a process of Bayesian inference

    Directory of Open Access Journals (Sweden)

    John Oberon Campbell

    2016-06-01

    Full Text Available Many of the mathematical frameworks describing natural selection are equivalent to Bayes’ Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians. As Bayesian inference can always be cast in terms of (variational free energy minimization, natural selection can be viewed as comprising two components: a generative model of an ‘experiment’ in the external world environment, and the results of that 'experiment' or the 'surprise' entailed by predicted and actual outcomes of the ‘experiment’. Minimization of free energy implies that the implicit measure of 'surprise' experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  9. Universal Darwinism As a Process of Bayesian Inference.

    Science.gov (United States)

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  10. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...

  11. A default Bayesian hypothesis test for correlations and partial correlations

    NARCIS (Netherlands)

    Wetzels, R.; Wagenmakers, E.J.

    2012-01-01

    We propose a default Bayesian hypothesis test for the presence of a correlation or a partial correlation. The test is a direct application of Bayesian techniques for variable selection in regression models. The test is easy to apply and yields practical advantages that the standard frequentist tests

  12. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    Science.gov (United States)

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  13. Bayesian model ensembling using meta-trained recurrent neural networks

    NARCIS (Netherlands)

    Ambrogioni, L.; Berezutskaya, Y.; Gü ç lü , U.; Borne, E.W.P. van den; Gü ç lü tü rk, Y.; Gerven, M.A.J. van; Maris, E.G.G.

    2017-01-01

    In this paper we demonstrate that a recurrent neural network meta-trained on an ensemble of arbitrary classification tasks can be used as an approximation of the Bayes optimal classifier. This result is obtained by relying on the framework of e-free approximate Bayesian inference, where the Bayesian

  14. Heuristically improved Bayesian segmentation of brain MR images ...

    African Journals Online (AJOL)

    Hence involving problem specific heuristics and expert knowledge in designing segmentation algorithms seems to be useful. A two-phase segmentation algorithm based on Bayesian method is proposed in this paper. The Bayesian part uses the gray value in segmenting images and the segmented image is used as the

  15. A Bayesian compositional estimator for microbial taxonomy based on biomarkers

    NARCIS (Netherlands)

    Van den Meersche, K.; Middelburg, J.J.; Soetaert, K.E.R.

    2008-01-01

    Determination of microbial taxonomy based on lipid or pigment spectra requires use of a compositional estimator. We present a new approach based on Bayesian inference and an implementation in the open software platform R. The Bayesian Compositional Estimator (BCE) aims not only to obtain a maximum

  16. An introduction to Bayesian statistics in health psychology.

    Science.gov (United States)

    Depaoli, Sarah; Rus, Holly M; Clifton, James P; van de Schoot, Rens; Tiemensma, Jitske

    2017-09-01

    The aim of the current article is to provide a brief introduction to Bayesian statistics within the field of health psychology. Bayesian methods are increasing in prevalence in applied fields, and they have been shown in simulation research to improve the estimation accuracy of structural equation models, latent growth curve (and mixture) models, and hierarchical linear models. Likewise, Bayesian methods can be used with small sample sizes since they do not rely on large sample theory. In this article, we discuss several important components of Bayesian statistics as they relate to health-based inquiries. We discuss the incorporation and impact of prior knowledge into the estimation process and the different components of the analysis that should be reported in an article. We present an example implementing Bayesian estimation in the context of blood pressure changes after participants experienced an acute stressor. We conclude with final thoughts on the implementation of Bayesian statistics in health psychology, including suggestions for reviewing Bayesian manuscripts and grant proposals. We have also included an extensive amount of online supplementary material to complement the content presented here, including Bayesian examples using many different software programmes and an extensive sensitivity analysis examining the impact of priors.

  17. Applications of Bayesian decision theory to intelligent tutoring systems

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1994-01-01

    Some applications of Bayesian decision theory to intelligent tutoring systems are considered. How the problem of adapting the appropriate amount of instruction to the changing nature of a student's capabilities during the learning process can be situated in the general framework of Bayesian decision

  18. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, Denny; Wagenmakers, Eric-Jan; Romeijn, Jan-Willem

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer

  19. Survey of Bayesian Models for Modelling of Stochastic Temporal Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ng, B

    2006-10-12

    This survey gives an overview of popular generative models used in the modeling of stochastic temporal systems. In particular, this survey is organized into two parts. The first part discusses the discrete-time representations of dynamic Bayesian networks and dynamic relational probabilistic models, while the second part discusses the continuous-time representation of continuous-time Bayesian networks.

  20. Exploiting Cross-sensitivity by Bayesian Decoding of Mixed Potential Sensor Arrays

    Energy Technology Data Exchange (ETDEWEB)

    Kreller, Cortney [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-02

    LANL mixed-potential electrochemical sensor (MPES) device arrays were coupled with advanced Bayesian inference treatment of the physical model of relevant sensor-analyte interactions. We demonstrated that our approach could be used to uniquely discriminate the composition of ternary gas sensors with three discreet MPES sensors with an average error of less than 2%. We also observed that the MPES exhibited excellent stability over a year of operation at elevated temperatures in the presence of test gases.

  1. An On-line Variational Bayesian Model for Multi-Person Tracking from Cluttered Scenes

    OpenAIRE

    Ba, Sileye; Alameda-Pineda, Xavier; Xompero, Alessio; Horaud, Radu

    2016-01-01

    Object tracking is an ubiquitous problem that appears in many applications such as remote sensing, audio processing, computer vision, human-machine interfaces, human-robot interaction, etc. Although thoroughly investigated in computer vision, tracking a time-varying number of persons remains a challenging open problem. In this paper, we propose an on-line variational Bayesian model for multi-person tracking from cluttered visual observations provided by person detectors. The contributions of ...

  2. Bayesian sample size determination for a clinical trial with correlated continuous and binary outcomes.

    Science.gov (United States)

    Stamey, James D; Natanegara, Fanni; Seaman, John W

    2013-01-01

    In clinical trials, multiple outcomes are often collected in order to simultaneously assess effectiveness and safety. We develop a Bayesian procedure for determining the required sample size in a regression model where a continuous efficacy variable and a binary safety variable are observed. The sample size determination procedure is simulation based. The model accounts for correlation between the two variables. Through examples we demonstrate that savings in total sample size are possible when the correlation between these two variables is sufficiently high.

  3. Bayesian Forecasting of WWW Traffic on the Time Varying Poisson Model

    OpenAIRE

    Koizumi, Daiki; Matsushima, Toshiyasu; Hirasawa, Shigeichi

    2009-01-01

    Traffic forecasting from past observed traffic data with small calculation complexity is one of important problems for planning of servers and networks. Focusing on World Wide Web (WWW) traffic as fundamental investigation, this paper would deal with Bayesian forecasting of network traffic on the time varying Poisson model from a viewpoint from statistical decision theory. Under this model, we would show that the estimated forecasting value is obtained by simple arithmetic calculation and exp...

  4. Bayesian Analysis of Bubbles in Asset Prices

    Directory of Open Access Journals (Sweden)

    Andras Fulop

    2017-10-01

    Full Text Available We develop a new model where the dynamic structure of the asset price, after the fundamental value is removed, is subject to two different regimes. One regime reflects the normal period where the asset price divided by the dividend is assumed to follow a mean-reverting process around a stochastic long run mean. The second regime reflects the bubble period with explosive behavior. Stochastic switches between two regimes and non-constant probabilities of exit from the bubble regime are both allowed. A Bayesian learning approach is employed to jointly estimate the latent states and the model parameters in real time. An important feature of our Bayesian method is that we are able to deal with parameter uncertainty and at the same time, to learn about the states and the parameters sequentially, allowing for real time model analysis. This feature is particularly useful for market surveillance. Analysis using simulated data reveals that our method has good power properties for detecting bubbles. Empirical analysis using price-dividend ratios of S&P500 highlights the advantages of our method.

  5. Robustifying Bayesian nonparametric mixtures for count data.

    Science.gov (United States)

    Canale, Antonio; Prünster, Igor

    2017-03-01

    Our motivating application stems from surveys of natural populations and is characterized by large spatial heterogeneity in the counts, which makes parametric approaches to modeling local animal abundance too restrictive. We adopt a Bayesian nonparametric approach based on mixture models and innovate with respect to popular Dirichlet process mixture of Poisson kernels by increasing the model flexibility at the level both of the kernel and the nonparametric mixing measure. This allows to derive accurate and robust estimates of the distribution of local animal abundance and of the corresponding clusters. The application and a simulation study for different scenarios yield also some general methodological implications. Adding flexibility solely at the level of the mixing measure does not improve inferences, since its impact is severely limited by the rigidity of the Poisson kernel with considerable consequences in terms of bias. However, once a kernel more flexible than the Poisson is chosen, inferences can be robustified by choosing a prior more general than the Dirichlet process. Therefore, to improve the performance of Bayesian nonparametric mixtures for count data one has to enrich the model simultaneously at both levels, the kernel and the mixing measure. © 2016, The International Biometric Society.

  6. Bayesian automated cortical segmentation for neonatal MRI

    Science.gov (United States)

    Chou, Zane; Paquette, Natacha; Ganesh, Bhavana; Wang, Yalin; Ceschin, Rafael; Nelson, Marvin D.; Macyszyn, Luke; Gaonkar, Bilwaj; Panigrahy, Ashok; Lepore, Natasha

    2017-11-01

    Several attempts have been made in the past few years to develop and implement an automated segmentation of neonatal brain structural MRI. However, accurate automated MRI segmentation remains challenging in this population because of the low signal-to-noise ratio, large partial volume effects and inter-individual anatomical variability of the neonatal brain. In this paper, we propose a learning method for segmenting the whole brain cortical grey matter on neonatal T2-weighted images. We trained our algorithm using a neonatal dataset composed of 3 fullterm and 4 preterm infants scanned at term equivalent age. Our segmentation pipeline combines the FAST algorithm from the FSL library software and a Bayesian segmentation approach to create a threshold matrix that minimizes the error of mislabeling brain tissue types. Our method shows promising results with our pilot training set. In both preterm and full-term neonates, automated Bayesian segmentation generates a smoother and more consistent parcellation compared to FAST, while successfully removing the subcortical structure and cleaning the edges of the cortical grey matter. This method show promising refinement of the FAST segmentation by considerably reducing manual input and editing required from the user, and further improving reliability and processing time of neonatal MR images. Further improvement will include a larger dataset of training images acquired from different manufacturers.

  7. A Bayesian Networks approach to Operational Risk

    Science.gov (United States)

    Aquaro, V.; Bardoscia, M.; Bellotti, R.; Consiglio, A.; De Carlo, F.; Ferri, G.

    2010-04-01

    A system for Operational Risk management based on the computational paradigm of Bayesian Networks is presented. The algorithm allows the construction of a Bayesian Network targeted for each bank and takes into account in a simple and realistic way the correlations among different processes of the bank. The internal losses are averaged over a variable time horizon, so that the correlations at different times are removed, while the correlations at the same time are kept: the averaged losses are thus suitable to perform the learning of the network topology and parameters; since the main aim is to understand the role of the correlations among the losses, the assessments of domain experts are not used. The algorithm has been validated on synthetic time series. It should be stressed that the proposed algorithm has been thought for the practical implementation in a mid or small sized bank, since it has a small impact on the organizational structure of a bank and requires an investment in human resources which is limited to the computational area.

  8. Bayesian analysis of multiple direct detection experiments

    Science.gov (United States)

    Arina, Chiara

    2014-12-01

    Bayesian methods offer a coherent and efficient framework for implementing uncertainties into induction problems. In this article, we review how this approach applies to the analysis of dark matter direct detection experiments. In particular we discuss the exclusion limit of XENON100 and the debated hints of detection under the hypothesis of a WIMP signal. Within parameter inference, marginalizing consistently over uncertainties to extract robust posterior probability distributions, we find that the claimed tension between XENON100 and the other experiments can be partially alleviated in isospin violating scenario, while elastic scattering model appears to be compatible with the frequentist statistical approach. We then move to model comparison, for which Bayesian methods are particularly well suited. Firstly, we investigate the annual modulation seen in CoGeNT data, finding that there is weak evidence for a modulation. Modulation models due to other physics compare unfavorably with the WIMP models, paying the price for their excessive complexity. Secondly, we confront several coherent scattering models to determine the current best physical scenario compatible with the experimental hints. We find that exothermic and inelastic dark matter are moderatly disfavored against the elastic scenario, while the isospin violating model has a similar evidence. Lastly the Bayes' factor gives inconclusive evidence for an incompatibility between the data sets of XENON100 and the hints of detection. The same question assessed with goodness of fit would indicate a 2 σ discrepancy. This suggests that more data are therefore needed to settle this question.

  9. Bayesian Analysis of High Dimensional Classification

    Science.gov (United States)

    Mukhopadhyay, Subhadeep; Liang, Faming

    2009-12-01

    Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.

  10. Predictive Bayesian inference and dynamic treatment regimes.

    Science.gov (United States)

    Saarela, Olli; Arjas, Elja; Stephens, David A; Moodie, Erica E M

    2015-11-01

    While optimal dynamic treatment regimes (DTRs) can be estimated without specification of a predictive model, a model-based approach, combined with dynamic programming and Monte Carlo integration, enables direct probabilistic comparisons between the outcomes under the optimal DTR and alternative (dynamic or static) treatment regimes. The Bayesian predictive approach also circumvents problems related to frequentist estimators under the nonregular estimation problem. However, the model-based approach is susceptible to misspecification, in particular of the "null-paradox" type, which is due to the model parameters not having a direct causal interpretation in the presence of latent individual-level characteristics. Because it is reasonable to insist on correct inferences under the null of no difference between the alternative treatment regimes, we discuss how to achieve this through a "null-robust" reparametrization of the problem in a longitudinal setting. Since we argue that causal inference can be entirely understood as posterior predictive inference in a hypothetical population without covariate imbalances, we also discuss how controlling for confounding through inverse probability of treatment weighting can be justified and incorporated in the Bayesian setting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Improving randomness characterization through Bayesian model selection.

    Science.gov (United States)

    Díaz Hernández Rojas, Rafael; Solís, Aldo; Angulo Martínez, Alí M; U'Ren, Alfred B; Hirsch, Jorge G; Marsili, Matteo; Pérez Castillo, Isaac

    2017-06-08

    Random number generation plays an essential role in technology with important applications in areas ranging from cryptography to Monte Carlo methods, and other probabilistic algorithms. All such applications require high-quality sources of random numbers, yet effective methods for assessing whether a source produce truly random sequences are still missing. Current methods either do not rely on a formal description of randomness (NIST test suite) on the one hand, or are inapplicable in principle (the characterization derived from the Algorithmic Theory of Information), on the other, for they require testing all the possible computer programs that could produce the sequence to be analysed. Here we present a rigorous method that overcomes these problems based on Bayesian model selection. We derive analytic expressions for a model's likelihood which is then used to compute its posterior distribution. Our method proves to be more rigorous than NIST's suite and Borel-Normality criterion and its implementation is straightforward. We applied our method to an experimental device based on the process of spontaneous parametric downconversion to confirm it behaves as a genuine quantum random number generator. As our approach relies on Bayesian inference our scheme transcends individual sequence analysis, leading to a characterization of the source itself.

  12. Bayesian Discovery of Linear Acyclic Causal Models

    CERN Document Server

    Hoyer, Patrik O

    2012-01-01

    Methods for automated discovery of causal relationships from non-interventional data have received much attention recently. A widely used and well understood model family is given by linear acyclic causal models (recursive structural equation models). For Gaussian data both constraint-based methods (Spirtes et al., 1993; Pearl, 2000) (which output a single equivalence class) and Bayesian score-based methods (Geiger and Heckerman, 1994) (which assign relative scores to the equivalence classes) are available. On the contrary, all current methods able to utilize non-Gaussianity in the data (Shimizu et al., 2006; Hoyer et al., 2008) always return only a single graph or a single equivalence class, and so are fundamentally unable to express the degree of certainty attached to that output. In this paper we develop a Bayesian score-based approach able to take advantage of non-Gaussianity when estimating linear acyclic causal models, and we empirically demonstrate that, at least on very modest size networks, its accur...

  13. Attention as a Bayesian inference process

    Science.gov (United States)

    Chikkerur, Sharat; Serre, Thomas; Tan, Cheston; Poggio, Tomaso

    2011-03-01

    David Marr famously defined vision as "knowing what is where by seeing". In the framework described here, attention is the inference process that solves the visual recognition problem of what is where. The theory proposes a computational role for attention and leads to a model that performs well in recognition tasks and that predicts some of the main properties of attention at the level of psychophysics and physiology. We propose an algorithmic implementation a Bayesian network that can be mapped into the basic functional anatomy of attention involving the ventral stream and the dorsal stream. This description integrates bottom-up, feature-based as well as spatial (context based) attentional mechanisms. We show that the Bayesian model predicts well human eye fixations (considered as a proxy for shifts of attention) in natural scenes, and can improve accuracy in object recognition tasks involving cluttered real world images. In both cases, we found that the proposed model can predict human performance better than existing bottom-up and top-down computational models.

  14. Bayesian equilibrium inference in the Minerva framework

    Science.gov (United States)

    Svensson, Jakob; Ford, Oliver; Kwak, Sehyun; Appel, Lynton; Rahbarnia, Kian; Geiger, Joachim; Schilling, Jonathan

    2017-10-01

    The Minerva framework is a scientific modelling system based on Bayesian forward modelling and is used at a number of experiments. The structure of the framework makes it possible to combine flux function based, axisymmetric or full 3D models. A general modularity approach makes it easy to replace underlying physics models, such as the model for force balance and corresponding current distribution. We will give an overview of the different models within Minerva for inference of equilibrium field and flux surfaces, for both tokamaks and stellarators. For axisymmetric devices, three methods of increasing complexity, Gaussian process based Current Tomography (CT), an iterative Grad-Shafranov solver, and a full nonlinear Grad-Shafranov based model, will be demonstrated for the JET device. The novel nonlinear Grad-Sharanov model defines a proper posterior distribution for the equilibrium problem thus defines the space of possible equilibrium solutions, and has the capacity to include any nonlinear constraints (e.g. from models of profile diagnostics). The Bayesian approach further allows uncertainties on the equilibrium parameters to be calculated. For the W7-X stellarator, two models based on the VMEC 3D solver and a fast function parameterization approximation will be demonstrated.

  15. Bayesian signal processing classical, modern, and particle filtering methods

    CERN Document Server

    Candy, James V

    2016-01-01

    This book aims to give readers a unified Bayesian treatment starting from the basics (Baye's rule) to the more advanced (Monte Carlo sampling), evolving to the next-generation model-based techniques (sequential Monte Carlo sampling). This next edition incorporates a new chapter on "Sequential Bayesian Detection," a new section on "Ensemble Kalman Filters" as well as an expansion of Case Studies that detail Bayesian solutions for a variety of applications. These studies illustrate Bayesian approaches to real-world problems incorporating detailed particle filter designs, adaptive particle filters and sequential Bayesian detectors. In addition to these major developments a variety of sections are expanded to "fill-in-the gaps" of the first edition. Here metrics for particle filter (PF) designs with emphasis on classical "sanity testing" lead to ensemble techniques as a basic requirement for performance analysis. The expansion of information theory metrics and their application to PF designs is fully developed an...

  16. Spatially-dependent Bayesian model selection for disease mapping.

    Science.gov (United States)

    Carroll, Rachel; Lawson, Andrew B; Faes, Christel; Kirby, Russell S; Aregay, Mehreteab; Watjou, Kevin

    2018-01-01

    In disease mapping where predictor effects are to be modeled, it is often the case that sets of predictors are fixed, and the aim is to choose between fixed model sets. Model selection methods, both Bayesian model selection and Bayesian model averaging, are approaches within the Bayesian paradigm for achieving this aim. In the spatial context, model selection could have a spatial component in the sense that some models may be more appropriate for certain areas of a study region than others. In this work, we examine the use of spatially referenced Bayesian model averaging and Bayesian model selection via a large-scale simulation study accompanied by a small-scale case study. Our results suggest that BMS performs well when a strong regression signature is found.

  17. Bayesian analysis. II. Signal detection and model selection

    Science.gov (United States)

    Bretthorst, G. Larry

    In the preceding. paper, Bayesian analysis was applied to the parameter estimation problem, given quadrature NMR data. Here Bayesian analysis is extended to the problem of selecting the model which is most probable in view of the data and all the prior information. In addition to the analytic calculation, two examples are given. The first example demonstrates how to use Bayesian probability theory to detect small signals in noise. The second example uses Bayesian probability theory to compute the probability of the number of decaying exponentials in simulated T1 data. The Bayesian answer to this question is essentially a microcosm of the scientific method and a quantitative statement of Ockham's razor: theorize about possible models, compare these to experiment, and select the simplest model that "best" fits the data.

  18. Stochastic margin-based structure learning of Bayesian network classifiers.

    Science.gov (United States)

    Pernkopf, Franz; Wohlmayr, Michael

    2013-02-01

    The margin criterion for parameter learning in graphical models gained significant impact over the last years. We use the maximum margin score for discriminatively optimizing the structure of Bayesian network classifiers. Furthermore, greedy hill-climbing and simulated annealing search heuristics are applied to determine the classifier structures. In the experiments, we demonstrate the advantages of maximum margin optimized Bayesian network structures in terms of classification performance compared to traditionally used discriminative structure learning methods. Stochastic simulated annealing requires less score evaluations than greedy heuristics. Additionally, we compare generative and discriminative parameter learning on both generatively and discriminatively structured Bayesian network classifiers. Margin-optimized Bayesian network classifiers achieve similar classification performance as support vector machines. Moreover, missing feature values during classification can be handled by discriminatively optimized Bayesian network classifiers, a case where purely discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.

  19. Approximation methods for efficient learning of Bayesian networks

    CERN Document Server

    Riggelsen, C

    2008-01-01

    This publication offers and investigates efficient Monte Carlo simulation methods in order to realize a Bayesian approach to approximate learning of Bayesian networks from both complete and incomplete data. For large amounts of incomplete data when Monte Carlo methods are inefficient, approximations are implemented, such that learning remains feasible, albeit non-Bayesian. The topics discussed are: basic concepts about probabilities, graph theory and conditional independence; Bayesian network learning from data; Monte Carlo simulation techniques; and, the concept of incomplete data. In order to provide a coherent treatment of matters, thereby helping the reader to gain a thorough understanding of the whole concept of learning Bayesian networks from (in)complete data, this publication combines in a clarifying way all the issues presented in the papers with previously unpublished work.

  20. Wavelet-Bayesian inference of cosmic strings embedded in the cosmic microwave background

    Science.gov (United States)

    McEwen, J. D.; Feeney, S. M.; Peiris, H. V.; Wiaux, Y.; Ringeval, C.; Bouchet, F. R.

    2017-12-01

    Cosmic strings are a well-motivated extension to the standard cosmological model and could induce a subdominant component in the anisotropies of the cosmic microwave background (CMB), in addition to the standard inflationary component. The detection of strings, while observationally challenging, would provide a direct probe of physics at very high-energy scales. We develop a framework for cosmic string inference from observations of the CMB made over the celestial sphere, performing a Bayesian analysis in wavelet space where the string-induced CMB component has distinct statistical properties to the standard inflationary component. Our wavelet-Bayesian framework provides a principled approach to compute the posterior distribution of the string tension Gμ and the Bayesian evidence ratio comparing the string model to the standard inflationary model. Furthermore, we present a technique to recover an estimate of any string-induced CMB map embedded in observational data. Using Planck-like simulations, we demonstrate the application of our framework and evaluate its performance. The method is sensitive to Gμ ˜ 5 × 10-7 for Nambu-Goto string simulations that include an integrated Sachs-Wolfe contribution only and do not include any recombination effects, before any parameters of the analysis are optimized. The sensitivity of the method compares favourably with other techniques applied to the same simulations.

  1. Estimating uncertainty and reliability of social network data using Bayesian inference.

    Science.gov (United States)

    Farine, Damien R; Strandburg-Peshkin, Ariana

    2015-09-01

    Social network analysis provides a useful lens through which to view the structure of animal societies, and as a result its use is increasingly widespread. One challenge that many studies of animal social networks face is dealing with limited sample sizes, which introduces the potential for a high level of uncertainty in estimating the rates of association or interaction between individuals. We present a method based on Bayesian inference to incorporate uncertainty into network analyses. We test the reliability of this method at capturing both local and global properties of simulated networks, and compare it to a recently suggested method based on bootstrapping. Our results suggest that Bayesian inference can provide useful information about the underlying certainty in an observed network. When networks are well sampled, observed networks approach the real underlying social structure. However, when sampling is sparse, Bayesian inferred networks can provide realistic uncertainty estimates around edge weights. We also suggest a potential method for estimating the reliability of an observed network given the amount of sampling performed. This paper highlights how relatively simple procedures can be used to estimate uncertainty and reliability in studies using animal social network analysis.

  2. Using consensus bayesian network to model the reactive oxygen species regulatory pathway.

    Directory of Open Access Journals (Sweden)

    Liangdong Hu

    Full Text Available Bayesian network is one of the most successful graph models for representing the reactive oxygen species regulatory pathway. With the increasing number of microarray measurements, it is possible to construct the bayesian network from microarray data directly. Although large numbers of bayesian network learning algorithms have been developed, when applying them to learn bayesian networks from microarray data, the accuracies are low due to that the databases they used to learn bayesian networks contain too few microarray data. In this paper, we propose a consensus bayesian network which is constructed by combining bayesian networks from relevant literatures and bayesian networks learned from microarray data. It would have a higher accuracy than the bayesian networks learned from one database. In the experiment, we validated the bayesian network combination algorithm on several classic machine learning databases and used the consensus bayesian network to model the Escherichia coli's ROS pathway.

  3. Using consensus bayesian network to model the reactive oxygen species regulatory pathway.

    Science.gov (United States)

    Hu, Liangdong; Wang, Limin

    2013-01-01

    Bayesian network is one of the most successful graph models for representing the reactive oxygen species regulatory pathway. With the increasing number of microarray measurements, it is possible to construct the bayesian network from microarray data directly. Although large numbers of bayesian network learning algorithms have been developed, when applying them to learn bayesian networks from microarray data, the accuracies are low due to that the databases they used to learn bayesian networks contain too few microarray data. In this paper, we propose a consensus bayesian network which is constructed by combining bayesian networks from relevant literatures and bayesian networks learned from microarray data. It would have a higher accuracy than the bayesian networks learned from one database. In the experiment, we validated the bayesian network combination algorithm on several classic machine learning databases and used the consensus bayesian network to model the Escherichia coli's ROS pathway.

  4. Prediction and assimilation of surf-zone processes using a Bayesian network: Part II: Inverse models

    Science.gov (United States)

    Plant, Nathaniel G.; Holland, K. Todd

    2011-01-01

    A Bayesian network model has been developed to simulate a relatively simple problem of wave propagation in the surf zone (detailed in Part I). Here, we demonstrate that this Bayesian model can provide both inverse modeling and data-assimilation solutions for predicting offshore wave heights and depth estimates given limited wave-height and depth information from an onshore location. The inverse method is extended to allow data assimilation using observational inputs that are not compatible with deterministic solutions of the problem. These inputs include sand bar positions (instead of bathymetry) and estimates of the intensity of wave breaking (instead of wave-height observations). Our results indicate that wave breaking information is essential to reduce prediction errors. In many practical situations, this information could be provided from a shore-based observer or from remote-sensing systems. We show that various combinations of the assimilated inputs significantly reduce the uncertainty in the estimates of water depths and wave heights in the model domain. Application of the Bayesian network model to new field data demonstrated significant predictive skill (R2 = 0.7) for the inverse estimate of a month-long time series of offshore wave heights. The Bayesian inverse results include uncertainty estimates that were shown to be most accurate when given uncertainty in the inputs (e.g., depth and tuning parameters). Furthermore, the inverse modeling was extended to directly estimate tuning parameters associated with the underlying wave-process model. The inverse estimates of the model parameters not only showed an offshore wave height dependence consistent with results of previous studies but the uncertainty estimates of the tuning parameters also explain previously reported variations in the model parameters.

  5. Genome-wide prediction of discrete traits using bayesian regressions and machine learning

    Directory of Open Access Journals (Sweden)

    Forni Selma

    2011-02-01

    Full Text Available Abstract Background Genomic selection has gained much attention and the main goal is to increase the predictive accuracy and the genetic gain in livestock using dense marker information. Most methods dealing with the large p (number of covariates small n (number of observations problem have dealt only with continuous traits, but there are many important traits in livestock that are recorded in a discrete fashion (e.g. pregnancy outcome, disease resistance. It is necessary to evaluate alternatives to analyze discrete traits in a genome-wide prediction context. Methods This study shows two threshold versions of Bayesian regressions (Bayes A and Bayesian LASSO and two machine learning algorithms (boosting and random forest to analyze discrete traits in a genome-wide prediction context. These methods were evaluated using simulated and field data to predict yet-to-be observed records. Performances were compared based on the models' predictive ability. Results The simulation showed that machine learning had some advantages over Bayesian regressions when a small number of QTL regulated the trait under pure additivity. However, differences were small and disappeared with a large number of QTL. Bayesian threshold LASSO and boosting achieved the highest accuracies, whereas Random Forest presented the highest classification performance. Random Forest was the most consistent method in detecting resistant and susceptible animals, phi correlation was up to 81% greater than Bayesian regressions. Random Forest outperformed other methods in correctly classifying resistant and susceptible animals in the two pure swine lines evaluated. Boosting and Bayes A were more accurate with crossbred data. Conclusions The results of this study suggest that the best method for genome-wide prediction may depend on the genetic basis of the population analyzed. All methods were less accurate at correctly classifying intermediate animals than extreme animals. Among the different

  6. Genome-wide prediction of discrete traits using Bayesian regressions and machine learning.

    Science.gov (United States)

    González-Recio, Oscar; Forni, Selma

    2011-02-17

    Genomic selection has gained much attention and the main goal is to increase the predictive accuracy and the genetic gain in livestock using dense marker information. Most methods dealing with the large p (number of covariates) small n (number of observations) problem have dealt only with continuous traits, but there are many important traits in livestock that are recorded in a discrete fashion (e.g. pregnancy outcome, disease resistance). It is necessary to evaluate alternatives to analyze discrete traits in a genome-wide prediction context. This study shows two threshold versions of Bayesian regressions (Bayes A and Bayesian LASSO) and two machine learning algorithms (boosting and random forest) to analyze discrete traits in a genome-wide prediction context. These methods were evaluated using simulated and field data to predict yet-to-be observed records. Performances were compared based on the models' predictive ability. The simulation showed that machine learning had some advantages over Bayesian regressions when a small number of QTL regulated the trait under pure additivity. However, differences were small and disappeared with a large number of QTL. Bayesian threshold LASSO and boosting achieved the highest accuracies, whereas Random Forest presented the highest classification performance. Random Forest was the most consistent method in detecting resistant and susceptible animals, phi correlation was up to 81% greater than Bayesian regressions. Random Forest outperformed other methods in correctly classifying resistant and susceptible animals in the two pure swine lines evaluated. Boosting and Bayes A were more accurate with crossbred data. The results of this study suggest that the best method for genome-wide prediction may depend on the genetic basis of the population analyzed. All methods were less accurate at correctly classifying intermediate animals than extreme animals. Among the different alternatives proposed to analyze discrete traits, machine

  7. Cosmic bulk flows on 50 h-1 Mpc scales: a Bayesian hyper-parameter method and multishell likelihood analysis

    Science.gov (United States)

    Ma, Yin-Zhe; Scott, Douglas

    2013-01-01

    It has been argued recently that the galaxy peculiar velocity field provides evidence of excessive power on scales of 50 h-1 Mpc, which seems to be inconsistent with the standard Λ cold dark matter (ΛCDM) cosmological model. We discuss several assumptions and conventions used in studies of the large-scale bulk flow to check whether this claim is robust under a variety of conditions. Rather than using a composite catalogue we select samples from the SN, ENEAR, Spiral Field I-band Survey (SFI++) and First Amendment Supernovae (A1SN) catalogues, and correct for Malmquist bias in each according to the IRAS PSCz density field. We also use slightly different assumptions about the small-scale velocity dispersion and the parametrization of the matter power spectrum when calculating the variance of the bulk flow. By combining the likelihood of individual catalogues using a Bayesian hyper-parameter method, we find that the joint likelihood of the amplitude parameter gives σ8 = 0.65+ 0.47- 0.35 (68 per cent confidence region), which is entirely consistent with the ΛCDM model. In addition, the bulk flow magnitude, v ˜ 310 km s-1, and direction, (l, b) ˜ (280° ± 8°, 5.1° ± 6°), found by each of the catalogues are all consistent with each other, and with the bulk flow results from most previous studies. Furthermore, the bulk flow velocities in different shells of the surveys constrain (σ8, Ωm) to be (1.01+ 0.26- 0.20, 0.31+ 0.28- 0.14) for SFI++ and (1.04+ 0.32- 0.24, 0.28+ 0.30- 0.14) for ENEAR, which are consistent with the 7-year Wilkinson and Microwave Anisotropy Probe (WMAP7) best-fitting values. We finally discuss the differences between our conclusions and those of the studies claiming the largest bulk flows.

  8. A Bayesian posterior predictive framework for weighting ensemble regional climate models

    Directory of Open Access Journals (Sweden)

    Y. Fan

    2017-06-01

    Full Text Available We present a novel Bayesian statistical approach to computing model weights in climate change projection ensembles in order to create probabilistic projections. The weight of each climate model is obtained by weighting the current day observed data under the posterior distribution admitted under competing climate models. We use a linear model to describe the model output and observations. The approach accounts for uncertainty in model bias, trend and internal variability, including error in the observations used. Our framework is general, requires very little problem-specific input, and works well with default priors. We carry out cross-validation checks that confirm that the method produces the correct coverage.

  9. A novel Bayesian hierarchical model for road safety hotspot prediction.

    Science.gov (United States)

    Fawcett, Lee; Thorpe, Neil; Matthews, Joseph; Kremer, Karsten

    2017-02-01

    In this paper, we propose a Bayesian hierarchical model for predicting accident counts in future years at sites within a pool of potential road safety hotspots. The aim is to inform road safety practitioners of the location of likely future hotspots to enable a proactive, rather than reactive, approach to road safety scheme implementation. A feature of our model is the ability to rank sites according to their potential to exceed, in some future time period, a threshold accident count which may be used as a criterion for scheme implementation. Our model specification enables the classical empirical Bayes formulation - commonly used in before-and-after studies, wherein accident counts from a single before period are used to estimate counterfactual counts in the after period - to be extended to incorporate counts from multiple time periods. This allows site-specific variations in historical accident counts (e.g. locally-observed trends) to offset estimates of safety generated by a global accident prediction model (APM), which itself is used to help account for the effects of global trend and regression-to-mean (RTM). The Bayesian posterior predictive distribution is exploited to formulate predictions and to properly quantify our uncertainty in these predictions. The main contributions of our model include (i) the ability to allow accident counts from multiple time-points to inform predictions, with counts in more recent years lending more weight to predictions than counts from time-points further in the past; (ii) where appropriate, the ability to offset global estimates of trend by variations in accident counts observed locally, at a site-specific level; and (iii) the ability to account for unknown/unobserved site-specific factors which may affect accident counts. We illustrate our model with an application to accident counts at 734 potential hotspots in the German city of Halle; we also propose some simple diagnostics to validate the predictive capability of our

  10. Bayesian methods in clinical trials: a Bayesian analysis of ECOG trials E1684 and E1690

    Directory of Open Access Journals (Sweden)

    Ibrahim Joseph G

    2012-11-01

    Full Text Available Abstract Background E1684 was the pivotal adjuvant melanoma trial for establishment of high-dose interferon (IFN as effective therapy of high-risk melanoma patients. E1690 was an intriguing effort to corroborate E1684, and the differences between the outcomes of these trials have embroiled the field in controversy over the past several years. The analyses of E1684 and E1690 were carried out separately when the results were published, and there were no further analyses trying to perform a single analysis of the combined trials. Method In this paper, we consider such a joint analysis by carrying out a Bayesian analysis of these two trials, thus providing us with a consistent and coherent methodology for combining the results from these two trials. Results The Bayesian analysis using power priors provided a more coherent flexible and potentially more accurate analysis than a separate analysis of these data or a frequentist analysis of these data. The methodology provides a consistent framework for carrying out a single unified analysis by combining data from two or more studies. Conclusions Such Bayesian analyses can be crucial in situations where the results from two theoretically identical trials yield somewhat conflicting or inconsistent results.

  11. Adaptive surrogate modeling for response surface approximations with application to bayesian inference

    KAUST Repository

    Prudhomme, Serge

    2015-09-17

    Parameter estimation for complex models using Bayesian inference is usually a very costly process as it requires a large number of solves of the forward problem. We show here how the construction of adaptive surrogate models using a posteriori error estimates for quantities of interest can significantly reduce the computational cost in problems of statistical inference. As surrogate models provide only approximations of the true solutions of the forward problem, it is nevertheless necessary to control these errors in order to construct an accurate reduced model with respect to the observables utilized in the identification of the model parameters. Effectiveness of the proposed approach is demonstrated on a numerical example dealing with the Spalart–Allmaras model for the simulation of turbulent channel flows. In particular, we illustrate how Bayesian model selection using the adapted surrogate model in place of solving the coupled nonlinear equations leads to the same quality of results while requiring fewer nonlinear PDE solves.

  12. Bayesian estimation of the Modified Omori Law parameters for the Iranian Plateau

    Science.gov (United States)

    Ommi, S.; Zafarani, H.; Smirnov, V. B.

    2016-07-01

    The forecasting of large aftershocks is a preliminary and critical step in seismic hazard analysis and seismic risk management. From a statistical point of view, it relies entirely on the estimation of the properties of aftershock sequences using a set of laws with well-defined parameters. Since the frequentist and Bayesian approaches are common tools to assess these parameter values, we compare the two approaches for the Modified Omori Law and a selection of mainshock-aftershock sequences in the Iranian Plateau. There is a general agreement between the two methods, but the Bayesian appears to be more efficient as the number of recorded aftershocks decreases. Taking into account temporal variations of the b-value, the slope of the frequency-size distribution, the probability for the occurrence of strong aftershock, or larger main shock has been calculated in a finite time window using the parameters of the Modified Omori Law observed in the Iranian Plateau.

  13. Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test

    DEFF Research Database (Denmark)

    Wang, C.; Turnbull, B.W.; Nielsen, Søren Saxmose

    2011-01-01

    . An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has......A Bayesian methodology was developed based on a latent change-point model to evaluate the performance of milk ELISA and fecal culture tests for longitudinal Johne's disease diagnostic data. The situation of no perfect reference test was considered; that is, no “gold standard.” A change...... an area under the receiver operating characteristic curve (AUC) of 0.984, and is superior to the raw ELISA (AUC = 0.911) and fecal culture (sensitivity = 0.358, specificity = 0.980) tests for Johne's disease diagnosis....

  14. A Cooperative Bayesian Nonparametric Framework for Primary User Activity Monitoring in Cognitive Radio Network

    CERN Document Server

    Saad, Walid; Poor, H Vincent; Başar, Tamer; Song, Ju Bin

    2012-01-01

    This paper introduces a novel approach that enables a number of cognitive radio devices that are observing the availability pattern of a number of primary users(PUs), to cooperate and use \\emph{Bayesian nonparametric} techniques to estimate the distributions of the PUs' activity pattern, assumed to be completely unknown. In the proposed model, each cognitive node may have its own individual view on each PU's distribution, and, hence, seeks to find partners having a correlated perception. To address this problem, a coalitional game is formulated between the cognitive devices and an algorithm for cooperative coalition formation is proposed. It is shown that the proposed coalition formation algorithm allows the cognitive nodes that are experiencing a similar behavior from some PUs to self-organize into disjoint, independent coalitions. Inside each coalition, the cooperative cognitive nodes use a combination of Bayesian nonparametric models such as the Dirichlet process and statistical goodness of fit techniques ...

  15. An Advanced Dynamic Framed-Slotted ALOHA Algorithm Based on Bayesian Estimation and Probability Response

    Directory of Open Access Journals (Sweden)

    Chaowei Wang

    2013-01-01

    Full Text Available This paper proposes an advanced dynamic framed-slotted ALOHA algorithm based on Bayesian estimation and probability response (BE-PDFSA to improve the performance of radio frequency identification (RFID system. The Bayesian estimation is introduced to improve the accuracy of the estimation algorithm for lacking a large number of observations in one query. The probability response is used to adjust responsive probability of the unrecognized tags to make the responsive tag number equal to the frame length. In this way, we can solve the problem of high collision rate with the increase of tag number and improve the throughput of the whole system. From the simulation results, we can see that the algorithm we proposed can greatly improve the stability of RFID system compared with DFSA and other commonly used algorithms.

  16. Assessing the relationship between spectral solar irradiance and stratospheric ozone using Bayesian inference

    CERN Document Server

    Ball, William T; Egerton, Jack S; Haigh, Joanna D

    2014-01-01

    We investigate the relationship between spectral solar irradiance (SSI) and ozone in the tropical upper stratosphere. We find that solar cycle (SC) changes in ozone can be well approximated by considering the ozone response to SSI changes in a small number individual wavelength bands between 176 and 310 nm, operating independently of each other. Additionally, we find that the ozone varies approximately linearly with changes in the SSI. Using these facts, we present a Bayesian formalism for inferring SC SSI changes and uncertainties from measured SC ozone profiles. Bayesian inference is a powerful, mathematically self-consistent method of considering both the uncertainties of the data and additional external information to provide the best estimate of parameters being estimated. Using this method, we show that, given measurement uncertainties in both ozone and SSI datasets, it is not currently possible to distinguish between observed or modelled SSI datasets using available estimates of ozone change profiles, ...

  17. Logistic regression against a divergent Bayesian network

    Directory of Open Access Journals (Sweden)

    Noel Antonio Sánchez Trujillo

    2015-01-01

    Full Text Available This article is a discussion about two statistical tools used for prediction and causality assessment: logistic regression and Bayesian networks. Using data of a simulated example from a study assessing factors that might predict pulmonary emphysema (where fingertip pigmentation and smoking are considered; we posed the following questions. Is pigmentation a confounding, causal or predictive factor? Is there perhaps another factor, like smoking, that confounds? Is there a synergy between pigmentation and smoking? The results, in terms of prediction, are similar with the two techniques; regarding causation, differences arise. We conclude that, in decision-making, the sum of both: a statistical tool, used with common sense, and previous evidence, taking years or even centuries to develop; is better than the automatic and exclusive use of statistical resources.

  18. A Review of the Bayesian Occupancy Filter

    Directory of Open Access Journals (Sweden)

    Marcelo Saval-Calvo

    2017-02-01

    Full Text Available Autonomous vehicle systems are currently the object of intense research within scientific and industrial communities; however, many problems remain to be solved. One of the most critical aspects addressed in both autonomous driving and robotics is environment perception, since it consists of the ability to understand the surroundings of the vehicle to estimate risks and make decisions on future movements. In recent years, the Bayesian Occupancy Filter (BOF method has been developed to evaluate occupancy by tessellation of the environment. A review of the BOF and its variants is presented in this paper. Moreover, we propose a detailed taxonomy where the BOF is decomposed into five progressive layers, from the level closest to the sensor to the highest abstractlevelofriskassessment. Inaddition,wepresentastudyofimplementedusecasestoprovide a practical understanding on the main uses of the BOF and its taxonomy.

  19. Modelling crime linkage with Bayesian networks.

    Science.gov (United States)

    de Zoete, Jacob; Sjerps, Marjan; Lagnado, David; Fenton, Norman

    2015-05-01

    When two or more crimes show specific similarities, such as a very distinct modus operandi, the probability that they were committed by the same offender becomes of interest. This probability depends on the degree of similarity and distinctiveness. We show how Bayesian networks can be used to model different evidential structures that can occur when linking crimes, and how they assist in understanding the complex underlying dependencies. That is, how evidence that is obtained in one case can be used in another and vice versa. The flip side of this is that the intuitive decision to "unlink" a case in which exculpatory evidence is obtained leads to serious overestimation of the strength of the remaining cases. Copyright © 2014 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  20. Numerical Methods for Bayesian Inverse Problems

    KAUST Repository

    Ernst, Oliver

    2014-01-06

    We present recent results on Bayesian inversion for a groundwater flow problem with an uncertain conductivity field. In particular, we show how direct and indirect measurements can be used to obtain a stochastic model for the unknown. The main tool here is Bayes’ theorem which merges the indirect data with the stochastic prior model for the conductivity field obtained by the direct measurements. Further, we demonstrate how the resulting posterior distribution of the quantity of interest, in this case travel times of radionuclide contaminants, can be obtained by Markov Chain Monte Carlo (MCMC) simulations. Moreover, we investigate new, promising MCMC methods which exploit geometrical features of the posterior and which are suited to infinite dimensions.