WorldWideScience

Sample records for bayesian hierarchical framework

  1. Modelling the dynamics of an experimental host-pathogen microcosm within a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    David Lunn

    Full Text Available The advantages of Bayesian statistical approaches, such as flexibility and the ability to acknowledge uncertainty in all parameters, have made them the prevailing method for analysing the spread of infectious diseases in human or animal populations. We introduce a Bayesian approach to experimental host-pathogen systems that shares these attractive features. Since uncertainty in all parameters is acknowledged, existing information can be accounted for through prior distributions, rather than through fixing some parameter values. The non-linear dynamics, multi-factorial design, multiple measurements of responses over time and sampling error that are typical features of experimental host-pathogen systems can also be naturally incorporated. We analyse the dynamics of the free-living protozoan Paramecium caudatum and its specialist bacterial parasite Holospora undulata. Our analysis provides strong evidence for a saturable infection function, and we were able to reproduce the two waves of infection apparent in the data by separating the initial inoculum from the parasites released after the first cycle of infection. In addition, the parameter estimates from the hierarchical model can be combined to infer variations in the parasite's basic reproductive ratio across experimental groups, enabling us to make predictions about the effect of resources and host genotype on the ability of the parasite to spread. Even though the high level of variability between replicates limited the resolution of the results, this Bayesian framework has strong potential to be used more widely in experimental ecology.

  2. Bayesian nonparametric hierarchical modeling.

    Science.gov (United States)

    Dunson, David B

    2009-04-01

    In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.

  3. Modeling visual search using three-parameter probability functions in a hierarchical Bayesian framework.

    Science.gov (United States)

    Lin, Yi-Shin; Heinke, Dietmar; Humphreys, Glyn W

    2015-04-01

    In this study, we applied Bayesian-based distributional analyses to examine the shapes of response time (RT) distributions in three visual search paradigms, which varied in task difficulty. In further analyses we investigated two common observations in visual search-the effects of display size and of variations in search efficiency across different task conditions-following a design that had been used in previous studies (Palmer, Horowitz, Torralba, & Wolfe, Journal of Experimental Psychology: Human Perception and Performance, 37, 58-71, 2011; Wolfe, Palmer, & Horowitz, Vision Research, 50, 1304-1311, 2010) in which parameters of the response distributions were measured. Our study showed that the distributional parameters in an experimental condition can be reliably estimated by moderate sample sizes when Monte Carlo simulation techniques are applied. More importantly, by analyzing trial RTs, we were able to extract paradigm-dependent shape changes in the RT distributions that could be accounted for by using the EZ2 diffusion model. The study showed that Bayesian-based RT distribution analyses can provide an important means to investigate the underlying cognitive processes in search, including stimulus grouping and the bottom-up guidance of attention.

  4. Inferring cetacean population densities from the absolute dynamic topography of the ocean in a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    Mario A Pardo

    Full Text Available We inferred the population densities of blue whales (Balaenoptera musculus and short-beaked common dolphins (Delphinus delphis in the Northeast Pacific Ocean as functions of the water-column's physical structure by implementing hierarchical models in a Bayesian framework. This approach allowed us to propagate the uncertainty of the field observations into the inference of species-habitat relationships and to generate spatially explicit population density predictions with reduced effects of sampling heterogeneity. Our hypothesis was that the large-scale spatial distributions of these two cetacean species respond primarily to ecological processes resulting from shoaling and outcropping of the pycnocline in regions of wind-forced upwelling and eddy-like circulation. Physically, these processes affect the thermodynamic balance of the water column, decreasing its volume and thus the height of the absolute dynamic topography (ADT. Biologically, they lead to elevated primary productivity and persistent aggregation of low-trophic-level prey. Unlike other remotely sensed variables, ADT provides information about the structure of the entire water column and it is also routinely measured at high spatial-temporal resolution by satellite altimeters with uniform global coverage. Our models provide spatially explicit population density predictions for both species, even in areas where the pycnocline shoals but does not outcrop (e.g. the Costa Rica Dome and the North Equatorial Countercurrent thermocline ridge. Interannual variations in distribution during El Niño anomalies suggest that the population density of both species decreases dramatically in the Equatorial Cold Tongue and the Costa Rica Dome, and that their distributions retract to particular areas that remain productive, such as the more oceanic waters in the central California Current System, the northern Gulf of California, the North Equatorial Countercurrent thermocline ridge, and the more

  5. Hierarchical Bayesian Modeling of Fluid-Induced Seismicity

    Science.gov (United States)

    Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.

    2017-11-01

    In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.

  6. Bayesian disease mapping: hierarchical modeling in spatial epidemiology

    National Research Council Canada - National Science Library

    Lawson, Andrew

    2013-01-01

    .... Exploring these new developments, Bayesian Disease Mapping: Hierarchical Modeling in Spatial Epidemiology, Second Edition provides an up-to-date, cohesive account of the full range of Bayesian disease mapping methods and applications...

  7. Bayesian Hierarchical Grouping: perceptual grouping as mixture estimation

    Science.gov (United States)

    Froyen, Vicky; Feldman, Jacob; Singh, Manish

    2015-01-01

    We propose a novel framework for perceptual grouping based on the idea of mixture models, called Bayesian Hierarchical Grouping (BHG). In BHG we assume that the configuration of image elements is generated by a mixture of distinct objects, each of which generates image elements according to some generative assumptions. Grouping, in this framework, means estimating the number and the parameters of the mixture components that generated the image, including estimating which image elements are “owned” by which objects. We present a tractable implementation of the framework, based on the hierarchical clustering approach of Heller and Ghahramani (2005). We illustrate it with examples drawn from a number of classical perceptual grouping problems, including dot clustering, contour integration, and part decomposition. Our approach yields an intuitive hierarchical representation of image elements, giving an explicit decomposition of the image into mixture components, along with estimates of the probability of various candidate decompositions. We show that BHG accounts well for a diverse range of empirical data drawn from the literature. Because BHG provides a principled quantification of the plausibility of grouping interpretations over a wide range of grouping problems, we argue that it provides an appealing unifying account of the elusive Gestalt notion of Prägnanz. PMID:26322548

  8. Hierarchical Bayesian spatial models for multispecies conservation planning and monitoring.

    Science.gov (United States)

    Carroll, Carlos; Johnson, Devin S; Dunk, Jeffrey R; Zielinski, William J

    2010-12-01

    Biologists who develop and apply habitat models are often familiar with the statistical challenges posed by their data's spatial structure but are unsure of whether the use of complex spatial models will increase the utility of model results in planning. We compared the relative performance of nonspatial and hierarchical Bayesian spatial models for three vertebrate and invertebrate taxa of conservation concern (Church's sideband snails [Monadenia churchi], red tree voles [Arborimus longicaudus], and Pacific fishers [Martes pennanti pacifica]) that provide examples of a range of distributional extents and dispersal abilities. We used presence-absence data derived from regional monitoring programs to develop models with both landscape and site-level environmental covariates. We used Markov chain Monte Carlo algorithms and a conditional autoregressive or intrinsic conditional autoregressive model framework to fit spatial models. The fit of Bayesian spatial models was between 35 and 55% better than the fit of nonspatial analogue models. Bayesian spatial models outperformed analogous models developed with maximum entropy (Maxent) methods. Although the best spatial and nonspatial models included similar environmental variables, spatial models provided estimates of residual spatial effects that suggested how ecological processes might structure distribution patterns. Spatial models built from presence-absence data improved fit most for localized endemic species with ranges constrained by poorly known biogeographic factors and for widely distributed species suspected to be strongly affected by unmeasured environmental variables or population processes. By treating spatial effects as a variable of interest rather than a nuisance, hierarchical Bayesian spatial models, especially when they are based on a common broad-scale spatial lattice (here the national Forest Inventory and Analysis grid of 24 km(2) hexagons), can increase the relevance of habitat models to multispecies

  9. Hierarchical Bayesian models of subtask learning.

    Science.gov (United States)

    Anglim, Jeromy; Wynton, Sarah K A

    2015-07-01

    The current study used Bayesian hierarchical methods to challenge and extend previous work on subtask learning consistency. A general model of individual-level subtask learning was proposed focusing on power and exponential functions with constraints to test for inconsistency. To study subtask learning, we developed a novel computer-based booking task, which logged participant actions, enabling measurement of strategy use and subtask performance. Model comparison was performed using deviance information criterion (DIC), posterior predictive checks, plots of model fits, and model recovery simulations. Results showed that although learning tended to be monotonically decreasing and decelerating, and approaching an asymptote for all subtasks, there was substantial inconsistency in learning curves both at the group- and individual-levels. This inconsistency was most apparent when constraining both the rate and the ratio of learning to asymptote to be equal across subtasks, thereby giving learning curves only 1 parameter for scaling. The inclusion of 6 strategy covariates provided improved prediction of subtask performance capturing different subtask learning processes and subtask trade-offs. In addition, strategy use partially explained the inconsistency in subtask learning. Overall, the model provided a more nuanced representation of how complex tasks can be decomposed in terms of simpler learning mechanisms. (c) 2015 APA, all rights reserved.

  10. A Bayesian framework for risk perception

    NARCIS (Netherlands)

    van Erp, H.R.N.

    2017-01-01

    We present here a Bayesian framework of risk perception. This framework encompasses plausibility judgments, decision making, and question asking. Plausibility judgments are modeled by way of Bayesian probability theory, decision making is modeled by way of a Bayesian decision theory, and relevancy

  11. Inferring on the intentions of others by hierarchical Bayesian learning.

    Directory of Open Access Journals (Sweden)

    Andreea O Diaconescu

    2014-09-01

    Full Text Available Inferring on others' (potentially time-varying intentions is a fundamental problem during many social transactions. To investigate the underlying mechanisms, we applied computational modeling to behavioral data from an economic game in which 16 pairs of volunteers (randomly assigned to "player" or "adviser" roles interacted. The player performed a probabilistic reinforcement learning task, receiving information about a binary lottery from a visual pie chart. The adviser, who received more predictive information, issued an additional recommendation. Critically, the game was structured such that the adviser's incentives to provide helpful or misleading information varied in time. Using a meta-Bayesian modeling framework, we found that the players' behavior was best explained by the deployment of hierarchical learning: they inferred upon the volatility of the advisers' intentions in order to optimize their predictions about the validity of their advice. Beyond learning, volatility estimates also affected the trial-by-trial variability of decisions: participants were more likely to rely on their estimates of advice accuracy for making choices when they believed that the adviser's intentions were presently stable. Finally, our model of the players' inference predicted the players' interpersonal reactivity index (IRI scores, explicit ratings of the advisers' helpfulness and the advisers' self-reports on their chosen strategy. Overall, our results suggest that humans (i employ hierarchical generative models to infer on the changing intentions of others, (ii use volatility estimates to inform decision-making in social interactions, and (iii integrate estimates of advice accuracy with non-social sources of information. The Bayesian framework presented here can quantify individual differences in these mechanisms from simple behavioral readouts and may prove useful in future clinical studies of maladaptive social cognition.

  12. A full-capture Hierarchical Bayesian model of Pollock's Closed Robust Design and application to dolphins

    Directory of Open Access Journals (Sweden)

    Robert William Rankin

    2016-03-01

    Full Text Available We present a Hierarchical Bayesian version of Pollock's Closed Robust Design for studying the survival, temporary-migration, and abundance of marked animals. Through simulations and analyses of a bottlenose dolphin photo-identification dataset, we compare several estimation frameworks, including Maximum Likelihood estimation (ML, model-averaging by AICc, as well as Bayesian and Hierarchical Bayesian (HB procedures. Our results demonstrate a number of advantages of the Bayesian framework over other popular methods. First, for simple fixed-effect models, we show the near-equivalence of Bayesian and ML point-estimates and confidence/credibility intervals. Second, we demonstrate how there is an inherent correlation among temporary-migration and survival parameter estimates in the PCRD, and while this can lead to serious convergence issues and singularities among MLEs, we show that the Bayesian estimates were more reliable. Third, we demonstrate that a Hierarchical Bayesian model with carefully thought-out hyperpriors, can lead to similar parameter estimates and conclusions as multi-model inference by AICc model-averaging. This latter point is especially interesting for mark-recapture practitioners, for whom model-uncertainty and multi-model inference have become a major preoccupation. Lastly, we extend the Hierarchical Bayesian PCRD to include full-capture histories (i.e., by modelling a recruitment process and individual-level heterogeneity in detection probabilities, which can have important consequences for the range of phenomena studied by the PCRD, as well as lead to large differences in abundance estimates. For example, we estimate 8%-24% more bottlenose dolphins in the western gulf of Shark Bay than previously estimated by ML and AICc-based model-averaging. Other important extensions are discussed. Our Bayesian PCRD models are written in the BUGS-like JAGS language for easy dissemination and customization by the community of capture

  13. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross- entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  14. The application of a hierarchical Bayesian spatiotemporal model for ...

    Indian Academy of Sciences (India)

    2005.09.070. Sahu S K and Bakar K S 2012 Hierarchical bayesian autore- gressive models for large space-time data with application to ozone concentration modeling; Appl. Stochastic Models. Bus. Ind. 28 395–415, doi: 10.1002/asmb.1951.

  15. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    Science.gov (United States)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream

  16. Bayesian Decision Theoretical Framework for Clustering

    Science.gov (United States)

    Chen, Mo

    2011-01-01

    In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…

  17. Calibration in a Bayesian modelling framework

    NARCIS (Netherlands)

    Jansen, M.J.W.; Hagenaars, T.H.J.

    2004-01-01

    Bayesian statistics may constitute the core of a consistent and comprehensive framework for the statistical aspects of modelling complex processes that involve many parameters whose values are derived from many sources. Bayesian statistics holds great promises for model calibration, provides the

  18. A Comparison of Hierarchical and Non-Hierarchical Bayesian Approaches for Fitting Allometric Larch (Larix.spp. Biomass Equations

    Directory of Open Access Journals (Sweden)

    Dongsheng Chen

    2016-01-01

    Full Text Available Accurate biomass estimations are important for assessing and monitoring forest carbon storage. Bayesian theory has been widely applied to tree biomass models. Recently, a hierarchical Bayesian approach has received increasing attention for improving biomass models. In this study, tree biomass data were obtained by sampling 310 trees from 209 permanent sample plots from larch plantations in six regions across China. Non-hierarchical and hierarchical Bayesian approaches were used to model allometric biomass equations. We found that the total, root, stem wood, stem bark, branch and foliage biomass model relationships were statistically significant (p-values < 0.001 for both the non-hierarchical and hierarchical Bayesian approaches, but the hierarchical Bayesian approach increased the goodness-of-fit statistics over the non-hierarchical Bayesian approach. The R2 values of the hierarchical approach were higher than those of the non-hierarchical approach by 0.008, 0.018, 0.020, 0.003, 0.088 and 0.116 for the total tree, root, stem wood, stem bark, branch and foliage models, respectively. The hierarchical Bayesian approach significantly improved the accuracy of the biomass model (except for the stem bark and can reflect regional differences by using random parameters to improve the regional scale model accuracy.

  19. Road network safety evaluation using Bayesian hierarchical joint model.

    Science.gov (United States)

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. A Bayesian hierarchical model for climate change detection and attribution

    Science.gov (United States)

    Katzfuss, Matthias; Hammerling, Dorit; Smith, Richard L.

    2017-06-01

    Regression-based detection and attribution methods continue to take a central role in the study of climate change and its causes. Here we propose a novel Bayesian hierarchical approach to this problem, which allows us to address several open methodological questions. Specifically, we take into account the uncertainties in the true temperature change due to imperfect measurements, the uncertainty in the true climate signal under different forcing scenarios due to the availability of only a small number of climate model simulations, and the uncertainty associated with estimating the climate variability covariance matrix, including the truncation of the number of empirical orthogonal functions (EOFs) in this covariance matrix. We apply Bayesian model averaging to assign optimal probabilistic weights to different possible truncations and incorporate all uncertainties into the inference on the regression coefficients. We provide an efficient implementation of our method in a software package and illustrate its use with a realistic application.

  1. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    , and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...... settings, where cues shape expectations about a small number of upcoming stimuli and thus convey "prior" information about clearly defined objects. While operationally consistent with the experiments it seeks to describe, this view of attention as prior seems to miss many essential elements of both its...

  2. Bayesian Uncertainty Quantification for Subsurface Inversion Using a Multiscale Hierarchical Model

    KAUST Repository

    Mondal, Anirban

    2014-07-03

    We consider a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a random field (spatial or temporal). The Bayesian approach contains a natural mechanism for regularization in the form of prior information, can incorporate information from heterogeneous sources and provide a quantitative assessment of uncertainty in the inverse solution. The Bayesian setting casts the inverse solution as a posterior probability distribution over the model parameters. The Karhunen-Loeve expansion is used for dimension reduction of the random field. Furthermore, we use a hierarchical Bayes model to inject multiscale data in the modeling framework. In this Bayesian framework, we show that this inverse problem is well-posed by proving that the posterior measure is Lipschitz continuous with respect to the data in total variation norm. Computational challenges in this construction arise from the need for repeated evaluations of the forward model (e.g., in the context of MCMC) and are compounded by high dimensionality of the posterior. We develop two-stage reversible jump MCMC that has the ability to screen the bad proposals in the first inexpensive stage. Numerical results are presented by analyzing simulated as well as real data from hydrocarbon reservoir. This article has supplementary material available online. © 2014 American Statistical Association and the American Society for Quality.

  3. Sampling-free Bayesian inversion with adaptive hierarchical tensor representations

    Science.gov (United States)

    Eigel, Martin; Marschall, Manuel; Schneider, Reinhold

    2018-03-01

    A sampling-free approach to Bayesian inversion with an explicit polynomial representation of the parameter densities is developed, based on an affine-parametric representation of a linear forward model. This becomes feasible due to the complete treatment in function spaces, which requires an efficient model reduction technique for numerical computations. The advocated perspective yields the crucial benefit that error bounds can be derived for all occuring approximations, leading to provable convergence subject to the discretization parameters. Moreover, it enables a fully adaptive a posteriori control with automatic problem-dependent adjustments of the employed discretizations. The method is discussed in the context of modern hierarchical tensor representations, which are used for the evaluation of a random PDE (the forward model) and the subsequent high-dimensional quadrature of the log-likelihood, alleviating the ‘curse of dimensionality’. Numerical experiments demonstrate the performance and confirm the theoretical results.

  4. Topics in Computational Bayesian Statistics With Applications to Hierarchical Models in Astronomy and Sociology

    Science.gov (United States)

    Sahai, Swupnil

    This thesis includes three parts. The overarching theme is how to analyze structured hierarchical data, with applications to astronomy and sociology. The first part discusses how expectation propagation can be used to parallelize the computation when fitting big hierarchical bayesian models. This methodology is then used to fit a novel, nonlinear mixture model to ultraviolet radiation from various regions of the observable universe. The second part discusses how the Stan probabilistic programming language can be used to numerically integrate terms in a hierarchical bayesian model. This technique is demonstrated on supernovae data to significantly speed up convergence to the posterior distribution compared to a previous study that used a Gibbs-type sampler. The third part builds a formal latent kernel representation for aggregate relational data as a way to more robustly estimate the mixing characteristics of agents in a network. In particular, the framework is applied to sociology surveys to estimate, as a function of ego age, the age and sex composition of the personal networks of individuals in the United States.

  5. Prediction of road accidents: A Bayesian hierarchical approach.

    Science.gov (United States)

    Deublein, Markus; Schubert, Matthias; Adey, Bryan T; Köhler, Jochen; Faber, Michael H

    2013-03-01

    In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson-lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models. Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis of the observed frequencies of the model response variables, e.g. the occurrence of an accident, and observed values of the risk indicating variables, e.g. degree of road curvature. Subsequently, parameter learning is done using updating algorithms, to determine the posterior predictive probability distributions of the model response variables, conditional on the values of the risk indicating variables. The methodology is illustrated through a case study using data of the Austrian rural motorway network. In the case study, on randomly selected road segments the methodology is used to produce a model to predict the expected number of accidents in which an injury has occurred and the expected number of light, severe and fatally injured road users. Additionally, the methodology is used for geo-referenced identification of road sections with increased occurrence probabilities of injury accident events on a road link between two Austrian cities. It is shown that the proposed methodology can be used to develop models to estimate the occurrence of road accidents for any

  6. Transdimensional, hierarchical, Bayesian inversion of ambient seismic noise: Australia

    Science.gov (United States)

    Crowder, E.; Rawlinson, N.; Cornwell, D. G.

    2017-12-01

    We present models of crustal velocity structure in southeastern Australia using a novel, transdimensional and hierarchical, Bayesian inversion approach. The inversion is applied to long-time ambient noise cross-correlations. The study area of SE Australia is thought to represent the eastern margin of Gondwana. Conflicting tectonic models have been proposed to explain the formation of eastern Gondwana and the enigmatic geological relationships in Bass Strait, which separates Tasmania and the mainland. A geologically complex area of crustal accretion, Bass Strait may contain part of an exotic continental block entrained in colliding crusts. Ambient noise data recorded by an array of 24 seismometers is used to produce a high resolution, 3D shear wave velocity model of Bass Strait. Phase velocity maps in the period range 2-30 s are produced and subsequently inverted for 3D shear wave velocity structure. The transdimensional, hierarchical Bayesian, inversion technique is used. This technique proves far superior to linearised inversion. The inversion model is dynamically parameterised during the process, implicitly controlled by the data, and noise is treated as an inversion unknown. The resulting shear wave velocity model shows three sedimentary basins in Bass Strait constrained by slow shear velocities (2.4-2.9 km/s) at 2-10 km depth. These failed rift basins from the breakup of Australia-Antartica appear to be overlying thinned crust, where typical mantle velocities of 3.8-4.0 km/s occur at depths greater than 20 km. High shear wave velocities ( 3.7-3.8 km/s) in our new model also match well with regions of high magnetic and gravity anomalies. Furthermore, we use both Rayleigh and Love wave phase data to to construct Vsv and Vsh maps. These are used to estimate crustal radial anisotropy in the Bass Strait. We interpret that structures delineated by our velocity models support the presence and extent of the exotic Precambrian micro-continent (the Selwyn Block) that was

  7. Hierarchical Bayesian inference of the initial mass function in composite stellar populations

    Science.gov (United States)

    Dries, M.; Trager, S. C.; Koopmans, L. V. E.; Popping, G.; Somerville, R. S.

    2018-03-01

    The initial mass function (IMF) is a key ingredient in many studies of galaxy formation and evolution. Although the IMF is often assumed to be universal, there is continuing evidence that it is not universal. Spectroscopic studies that derive the IMF of the unresolved stellar populations of a galaxy often assume that this spectrum can be described by a single stellar population (SSP). To alleviate these limitations, in this paper we have developed a unique hierarchical Bayesian framework for modelling composite stellar populations (CSPs). Within this framework, we use a parametrized IMF prior to regulate a direct inference of the IMF. We use this new framework to determine the number of SSPs that is required to fit a set of realistic CSP mock spectra. The CSP mock spectra that we use are based on semi-analytic models and have an IMF that varies as a function of stellar velocity dispersion of the galaxy. Our results suggest that using a single SSP biases the determination of the IMF slope to a higher value than the true slope, although the trend with stellar velocity dispersion is overall recovered. If we include more SSPs in the fit, the Bayesian evidence increases significantly and the inferred IMF slopes of our mock spectra converge, within the errors, to their true values. Most of the bias is already removed by using two SSPs instead of one. We show that we can reconstruct the variable IMF of our mock spectra for signal-to-noise ratios exceeding ˜75.

  8. Fluorocarbon adsorption in hierarchical porous frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Motkuri, RK; Annapureddy, HVR; Vijaykumar, M; Schaef, HT; Martin, PF; McGrail, BP; Dang, LX; Krishna, R; Thallapally, PK

    2014-07-09

    Metal-organic frameworks comprise an important class of solid-state materials and have potential for many emerging applications such as energy storage, separation, catalysis and bio-medical. Here we report the adsorption behaviour of a series of fluorocarbon derivatives on a set of microporous and hierarchical mesoporous frameworks. The microporous frameworks show a saturation uptake capacity for dichlorodifluoromethane of >4 mmol g(-1) at a very low relative saturation pressure (P/P-o) of 0.02. In contrast, the mesoporous framework shows an exceptionally high uptake capacity reaching >14 mmol g(-1) at P/P-o of 0.4. Adsorption affinity in terms of mass loading and isosteric heats of adsorption is found to generally correlate with the polarizability and boiling point of the refrigerant, with dichlorodifluoromethane >chlorodifluoromethane >chlorotrifluoromethane >tetrafluoromethane >methane. These results suggest the possibility of exploiting these sorbents for separation of azeotropic mixtures of fluorocarbons and use in eco-friendly fluorocarbon-based adsorption cooling.

  9. Fluorocarbon adsorption in hierarchical porous frameworks

    Science.gov (United States)

    Motkuri, Radha Kishan; Annapureddy, Harsha V. R.; Vijaykumar, M.; Schaef, H. Todd; Martin, Paul F.; McGrail, B. Peter; Dang, Liem X.; Krishna, Rajamani; Thallapally, Praveen K.

    2014-07-01

    Metal-organic frameworks comprise an important class of solid-state materials and have potential for many emerging applications such as energy storage, separation, catalysis and bio-medical. Here we report the adsorption behaviour of a series of fluorocarbon derivatives on a set of microporous and hierarchical mesoporous frameworks. The microporous frameworks show a saturation uptake capacity for dichlorodifluoromethane of >4 mmol g-1 at a very low relative saturation pressure (P/Po) of 0.02. In contrast, the mesoporous framework shows an exceptionally high uptake capacity reaching >14 mmol g-1 at P/Po of 0.4. Adsorption affinity in terms of mass loading and isosteric heats of adsorption is found to generally correlate with the polarizability and boiling point of the refrigerant, with dichlorodifluoromethane >chlorodifluoromethane >chlorotrifluoromethane >tetrafluoromethane >methane. These results suggest the possibility of exploiting these sorbents for separation of azeotropic mixtures of fluorocarbons and use in eco-friendly fluorocarbon-based adsorption cooling.

  10. A novel Bayesian hierarchical model for road safety hotspot prediction.

    Science.gov (United States)

    Fawcett, Lee; Thorpe, Neil; Matthews, Joseph; Kremer, Karsten

    2017-02-01

    In this paper, we propose a Bayesian hierarchical model for predicting accident counts in future years at sites within a pool of potential road safety hotspots. The aim is to inform road safety practitioners of the location of likely future hotspots to enable a proactive, rather than reactive, approach to road safety scheme implementation. A feature of our model is the ability to rank sites according to their potential to exceed, in some future time period, a threshold accident count which may be used as a criterion for scheme implementation. Our model specification enables the classical empirical Bayes formulation - commonly used in before-and-after studies, wherein accident counts from a single before period are used to estimate counterfactual counts in the after period - to be extended to incorporate counts from multiple time periods. This allows site-specific variations in historical accident counts (e.g. locally-observed trends) to offset estimates of safety generated by a global accident prediction model (APM), which itself is used to help account for the effects of global trend and regression-to-mean (RTM). The Bayesian posterior predictive distribution is exploited to formulate predictions and to properly quantify our uncertainty in these predictions. The main contributions of our model include (i) the ability to allow accident counts from multiple time-points to inform predictions, with counts in more recent years lending more weight to predictions than counts from time-points further in the past; (ii) where appropriate, the ability to offset global estimates of trend by variations in accident counts observed locally, at a site-specific level; and (iii) the ability to account for unknown/unobserved site-specific factors which may affect accident counts. We illustrate our model with an application to accident counts at 734 potential hotspots in the German city of Halle; we also propose some simple diagnostics to validate the predictive capability of our

  11. Efficient hierarchical trans-dimensional Bayesian inversion of magnetotelluric data

    Science.gov (United States)

    Xiang, Enming; Guo, Rongwen; Dosso, Stan E.; Liu, Jianxin; Dong, Hao; Ren, Zhengyong

    2018-02-01

    This paper develops an efficient hierarchical trans-dimensional (trans-D) Bayesian algorithm to invert magnetotelluric (MT) data for subsurface geoelectrical structure, with unknown geophysical model parameterization (the number of conductivity-layer interfaces) and data error models parameterized by an auto-regressive (AR) process to account for potential error correlations. The reversible-jump Markov-chain Monte Carlo algorithm, which adds/removes interfaces and AR parameters in birth/death steps, is applied to sample the trans-D posterior probability density for model parameterization, model parameters, error variance and AR parameters, accounting for the uncertainties of model dimension and data-error statistics in the uncertainty estimates of the conductivity profile. To provide efficient sampling over the multiple subspaces of different dimensions, advanced proposal schemes are applied. Parameter perturbations are carried out in principal-component space, defined by eigen-decomposition of the unit-lag model covariance matrix, to minimize the effect of inter-parameter correlations and provide effective perturbation directions and length scales. Parameters of new layers in birth steps are proposed from the prior, instead of focused distributions centred at existing values, to improve birth acceptance rates. Parallel tempering, based on a series of parallel interacting Markov chains with successively relaxed likelihoods, is applied to improve chain mixing over model dimensions. The trans-D inversion is applied in a simulation study to examine the resolution of model structure according to the data information content. The inversion is also applied to a measured MT data set from south-central Australia.

  12. Likelihood-free inference of population structure and local adaptation in a Bayesian hierarchical model.

    Science.gov (United States)

    Bazin, Eric; Dawson, Kevin J; Beaumont, Mark A

    2010-06-01

    We address the problem of finding evidence of natural selection from genetic data, accounting for the confounding effects of demographic history. In the absence of natural selection, gene genealogies should all be sampled from the same underlying distribution, often approximated by a coalescent model. Selection at a particular locus will lead to a modified genealogy, and this motivates a number of recent approaches for detecting the effects of natural selection in the genome as "outliers" under some models. The demographic history of a population affects the sampling distribution of genealogies, and therefore the observed genotypes and the classification of outliers. Since we cannot see genealogies directly, we have to infer them from the observed data under some model of mutation and demography. Thus the accuracy of an outlier-based approach depends to a greater or a lesser extent on the uncertainty about the demographic and mutational model. A natural modeling framework for this type of problem is provided by Bayesian hierarchical models, in which parameters, such as mutation rates and selection coefficients, are allowed to vary across loci. It has proved quite difficult computationally to implement fully probabilistic genealogical models with complex demographies, and this has motivated the development of approximations such as approximate Bayesian computation (ABC). In ABC the data are compressed into summary statistics, and computation of the likelihood function is replaced by simulation of data under the model. In a hierarchical setting one may be interested both in hyperparameters and parameters, and there may be very many of the latter--for example, in a genetic model, these may be parameters describing each of many loci or populations. This poses a problem for ABC in that one then requires summary statistics for each locus, which, if used naively, leads to a consequent difficulty in conditional density estimation. We develop a general method for applying

  13. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  14. A hierarchical Bayesian model to incorporate uncertainty into methods for diversity partitioning.

    Science.gov (United States)

    Marion, Zachary H; Fordyce, James A; Fitzpatrick, Benjamin M

    2018-04-01

    Recently there have been major theoretical advances in the quantification and partitioning of diversity within and among communities, regions, and ecosystems. However, applying those advances to real data remains a challenge. Ecologists often end up describing their samples rather than estimating the diversity components of an underlying study system, and existing approaches do not easily provide statistical frameworks for testing ecological questions. Here we offer one avenue to do all of the above using a hierarchical Bayesian approach. We estimate posterior distributions of the underlying "true" relative abundances of each species within each unit sampled. These posterior estimates of relative abundance can then be used with existing formulae to estimate and partition diversity. The result is a posterior distribution of diversity metrics describing our knowledge (or beliefs) about the study system. This approach intuitively leads to statistical inferences addressing biologically motivated hypotheses via Bayesian model comparison. Using simulations, we demonstrate that our approach does as well or better at approximating the "true" diversity of a community relative to naïve or ad-hoc bias-corrected estimates. Moreover, model comparison correctly distinguishes between alternative hypotheses about the distribution of diversity within and among samples. Finally, we use an empirical ecological dataset to illustrate how the approach can be used to address questions about the makeup and diversities of assemblages at local and regional scales. © 2018 by the Ecological Society of America.

  15. Prediction of road accidents: A Bayesian hierarchical approach

    DEFF Research Database (Denmark)

    Deublein, Markus; Schubert, Matthias; Adey, Bryan T.

    2013-01-01

    -lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks...... in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models.Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis...

  16. Bayesian disease mapping: hierarchical modeling in spatial epidemiology

    National Research Council Canada - National Science Library

    Lawson, Andrew

    2013-01-01

    Since the publication of the first edition, many new Bayesian tools and methods have been developed for space-time data analysis, the predictive modeling of health outcomes, and other spatial biostatistical areas...

  17. A Hierarchical Bayesian Model to Predict Self-Thinning Line for Chinese Fir in Southern China.

    Directory of Open Access Journals (Sweden)

    Xiongqing Zhang

    Full Text Available Self-thinning is a dynamic equilibrium between forest growth and mortality at full site occupancy. Parameters of the self-thinning lines are often confounded by differences across various stand and site conditions. For overcoming the problem of hierarchical and repeated measures, we used hierarchical Bayesian method to estimate the self-thinning line. The results showed that the self-thinning line for Chinese fir (Cunninghamia lanceolata (Lamb.Hook. plantations was not sensitive to the initial planting density. The uncertainty of model predictions was mostly due to within-subject variability. The simulation precision of hierarchical Bayesian method was better than that of stochastic frontier function (SFF. Hierarchical Bayesian method provided a reasonable explanation of the impact of other variables (site quality, soil type, aspect, etc. on self-thinning line, which gave us the posterior distribution of parameters of self-thinning line. The research of self-thinning relationship could be benefit from the use of hierarchical Bayesian method.

  18. Clustering and Bayesian hierarchical modeling for the definition of informative prior distributions in hydrogeology

    Science.gov (United States)

    Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.

    2017-12-01

    In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.

  19. Composite behavior analysis for video surveillance using hierarchical dynamic Bayesian networks

    Science.gov (United States)

    Cheng, Huanhuan; Shan, Yong; Wang, Runsheng

    2011-03-01

    Analyzing composite behaviors involving objects from multiple categories in surveillance videos is a challenging task due to the complicated relationships among human and objects. This paper presents a novel behavior analysis framework using a hierarchical dynamic Bayesian network (DBN) for video surveillance systems. The model is built for extracting objects' behaviors and their relationships by representing behaviors using spatial-temporal characteristics. The recognition of object behaviors is processed by the DBN at multiple levels: features of objects at low level, objects and their relationships at middle level, and event at high level, where event refers to behaviors of a single type object as well as behaviors consisting of several types of objects such as ``a person getting in a car.'' Furthermore, to reduce the complexity, a simple model selection criterion is addressed, by which the appropriated model is picked out from a pool of candidate models. Experiments are shown to demonstrate that the proposed framework could efficiently recognize and semantically describe composite object and human activities in surveillance videos.

  20. Assimilating multi-source uncertainties of a parsimonious conceptual hydrological model using hierarchical Bayesian modeling

    Science.gov (United States)

    Wei Wu; James Clark; James Vose

    2010-01-01

    Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model – GR4J – by coherently assimilating the uncertainties from the...

  1. Hierarchical Bayesian modeling of the space - time diffusion patterns of cholera epidemic in Kumasi, Ghana

    NARCIS (Netherlands)

    Osei, Frank B.; Osei, F.B.; Duker, Alfred A.; Stein, A.

    2011-01-01

    This study analyses the joint effects of the two transmission routes of cholera on the space-time diffusion dynamics. Statistical models are developed and presented to investigate the transmission network routes of cholera diffusion. A hierarchical Bayesian modelling approach is employed for a joint

  2. A hierarchical Bayesian spatio-temporal model to forecast trapped particle fluxes over the SAA region

    Czech Academy of Sciences Publication Activity Database

    Suparta, W.; Gusrizal, G.; Kudela, Karel; Isa, Z.

    2017-01-01

    Roč. 28, č. 3 (2017), s. 357-370 ISSN 1017-0839 R&D Projects: GA MŠk EF15_003/0000481 Institutional support: RVO:61389005 Keywords : trapped particle * spatio-temporal * hierarchical Bayesian * forecasting Subject RIV: DG - Athmosphere Sciences, Meteorology OBOR OECD: Meteorology and atmospheric sciences Impact factor: 0.752, year: 2016

  3. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  4. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  5. Multilevel Bayesian networks for the analysis of hierarchical health care data.

    Science.gov (United States)

    Lappenschaar, Martijn; Hommersom, Arjen; Lucas, Peter J F; Lagro, Joep; Visscher, Stefan

    2013-03-01

    Large health care datasets normally have a hierarchical structure, in terms of levels, as the data have been obtained from different practices, hospitals, or regions. Multilevel regression is the technique commonly used to deal with such multilevel data. However, for the statistical analysis of interactions between entities from a domain, multilevel regression yields little to no insight. While Bayesian networks have proved to be useful for analysis of interactions, they do not have the capability to deal with hierarchical data. In this paper, we describe a new formalism, which we call multilevel Bayesian networks; its effectiveness for the analysis of hierarchically structured health care data is studied from the perspective of multimorbidity. Multilevel Bayesian networks are formally defined and applied to analyze clinical data from family practices in The Netherlands with the aim to predict interactions between heart failure and diabetes mellitus. We compare the results obtained with multilevel regression. The results obtained by multilevel Bayesian networks closely resembled those obtained by multilevel regression. For both diseases, the area under the curve of the prediction model improved, and the net reclassification improvements were significantly positive. In addition, the models offered considerable more insight, through its internal structure, into the interactions between the diseases. Multilevel Bayesian networks offer a suitable alternative to multilevel regression when analyzing hierarchical health care data. They provide more insight into the interactions between multiple diseases. Moreover, a multilevel Bayesian network model can be used for the prediction of the occurrence of multiple diseases, even when some of the predictors are unknown, which is typically the case in medicine. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Application of hierarchical Bayesian unmixing models in river sediment source apportionment

    Science.gov (United States)

    Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice

    2016-04-01

    Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling

  7. A BAYESIAN HIERARCHICAL SPATIAL MODEL FOR DENTAL CARIES ASSESSMENT USING NON-GAUSSIAN MARKOV RANDOM FIELDS.

    Science.gov (United States)

    Jin, Ick Hoon; Yuan, Ying; Bandyopadhyay, Dipankar

    2016-01-01

    Research in dental caries generates data with two levels of hierarchy: that of a tooth overall and that of the different surfaces of the tooth. The outcomes often exhibit spatial referencing among neighboring teeth and surfaces, i.e., the disease status of a tooth or surface might be influenced by the status of a set of proximal teeth/surfaces. Assessments of dental caries (tooth decay) at the tooth level yield binary outcomes indicating the presence/absence of teeth, and trinary outcomes at the surface level indicating healthy, decayed, or filled surfaces. The presence of these mixed discrete responses complicates the data analysis under a unified framework. To mitigate complications, we develop a Bayesian two-level hierarchical model under suitable (spatial) Markov random field assumptions that accommodates the natural hierarchy within the mixed responses. At the first level, we utilize an autologistic model to accommodate the spatial dependence for the tooth-level binary outcomes. For the second level and conditioned on a tooth being non-missing, we utilize a Potts model to accommodate the spatial referencing for the surface-level trinary outcomes. The regression models at both levels were controlled for plausible covariates (risk factors) of caries, and remain connected through shared parameters. To tackle the computational challenges in our Bayesian estimation scheme caused due to the doubly-intractable normalizing constant, we employ a double Metropolis-Hastings sampler. We compare and contrast our model performances to the standard non-spatial (naive) model using a small simulation study, and illustrate via an application to a clinical dataset on dental caries.

  8. A hierarchical framework for air traffic control

    Science.gov (United States)

    Roy, Kaushik

    Air travel in recent years has been plagued by record delays, with over $8 billion in direct operating costs being attributed to 100 million flight delay minutes in 2007. Major contributing factors to delay include weather, congestion, and aging infrastructure; the Next Generation Air Transportation System (NextGen) aims to alleviate these delays through an upgrade of the air traffic control system. Changes to large-scale networked systems such as air traffic control are complicated by the need for coordinated solutions over disparate temporal and spatial scales. Individual air traffic controllers must ensure aircraft maintain safe separation locally with a time horizon of seconds to minutes, whereas regional plans are formulated to efficiently route flows of aircraft around weather and congestion on the order of every hour. More efficient control algorithms that provide a coordinated solution are required to safely handle a larger number of aircraft in a fixed amount of airspace. Improved estimation algorithms are also needed to provide accurate aircraft state information and situational awareness for human controllers. A hierarchical framework is developed to simultaneously solve the sometimes conflicting goals of regional efficiency and local safety. Careful attention is given in defining the interactions between the layers of this hierarchy. In this way, solutions to individual air traffic problems can be targeted and implemented as needed. First, the regional traffic flow management problem is posed as an optimization problem and shown to be NP-Hard. Approximation methods based on aggregate flow models are developed to enable real-time implementation of algorithms that reduce the impact of congestion and adverse weather. Second, the local trajectory design problem is solved using a novel slot-based sector model. This model is used to analyze sector capacity under varying traffic patterns, providing a more comprehensive understanding of how increased automation

  9. Determining the Bayesian optimal sampling strategy in a hierarchical system.

    Energy Technology Data Exchange (ETDEWEB)

    Grace, Matthew D.; Ringland, James T.; Boggs, Paul T.; Pebay, Philippe Pierre

    2010-09-01

    Consider a classic hierarchy tree as a basic model of a 'system-of-systems' network, where each node represents a component system (which may itself consist of a set of sub-systems). For this general composite system, we present a technique for computing the optimal testing strategy, which is based on Bayesian decision analysis. In previous work, we developed a Bayesian approach for computing the distribution of the reliability of a system-of-systems structure that uses test data and prior information. This allows for the determination of both an estimate of the reliability and a quantification of confidence in the estimate. Improving the accuracy of the reliability estimate and increasing the corresponding confidence require the collection of additional data. However, testing all possible sub-systems may not be cost-effective, feasible, or even necessary to achieve an improvement in the reliability estimate. To address this sampling issue, we formulate a Bayesian methodology that systematically determines the optimal sampling strategy under specified constraints and costs that will maximally improve the reliability estimate of the composite system, e.g., by reducing the variance of the reliability distribution. This methodology involves calculating the 'Bayes risk of a decision rule' for each available sampling strategy, where risk quantifies the relative effect that each sampling strategy could have on the reliability estimate. A general numerical algorithm is developed and tested using an example multicomponent system. The results show that the procedure scales linearly with the number of components available for testing.

  10. A method of spherical harmonic analysis in the geosciences via hierarchical Bayesian inference

    Science.gov (United States)

    Muir, J. B.; Tkalčić, H.

    2015-11-01

    The problem of decomposing irregular data on the sphere into a set of spherical harmonics is common in many fields of geosciences where it is necessary to build a quantitative understanding of a globally varying field. For example, in global seismology, a compressional or shear wave speed that emerges from tomographic images is used to interpret current state and composition of the mantle, and in geomagnetism, secular variation of magnetic field intensity measured at the surface is studied to better understand the changes in the Earth's core. Optimization methods are widely used for spherical harmonic analysis of irregular data, but they typically do not treat the dependence of the uncertainty estimates on the imposed regularization. This can cause significant difficulties in interpretation, especially when the best-fit model requires more variables as a result of underestimating data noise. Here, with the above limitations in mind, the problem of spherical harmonic expansion of irregular data is treated within the hierarchical Bayesian framework. The hierarchical approach significantly simplifies the problem by removing the need for regularization terms and user-supplied noise estimates. The use of the corrected Akaike Information Criterion for picking the optimal maximum degree of spherical harmonic expansion and the resulting spherical harmonic analyses are first illustrated on a noisy synthetic data set. Subsequently, the method is applied to two global data sets sensitive to the Earth's inner core and lowermost mantle, consisting of PKPab-df and PcP-P differential traveltime residuals relative to a spherically symmetric Earth model. The posterior probability distributions for each spherical harmonic coefficient are calculated via Markov Chain Monte Carlo sampling; the uncertainty obtained for the coefficients thus reflects the noise present in the real data and the imperfections in the spherical harmonic expansion.

  11. MEG Source Localization of Spatially Extended Generators of Epileptic Activity: Comparing Entropic and Hierarchical Bayesian Approaches

    Science.gov (United States)

    Chowdhury, Rasheda Arman; Lina, Jean Marc; Kobayashi, Eliane; Grova, Christophe

    2013-01-01

    Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG) or Magneto-EncephaloGraphy (MEG) signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i) brain activity may be modeled using cortical parcels and (ii) brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP) method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM) and the Hierarchical Bayesian (HB) source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC) analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm2 to 30 cm2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered. PMID:23418485

  12. MEG source localization of spatially extended generators of epileptic activity: comparing entropic and hierarchical bayesian approaches.

    Directory of Open Access Journals (Sweden)

    Rasheda Arman Chowdhury

    Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.

  13. Modeling when people quit: Bayesian censored geometric models with hierarchical and latent-mixture extensions.

    Science.gov (United States)

    Okada, Kensuke; Vandekerckhove, Joachim; Lee, Michael D

    2018-02-01

    People often interact with environments that can provide only a finite number of items as resources. Eventually a book contains no more chapters, there are no more albums available from a band, and every Pokémon has been caught. When interacting with these sorts of environments, people either actively choose to quit collecting new items, or they are forced to quit when the items are exhausted. Modeling the distribution of how many items people collect before they quit involves untangling these two possibilities, We propose that censored geometric models are a useful basic technique for modeling the quitting distribution, and, show how, by implementing these models in a hierarchical and latent-mixture framework through Bayesian methods, they can be extended to capture the additional features of specific situations. We demonstrate this approach by developing and testing a series of models in two case studies involving real-world data. One case study deals with people choosing jokes from a recommender system, and the other deals with people completing items in a personality survey.

  14. Estimating the Term Structure With a Semiparametric Bayesian Hierarchical Model: An Application to Corporate Bonds1

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Ensor, Katherine B.; Rosner, Gary L.

    2011-01-01

    The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material. PMID:21765566

  15. Hierarchical Terrain Classification Based on Multilayer Bayesian Network and Conditional Random Field

    Directory of Open Access Journals (Sweden)

    Chu He

    2017-01-01

    Full Text Available This paper presents a hierarchical classification approach for Synthetic Aperture Radar (SAR images. The Conditional Random Field (CRF and Bayesian Network (BN are employed to incorporate prior knowledge into this approach for facilitating SAR image classification. (1 A multilayer region pyramid is constructed based on multiscale oversegmentation, and then, CRF is used to model the spatial relationships among those extracted regions within each layer of the region pyramid; the boundary prior knowledge is exploited and integrated into the CRF model as a strengthened constraint to improve classification performance near the boundaries. (2 Multilayer BN is applied to establish the causal connections between adjacent layers of the constructed region pyramid, where the classification probabilities of those sub-regions in the lower layer, conditioned on their parents’ regions in the upper layer, are used as adjacent links. More contextual information is taken into account in this framework, which is a benefit to the performance improvement. Several experiments are conducted on real ESAR and TerraSAR data, and the results show that the proposed method achieves better classification accuracy.

  16. A hierarchical spatial framework for forest landscape planning.

    Science.gov (United States)

    Pete Bettinger; Marie Lennette; K. Norman Johnson; Thomas A. Spies

    2005-01-01

    A hierarchical spatial framework for large-scale, long-term forest landscape planning is presented along with example policy analyses for a 560,000 ha area of the Oregon Coast Range. The modeling framework suggests utilizing the detail provided by satellite imagery to track forest vegetation condition and for representation of fine-scale features, such as riparian...

  17. Bayesian hierarchical models for regional climate reconstructions of the last glacial maximum

    Science.gov (United States)

    Weitzel, Nils; Hense, Andreas; Ohlwein, Christian

    2017-04-01

    Spatio-temporal reconstructions of past climate are important for the understanding of the long term behavior of the climate system and the sensitivity to forcing changes. Unfortunately, they are subject to large uncertainties, have to deal with a complex proxy-climate structure, and a physically reasonable interpolation between the sparse proxy observations is difficult. Bayesian Hierarchical Models (BHMs) are a class of statistical models that is well suited for spatio-temporal reconstructions of past climate because they permit the inclusion of multiple sources of information (e.g. records from different proxy types, uncertain age information, output from climate simulations) and quantify uncertainties in a statistically rigorous way. BHMs in paleoclimatology typically consist of three stages which are modeled individually and are combined using Bayesian inference techniques. The data stage models the proxy-climate relation (often named transfer function), the process stage models the spatio-temporal distribution of the climate variables of interest, and the prior stage consists of prior distributions of the model parameters. For our BHMs, we translate well-known proxy-climate transfer functions for pollen to a Bayesian framework. In addition, we can include Gaussian distributed local climate information from preprocessed proxy records. The process stage combines physically reasonable spatial structures from prior distributions with proxy records which leads to a multivariate posterior probability distribution for the reconstructed climate variables. The prior distributions that constrain the possible spatial structure of the climate variables are calculated from climate simulation output. We present results from pseudoproxy tests as well as new regional reconstructions of temperatures for the last glacial maximum (LGM, ˜ 21,000 years BP). These reconstructions combine proxy data syntheses with information from climate simulations for the LGM that were

  18. Testing adaptive toolbox models: a Bayesian hierarchical approach

    NARCIS (Netherlands)

    Scheibehenne, B.; Rieskamp, J.; Wagenmakers, E.-J.

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often

  19. A Hierarchical Learning Control Framework for an Aerial Manipulation System

    Science.gov (United States)

    Ma, Le; Chi, yanxun; Li, Jiapeng; Li, Zhongsheng; Ding, Yalei; Liu, Lixing

    2017-07-01

    A hierarchical learning control framework for an aerial manipulation system is proposed. Firstly, the mechanical design of aerial manipulation system is introduced and analyzed, and the kinematics and the dynamics based on Newton-Euler equation are modeled. Secondly, the framework of hierarchical learning for this system is presented, in which flight platform and manipulator are controlled by different controller respectively. The RBF (Radial Basis Function) neural networks are employed to estimate parameters and control. The Simulation and experiment demonstrate that the methods proposed effective and advanced.

  20. Fluorocarbon adsorption in hierarchical porous frameworks

    NARCIS (Netherlands)

    Motkuri, R.K.; Annapureddy, H.V.R.; Vijaykumar, M.; Schaef, H.T.; Martin, P.F.; McGrail, B.P.; Dang, L.X.; Krishna, R.; Thallapally, P.K.

    2014-01-01

    Metal-organic frameworks comprise an important class of solid-state materials and have potential for many emerging applications such as energy storage, separation, catalysis and bio-medical. Here we report the adsorption behaviour of a series of fluorocarbon derivatives on a set of microporous and

  1. A Unified Bayesian Inference Framework for Generalized Linear Models

    Science.gov (United States)

    Meng, Xiangming; Wu, Sheng; Zhu, Jiang

    2018-03-01

    In this letter, we present a unified Bayesian inference framework for generalized linear models (GLM) which iteratively reduces the GLM problem to a sequence of standard linear model (SLM) problems. This framework provides new perspectives on some established GLM algorithms derived from SLM ones and also suggests novel extensions for some other SLM algorithms. Specific instances elucidated under such framework are the GLM versions of approximate message passing (AMP), vector AMP (VAMP), and sparse Bayesian learning (SBL). It is proved that the resultant GLM version of AMP is equivalent to the well-known generalized approximate message passing (GAMP). Numerical results for 1-bit quantized compressed sensing (CS) demonstrate the effectiveness of this unified framework.

  2. A Hierarchical Biology Concept Framework: A Tool for Course Design

    OpenAIRE

    Khodor, Julia; Halme, Dina Gould; Walker, Graham C.

    2004-01-01

    A typical undergraduate biology curriculum covers a very large number of concepts and details. We describe the development of a Biology Concept Framework (BCF) as a possible way to organize this material to enhance teaching and learning. Our BCF is hierarchical, places details in context, nests related concepts, and articulates concepts that are inherently obvious to experts but often difficult ...

  3. Multimethod, multistate Bayesian hierarchical modeling approach for use in regional monitoring of wolves.

    Science.gov (United States)

    Jiménez, José; García, Emilio J; Llaneza, Luis; Palacios, Vicente; González, Luis Mariano; García-Domínguez, Francisco; Múñoz-Igualada, Jaime; López-Bao, José Vicente

    2016-08-01

    In many cases, the first step in large-carnivore management is to obtain objective, reliable, and cost-effective estimates of population parameters through procedures that are reproducible over time. However, monitoring predators over large areas is difficult, and the data have a high level of uncertainty. We devised a practical multimethod and multistate modeling approach based on Bayesian hierarchical-site-occupancy models that combined multiple survey methods to estimate different population states for use in monitoring large predators at a regional scale. We used wolves (Canis lupus) as our model species and generated reliable estimates of the number of sites with wolf reproduction (presence of pups). We used 2 wolf data sets from Spain (Western Galicia in 2013 and Asturias in 2004) to test the approach. Based on howling surveys, the naïve estimation (i.e., estimate based only on observations) of the number of sites with reproduction was 9 and 25 sites in Western Galicia and Asturias, respectively. Our model showed 33.4 (SD 9.6) and 34.4 (3.9) sites with wolf reproduction, respectively. The number of occupied sites with wolf reproduction was 0.67 (SD 0.19) and 0.76 (0.11), respectively. This approach can be used to design more cost-effective monitoring programs (i.e., to define the sampling effort needed per site). Our approach should inspire well-coordinated surveys across multiple administrative borders and populations and lead to improved decision making for management of large carnivores on a landscape level. The use of this Bayesian framework provides a simple way to visualize the degree of uncertainty around population-parameter estimates and thus provides managers and stakeholders an intuitive approach to interpreting monitoring results. Our approach can be widely applied to large spatial scales in wildlife monitoring where detection probabilities differ between population states and where several methods are being used to estimate different population

  4. Sub-seasonal-to-seasonal Reservoir Inflow Forecast using Bayesian Hierarchical Hidden Markov Model

    Science.gov (United States)

    Mukhopadhyay, S.; Arumugam, S.

    2017-12-01

    Sub-seasonal-to-seasonal (S2S) (15-90 days) streamflow forecasting is an emerging area of research that provides seamless information for reservoir operation from weather time scales to seasonal time scales. From an operational perspective, sub-seasonal inflow forecasts are highly valuable as these enable water managers to decide short-term releases (15-30 days), while holding water for seasonal needs (e.g., irrigation and municipal supply) and to meet end-of-the-season target storage at a desired level. We propose a Bayesian Hierarchical Hidden Markov Model (BHHMM) to develop S2S inflow forecasts for the Tennessee Valley Area (TVA) reservoir system. Here, the hidden states are predicted by relevant indices that influence the inflows at S2S time scale. The hidden Markov model also captures the both spatial and temporal hierarchy in predictors that operate at S2S time scale with model parameters being estimated as a posterior distribution using a Bayesian framework. We present our work in two steps, namely single site model and multi-site model. For proof of concept, we consider inflows to Douglas Dam, Tennessee, in the single site model. For multisite model we consider reservoirs in the upper Tennessee valley. Streamflow forecasts are issued and updated continuously every day at S2S time scale. We considered precipitation forecasts obtained from NOAA Climate Forecast System (CFSv2) GCM as predictors for developing S2S streamflow forecasts along with relevant indices for predicting hidden states. Spatial dependence of the inflow series of reservoirs are also preserved in the multi-site model. To circumvent the non-normality of the data, we consider the HMM in a Generalized Linear Model setting. Skill of the proposed approach is tested using split sample validation against a traditional multi-site canonical correlation model developed using the same set of predictors. From the posterior distribution of the inflow forecasts, we also highlight different system behavior

  5. Bayesian Hierarchical Scale Mixtures of Log-Normal Models for Inference in Reliability with Stochastic Constraint

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2017-06-01

    Full Text Available This paper develops Bayesian inference in reliability of a class of scale mixtures of log-normal failure time (SMLNFT models with stochastic (or uncertain constraint in their reliability measures. The class is comprehensive and includes existing failure time (FT models (such as log-normal, log-Cauchy, and log-logistic FT models as well as new models that are robust in terms of heavy-tailed FT observations. Since classical frequency approaches to reliability analysis based on the SMLNFT model with stochastic constraint are intractable, the Bayesian method is pursued utilizing a Markov chain Monte Carlo (MCMC sampling based approach. This paper introduces a two-stage maximum entropy (MaxEnt prior, which elicits a priori uncertain constraint and develops Bayesian hierarchical SMLNFT model by using the prior. The paper also proposes an MCMC method for Bayesian inference in the SMLNFT model reliability and calls attention to properties of the MaxEnt prior that are useful for method development. Finally, two data sets are used to illustrate how the proposed methodology works.

  6. HDDM: Hierarchical Bayesian estimation of the Drift-Diffusion Model in Python

    Directory of Open Access Journals (Sweden)

    Thomas V Wiecki

    2013-08-01

    Full Text Available The diffusion model is a commonly used tool to infer latent psychological processes underlying decision making, and to link them to neural mechanisms based on reaction times. Although efficient open source software has been made available to quantitatively fit the model to data, current estimation methods require an abundance of reaction time measurements to recover meaningful parameters, and only provide point estimates of each parameter. In contrast, hierarchical Bayesian parameter estimation methods are useful for enhancing statistical power, allowing for simultaneous estimation of individual subject parameters and the group distribution that they are drawn from, while also providing measures of uncertainty in these parameters in the posterior distribution. Here, we present a novel Python-based toolbox called HDDM (hierarchical drift diffusion model, which allows fast and flexible estimation of the the drift-diffusion model and the related linear ballistic accumulator model. HDDM requires fewer data per subject / condition than non-hierarchical method, allows for full Bayesian data analysis, and can handle outliers in the data. Finally, HDDM supports the estimation of how trial-by-trial measurements (e.g. fMRI influence decision making parameters. This paper will first describe the theoretical background of drift-diffusion model and Bayesian inference. We then illustrate usage of the toolbox on a real-world data set from our lab. Finally, parameter recovery studies show that HDDM beats alternative fitting methods like the chi-quantile method as well as maximum likelihood estimation. The software and documentation can be downloaded at: http://ski.clps.brown.edu/hddm_docs

  7. Hierarchical Bayesian nonparametric mixture models for clustering with variable relevance determination.

    Science.gov (United States)

    Yau, Christopher; Holmes, Chris

    2011-07-01

    We propose a hierarchical Bayesian nonparametric mixture model for clustering when some of the covariates are assumed to be of varying relevance to the clustering problem. This can be thought of as an issue in variable selection for unsupervised learning. We demonstrate that by defining a hierarchical population based nonparametric prior on the cluster locations scaled by the inverse covariance matrices of the likelihood we arrive at a 'sparsity prior' representation which admits a conditionally conjugate prior. This allows us to perform full Gibbs sampling to obtain posterior distributions over parameters of interest including an explicit measure of each covariate's relevance and a distribution over the number of potential clusters present in the data. This also allows for individual cluster specific variable selection. We demonstrate improved inference on a number of canonical problems.

  8. Bayesian Hierarchical Structure for Quantifying Population Variability to Inform Probabilistic Health Risk Assessments.

    Science.gov (United States)

    Shao, Kan; Allen, Bruce C; Wheeler, Matthew W

    2017-10-01

    Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations. © 2016 Society for Risk Analysis.

  9. Linking bovine tuberculosis on cattle farms to white-tailed deer and environmental variables using Bayesian hierarchical analysis.

    Directory of Open Access Journals (Sweden)

    W David Walter

    Full Text Available Bovine tuberculosis is a bacterial disease caused by Mycobacterium bovis in livestock and wildlife with hosts that include Eurasian badgers (Meles meles, brushtail possum (Trichosurus vulpecula, and white-tailed deer (Odocoileus virginianus. Risk-assessment efforts in Michigan have been initiated on farms to minimize interactions of cattle with wildlife hosts but research on M. bovis on cattle farms has not investigated the spatial context of disease epidemiology. To incorporate spatially explicit data, initial likelihood of infection probabilities for cattle farms tested for M. bovis, prevalence of M. bovis in white-tailed deer, deer density, and environmental variables for each farm were modeled in a Bayesian hierarchical framework. We used geo-referenced locations of 762 cattle farms that have been tested for M. bovis, white-tailed deer prevalence, and several environmental variables that may lead to long-term survival and viability of M. bovis on farms and surrounding habitats (i.e., soil type, habitat type. Bayesian hierarchical analyses identified deer prevalence and proportion of sandy soil within our sampling grid as the most supported model. Analysis of cattle farms tested for M. bovis identified that for every 1% increase in sandy soil resulted in an increase in odds of infection by 4%. Our analysis revealed that the influence of prevalence of M. bovis in white-tailed deer was still a concern even after considerable efforts to prevent cattle interactions with white-tailed deer through on-farm mitigation and reduction in the deer population. Cattle farms test positive for M. bovis annually in our study area suggesting that the potential for an environmental source either on farms or in the surrounding landscape may contributing to new or re-infections with M. bovis. Our research provides an initial assessment of potential environmental factors that could be incorporated into additional modeling efforts as more knowledge of deer herd

  10. Improving inferences from short-term ecological studies with Bayesian hierarchical modeling: white-headed woodpeckers in managed forests.

    Science.gov (United States)

    Linden, Daniel W; Roloff, Gary J

    2015-08-01

    Pilot studies are often used to design short-term research projects and long-term ecological monitoring programs, but data are sometimes discarded when they do not match the eventual survey design. Bayesian hierarchical modeling provides a convenient framework for integrating multiple data sources while explicitly separating sample variation into observation and ecological state processes. Such an approach can better estimate state uncertainty and improve inferences from short-term studies in dynamic systems. We used a dynamic multistate occupancy model to estimate the probabilities of occurrence and nesting for white-headed woodpeckers Picoides albolarvatus in recent harvest units within managed forests of northern California, USA. Our objectives were to examine how occupancy states and state transitions were related to forest management practices, and how the probabilities changed over time. Using Gibbs variable selection, we made inferences using multiple model structures and generated model-averaged estimates. Probabilities of white-headed woodpecker occurrence and nesting were high in 2009 and 2010, and the probability that nesting persisted at a site was positively related to the snag density in harvest units. Prior-year nesting resulted in higher probabilities of subsequent occurrence and nesting. We demonstrate the benefit of forest management practices that increase the density of retained snags in harvest units for providing white-headed woodpecker nesting habitat. While including an additional year of data from our pilot study did not drastically alter management recommendations, it changed the interpretation of the mechanism behind the observed dynamics. Bayesian hierarchical modeling has the potential to maximize the utility of studies based on small sample sizes while fully accounting for measurement error and both estimation and model uncertainty, thereby improving the ability of observational data to inform conservation and management strategies.

  11. A hierarchical bayesian model to quantify uncertainty of stream water temperature forecasts.

    Directory of Open Access Journals (Sweden)

    Guillaume Bal

    Full Text Available Providing generic and cost effective modelling approaches to reconstruct and forecast freshwater temperature using predictors as air temperature and water discharge is a prerequisite to understanding ecological processes underlying the impact of water temperature and of global warming on continental aquatic ecosystems. Using air temperature as a simple linear predictor of water temperature can lead to significant bias in forecasts as it does not disentangle seasonality and long term trends in the signal. Here, we develop an alternative approach based on hierarchical Bayesian statistical time series modelling of water temperature, air temperature and water discharge using seasonal sinusoidal periodic signals and time varying means and amplitudes. Fitting and forecasting performances of this approach are compared with that of simple linear regression between water and air temperatures using i an emotive simulated example, ii application to three French coastal streams with contrasting bio-geographical conditions and sizes. The time series modelling approach better fit data and does not exhibit forecasting bias in long term trends contrary to the linear regression. This new model also allows for more accurate forecasts of water temperature than linear regression together with a fair assessment of the uncertainty around forecasting. Warming of water temperature forecast by our hierarchical Bayesian model was slower and more uncertain than that expected with the classical regression approach. These new forecasts are in a form that is readily usable in further ecological analyses and will allow weighting of outcomes from different scenarios to manage climate change impacts on freshwater wildlife.

  12. Prion Amplification and Hierarchical Bayesian Modeling Refine Detection of Prion Infection

    Science.gov (United States)

    Wyckoff, A. Christy; Galloway, Nathan; Meyerett-Reid, Crystal; Powers, Jenny; Spraker, Terry; Monello, Ryan J.; Pulford, Bruce; Wild, Margaret; Antolin, Michael; Vercauteren, Kurt; Zabel, Mark

    2015-02-01

    Prions are unique infectious agents that replicate without a genome and cause neurodegenerative diseases that include chronic wasting disease (CWD) of cervids. Immunohistochemistry (IHC) is currently considered the gold standard for diagnosis of a prion infection but may be insensitive to early or sub-clinical CWD that are important to understanding CWD transmission and ecology. We assessed the potential of serial protein misfolding cyclic amplification (sPMCA) to improve detection of CWD prior to the onset of clinical signs. We analyzed tissue samples from free-ranging Rocky Mountain elk (Cervus elaphus nelsoni) and used hierarchical Bayesian analysis to estimate the specificity and sensitivity of IHC and sPMCA conditional on simultaneously estimated disease states. Sensitivity estimates were higher for sPMCA (99.51%, credible interval (CI) 97.15-100%) than IHC of obex (brain stem, 76.56%, CI 57.00-91.46%) or retropharyngeal lymph node (90.06%, CI 74.13-98.70%) tissues, or both (98.99%, CI 90.01-100%). Our hierarchical Bayesian model predicts the prevalence of prion infection in this elk population to be 18.90% (CI 15.50-32.72%), compared to previous estimates of 12.90%. Our data reveal a previously unidentified sub-clinical prion-positive portion of the elk population that could represent silent carriers capable of significantly impacting CWD ecology.

  13. Top-down feedback in an HMAX-like cortical model of object perception based on hierarchical Bayesian networks and belief propagation.

    Science.gov (United States)

    Dura-Bernal, Salvador; Wennekers, Thomas; Denham, Susan L

    2012-01-01

    Hierarchical generative models, such as Bayesian networks, and belief propagation have been shown to provide a theoretical framework that can account for perceptual processes, including feedforward recognition and feedback modulation. The framework explains both psychophysical and physiological experimental data and maps well onto the hierarchical distributed cortical anatomy. However, the complexity required to model cortical processes makes inference, even using approximate methods, very computationally expensive. Thus, existing object perception models based on this approach are typically limited to tree-structured networks with no loops, use small toy examples or fail to account for certain perceptual aspects such as invariance to transformations or feedback reconstruction. In this study we develop a Bayesian network with an architecture similar to that of HMAX, a biologically-inspired hierarchical model of object recognition, and use loopy belief propagation to approximate the model operations (selectivity and invariance). Crucially, the resulting Bayesian network extends the functionality of HMAX by including top-down recursive feedback. Thus, the proposed model not only achieves successful feedforward recognition invariant to noise, occlusions, and changes in position and size, but is also able to reproduce modulatory effects such as illusory contour completion and attention. Our novel and rigorous methodology covers key aspects such as learning using a layerwise greedy algorithm, combining feedback information from multiple parents and reducing the number of operations required. Overall, this work extends an established model of object recognition to include high-level feedback modulation, based on state-of-the-art probabilistic approaches. The methodology employed, consistent with evidence from the visual cortex, can be potentially generalized to build models of hierarchical perceptual organization that include top-down and bottom-up interactions, for

  14. Top-down feedback in an HMAX-like cortical model of object perception based on hierarchical Bayesian networks and belief propagation.

    Directory of Open Access Journals (Sweden)

    Salvador Dura-Bernal

    Full Text Available Hierarchical generative models, such as Bayesian networks, and belief propagation have been shown to provide a theoretical framework that can account for perceptual processes, including feedforward recognition and feedback modulation. The framework explains both psychophysical and physiological experimental data and maps well onto the hierarchical distributed cortical anatomy. However, the complexity required to model cortical processes makes inference, even using approximate methods, very computationally expensive. Thus, existing object perception models based on this approach are typically limited to tree-structured networks with no loops, use small toy examples or fail to account for certain perceptual aspects such as invariance to transformations or feedback reconstruction. In this study we develop a Bayesian network with an architecture similar to that of HMAX, a biologically-inspired hierarchical model of object recognition, and use loopy belief propagation to approximate the model operations (selectivity and invariance. Crucially, the resulting Bayesian network extends the functionality of HMAX by including top-down recursive feedback. Thus, the proposed model not only achieves successful feedforward recognition invariant to noise, occlusions, and changes in position and size, but is also able to reproduce modulatory effects such as illusory contour completion and attention. Our novel and rigorous methodology covers key aspects such as learning using a layerwise greedy algorithm, combining feedback information from multiple parents and reducing the number of operations required. Overall, this work extends an established model of object recognition to include high-level feedback modulation, based on state-of-the-art probabilistic approaches. The methodology employed, consistent with evidence from the visual cortex, can be potentially generalized to build models of hierarchical perceptual organization that include top-down and bottom

  15. Top-Down Feedback in an HMAX-Like Cortical Model of Object Perception Based on Hierarchical Bayesian Networks and Belief Propagation

    Science.gov (United States)

    Dura-Bernal, Salvador; Wennekers, Thomas; Denham, Susan L.

    2012-01-01

    Hierarchical generative models, such as Bayesian networks, and belief propagation have been shown to provide a theoretical framework that can account for perceptual processes, including feedforward recognition and feedback modulation. The framework explains both psychophysical and physiological experimental data and maps well onto the hierarchical distributed cortical anatomy. However, the complexity required to model cortical processes makes inference, even using approximate methods, very computationally expensive. Thus, existing object perception models based on this approach are typically limited to tree-structured networks with no loops, use small toy examples or fail to account for certain perceptual aspects such as invariance to transformations or feedback reconstruction. In this study we develop a Bayesian network with an architecture similar to that of HMAX, a biologically-inspired hierarchical model of object recognition, and use loopy belief propagation to approximate the model operations (selectivity and invariance). Crucially, the resulting Bayesian network extends the functionality of HMAX by including top-down recursive feedback. Thus, the proposed model not only achieves successful feedforward recognition invariant to noise, occlusions, and changes in position and size, but is also able to reproduce modulatory effects such as illusory contour completion and attention. Our novel and rigorous methodology covers key aspects such as learning using a layerwise greedy algorithm, combining feedback information from multiple parents and reducing the number of operations required. Overall, this work extends an established model of object recognition to include high-level feedback modulation, based on state-of-the-art probabilistic approaches. The methodology employed, consistent with evidence from the visual cortex, can be potentially generalized to build models of hierarchical perceptual organization that include top-down and bottom-up interactions, for

  16. Probabilistic daily ILI syndromic surveillance with a spatio-temporal Bayesian hierarchical model.

    Directory of Open Access Journals (Sweden)

    Ta-Chien Chan

    Full Text Available BACKGROUND: For daily syndromic surveillance to be effective, an efficient and sensible algorithm would be expected to detect aberrations in influenza illness, and alert public health workers prior to any impending epidemic. This detection or alert surely contains uncertainty, and thus should be evaluated with a proper probabilistic measure. However, traditional monitoring mechanisms simply provide a binary alert, failing to adequately address this uncertainty. METHODS AND FINDINGS: Based on the Bayesian posterior probability of influenza-like illness (ILI visits, the intensity of outbreak can be directly assessed. The numbers of daily emergency room ILI visits at five community hospitals in Taipei City during 2006-2007 were collected and fitted with a Bayesian hierarchical model containing meteorological factors such as temperature and vapor pressure, spatial interaction with conditional autoregressive structure, weekend and holiday effects, seasonality factors, and previous ILI visits. The proposed algorithm recommends an alert for action if the posterior probability is larger than 70%. External data from January to February of 2008 were retained for validation. The decision rule detects successfully the peak in the validation period. When comparing the posterior probability evaluation with the modified Cusum method, results show that the proposed method is able to detect the signals 1-2 days prior to the rise of ILI visits. CONCLUSIONS: This Bayesian hierarchical model not only constitutes a dynamic surveillance system but also constructs a stochastic evaluation of the need to call for alert. The monitoring mechanism provides earlier detection as well as a complementary tool for current surveillance programs.

  17. An approach based on Hierarchical Bayesian Graphical Models for measurement interpretation under uncertainty

    Science.gov (United States)

    Skataric, Maja; Bose, Sandip; Zeroug, Smaine; Tilke, Peter

    2017-02-01

    It is not uncommon in the field of non-destructive evaluation that multiple measurements encompassing a variety of modalities are available for analysis and interpretation for determining the underlying states of nature of the materials or parts being tested. Despite and sometimes due to the richness of data, significant challenges arise in the interpretation manifested as ambiguities and inconsistencies due to various uncertain factors in the physical properties (inputs), environment, measurement device properties, human errors, and the measurement data (outputs). Most of these uncertainties cannot be described by any rigorous mathematical means, and modeling of all possibilities is usually infeasible for many real time applications. In this work, we will discuss an approach based on Hierarchical Bayesian Graphical Models (HBGM) for the improved interpretation of complex (multi-dimensional) problems with parametric uncertainties that lack usable physical models. In this setting, the input space of the physical properties is specified through prior distributions based on domain knowledge and expertise, which are represented as Gaussian mixtures to model the various possible scenarios of interest for non-destructive testing applications. Forward models are then used offline to generate the expected distribution of the proposed measurements which are used to train a hierarchical Bayesian network. In Bayesian analysis, all model parameters are treated as random variables, and inference of the parameters is made on the basis of posterior distribution given the observed data. Learned parameters of the posterior distribution obtained after the training can therefore be used to build an efficient classifier for differentiating new observed data in real time on the basis of pre-trained models. We will illustrate the implementation of the HBGM approach to ultrasonic measurements used for cement evaluation of cased wells in the oil industry.

  18. An integrative framework for Bayesian variable selection with informative priors for identifying genes and pathways.

    Science.gov (United States)

    Peng, Bin; Zhu, Dianwen; Ander, Bradley P; Zhang, Xiaoshuai; Xue, Fuzhong; Sharp, Frank R; Yang, Xiaowei

    2013-01-01

    The discovery of genetic or genomic markers plays a central role in the development of personalized medicine. A notable challenge exists when dealing with the high dimensionality of the data sets, as thousands of genes or millions of genetic variants are collected on a relatively small number of subjects. Traditional gene-wise selection methods using univariate analyses face difficulty to incorporate correlational, structural, or functional structures amongst the molecular measures. For microarray gene expression data, we first summarize solutions in dealing with 'large p, small n' problems, and then propose an integrative Bayesian variable selection (iBVS) framework for simultaneously identifying causal or marker genes and regulatory pathways. A novel partial least squares (PLS) g-prior for iBVS is developed to allow the incorporation of prior knowledge on gene-gene interactions or functional relationships. From the point view of systems biology, iBVS enables user to directly target the joint effects of multiple genes and pathways in a hierarchical modeling diagram to predict disease status or phenotype. The estimated posterior selection probabilities offer probabilitic and biological interpretations. Both simulated data and a set of microarray data in predicting stroke status are used in validating the performance of iBVS in a Probit model with binary outcomes. iBVS offers a general framework for effective discovery of various molecular biomarkers by combining data-based statistics and knowledge-based priors. Guidelines on making posterior inferences, determining Bayesian significance levels, and improving computational efficiencies are also discussed.

  19. A Bayesian framework for cell-level protein network analysis for multivariate proteomics image data

    Science.gov (United States)

    Kovacheva, Violet N.; Sirinukunwattana, Korsuk; Rajpoot, Nasir M.

    2014-03-01

    The recent development of multivariate imaging techniques, such as the Toponome Imaging System (TIS), has facilitated the analysis of multiple co-localisation of proteins. This could hold the key to understanding complex phenomena such as protein-protein interaction in cancer. In this paper, we propose a Bayesian framework for cell level network analysis allowing the identification of several protein pairs having significantly higher co-expression levels in cancerous tissue samples when compared to normal colon tissue. It involves segmenting the DAPI-labeled image into cells and determining the cell phenotypes according to their protein-protein dependence profile. The cells are phenotyped using Gaussian Bayesian hierarchical clustering (GBHC) after feature selection is performed. The phenotypes are then analysed using Difference in Sums of Weighted cO-dependence Profiles (DiSWOP), which detects differences in the co-expression patterns of protein pairs. We demonstrate that the pairs highlighted by the proposed framework have high concordance with recent results using a different phenotyping method. This demonstrates that the results are independent of the clustering method used. In addition, the highlighted protein pairs are further analysed via protein interaction pathway databases and by considering the localization of high protein-protein dependence within individual samples. This suggests that the proposed approach could identify potentially functional protein complexes active in cancer progression and cell differentiation.

  20. Estimating effectiveness in HIV prevention trials with a Bayesian hierarchical compound Poisson frailty model

    Science.gov (United States)

    Coley, Rebecca Yates; Browna, Elizabeth R.

    2016-01-01

    Inconsistent results in recent HIV prevention trials of pre-exposure prophylactic interventions may be due to heterogeneity in risk among study participants. Intervention effectiveness is most commonly estimated with the Cox model, which compares event times between populations. When heterogeneity is present, this population-level measure underestimates intervention effectiveness for individuals who are at risk. We propose a likelihood-based Bayesian hierarchical model that estimates the individual-level effectiveness of candidate interventions by accounting for heterogeneity in risk with a compound Poisson-distributed frailty term. This model reflects the mechanisms of HIV risk and allows that some participants are not exposed to HIV and, therefore, have no risk of seroconversion during the study. We assess model performance via simulation and apply the model to data from an HIV prevention trial. PMID:26869051

  1. A hierarchical bayesian approach to ecological count data: a flexible tool for ecologists.

    Directory of Open Access Journals (Sweden)

    James A Fordyce

    Full Text Available Many ecological studies use the analysis of count data to arrive at biologically meaningful inferences. Here, we introduce a hierarchical bayesian approach to count data. This approach has the advantage over traditional approaches in that it directly estimates the parameters of interest at both the individual-level and population-level, appropriately models uncertainty, and allows for comparisons among models, including those that exceed the complexity of many traditional approaches, such as ANOVA or non-parametric analogs. As an example, we apply this method to oviposition preference data for butterflies in the genus Lycaeides. Using this method, we estimate the parameters that describe preference for each population, compare the preference hierarchies among populations, and explore various models that group populations that share the same preference hierarchy.

  2. Hierarchical structure of the Sicilian goats revealed by Bayesian analyses of microsatellite information.

    Science.gov (United States)

    Siwek, M; Finocchiaro, R; Curik, I; Portolano, B

    2011-02-01

    Genetic structure and relationship amongst the main goat populations in Sicily (Girgentana, Derivata di Siria, Maltese and Messinese) were analysed using information from 19 microsatellite markers genotyped on 173 individuals. A posterior Bayesian approach implemented in the program STRUCTURE revealed a hierarchical structure with two clusters at the first level (Girgentana vs. Messinese, Derivata di Siria and Maltese), explaining 4.8% of variation (amovaФ(ST) estimate). Seven clusters nested within these first two clusters (further differentiations of Girgentana, Derivata di Siria and Maltese), explaining 8.5% of variation (amovaФ(SC) estimate). The analyses and methods applied in this study indicate their power to detect subtle population structure. © 2010 The Authors, Animal Genetics © 2010 Stichting International Foundation for Animal Genetics.

  3. A Hierarchical Bayesian Setting for an Inverse Problem in Linear Parabolic PDEs with Noisy Boundary Conditions

    KAUST Repository

    Ruggeri, Fabrizio

    2016-05-12

    In this work we develop a Bayesian setting to infer unknown parameters in initial-boundary value problems related to linear parabolic partial differential equations. We realistically assume that the boundary data are noisy, for a given prescribed initial condition. We show how to derive the joint likelihood function for the forward problem, given some measurements of the solution field subject to Gaussian noise. Given Gaussian priors for the time-dependent Dirichlet boundary values, we analytically marginalize the joint likelihood using the linearity of the equation. Our hierarchical Bayesian approach is fully implemented in an example that involves the heat equation. In this example, the thermal diffusivity is the unknown parameter. We assume that the thermal diffusivity parameter can be modeled a priori through a lognormal random variable or by means of a space-dependent stationary lognormal random field. Synthetic data are used to test the inference. We exploit the behavior of the non-normalized log posterior distribution of the thermal diffusivity. Then, we use the Laplace method to obtain an approximated Gaussian posterior and therefore avoid costly Markov Chain Monte Carlo computations. Expected information gains and predictive posterior densities for observable quantities are numerically estimated using Laplace approximation for different experimental setups.

  4. Hierarchical imputation of systematically and sporadically missing data: An approximate Bayesian approach using chained equations.

    Science.gov (United States)

    Jolani, Shahab

    2018-03-01

    In health and medical sciences, multiple imputation (MI) is now becoming popular to obtain valid inferences in the presence of missing data. However, MI of clustered data such as multicenter studies and individual participant data meta-analysis requires advanced imputation routines that preserve the hierarchical structure of data. In clustered data, a specific challenge is the presence of systematically missing data, when a variable is completely missing in some clusters, and sporadically missing data, when it is partly missing in some clusters. Unfortunately, little is known about how to perform MI when both types of missing data occur simultaneously. We develop a new class of hierarchical imputation approach based on chained equations methodology that simultaneously imputes systematically and sporadically missing data while allowing for arbitrary patterns of missingness among them. Here, we use a random effect imputation model and adopt a simplification over fully Bayesian techniques such as Gibbs sampler to directly obtain draws of parameters within each step of the chained equations. We justify through theoretical arguments and extensive simulation studies that the proposed imputation methodology has good statistical properties in terms of bias and coverage rates of parameter estimates. An illustration is given in a case study with eight individual participant datasets. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Hierarchical Bayesian Markov switching models with application to predicting spawning success of shovelnose sturgeon

    Science.gov (United States)

    Holan, S.H.; Davis, G.M.; Wildhaber, M.L.; DeLonay, A.J.; Papoulias, D.M.

    2009-01-01

    The timing of spawning in fish is tightly linked to environmental factors; however, these factors are not very well understood for many species. Specifically, little information is available to guide recruitment efforts for endangered species such as the sturgeon. Therefore, we propose a Bayesian hierarchical model for predicting the success of spawning of the shovelnose sturgeon which uses both biological and behavioural (longitudinal) data. In particular, we use data that were produced from a tracking study that was conducted in the Lower Missouri River. The data that were produced from this study consist of biological variables associated with readiness to spawn along with longitudinal behavioural data collected by using telemetry and archival data storage tags. These high frequency data are complex both biologically and in the underlying behavioural process. To accommodate such complexity we developed a hierarchical linear regression model that uses an eigenvalue predictor, derived from the transition probability matrix of a two-state Markov switching model with generalized auto-regressive conditional heteroscedastic dynamics. Finally, to minimize the computational burden that is associated with estimation of this model, a parallel computing approach is proposed. ?? Journal compilation 2009 Royal Statistical Society.

  6. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    Science.gov (United States)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  7. A Hierarchical Probabilistic Framework for Recognizing Learners’ Interaction Experience Trends and Emotions

    Directory of Open Access Journals (Sweden)

    Imène Jraidi

    2014-01-01

    Full Text Available We seek to model the users’ experience within an interactive learning environment. More precisely, we are interested in assessing the relationship between learners’ emotional reactions and three trends in the interaction experience, namely, flow: the optimal interaction (a perfect immersion within the task, stuck: the nonoptimal interaction (a difficulty to maintain focused attention, and off-task: the noninteraction (a dropout from the task. We propose a hierarchical probabilistic framework using a dynamic Bayesian network to model this relationship and to simultaneously recognize the probability of experiencing each trend as well as the emotional responses occurring subsequently. The framework combines three modality diagnostic variables that sense the learner’s experience including physiology, behavior, and performance, predictive variables that represent the current context and the learner’s profile, and a dynamic structure that tracks the evolution of the learner’s experience. An experimental study, with a specifically designed protocol for eliciting the targeted experiences, was conducted to validate our approach. Results revealed that multiple concurrent emotions can be associated with the experiences of flow, stuck, and off-task and that the same trend can be expressed differently from one individual to another. The evaluation of the framework showed promising results in predicting learners’ experience trends and emotional responses.

  8. Linking bovine tuberculosis on cattle farms to white-tailed deer and environmental variables using Bayesian hierarchical analysis

    Science.gov (United States)

    Walter, William D.; Smith, Rick; Vanderklok, Mike; VerCauterren, Kurt C.

    2014-01-01

    Bovine tuberculosis is a bacterial disease caused by Mycobacterium bovis in livestock and wildlife with hosts that include Eurasian badgers (Meles meles), brushtail possum (Trichosurus vulpecula), and white-tailed deer (Odocoileus virginianus). Risk-assessment efforts in Michigan have been initiated on farms to minimize interactions of cattle with wildlife hosts but research onM. bovis on cattle farms has not investigated the spatial context of disease epidemiology. To incorporate spatially explicit data, initial likelihood of infection probabilities for cattle farms tested for M. bovis, prevalence of M. bovis in white-tailed deer, deer density, and environmental variables for each farm were modeled in a Bayesian hierarchical framework. We used geo-referenced locations of 762 cattle farms that have been tested for M. bovis, white-tailed deer prevalence, and several environmental variables that may lead to long-term survival and viability of M. bovis on farms and surrounding habitats (i.e., soil type, habitat type). Bayesian hierarchical analyses identified deer prevalence and proportion of sandy soil within our sampling grid as the most supported model. Analysis of cattle farms tested for M. bovisidentified that for every 1% increase in sandy soil resulted in an increase in odds of infection by 4%. Our analysis revealed that the influence of prevalence of M. bovis in white-tailed deer was still a concern even after considerable efforts to prevent cattle interactions with white-tailed deer through on-farm mitigation and reduction in the deer population. Cattle farms test positive for M. bovis annually in our study area suggesting that the potential for an environmental source either on farms or in the surrounding landscape may contributing to new or re-infections with M. bovis. Our research provides an initial assessment of potential environmental factors that could be incorporated into additional modeling efforts as more knowledge of deer herd

  9. Can the experimental study of religion be advanced using a Bayesian predictive framework?

    NARCIS (Netherlands)

    van Elk, M.; Wagenmakers, E.-J.

    2017-01-01

    We propose a Bayesian framework as an important theoretical and methodological tool to improve the scientific study of religion. At a theoretical level, the Bayesian predictive processing framework has the potential to provide a unifying account of religious beliefs and experience by stressing the

  10. Abrupt strategy change underlies gradual performance change: Bayesian hierarchical models of component and aggregate strategy use.

    Science.gov (United States)

    Wynton, Sarah K A; Anglim, Jeromy

    2017-10-01

    While researchers have often sought to understand the learning curve in terms of multiple component processes, few studies have measured and mathematically modeled these processes on a complex task. In particular, there remains a need to reconcile how abrupt changes in strategy use can co-occur with gradual changes in task completion time. Thus, the current study aimed to assess the degree to which strategy change was abrupt or gradual, and whether strategy aggregation could partially explain gradual performance change. It also aimed to show how Bayesian methods could be used to model the effect of practice on strategy use. To achieve these aims, 162 participants completed 15 blocks of practice on a complex computer-based task-the Wynton-Anglim booking (WAB) task. The task allowed for multiple component strategies (i.e., memory retrieval, information reduction, and insight) that could also be aggregated to a global measure of strategy use. Bayesian hierarchical models were used to compare abrupt and gradual functions of component and aggregate strategy use. Task completion time was well-modeled by a power function, and global strategy use explained substantial variance in performance. Change in component strategy use tended to be abrupt, whereas change in global strategy use was gradual and well-modeled by a power function. Thus, differential timing of component strategy shifts leads to gradual changes in overall strategy efficiency, and this provides one reason for why smooth learning curves can co-occur with abrupt changes in strategy use. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Constraining mass anomalies in the interior of spherical bodies using Trans-dimensional Bayesian Hierarchical inference.

    Science.gov (United States)

    Izquierdo, K.; Lekic, V.; Montesi, L.

    2017-12-01

    Gravity inversions are especially important for planetary applications since measurements of the variations in gravitational acceleration are often the only constraint available to map out lateral density variations in the interiors of planets and other Solar system objects. Currently, global gravity data is available for the terrestrial planets and the Moon. Although several methods for inverting these data have been developed and applied, the non-uniqueness of global density models that fit the data has not yet been fully characterized. We make use of Bayesian inference and a Reversible Jump Markov Chain Monte Carlo (RJMCMC) approach to develop a Trans-dimensional Hierarchical Bayesian (THB) inversion algorithm that yields a large sample of models that fit a gravity field. From this group of models, we can determine the most likely value of parameters of a global density model and a measure of the non-uniqueness of each parameter when the number of anomalies describing the gravity field is not fixed a priori. We explore the use of a parallel tempering algorithm and fast multipole method to reduce the number of iterations and computing time needed. We applied this method to a synthetic gravity field of the Moon and a long wavelength synthetic model of density anomalies in the Earth's lower mantle. We obtained a good match between the given gravity field and the gravity field produced by the most likely model in each inversion. The number of anomalies of the models showed parsimony of the algorithm, the value of the noise variance of the input data was retrieved, and the non-uniqueness of the models was quantified. Our results show that the ability to constrain the latitude and longitude of density anomalies, which is excellent at shallow locations (information about the overall density distribution of celestial bodies even when there is no other geophysical data available.

  12. Parameterization of aquatic ecosystem functioning and its natural variation: Hierarchical Bayesian modelling of plankton food web dynamics

    Science.gov (United States)

    Norros, Veera; Laine, Marko; Lignell, Risto; Thingstad, Frede

    2017-10-01

    Methods for extracting empirically and theoretically sound parameter values are urgently needed in aquatic ecosystem modelling to describe key flows and their variation in the system. Here, we compare three Bayesian formulations for mechanistic model parameterization that differ in their assumptions about the variation in parameter values between various datasets: 1) global analysis - no variation, 2) separate analysis - independent variation and 3) hierarchical analysis - variation arising from a shared distribution defined by hyperparameters. We tested these methods, using computer-generated and empirical data, coupled with simplified and reasonably realistic plankton food web models, respectively. While all methods were adequate, the simulated example demonstrated that a well-designed hierarchical analysis can result in the most accurate and precise parameter estimates and predictions, due to its ability to combine information across datasets. However, our results also highlighted sensitivity to hyperparameter prior distributions as an important caveat of hierarchical analysis. In the more complex empirical example, hierarchical analysis was able to combine precise identification of parameter values with reasonably good predictive performance, although the ranking of the methods was less straightforward. We conclude that hierarchical Bayesian analysis is a promising tool for identifying key ecosystem-functioning parameters and their variation from empirical datasets.

  13. Epigenetic change detection and pattern recognition via Bayesian hierarchical hidden Markov models.

    Science.gov (United States)

    Wang, Xinlei; Zang, Miao; Xiao, Guanghua

    2013-06-15

    Epigenetics is the study of changes to the genome that can switch genes on or off and determine which proteins are transcribed without altering the DNA sequence. Recently, epigenetic changes have been linked to the development and progression of disease such as psychiatric disorders. High-throughput epigenetic experiments have enabled researchers to measure genome-wide epigenetic profiles and yield data consisting of intensity ratios of immunoprecipitation versus reference samples. The intensity ratios can provide a view of genomic regions where protein binding occur under one experimental condition and further allow us to detect epigenetic alterations through comparison between two different conditions. However, such experiments can be expensive, with only a few replicates available. Moreover, epigenetic data are often spatially correlated with high noise levels. In this paper, we develop a Bayesian hierarchical model, combined with hidden Markov processes with four states for modeling spatial dependence, to detect genomic sites with epigenetic changes from two-sample experiments with paired internal control. One attractive feature of the proposed method is that the four states of the hidden Markov process have well-defined biological meanings and allow us to directly call the change patterns based on the corresponding posterior probabilities. In contrast, none of existing methods can offer this advantage. In addition, the proposed method offers great power in statistical inference by spatial smoothing (via hidden Markov modeling) and information pooling (via hierarchical modeling). Both simulation studies and real data analysis in a cocaine addiction study illustrate the reliability and success of this method. Copyright © 2012 John Wiley & Sons, Ltd.

  14. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. I. Biological model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    Several replacement models have been presented in literature. In other applicational areas like dairy cow replacement, various methodological improvements like hierarchical Markov processes and Bayesian updating have been implemented, but not in sow models. Furthermore, there are methodological...... improvements like multi-level hierarchical Markov processes with decisions on multiple time scales, efficient methods for parameter estimations at herd level and standard software that has been hardly implemented at all in any replacement model. The aim of this study is to present a sow replacement model...

  15. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. II. Optimization model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    improvements. The biological model of the replacement model is described in a previous paper and in this paper the optimization model is described. The model is developed as a prototype for use under practical conditions. The application of the model is demonstrated using data from two commercial Danish sow......Recent methodological improvements in replacement models comprising multi-level hierarchical Markov processes and Bayesian updating have hardly been implemented in any replacement model and the aim of this study is to present a sow replacement model that really uses these methodological...... herds. It is concluded that the Bayesian updating technique and the hierarchical structure decrease the size of the state space dramatically. Since parameter estimates vary considerably among herds it is concluded that decision support concerning sow replacement only makes sense with parameters...

  16. Tools for predicting rainfall from lightning records: events identification and rain prediction using a Bayesian hierarchical model

    OpenAIRE

    Di Giuseppe, Edmondo; Lasinio, Giovanna Jona; Pasqui, Massimiliano; Esposito, Stanislao

    2015-01-01

    We propose a new statistical protocol for the estimation of precipitation using lightning data. We first identify rainy events using a scan statistics, then we estimate Rainfall Lighting Ratio (RLR) to convert lightning number into rain volume given the storm intensity. Then we build a hierarchical Bayesian model aiming at the prediction of 15- and 30-minutes cumulated precipitation at unobserved locations and time using information on lightning in the same area. More specifically, we build a...

  17. An Approach to Structure Determination and Estimation of Hierarchical Archimedean Copulas and its Application to Bayesian Classification

    Czech Academy of Sciences Publication Activity Database

    Górecki, J.; Hofert, M.; Holeňa, Martin

    2016-01-01

    Roč. 46, č. 1 (2016), s. 21-59 ISSN 0925-9902 R&D Projects: GA ČR GA13-17187S Grant - others:Slezská univerzita v Opavě(CZ) SGS/21/2014 Institutional support: RVO:67985807 Keywords : Copula * Hierarchical archimedean copula * Copula estimation * Structure determination * Kendall’s tau * Bayesian classification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.294, year: 2016

  18. DUST SPECTRAL ENERGY DISTRIBUTIONS IN THE ERA OF HERSCHEL AND PLANCK: A HIERARCHICAL BAYESIAN-FITTING TECHNIQUE

    International Nuclear Information System (INIS)

    Kelly, Brandon C.; Goodman, Alyssa A.; Shetty, Rahul; Stutz, Amelia M.; Launhardt, Ralf; Kauffmann, Jens

    2012-01-01

    We present a hierarchical Bayesian method for fitting infrared spectral energy distributions (SEDs) of dust emission to observed fluxes. Under the standard assumption of optically thin single temperature (T) sources, the dust SED as represented by a power-law-modified blackbody is subject to a strong degeneracy between T and the spectral index β. The traditional non-hierarchical approaches, typically based on χ 2 minimization, are severely limited by this degeneracy, as it produces an artificial anti-correlation between T and β even with modest levels of observational noise. The hierarchical Bayesian method rigorously and self-consistently treats measurement uncertainties, including calibration and noise, resulting in more precise SED fits. As a result, the Bayesian fits do not produce any spurious anti-correlations between the SED parameters due to measurement uncertainty. We demonstrate that the Bayesian method is substantially more accurate than the χ 2 fit in recovering the SED parameters, as well as the correlations between them. As an illustration, we apply our method to Herschel and submillimeter ground-based observations of the star-forming Bok globule CB244. This source is a small, nearby molecular cloud containing a single low-mass protostar and a starless core. We find that T and β are weakly positively correlated—in contradiction with the χ 2 fits, which indicate a T-β anti-correlation from the same data set. Additionally, in comparison to the χ 2 fits the Bayesian SED parameter estimates exhibit a reduced range in values.

  19. Subjective value of risky foods for individual domestic chicks: a hierarchical Bayesian model.

    Science.gov (United States)

    Kawamori, Ai; Matsushima, Toshiya

    2010-05-01

    For animals to decide which prey to attack, the gain and delay of the food item must be integrated in a value function. However, the subjective value is not obtained by expected profitability when it is accompanied by risk. To estimate the subjective value, we examined choices in a cross-shaped maze with two colored feeders in domestic chicks. When tested by a reversal in food amount or delay, chicks changed choices similarly in both conditions (experiment 1). We therefore examined risk sensitivity for amount and delay (experiment 2) by supplying one feeder with food of fixed profitability and the alternative feeder with high- or low-profitability food at equal probability. Profitability varied in amount (groups 1 and 2 at high and low variance) or in delay (group 3). To find the equilibrium, the amount (groups 1 and 2) or delay (group 3) of the food in the fixed feeder was adjusted in a total of 18 blocks. The Markov chain Monte Carlo method was applied to a hierarchical Bayesian model to estimate the subjective value. Chicks undervalued the variable feeder in group 1 and were indifferent in group 2 but overvalued the variable feeder in group 3 at a population level. Re-examination without the titration procedure (experiment 3) suggested that the subjective value was not absolute for each option. When the delay was varied, the variable option was often given a paradoxically high value depending on fixed alternative. Therefore, the basic assumption of the uniquely determined value function might be questioned.

  20. Hierarchical Bayesian modeling of ionospheric TEC disturbances as non-stationary processes

    Science.gov (United States)

    Seid, Abdu Mohammed; Berhane, Tesfahun; Roininen, Lassi; Nigussie, Melessew

    2018-03-01

    We model regular and irregular variation of ionospheric total electron content as stationary and non-stationary processes, respectively. We apply the method developed to SCINDA GPS data set observed at Bahir Dar, Ethiopia (11.6 °N, 37.4 °E) . We use hierarchical Bayesian inversion with Gaussian Markov random process priors, and we model the prior parameters in the hyperprior. We use Matérn priors via stochastic partial differential equations, and use scaled Inv -χ2 hyperpriors for the hyperparameters. For drawing posterior estimates, we use Markov Chain Monte Carlo methods: Gibbs sampling and Metropolis-within-Gibbs for parameter and hyperparameter estimations, respectively. This allows us to quantify model parameter estimation uncertainties as well. We demonstrate the applicability of the method proposed using a synthetic test case. Finally, we apply the method to real GPS data set, which we decompose to regular and irregular variation components. The result shows that the approach can be used as an accurate ionospheric disturbance characterization technique that quantifies the total electron content variability with corresponding error uncertainties.

  1. Mapping brucellosis increases relative to elk density using hierarchical Bayesian models.

    Directory of Open Access Journals (Sweden)

    Paul C Cross

    Full Text Available The relationship between host density and parasite transmission is central to the effectiveness of many disease management strategies. Few studies, however, have empirically estimated this relationship particularly in large mammals. We applied hierarchical Bayesian methods to a 19-year dataset of over 6400 brucellosis tests of adult female elk (Cervus elaphus in northwestern Wyoming. Management captures that occurred from January to March were over two times more likely to be seropositive than hunted elk that were killed in September to December, while accounting for site and year effects. Areas with supplemental feeding grounds for elk had higher seroprevalence in 1991 than other regions, but by 2009 many areas distant from the feeding grounds were of comparable seroprevalence. The increases in brucellosis seroprevalence were correlated with elk densities at the elk management unit, or hunt area, scale (mean 2070 km(2; range = [95-10237]. The data, however, could not differentiate among linear and non-linear effects of host density. Therefore, control efforts that focus on reducing elk densities at a broad spatial scale were only weakly supported. Additional research on how a few, large groups within a region may be driving disease dynamics is needed for more targeted and effective management interventions. Brucellosis appears to be expanding its range into new regions and elk populations, which is likely to further complicate the United States brucellosis eradication program. This study is an example of how the dynamics of host populations can affect their ability to serve as disease reservoirs.

  2. Mapping brucellosis increases relative to elk density using hierarchical Bayesian models

    Science.gov (United States)

    Cross, Paul C.; Heisey, Dennis M.; Scurlock, Brandon M.; Edwards, William H.; Brennan, Angela; Ebinger, Michael R.

    2010-01-01

    The relationship between host density and parasite transmission is central to the effectiveness of many disease management strategies. Few studies, however, have empirically estimated this relationship particularly in large mammals. We applied hierarchical Bayesian methods to a 19-year dataset of over 6400 brucellosis tests of adult female elk (Cervus elaphus) in northwestern Wyoming. Management captures that occurred from January to March were over two times more likely to be seropositive than hunted elk that were killed in September to December, while accounting for site and year effects. Areas with supplemental feeding grounds for elk had higher seroprevalence in 1991 than other regions, but by 2009 many areas distant from the feeding grounds were of comparable seroprevalence. The increases in brucellosis seroprevalence were correlated with elk densities at the elk management unit, or hunt area, scale (mean 2070 km2; range = [95–10237]). The data, however, could not differentiate among linear and non-linear effects of host density. Therefore, control efforts that focus on reducing elk densities at a broad spatial scale were only weakly supported. Additional research on how a few, large groups within a region may be driving disease dynamics is needed for more targeted and effective management interventions. Brucellosis appears to be expanding its range into new regions and elk populations, which is likely to further complicate the United States brucellosis eradication program. This study is an example of how the dynamics of host populations can affect their ability to serve as disease reservoirs.

  3. A Bayesian Hierarchical Model for Glacial Dynamics Based on the Shallow Ice Approximation and its Evaluation Using Analytical Solutions

    Science.gov (United States)

    Gopalan, Giri; Hrafnkelsson, Birgir; Aðalgeirsdóttir, Guðfinna; Jarosch, Alexander H.; Pálsson, Finnur

    2018-03-01

    Bayesian hierarchical modeling can assist the study of glacial dynamics and ice flow properties. This approach will allow glaciologists to make fully probabilistic predictions for the thickness of a glacier at unobserved spatio-temporal coordinates, and it will also allow for the derivation of posterior probability distributions for key physical parameters such as ice viscosity and basal sliding. The goal of this paper is to develop a proof of concept for a Bayesian hierarchical model constructed, which uses exact analytical solutions for the shallow ice approximation (SIA) introduced by Bueler et al. (2005). A suite of test simulations utilizing these exact solutions suggests that this approach is able to adequately model numerical errors and produce useful physical parameter posterior distributions and predictions. A byproduct of the development of the Bayesian hierarchical model is the derivation of a novel finite difference method for solving the SIA partial differential equation (PDE). An additional novelty of this work is the correction of numerical errors induced through a numerical solution using a statistical model. This error correcting process models numerical errors that accumulate forward in time and spatial variation of numerical errors between the dome, interior, and margin of a glacier.

  4. An integrative framework for Bayesian variable selection with informative priors for identifying genes and pathways.

    Directory of Open Access Journals (Sweden)

    Bin Peng

    Full Text Available The discovery of genetic or genomic markers plays a central role in the development of personalized medicine. A notable challenge exists when dealing with the high dimensionality of the data sets, as thousands of genes or millions of genetic variants are collected on a relatively small number of subjects. Traditional gene-wise selection methods using univariate analyses face difficulty to incorporate correlational, structural, or functional structures amongst the molecular measures. For microarray gene expression data, we first summarize solutions in dealing with 'large p, small n' problems, and then propose an integrative Bayesian variable selection (iBVS framework for simultaneously identifying causal or marker genes and regulatory pathways. A novel partial least squares (PLS g-prior for iBVS is developed to allow the incorporation of prior knowledge on gene-gene interactions or functional relationships. From the point view of systems biology, iBVS enables user to directly target the joint effects of multiple genes and pathways in a hierarchical modeling diagram to predict disease status or phenotype. The estimated posterior selection probabilities offer probabilitic and biological interpretations. Both simulated data and a set of microarray data in predicting stroke status are used in validating the performance of iBVS in a Probit model with binary outcomes. iBVS offers a general framework for effective discovery of various molecular biomarkers by combining data-based statistics and knowledge-based priors. Guidelines on making posterior inferences, determining Bayesian significance levels, and improving computational efficiencies are also discussed.

  5. Kinematic and Attribute Fusion Using a Bayesian Belief Network Framework

    National Research Council Canada - National Science Library

    Krieg, Mark L

    2006-01-01

    .... However, attribute information has the potential to not only provide identity and class information, but it may also improve data association and kinematic tracking performance, Bayesian Belief...

  6. Internal cycling, not external loading, decides the nutrient limitation in eutrophic lake: A dynamic model with temporal Bayesian hierarchical inference.

    Science.gov (United States)

    Wu, Zhen; Liu, Yong; Liang, Zhongyao; Wu, Sifeng; Guo, Huaicheng

    2017-06-01

    Lake eutrophication is associated with excessive anthropogenic nutrients (mainly nitrogen (N) and phosphorus (P)) and unobserved internal nutrient cycling. Despite the advances in understanding the role of external loadings, the contribution of internal nutrient cycling is still an open question. A dynamic mass-balance model was developed to simulate and measure the contributions of internal cycling and external loading. It was based on the temporal Bayesian Hierarchical Framework (BHM), where we explored the seasonal patterns in the dynamics of nutrient cycling processes and the limitation of N and P on phytoplankton growth in hyper-eutrophic Lake Dianchi, China. The dynamic patterns of the five state variables (Chla, TP, ammonia, nitrate and organic N) were simulated based on the model. Five parameters (algae growth rate, sediment exchange rate of N and P, nitrification rate and denitrification rate) were estimated based on BHM. The model provided a good fit to observations. Our model results highlighted the role of internal cycling of N and P in Lake Dianchi. The internal cycling processes contributed more than external loading to the N and P changes in the water column. Further insights into the nutrient limitation analysis indicated that the sediment exchange of P determined the P limitation. Allowing for the contribution of denitrification to N removal, N was the more limiting nutrient in most of the time, however, P was the more important nutrient for eutrophication management. For Lake Dianchi, it would not be possible to recover solely by reducing the external watershed nutrient load; the mechanisms of internal cycling should also be considered as an approach to inhibit the release of sediments and to enhance denitrification. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. TYPE Ia SUPERNOVA COLORS AND EJECTA VELOCITIES: HIERARCHICAL BAYESIAN REGRESSION WITH NON-GAUSSIAN DISTRIBUTIONS

    International Nuclear Information System (INIS)

    Mandel, Kaisey S.; Kirshner, Robert P.; Foley, Ryan J.

    2014-01-01

    We investigate the statistical dependence of the peak intrinsic colors of Type Ia supernovae (SNe Ia) on their expansion velocities at maximum light, measured from the Si II λ6355 spectral feature. We construct a new hierarchical Bayesian regression model, accounting for the random effects of intrinsic scatter, measurement error, and reddening by host galaxy dust, and implement a Gibbs sampler and deviance information criteria to estimate the correlation. The method is applied to the apparent colors from BVRI light curves and Si II velocity data for 79 nearby SNe Ia. The apparent color distributions of high-velocity (HV) and normal velocity (NV) supernovae exhibit significant discrepancies for B – V and B – R, but not other colors. Hence, they are likely due to intrinsic color differences originating in the B band, rather than dust reddening. The mean intrinsic B – V and B – R color differences between HV and NV groups are 0.06 ± 0.02 and 0.09 ± 0.02 mag, respectively. A linear model finds significant slopes of –0.021 ± 0.006 and –0.030 ± 0.009 mag (10 3 km s –1 ) –1 for intrinsic B – V and B – R colors versus velocity, respectively. Because the ejecta velocity distribution is skewed toward high velocities, these effects imply non-Gaussian intrinsic color distributions with skewness up to +0.3. Accounting for the intrinsic-color-velocity correlation results in corrections to A V extinction estimates as large as –0.12 mag for HV SNe Ia and +0.06 mag for NV events. Velocity measurements from SN Ia spectra have the potential to diminish systematic errors from the confounding of intrinsic colors and dust reddening affecting supernova distances

  8. ACES-Based Testbed and Bayesian Game-Theoretic Framework for Dynamic Airspace Configuration, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The key innovation in this effort is the development of algorithms and a framework for automated Dynamic Airspace Configuration (DAC) using a cooperative Bayesian...

  9. Hierarchical modeling of systems with similar components: A framework for adaptive monitoring and control

    International Nuclear Information System (INIS)

    Memarzadeh, Milad; Pozzi, Matteo; Kolter, J. Zico

    2016-01-01

    System management includes the selection of maintenance actions depending on the available observations: when a system is made up by components known to be similar, data collected on one is also relevant for the management of others. This is typically the case of wind farms, which are made up by similar turbines. Optimal management of wind farms is an important task due to high cost of turbines' operation and maintenance: in this context, we recently proposed a method for planning and learning at system-level, called PLUS, built upon the Partially Observable Markov Decision Process (POMDP) framework, which treats transition and emission probabilities as random variables, and is therefore suitable for including model uncertainty. PLUS models the components as independent or identical. In this paper, we extend that formulation, allowing for a weaker similarity among components. The proposed approach, called Multiple Uncertain POMDP (MU-POMDP), models the components as POMDPs, and assumes the corresponding parameters as dependent random variables. Through this framework, we can calibrate specific degradation and emission models for each component while, at the same time, process observations at system-level. We compare the performance of the proposed MU-POMDP with PLUS, and discuss its potential and computational complexity. - Highlights: • A computational framework is proposed for adaptive monitoring and control. • It adopts a scheme based on Markov Chain Monte Carlo for inference and learning. • Hierarchical Bayesian modeling is used to allow a system-level flow of information. • Results show potential of significant savings in management of wind farms.

  10. Airline Sustainability Modeling: A New Framework with Application of Bayesian Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Hashem Salarzadeh Jenatabadi

    2016-11-01

    Full Text Available There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM with maximum likelihood and Bayesian predictors. The introduced framework includes economic performance, operational performance, cost performance, and financial performance. Based on both Bayesian SEM (Bayesian-SEM and Classical SEM (Classical-SEM, it was found that economic performance with both operational performance and cost performance are significantly related to the financial performance index. The four mathematical indices employed are root mean square error, coefficient of determination, mean absolute error, and mean absolute percentage error to compare the efficiency of Bayesian-SEM and Classical-SEM in predicting the airline financial performance. The outputs confirmed that the framework with Bayesian prediction delivered a good fit with the data, although the framework predicted with a Classical-SEM approach did not prepare a well-fitting model. The reasons for this discrepancy between Classical and Bayesian predictions, as well as the potential advantages and caveats with the application of Bayesian approach in airline sustainability studies, are debated.

  11. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    OpenAIRE

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models marketing performance as a sequence of intermediate performance measures ultimately leading to financial performance. This framework, called the Hierarchical Marketing Performance (HMP) framework, starts ...

  12. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. II. Optimization model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    improvements. The biological model of the replacement model is described in a previous paper and in this paper the optimization model is described. The model is developed as a prototype for use under practical conditions. The application of the model is demonstrated using data from two commercial Danish sow......Recent methodological improvements in replacement models comprising multi-level hierarchical Markov processes and Bayesian updating have hardly been implemented in any replacement model and the aim of this study is to present a sow replacement model that really uses these methodological...

  13. Spatial Intensity Duration Frequency Relationships Using Hierarchical Bayesian Analysis for Urban Areas

    Science.gov (United States)

    Rupa, Chandra; Mujumdar, Pradeep

    2016-04-01

    In urban areas, quantification of extreme precipitation is important in the design of storm water drains and other infrastructure. Intensity Duration Frequency (IDF) relationships are generally used to obtain design return level for a given duration and return period. Due to lack of availability of extreme precipitation data for sufficiently large number of years, estimating the probability of extreme events is difficult. Typically, a single station data is used to obtain the design return levels for various durations and return periods, which are used in the design of urban infrastructure for the entire city. In an urban setting, the spatial variation of precipitation can be high; the precipitation amounts and patterns often vary within short distances of less than 5 km. Therefore it is crucial to study the uncertainties in the spatial variation of return levels for various durations. In this work, the extreme precipitation is modeled spatially using the Bayesian hierarchical analysis and the spatial variation of return levels is studied. The analysis is carried out with Block Maxima approach for defining the extreme precipitation, using Generalized Extreme Value (GEV) distribution for Bangalore city, Karnataka state, India. Daily data for nineteen stations in and around Bangalore city is considered in the study. The analysis is carried out for summer maxima (March - May), monsoon maxima (June - September) and the annual maxima rainfall. In the hierarchical analysis, the statistical model is specified in three layers. The data layer models the block maxima, pooling the extreme precipitation from all the stations. In the process layer, the latent spatial process characterized by geographical and climatological covariates (lat-lon, elevation, mean temperature etc.) which drives the extreme precipitation is modeled and in the prior level, the prior distributions that govern the latent process are modeled. Markov Chain Monte Carlo (MCMC) algorithm (Metropolis Hastings

  14. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. I. Biological model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    Several replacement models have been presented in literature. In other applicational areas like dairy cow replacement, various methodological improvements like hierarchical Markov processes and Bayesian updating have been implemented, but not in sow models. Furthermore, there are methodological...... improvements like multi-level hierarchical Markov processes with decisions on multiple time scales, efficient methods for parameter estimations at herd level and standard software that has been hardly implemented at all in any replacement model. The aim of this study is to present a sow replacement model...... that really uses all these methodological improvements. In this paper, the biological model describing the performance and feed intake of sows is presented. In particular, estimation of herd specific parameters is emphasized. The optimization model is described in a subsequent paper...

  15. A hierarchical method for Bayesian inference of rate parameters from shock tube data: Application to the study of the reaction of hydroxyl with 2-methylfuran

    KAUST Repository

    Kim, Daesang

    2017-06-22

    We developed a novel two-step hierarchical method for the Bayesian inference of the rate parameters of a target reaction from time-resolved concentration measurements in shock tubes. The method was applied to the calibration of the parameters of the reaction of hydroxyl with 2-methylfuran, which is studied experimentally via absorption measurements of the OH radical\\'s concentration following shock-heating. In the first step of the approach, each shock tube experiment is treated independently to infer the posterior distribution of the rate constant and error hyper-parameter that best explains the OH signal. In the second step, these posterior distributions are sampled to calibrate the parameters appearing in the Arrhenius reaction model for the rate constant. Furthermore, the second step is modified and repeated in order to explore alternative rate constant models and to assess the effect of uncertainties in the reflected shock\\'s temperature. Comparisons of the estimates obtained via the proposed methodology against the common least squares approach are presented. The relative merits of the novel Bayesian framework are highlighted, especially with respect to the opportunity to utilize the posterior distributions of the parameters in future uncertainty quantification studies.

  16. An Automated Bayesian Framework for Integrative Gene Expression Analysis and Predictive Medicine

    OpenAIRE

    Parikh, Neena; Zollanvari, Amin; Alterovitz, Gil

    2012-01-01

    Motivation: This work constructs a closed loop Bayesian Network framework for predictive medicine via integrative analysis of publicly available gene expression findings pertaining to various diseases. Results: An automated pipeline was successfully constructed. Integrative models were made based on gene expression data obtained from GEO experiments relating to four different diseases using Bayesian statistical methods. Many of these models demonstrated a high level of accuracy and predictive...

  17. A classification framework for content-based extraction of biomedical objects from hierarchically decomposed images

    Science.gov (United States)

    Thies, Christian; Schmidt Borreda, Marcel; Seidl, Thomas; Lehmann, Thomas M.

    2006-03-01

    Multiscale analysis provides a complete hierarchical partitioning of images into visually plausible regions. Each of them is formally characterized by a feature vector describing shape, texture and scale properties. Consequently, object extraction becomes a classification of the feature vectors. Classifiers are trained by relevant and irrelevant regions labeled as object and remaining partitions, respectively. A trained classifier is applicable to yet uncategorized partitionings to identify the corresponding region's classes. Such an approach enables retrieval of a-priori unknown objects within a point-and-click interface. In this work, the classification pipeline consists of a framework for data selection, feature selection, classifier training, classification of testing data, and evaluation. According to the no-free-lunch-theorem of supervised learning, the appropriate classification pipeline is determined experimentally. Therefore, each of the steps is varied by state-of-the-art methods and the respective classification quality is measured. Selection of training data from the ground truth is supported by bootstrapping, variance pooling, virtual training data, and cross validation. Feature selection for dimension reduction is performed by linear discriminant analysis, principal component analysis, and greedy selection. Competing classifiers are k-nearest-neighbor, Bayesian classifier, and the support vector machine. Quality is measured by precision and recall to reflect the retrieval task. A set of 105 hand radiographs from clinical routine serves as ground truth, where the metacarpal bones have been labeled manually. In total, 368 out of 39.017 regions are identified as relevant. In initial experiments for feature selection with the support vector machine have been obtained recall, precision and F-measure of 0.58, 0.67, and 0,62, respectively.

  18. Why less can be more: A Bayesian framework for heuristics

    OpenAIRE

    Parpart, Paula

    2017-01-01

    When making decisions under uncertainty, one common view is that people rely on simple heuristics that deliberately ignore information. One of the greatest puzzles in cognitive science concerns why heuristics can sometimes outperform full-information models, such as linear regression, which make full use of the available information. In this thesis, I will contribute the novel idea that heuristics can be thought of as embodying extreme Bayesian priors. Thereby, an explanation for less-is-more...

  19. FUZZY CLUSTERING BASED BAYESIAN FRAMEWORK TO PREDICT MENTAL HEALTH PROBLEMS AMONG CHILDREN

    Directory of Open Access Journals (Sweden)

    M R Sumathi

    2017-04-01

    Full Text Available According to World Health Organization, 10-20% of children and adolescents all over the world are experiencing mental disorders. Correct diagnosis of mental disorders at an early stage improves the quality of life of children and avoids complicated problems. Various expert systems using artificial intelligence techniques have been developed for diagnosing mental disorders like Schizophrenia, Depression, Dementia, etc. This study focuses on predicting basic mental health problems of children, like Attention problem, Anxiety problem, Developmental delay, Attention Deficit Hyperactivity Disorder (ADHD, Pervasive Developmental Disorder(PDD, etc. using the machine learning techniques, Bayesian Networks and Fuzzy clustering. The focus of the article is on learning the Bayesian network structure using a novel Fuzzy Clustering Based Bayesian network structure learning framework. The performance of the proposed framework was compared with the other existing algorithms and the experimental results have shown that the proposed framework performs better than the earlier algorithms.

  20. Hierarchical Scheduling Framework Based on Compositional Analysis Using Uppaal

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; David, Alexandre; Kim, Jin Hyun

    2014-01-01

    , which are some of the inputs for the parameterized timed automata that make up the framework. Components may have different scheduling policies, and each component is analyzed independently using Uppaal. We have applied our framework for the schedulability analysis of an avionics system....

  1. A Hierarchical Multivariate Bayesian Approach to Ensemble Model output Statistics in Atmospheric Prediction

    Science.gov (United States)

    2017-09-01

    represented by the dispersion of the discrete forecast estimates (black curves). 6 The computational intractability of Epstein’s complete...that scales well with complicated systems, the posterior densities are often analytically intractable (G13). To this end, MCMC methods provide a...participation in this conflict . Nevertheless, the advantages of intuitive, common-sense Bayesian statistical conclusions detailed by Casella (2008), Gelman

  2. Bayesian hierarchical modelling of continuous non‐negative longitudinal data with a spike at zero: An application to a study of birds visiting gardens in winter

    Science.gov (United States)

    Buckland, Stephen T.; King, Ruth; Toms, Mike P.

    2015-01-01

    The development of methods for dealing with continuous data with a spike at zero has lagged behind those for overdispersed or zero‐inflated count data. We consider longitudinal ecological data corresponding to an annual average of 26 weekly maximum counts of birds, and are hence effectively continuous, bounded below by zero but also with a discrete mass at zero. We develop a Bayesian hierarchical Tweedie regression model that can directly accommodate the excess number of zeros common to this type of data, whilst accounting for both spatial and temporal correlation. Implementation of the model is conducted in a Markov chain Monte Carlo (MCMC) framework, using reversible jump MCMC to explore uncertainty across both parameter and model spaces. This regression modelling framework is very flexible and removes the need to make strong assumptions about mean‐variance relationships a priori. It can also directly account for the spike at zero, whilst being easily applicable to other types of data and other model formulations. Whilst a correlative study such as this cannot prove causation, our results suggest that an increase in an avian predator may have led to an overall decrease in the number of one of its prey species visiting garden feeding stations in the United Kingdom. This may reflect a change in behaviour of house sparrows to avoid feeding stations frequented by sparrowhawks, or a reduction in house sparrow population size as a result of sparrowhawk increase. PMID:25737026

  3. Group Tracking of Space Objects within Bayesian Framework

    Directory of Open Access Journals (Sweden)

    Huang Jian

    2013-03-01

    Full Text Available It is imperative to efficiently track and catalogue the extensive dense group space objects for space surveillance. As the main instrument for Low Earth Orbit (LEO space surveillance, ground-based radar system is usually limited by its resolving power while tracking the small space debris with high dense population. Thus, the obtained information about target detection and observation will be seriously missed, which makes the traditional tracking method inefficient. Therefore, we conceived the concept of group tracking. The overall motional tendency of the group objects is particularly focused, while the individual object is simultaneously tracked in effect. The tracking procedure is based on the Bayesian frame. According to the restriction among the group center and observations of multi-targets, the reconstruction of targets’ number and estimation of individual trajectory can be greatly improved on the accuracy and robustness in the case of high miss alarm. The Markov Chain Monte Carlo Particle (MCMC-Particle algorism is utilized for solving the Bayesian integral problem. Finally, the simulation of the group space objects tracking is carried out to validate the efficiency of the proposed method.

  4. Bayesian Option Pricing Framework with Stochastic Volatility for FX Data

    Directory of Open Access Journals (Sweden)

    Ying Wang

    2016-12-01

    Full Text Available The application of stochastic volatility (SV models in the option pricing literature usually assumes that the market has sufficient option data to calibrate the model’s risk-neutral parameters. When option data are insufficient or unavailable, market practitioners must estimate the model from the historical returns of the underlying asset and then transform the resulting model into its risk-neutral equivalent. However, the likelihood function of an SV model can only be expressed in a high-dimensional integration, which makes the estimation a highly challenging task. The Bayesian approach has been the classical way to estimate SV models under the data-generating (physical probability measure, but the transformation from the estimated physical dynamic into its risk-neutral counterpart has not been addressed. Inspired by the generalized autoregressive conditional heteroskedasticity (GARCH option pricing approach by Duan in 1995, we propose an SV model that enables us to simultaneously and conveniently perform Bayesian inference and transformation into risk-neutral dynamics. Our model relaxes the normality assumption on innovations of both return and volatility processes, and our empirical study shows that the estimated option prices generate realistic implied volatility smile shapes. In addition, the volatility premium is almost flat across strike prices, so adding a few option data to the historical time series of the underlying asset can greatly improve the estimation of option prices.

  5. Hierarchical Bayesian Spatio-Temporal Analysis of Climatic and Socio-Economic Determinants of Rocky Mountain Spotted Fever.

    Directory of Open Access Journals (Sweden)

    Ram K Raghavan

    Full Text Available This study aims to examine the spatio-temporal dynamics of Rocky Mountain spotted fever (RMSF prevalence in four contiguous states of Midwestern United States, and to determine the impact of environmental and socio-economic factors associated with this disease. Bayesian hierarchical models were used to quantify space and time only trends and spatio-temporal interaction effect in the case reports submitted to the state health departments in the region. Various socio-economic, environmental and climatic covariates screened a priori in a bivariate procedure were added to a main-effects Bayesian model in progressive steps to evaluate important drivers of RMSF space-time patterns in the region. Our results show a steady increase in RMSF incidence over the study period to newer geographic areas, and the posterior probabilities of county-specific trends indicate clustering of high risk counties in the central and southern parts of the study region. At the spatial scale of a county, the prevalence levels of RMSF is influenced by poverty status, average relative humidity, and average land surface temperature (>35°C in the region, and the relevance of these factors in the context of climate-change impacts on tick-borne diseases are discussed.

  6. Multilevel Bayesian networks for the analysis of hierarchical health care data

    NARCIS (Netherlands)

    Lappenschaar, M.; Hommersom, A.; Lucas, P.J.; Lagro, J.; Visscher, S.

    2013-01-01

    OBJECTIVE: Large health care datasets normally have a hierarchical structure, in terms of levels, as the data have been obtained from different practices, hospitals, or regions. Multilevel regression is the technique commonly used to deal with such multilevel data. However, for the statistical

  7. Bayesian Integration of Large Scale SNA Data Frameworks with an Application to Guatemala

    NARCIS (Netherlands)

    Van Tongeren, J.W.; Magnus, J.R.

    2011-01-01

    We present a Bayesian estimation method applied to an extended set of national accounts data and estimates of approximately 2500 variables. The method is based on conventional national accounts frameworks as compiled by countries in Central America, in particular Guatemala, and on concepts that are

  8. Hierarchical Bayesian Data Analysis in Radiometric SAR System Calibration: A Case Study on Transponder Calibration with RADARSAT-2 Data

    Directory of Open Access Journals (Sweden)

    Björn J. Döring

    2013-12-01

    Full Text Available A synthetic aperture radar (SAR system requires external absolute calibration so that radiometric measurements can be exploited in numerous scientific and commercial applications. Besides estimating a calibration factor, metrological standards also demand the derivation of a respective calibration uncertainty. This uncertainty is currently not systematically determined. Here for the first time it is proposed to use hierarchical modeling and Bayesian statistics as a consistent method for handling and analyzing the hierarchical data typically acquired during external calibration campaigns. Through the use of Markov chain Monte Carlo simulations, a joint posterior probability can be conveniently derived from measurement data despite the necessary grouping of data samples. The applicability of the method is demonstrated through a case study: The radar reflectivity of DLR’s new C-band Kalibri transponder is derived through a series of RADARSAT-2 acquisitions and a comparison with reference point targets (corner reflectors. The systematic derivation of calibration uncertainties is seen as an important step toward traceable radiometric calibration of synthetic aperture radars.

  9. Application of Bayesian networks in a hierarchical structure for environmental risk assessment: a case study of the Gabric Dam, Iran.

    Science.gov (United States)

    Malekmohammadi, Bahram; Tayebzadeh Moghadam, Negar

    2018-04-13

    Environmental risk assessment (ERA) is a commonly used, effective tool applied to reduce adverse effects of environmental risk factors. In this study, ERA was investigated using the Bayesian network (BN) model based on a hierarchical structure of variables in an influence diagram (ID). ID facilitated ranking of the different alternatives under uncertainty that were then used to evaluate comparisons of the different risk factors. BN was used to present a new model for ERA applicable to complicated development projects such as dam construction. The methodology was applied to the Gabric Dam, in southern Iran. The main environmental risk factors in the region, presented by the Gabric Dam, were identified based on the Delphi technique and specific features of the study area. These included the following: flood, water pollution, earthquake, changes in land use, erosion and sedimentation, effects on the population, and ecosensitivity. These risk factors were then categorized based on results from the output decision node of the BN, including expected utility values for risk factors in the decision node. ERA was performed for the Gabric Dam using the analytical hierarchy process (AHP) method to compare results of BN modeling with those of conventional methods. Results determined that a BN-based hierarchical structure to ERA present acceptable and reasonable risk assessment prioritization in proposing suitable solutions to reduce environmental risks and can be used as a powerful decision support system for evaluating environmental risks.

  10. Probabilistic risk assessment framework for structural systems under multiple hazards using Bayesian statistics

    Energy Technology Data Exchange (ETDEWEB)

    Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)

    2017-04-15

    Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.

  11. Probabilistic risk assessment framework for structural systems under multiple hazards using Bayesian statistics

    International Nuclear Information System (INIS)

    Kwag, Shinyoung; Gupta, Abhinav

    2017-01-01

    Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.

  12. Optimizing Battery Life for Electric UAVs using a Bayesian Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — In summary, this paper lays a simple flight plan optimization strategy based on the particle filtering framework described in [5]. This is meant as a first step in...

  13. Discovery of temporal association rules with hierarchical granular framework

    Directory of Open Access Journals (Sweden)

    Tzung-Pei Hong

    2016-07-01

    Full Text Available Most of the existing studies in temporal data mining consider only lifespan of items to find general temporal association rules. However, an infrequent item for the entire time may be frequent within part of the time. We thus organize time into granules and consider temporal data mining for different levels of granules. Besides, an item may not be ready at the beginning of a store. In this paper, we use the first transaction including an item as the start point for the item. Before the start point, the item may not be brought. A three-phase mining framework with consideration of the item lifespan definition is designed. At last, experiments were made to demonstrate the performance of the proposed framework.

  14. Estimating temporal trend in the presence of spatial complexity: a Bayesian hierarchical model for a wetland plant population undergoing restoration.

    Directory of Open Access Journals (Sweden)

    Thomas J Rodhouse

    Full Text Available Monitoring programs that evaluate restoration and inform adaptive management are important for addressing environmental degradation. These efforts may be well served by spatially explicit hierarchical approaches to modeling because of unavoidable spatial structure inherited from past land use patterns and other factors. We developed bayesian hierarchical models to estimate trends from annual density counts observed in a spatially structured wetland forb (Camassia quamash [camas] population following the cessation of grazing and mowing on the study area, and in a separate reference population of camas. The restoration site was bisected by roads and drainage ditches, resulting in distinct subpopulations ("zones" with different land use histories. We modeled this spatial structure by fitting zone-specific intercepts and slopes. We allowed spatial covariance parameters in the model to vary by zone, as in stratified kriging, accommodating anisotropy and improving computation and biological interpretation. Trend estimates provided evidence of a positive effect of passive restoration, and the strength of evidence was influenced by the amount of spatial structure in the model. Allowing trends to vary among zones and accounting for topographic heterogeneity increased precision of trend estimates. Accounting for spatial autocorrelation shifted parameter coefficients in ways that varied among zones depending on strength of statistical shrinkage, autocorrelation and topographic heterogeneity--a phenomenon not widely described. Spatially explicit estimates of trend from hierarchical models will generally be more useful to land managers than pooled regional estimates and provide more realistic assessments of uncertainty. The ability to grapple with historical contingency is an appealing benefit of this approach.

  15. Global, regional, and subregional classification of abortions by safety, 2010-14: estimates from a Bayesian hierarchical model.

    Science.gov (United States)

    Ganatra, Bela; Gerdts, Caitlin; Rossier, Clémentine; Johnson, Brooke Ronald; Tunçalp, Özge; Assifi, Anisa; Sedgh, Gilda; Singh, Susheela; Bankole, Akinrinola; Popinchalk, Anna; Bearak, Jonathan; Kang, Zhenning; Alkema, Leontine

    2017-11-25

    Global estimates of unsafe abortions have been produced for 1995, 2003, and 2008. However, reconceptualisation of the framework and methods for estimating abortion safety is needed owing to the increased availability of simple methods for safe abortion (eg, medical abortion), the increasingly widespread use of misoprostol outside formal health systems in contexts where abortion is legally restricted, and the need to account for the multiple factors that affect abortion safety. We used all available empirical data on abortion methods, providers, and settings, and factors affecting safety as covariates within a Bayesian hierarchical model to estimate the global, regional, and subregional distributions of abortion by safety categories. We used a three-tiered categorisation based on the WHO definition of unsafe abortion and WHO guidelines on safe abortion to categorise abortions as safe or unsafe and to further divide unsafe abortions into two categories of less safe and least safe. Of the 55· 7 million abortions that occurred worldwide each year between 2010-14, we estimated that 30·6 million (54·9%, 90% uncertainty interval 49·9-59·4) were safe, 17·1 million (30·7%, 25·5-35·6) were less safe, and 8·0 million (14·4%, 11·5-18·1) were least safe. Thus, 25·1 million (45·1%, 40·6-50·1) abortions each year between 2010 and 2014 were unsafe, with 24·3 million (97%) of these in developing countries. The proportion of unsafe abortions was significantly higher in developing countries than developed countries (49·5% vs 12·5%). When grouped by the legal status of abortion, the proportion of unsafe abortions was significantly higher in countries with highly restrictive abortion laws than in those with less restrictive laws. Increased efforts are needed, especially in developing countries, to ensure access to safe abortion. The paucity of empirical data is a limitation of these findings. Improved in-country data for health services and innovative research to

  16. Integrated Bayesian network framework for modeling complex ecological issues.

    Science.gov (United States)

    Johnson, Sandra; Mengersen, Kerrie

    2012-07-01

    The management of environmental problems is multifaceted, requiring varied and sometimes conflicting objectives and perspectives to be considered. Bayesian network (BN) modeling facilitates the integration of information from diverse sources and is well suited to tackling the management challenges of complex environmental problems. However, combining several perspectives in one model can lead to large, unwieldy BNs that are difficult to maintain and understand. Conversely, an oversimplified model may lead to an unrealistic representation of the environmental problem. Environmental managers require the current research and available knowledge about an environmental problem of interest to be consolidated in a meaningful way, thereby enabling the assessment of potential impacts and different courses of action. Previous investigations of the environmental problem of interest may have already resulted in the construction of several disparate ecological models. On the other hand, the opportunity may exist to initiate this modeling. In the first instance, the challenge is to integrate existing models and to merge the information and perspectives from these models. In the second instance, the challenge is to include different aspects of the environmental problem incorporating both the scientific and management requirements. Although the paths leading to the combined model may differ for these 2 situations, the common objective is to design an integrated model that captures the available information and research, yet is simple to maintain, expand, and refine. BN modeling is typically an iterative process, and we describe a heuristic method, the iterative Bayesian network development cycle (IBNDC), for the development of integrated BN models that are suitable for both situations outlined above. The IBNDC approach facilitates object-oriented BN (OOBN) modeling, arguably viewed as the next logical step in adaptive management modeling, and that embraces iterative development

  17. An economic growth model based on financial credits distribution to the government economy priority sectors of each regency in Indonesia using hierarchical Bayesian method

    Science.gov (United States)

    Yasmirullah, Septia Devi Prihastuti; Iriawan, Nur; Sipayung, Feronika Rosalinda

    2017-11-01

    The success of regional economic establishment could be measured by economic growth. Since the Act No. 32 of 2004 has been implemented, unbalance economic among the regency in Indonesia is increasing. This condition is contrary different with the government goal to build society welfare through the economic activity development in each region. This research aims to examine economic growth through the distribution of bank credits to each Indonesia's regency. The data analyzed in this research is hierarchically structured data which follow normal distribution in first level. Two modeling approaches are employed in this research, a global-one level Bayesian approach and two-level hierarchical Bayesian approach. The result shows that hierarchical Bayesian has succeeded to demonstrate a better estimation than a global-one level Bayesian. It proves that the different economic growth in each province is significantly influenced by the variations of micro level characteristics in each province. These variations are significantly affected by cities and province characteristics in second level.

  18. A novel approach to quantifying the sensitivity of current and future cosmological datasets to the neutrino mass ordering through Bayesian hierarchical modeling

    Science.gov (United States)

    Gerbino, Martina; Lattanzi, Massimiliano; Mena, Olga; Freese, Katherine

    2017-12-01

    We present a novel approach to derive constraints on neutrino masses, as well as on other cosmological parameters, from cosmological data, while taking into account our ignorance of the neutrino mass ordering. We derive constraints from a combination of current as well as future cosmological datasets on the total neutrino mass Mν and on the mass fractions fν,i =mi /Mν (where the index i = 1 , 2 , 3 indicates the three mass eigenstates) carried by each of the mass eigenstates mi, after marginalizing over the (unknown) neutrino mass ordering, either normal ordering (NH) or inverted ordering (IH). The bounds on all the cosmological parameters, including those on the total neutrino mass, take therefore into account the uncertainty related to our ignorance of the mass hierarchy that is actually realized in nature. This novel approach is carried out in the framework of Bayesian analysis of a typical hierarchical problem, where the distribution of the parameters of the model depends on further parameters, the hyperparameters. In this context, the choice of the neutrino mass ordering is modeled via the discrete hyperparameterhtype, which we introduce in the usual Markov chain analysis. The preference from cosmological data for either the NH or the IH scenarios is then simply encoded in the posterior distribution of the hyperparameter itself. Current cosmic microwave background (CMB) measurements assign equal odds to the two hierarchies, and are thus unable to distinguish between them. However, after the addition of baryon acoustic oscillation (BAO) measurements, a weak preference for the normal hierarchical scenario appears, with odds of 4 : 3 from Planck temperature and large-scale polarization in combination with BAO (3 : 2 if small-scale polarization is also included). Concerning next-generation cosmological experiments, forecasts suggest that the combination of upcoming CMB (COrE) and BAO surveys (DESI) may determine the neutrino mass hierarchy at a high statistical

  19. A Bayesian framework for automated cardiovascular risk scoring on standard lumbar radiographs

    DEFF Research Database (Denmark)

    Petersen, Peter Kersten; Ganz, Melanie; Mysling, Peter

    2012-01-01

    We present a fully automated framework for scoring a patients risk of cardiovascular disease (CVD) and mortality from a standard lateral radiograph of the lumbar aorta. The framework segments abdominal aortic calcifications for computing a CVD risk score and performs a survival analysis to validate...... the score. Since the aorta is invisible on X-ray images, its position is reasoned from (1) the shape and location of the lumbar vertebrae and (2) the location, shape, and orientation of potential calcifications. The proposed framework follows the principle of Bayesian inference, which has several advantages...

  20. Hierarchical Pore Development by Plasma Etching of Zr-Based Metal-Organic Frameworks.

    Science.gov (United States)

    DeCoste, Jared B; Rossin, Joseph A; Peterson, Gregory W

    2015-12-07

    The typically stable Zr-based metal-organic frameworks (MOFs) UiO-66 and UiO-66-NH2 were treated with tetrafluoromethane (CF4 ) and hexafluoroethane (C2 F6 ) plasmas. Through interactions between fluoride radicals from the perfluoroalkane plasma and the zirconium-oxygen bonds of the MOF, the resulting materials showed the development of mesoporosity, creating a hierarchical pore structure. It is anticipated that this strategy can be used as a post-synthetic technique for developing hierarchical networks in a variety of MOFs. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Sparse Estimation Using Bayesian Hierarchical Prior Modeling for Real and Complex Linear Models

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Badiu, Mihai Alin

    2015-01-01

    -valued models, this paper proposes a GSM model - the Bessel K model - that induces concave penalty functions for the estimation of complex sparse signals. The properties of the Bessel K model are analyzed when it is applied to Type I and Type II estimation. This analysis reveals that, by tuning the parameters...... of the mixing pdf different penalty functions are invoked depending on the estimation type used, the value of the noise variance, and whether real or complex signals are estimated. Using the Bessel K model, we derive a sparse estimator based on a modification of the expectation-maximization algorithm formulated......In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex...

  2. Improving the Calibration of the SN Ia Anchor Datasets with a Bayesian Hierarchal Model

    Science.gov (United States)

    Currie, Miles; Rubin, David

    2018-01-01

    Inter-survey calibration remains one of the largest systematic uncertainties in SN Ia cosmology today. Ideally, each survey would measure their system throughputs and observe well characterized spectrophotometric standard stars, but many important surveys have not done so. For these surveys, we calibrate using tertiary survey stars tied to SDSS and Pan-STARRS. We improve on previous efforts by taking the spatially variable response of each telescope/camera into account, and using improved color transformations in the surveys’ natural instrumental photometric system. We use a global hierarchical model of the data, automatically providing a covariance matrix of magnitude offsets and bandpass shifts which reduces the systematic uncertainty in inter-survey calibration, thereby providing better cosmological constraints.

  3. Creating Hierarchical Pores by Controlled Linker Thermolysis in Multivariate Metal-Organic Frameworks

    KAUST Repository

    Feng, Liang

    2018-01-18

    Sufficient pore size, appropriate stability and hierarchical porosity are three prerequisites for open frameworks designed for drug delivery, enzyme immobilization and catalysis involving large molecules. Herein, we report a powerful and general strate-gy, linker thermolysis, to construct ultra-stable hierarchically porous metal−organic frameworks (HP-MOFs) with tunable pore size distribution. Linker instability, usually an undesirable trait of MOFs, was exploited to create mesopores by generating crystal defects throughout a microporous MOF crystal via thermolysis. The crystallinity and stability of HP-MOFs remain after thermolabile linkers are selectively removed from multivariate metal-organic frameworks (MTV-MOFs) through a decarboxyla-tion process. A domain-based linker spatial distribution was found to be critical for creating hierarchical pores inside MTV-MOFs. Furthermore, linker thermolysis promotes the formation of ultra-small metal oxide (MO) nanoparticles immobilized in an open framework that exhibits high catalytic activity for Lewis acid catalyzed reactions. Most importantly, this work pro-vides fresh insights into the connection between linker apportionment and vacancy distribution, which may shed light on prob-ing the disordered linker apportionment in multivariate systems, a long-standing challenge in the study of MTV-MOFs.

  4. Bayesian hierarchical modelling of continuous non-negative longitudinal data with a spike at zero: An application to a study of birds visiting gardens in winter.

    Science.gov (United States)

    Swallow, Ben; Buckland, Stephen T; King, Ruth; Toms, Mike P

    2016-03-01

    The development of methods for dealing with continuous data with a spike at zero has lagged behind those for overdispersed or zero-inflated count data. We consider longitudinal ecological data corresponding to an annual average of 26 weekly maximum counts of birds, and are hence effectively continuous, bounded below by zero but also with a discrete mass at zero. We develop a Bayesian hierarchical Tweedie regression model that can directly accommodate the excess number of zeros common to this type of data, whilst accounting for both spatial and temporal correlation. Implementation of the model is conducted in a Markov chain Monte Carlo (MCMC) framework, using reversible jump MCMC to explore uncertainty across both parameter and model spaces. This regression modelling framework is very flexible and removes the need to make strong assumptions about mean-variance relationships a priori. It can also directly account for the spike at zero, whilst being easily applicable to other types of data and other model formulations. Whilst a correlative study such as this cannot prove causation, our results suggest that an increase in an avian predator may have led to an overall decrease in the number of one of its prey species visiting garden feeding stations in the United Kingdom. This may reflect a change in behaviour of house sparrows to avoid feeding stations frequented by sparrowhawks, or a reduction in house sparrow population size as a result of sparrowhawk increase. © 2015 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Bayesian Decision Support

    Science.gov (United States)

    Berliner, M.

    2017-12-01

    Bayesian statistical decision theory offers a natural framework for decision-policy making in the presence of uncertainty. Key advantages of the approach include efficient incorporation of information and observations. However, in complicated settings it is very difficult, perhaps essentially impossible, to formalize the mathematical inputs needed in the approach. Nevertheless, using the approach as a template is useful for decision support; that is, organizing and communicating our analyses. Bayesian hierarchical modeling is valuable in quantifying and managing uncertainty such cases. I review some aspects of the idea emphasizing statistical model development and use in the context of sea-level rise.

  6. Manual hierarchical clustering of regional geochemical data using a Bayesian finite mixture model

    Science.gov (United States)

    Ellefsen, Karl J.; Smith, David

    2016-01-01

    Interpretation of regional scale, multivariate geochemical data is aided by a statistical technique called “clustering.” We investigate a particular clustering procedure by applying it to geochemical data collected in the State of Colorado, United States of America. The clustering procedure partitions the field samples for the entire survey area into two clusters. The field samples in each cluster are partitioned again to create two subclusters, and so on. This manual procedure generates a hierarchy of clusters, and the different levels of the hierarchy show geochemical and geological processes occurring at different spatial scales. Although there are many different clustering methods, we use Bayesian finite mixture modeling with two probability distributions, which yields two clusters. The model parameters are estimated with Hamiltonian Monte Carlo sampling of the posterior probability density function, which usually has multiple modes. Each mode has its own set of model parameters; each set is checked to ensure that it is consistent both with the data and with independent geologic knowledge. The set of model parameters that is most consistent with the independent geologic knowledge is selected for detailed interpretation and partitioning of the field samples.

  7. Electroencephalography-based real-time cortical monitoring system that uses hierarchical Bayesian estimations for the brain-machine interface.

    Science.gov (United States)

    Choi, Kyuwan

    2014-06-01

    In this study, a real-time cortical activity monitoring system was constructed, which could estimate cortical activities every 125 milliseconds over 2,240 vertexes from 64 channel electroencephalography signals through the Hierarchical Bayesian estimation that uses functional magnetic resonance imaging data as its prior information. Recently, functional magnetic resonance imaging has mostly been used in the neurofeedback field because it allows for high spatial resolution. However, in functional magnetic resonance imaging, the time for the neurofeedback information to reach the patient is delayed several seconds because of its poor temporal resolution. Therefore, a number of problems need to be solved to effectively implement feedback training paradigms in patients. To address this issue, this study used a new cortical activity monitoring system that improved both spatial and temporal resolution by using both functional magnetic resonance imaging data and electroencephalography signals in conjunction with one another. This system is advantageous as it can improve applications in the fields of real-time diagnosis, neurofeedback, and the brain-machine interface.

  8. A multi-level hierarchic Markov process with Bayesian updating for herd optimization and simulation in dairy cattle.

    Science.gov (United States)

    Demeter, R M; Kristensen, A R; Dijkstra, J; Oude Lansink, A G J M; Meuwissen, M P M; van Arendonk, J A M

    2011-12-01

    Herd optimization models that determine economically optimal insemination and replacement decisions are valuable research tools to study various aspects of farming systems. The aim of this study was to develop a herd optimization and simulation model for dairy cattle. The model determines economically optimal insemination and replacement decisions for individual cows and simulates whole-herd results that follow from optimal decisions. The optimization problem was formulated as a multi-level hierarchic Markov process, and a state space model with Bayesian updating was applied to model variation in milk yield. Methodological developments were incorporated in 2 main aspects. First, we introduced an additional level to the model hierarchy to obtain a more tractable and efficient structure. Second, we included a recently developed cattle feed intake model. In addition to methodological developments, new parameters were used in the state space model and other biological functions. Results were generated for Dutch farming conditions, and outcomes were in line with actual herd performance in the Netherlands. Optimal culling decisions were sensitive to variation in milk yield but insensitive to energy requirements for maintenance and feed intake capacity. We anticipate that the model will be applied in research and extension. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  9. Estimation of Mental Disorders Prevalence in High School Students Using Small Area Methods: A Hierarchical Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Ali Reza Soltanian

    2016-08-01

    Full Text Available Background Adolescence is one of the most important periods in the course of human evolution and the prevalence of mental disorders among adolescence in different regions of Iran, especially in southern Iran. Objectives This study was conducted to determine the prevalence of mental disorders among high school students in Bushehr province, south of Iran. Methods In this cross-sectional study, 286 high school students were recruited by a multi-stage random sampling in Bushehr province in 2015. A general health questionnaire (GHQ-28 was used to assess mental disorders. The small area method, under the hierarchical Bayesian approach, was used to determine the prevalence of mental disorders and data analysis. Results From 286 questionnaires only 182 were completely filed and evaluated (the response rate was 70.5%. Of the students, 58.79% and 41.21% were male and female, respectively. Of all students, the prevalence of mental disorders in Bushehr, Dayyer, Deylam, Kangan, Dashtestan, Tangestan, Genaveh, and Dashty were 0.48, 0.42, 0.45, 0.52, 0.41, 0.47, 0.42, and 0.43, respectively. Conclusions Based on this study, the prevalence of mental disorders among adolescents was increasing in Bushehr Province counties. The lack of a national policy in this way is a serious obstacle to mental health and wellbeing access.

  10. Using hierarchical dynamic Bayesian networks to investigate dynamics of organ failure in patients in the Intensive Care Unit.

    Science.gov (United States)

    Peelen, Linda; de Keizer, Nicolette F; Jonge, Evert de; Bosman, Robert-Jan; Abu-Hanna, Ameen; Peek, Niels

    2010-04-01

    In intensive care medicine close monitoring of organ failure status is important for the prognosis of patients and for choices regarding ICU management. Major challenges in analyzing the multitude of data pertaining to the functioning of the organ systems over time are to extract meaningful clinical patterns and to provide predictions for the future course of diseases. With their explicit states and probabilistic state transitions, Markov models seem to fit this purpose well. In complex domains such as intensive care a choice is often made between a simple model that is estimated from the data, or a more complex model in which the parameters are provided by domain experts. Our primary aim is to combine these approaches and develop a set of complex Markov models based on clinical data. In this paper we describe the design choices underlying the models, which enable them to identify temporal patterns, predict outcomes, and test clinical hypotheses. Our models are characterized by the choice of the dynamic hierarchical Bayesian network structure and the use of logistic regression equations in estimating the transition probabilities. We demonstrate the induction, inference, evaluation, and use of these models in practice in a case-study of patients with severe sepsis admitted to four Dutch ICUs. 2009 Elsevier Inc. All rights reserved.

  11. Materials Knowledge Systems in Python - A Data Science Framework for Accelerated Development of Hierarchical Materials.

    Science.gov (United States)

    Brough, David B; Wheeler, Daniel; Kalidindi, Surya R

    2017-03-01

    There is a critical need for customized analytics that take into account the stochastic nature of the internal structure of materials at multiple length scales in order to extract relevant and transferable knowledge. Data driven Process-Structure-Property (PSP) linkages provide systemic, modular and hierarchical framework for community driven curation of materials knowledge, and its transference to design and manufacturing experts. The Materials Knowledge Systems in Python project (PyMKS) is the first open source materials data science framework that can be used to create high value PSP linkages for hierarchical materials that can be leveraged by experts in materials science and engineering, manufacturing, machine learning and data science communities. This paper describes the main functions available from this repository, along with illustrations of how these can be accessed, utilized, and potentially further refined by the broader community of researchers.

  12. A Bayesian framework to estimate diversification rates and their variation through time and space

    Directory of Open Access Journals (Sweden)

    Silvestro Daniele

    2011-10-01

    Full Text Available Abstract Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae and Lupinus (Fabaceae. In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling.

  13. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    Science.gov (United States)

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  14. Spatial variability of the effect of air pollution on term birth weight: evaluating influential factors using Bayesian hierarchical models.

    Science.gov (United States)

    Li, Lianfa; Laurent, Olivier; Wu, Jun

    2016-02-05

    Epidemiological studies suggest that air pollution is adversely associated with pregnancy outcomes. Such associations may be modified by spatially-varying factors including socio-demographic characteristics, land-use patterns and unaccounted exposures. Yet, few studies have systematically investigated the impact of these factors on spatial variability of the air pollution's effects. This study aimed to examine spatial variability of the effects of air pollution on term birth weight across Census tracts and the influence of tract-level factors on such variability. We obtained over 900,000 birth records from 2001 to 2008 in Los Angeles County, California, USA. Air pollution exposure was modeled at individual level for nitrogen dioxide (NO2) and nitrogen oxides (NOx) using spatiotemporal models. Two-stage Bayesian hierarchical non-linear models were developed to (1) quantify the associations between air pollution exposure and term birth weight within each tract; and (2) examine the socio-demographic, land-use, and exposure-related factors contributing to the between-tract variability of the associations between air pollution and term birth weight. Higher air pollution exposure was associated with lower term birth weight (average posterior effects: -14.7 (95 % CI: -19.8, -9.7) g per 10 ppb increment in NO2 and -6.9 (95 % CI: -12.9, -0.9) g per 10 ppb increment in NOx). The variation of the association across Census tracts was significantly influenced by the tract-level socio-demographic, exposure-related and land-use factors. Our models captured the complex non-linear relationship between these factors and the associations between air pollution and term birth weight: we observed the thresholds from which the influence of the tract-level factors was markedly exacerbated or attenuated. Exacerbating factors might reflect additional exposure to environmental insults or lower socio-economic status with higher vulnerability, whereas attenuating factors might indicate reduced

  15. Bayesian modelling of multiple diagnostics at Wendelstein 7-X using the Minerva framework

    Science.gov (United States)

    Kwak, Sehyun; Svensson, Jakob; Bozhenkov, Sergey; Trimino Mora, Humberto; Hoefel, Udo; Pavone, Andrea; Krychowiak, Maciej; Langenberg, Andreas; Ghim, Young-Chul; W7-X Team Team

    2017-10-01

    Wendelstein 7-X (W7-X) is a large scale optimised stellarator designed for steady-state operation with fusion reactor relevant conditions. Consistent inference of physics parameters and their associated uncertainties requires the capability to handle the complexity of the entire system, including physics models of multiple diagnostics. A Bayesian model has been developed in the Minerva framework to infer electron temperature and density profiles from multiple diagnostics in a consistent way. Here, the physics models predict the data of multiple diagnostics in a joint Bayesian analysis. The electron temperature and density profiles are modelled by Gaussian processes with hyperparameters. Markov chain Monte Carlo methods explore the full posterior of electron temperature and density profiles as well as possible combinations of hyperparameters and calibration factors. This results in a profile inference with proper uncertainties reflecting both statistical error and the automatic calibration for diagnostics.

  16. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    Energy Technology Data Exchange (ETDEWEB)

    Hadjidoukas, P.E.; Angelikopoulos, P. [Computational Science and Engineering Laboratory, ETH Zürich, CH-8092 (Switzerland); Papadimitriou, C. [Department of Mechanical Engineering, University of Thessaly, GR-38334 Volos (Greece); Koumoutsakos, P., E-mail: petros@ethz.ch [Computational Science and Engineering Laboratory, ETH Zürich, CH-8092 (Switzerland)

    2015-03-01

    We present Π4U,{sup 1} an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  17. Mapping informative clusters in a hierarchical [corrected] framework of FMRI multivariate analysis.

    Directory of Open Access Journals (Sweden)

    Rui Xu

    Full Text Available Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies.

  18. The Hydrolytic Stability and Degradation Mechanism of a Hierarchically Porous Metal Alkylphosphonate Framework

    Directory of Open Access Journals (Sweden)

    Kai Lv

    2018-03-01

    Full Text Available To aid the design of a hierarchically porous unconventional metal-phosphonate framework (HP-UMPF for practical radioanalytical separation, a systematic investigation of the hydrolytic stability of bulk phase against acidic corrosion has been carried out for an archetypical HP-UMPF. Bulk dissolution results suggest that aqueous acidity has a more paramount effect on incongruent leaching than the temperature, and the kinetic stability reaches equilibrium by way of an accumulation of a partial leached species on the corrosion conduits. A variation of particle morphology, hierarchical porosity and backbone composition upon corrosion reveals that they are hydrolytically resilient without suffering any great degradation of porous texture, although large aggregates crack into sporadic fractures while the nucleophilic attack of inorganic layers cause the leaching of tin and phosphorus. The remaining selectivity of these HP-UMPFs is dictated by a balance between the elimination of free phosphonate and the exposure of confined phosphonates, thus allowing a real-time tailor of radionuclide sequestration. Moreover, a plausible degradation mechanism has been proposed for the triple progressive dissolution of three-level hierarchical porous structures to elucidate resultant reactivity. These HP-UMPFs are compared with benchmark metal-organic frameworks (MOFs to obtain a rough grading of hydrolytic stability and two feasible approaches are suggested for enhancing their hydrolytic stability that are intended for real-life separation protocols.

  19. Development of a hierarchical Bayesian model to estimate the growth parameters of Listeria monocytogenes in minimally processed fresh leafy salads.

    Science.gov (United States)

    Crépet, Amélie; Stahl, Valérie; Carlin, Frédéric

    2009-05-31

    The optimal growth rate mu(opt) of Listeria monocytogenes in minimally processed (MP) fresh leafy salads was estimated with a hierarchical Bayesian model at (mean+/-standard deviation) 0.33+/-0.16 h(-1). This mu(opt) value was much lower on average than that in nutrient broth, liquid dairy, meat and seafood products (0.7-1.3 h(-1)), and of the same order of magnitude as in cheese. Cardinal temperatures T(min), T(opt) and T(max) were determined at -4.5+/-1.3 degrees C, 37.1+/-1.3 degrees C and 45.4+/-1.2 degrees C respectively. These parameters were determined from 206 growth curves of L. monocytogenes in MP fresh leafy salads (lettuce including iceberg lettuce, broad leaf endive, curly leaf endive, lamb's lettuce, and mixtures of them) selected in the scientific literature and in technical reports. The adequacy of the model was evaluated by comparing observed data (bacterial concentrations at each experimental time for the completion of the 206 growth curves, mean log(10) increase at selected times and temperatures, L. monocytogenes concentrations in naturally contaminated MP iceberg lettuce) with the distribution of the predicted data generated by the model. The sensitivity of the model to assumptions about the prior values also was tested. The observed values mostly fell into the 95% credible interval of the distribution of predicted values. The mu(opt) and its uncertainty determined in this work could be used in quantitative microbial risk assessment for L. monocytogenes in minimally processed fresh leafy salads.

  20. Global trends and factors associated with the illegal killing of elephants: A hierarchical bayesian analysis of carcass encounter data.

    Directory of Open Access Journals (Sweden)

    Robert W Burn

    Full Text Available Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES. Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE, set up by the 10(th Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002-2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process.

  1. Effects of management intervention on post-disturbance community composition: an experimental analysis using bayesian hierarchical models.

    Directory of Open Access Journals (Sweden)

    Jack Giovanini

    Full Text Available As human demand for ecosystem products increases, management intervention may become more frequent after environmental disturbances. Evaluations of ecological responses to cumulative effects of management interventions and natural disturbances provide critical decision-support tools for managers who strive to balance environmental conservation and economic development. We conducted an experiment to evaluate the effects of salvage logging on avian community composition in lodgepole pine (Pinus contorta forests affected by beetle outbreaks in Oregon, USA, 1996-1998. Treatments consisted of the removal of lodgepole pine snags only, and live trees were not harvested. We used a bayesian hierarchical model to quantify occupancy dynamics for 27 breeding species, while accounting for variation in the detection process. We examined how magnitude and precision of treatment effects varied when incorporating prior information from a separate intervention study that occurred in a similar ecological system. Regardless of which prior we evaluated, we found no evidence that the harvest treatment had a negative impact on species richness, with an estimated average of 0.2-2.2 more species in harvested stands than unharvested stands. Estimated average similarity between control and treatment stands ranged from 0.82-0.87 (1 indicating complete similarity between a pair of stands and suggested that treatment stands did not contain novel assemblies of species responding to the harvesting prescription. Estimated treatment effects were positive for twenty-four (90% of the species, although the credible intervals contained 0 in all cases. These results suggest that, unlike most post-fire salvage logging prescriptions, selective harvesting after beetle outbreaks may meet multiple management objectives, including the maintenance of avian community richness comparable to what is found in unharvested stands. Our results provide managers with prescription alternatives to

  2. Estimation of Coast-Wide Population Trends of Marbled Murrelets in Canada Using a Bayesian Hierarchical Model.

    Directory of Open Access Journals (Sweden)

    Douglas F Bertram

    Full Text Available Species at risk with secretive breeding behaviours, low densities, and wide geographic range pose a significant challenge to conservation actions because population trends are difficult to detect. Such is the case with the Marbled Murrelet (Brachyramphus marmoratus, a seabird listed as 'Threatened' by the Species at Risk Act in Canada largely due to the loss of its old growth forest nesting habitat. We report the first estimates of population trend of Marbled Murrelets in Canada derived from a monitoring program that uses marine radar to detect birds as they enter forest watersheds during 923 dawn surveys at 58 radar monitoring stations within the six Marbled Murrelet Conservation Regions on coastal British Columbia, Canada, 1996-2013. Temporal trends in radar counts were analyzed with a hierarchical Bayesian multivariate modeling approach that controlled for variation in tilt of the radar unit and day of year, included year-specific deviations from the overall trend ('year effects', and allowed for trends to be estimated at three spatial scales. A negative overall trend of -1.6%/yr (95% credibility interval: -3.2%, 0.01% indicated moderate evidence for a coast-wide decline, although trends varied strongly among the six conservation regions. Negative annual trends were detected in East Vancouver Island (-9%/yr and South Mainland Coast (-3%/yr Conservation Regions. Over a quarter of the year effects were significantly different from zero, and the estimated standard deviation in common-shared year effects between sites within each region was about 50% per year. This large common-shared interannual variation in counts may have been caused by regional movements of birds related to changes in marine conditions that affect the availability of prey.

  3. A hierarchical bayesian analysis of parasite prevalence and sociocultural outcomes: The role of structural racism and sanitation infrastructure.

    Science.gov (United States)

    Ross, Cody T; Winterhalder, Bruce

    2016-01-01

    We conduct a revaluation of the Thornhill and Fincher research project on parasites using finely-resolved geographic data on parasite prevalence, individual-level sociocultural data, and multilevel Bayesian modeling. In contrast to the evolutionary psychological mechanisms linking parasites to human behavior and cultural characteristics proposed by Thornhill and Fincher, we offer an alternative hypothesis that structural racism and differential access to sanitation systems drive both variation in parasite prevalence and differential behaviors and cultural characteristics. We adopt a Bayesian framework to estimate parasite prevalence rates in 51 districts in eight Latin American countries using the disease status of 170,220 individuals tested for infection with the intestinal roundworm Ascaris lumbricoides (Hürlimann et al., []: PLoS Negl Trop Dis 5:e1404). We then use district-level estimates of parasite prevalence and individual-level social data from 5,558 individuals in the same 51 districts (Latinobarómetro, 2008) to assess claims of causal associations between parasite prevalence and sociocultural characteristics. We find, contrary to Thornhill and Fincher, that parasite prevalence is positively associated with preferences for democracy, negatively associated with preferences for collectivism, and not associated with violent crime rates or gender inequality. A positive association between parasite prevalence and religiosity, as in Fincher and Thornhill (: Behav Brain Sci 35:61-79), and a negative association between parasite prevalence and achieved education, as predicted by Eppig et al. (: Proc R S B: Biol Sci 277:3801-3808), become negative and unreliable when reasonable controls are included in the model. We find support for all predictions derived from our hypothesis linking structural racism to both parasite prevalence and cultural outcomes. We conclude that best practices in biocultural modeling require examining more than one hypothesis, retaining

  4. Using hierarchical Bayesian binary probit models to analyze crash injury severity on high speed facilities with real-time traffic data.

    Science.gov (United States)

    Yu, Rongjie; Abdel-Aty, Mohamed

    2014-01-01

    Severe crashes are causing serious social and economic loss, and because of this, reducing crash injury severity has become one of the key objectives of the high speed facilities' (freeway and expressway) management. Traditional crash injury severity analysis utilized data mainly from crash reports concerning the crash occurrence information, drivers' characteristics and roadway geometric related variables. In this study, real-time traffic and weather data were introduced to analyze the crash injury severity. The space mean speeds captured by the Automatic Vehicle Identification (AVI) system on the two roadways were used as explanatory variables in this study; and data from a mountainous freeway (I-70 in Colorado) and an urban expressway (State Road 408 in Orlando) have been used to identify the analysis result's consistence. Binary probit (BP) models were estimated to classify the non-severe (property damage only) crashes and severe (injury and fatality) crashes. Firstly, Bayesian BP models' results were compared to the results from Maximum Likelihood Estimation BP models and it was concluded that Bayesian inference was superior with more significant variables. Then different levels of hierarchical Bayesian BP models were developed with random effects accounting for the unobserved heterogeneity at segment level and crash individual level, respectively. Modeling results from both studied locations demonstrate that large variations of speed prior to the crash occurrence would increase the likelihood of severe crash occurrence. Moreover, with considering unobserved heterogeneity in the Bayesian BP models, the model goodness-of-fit has improved substantially. Finally, possible future applications of the model results and the hierarchical Bayesian probit models were discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Construction of hierarchically porous metal-organic frameworks through linker labilization

    Science.gov (United States)

    Yuan, Shuai; Zou, Lanfang; Qin, Jun-Sheng; Li, Jialuo; Huang, Lan; Feng, Liang; Wang, Xuan; Bosch, Mathieu; Alsalme, Ali; Cagin, Tahir; Zhou, Hong-Cai

    2017-05-01

    A major goal of metal-organic framework (MOF) research is the expansion of pore size and volume. Although many approaches have been attempted to increase the pore size of MOF materials, it is still a challenge to construct MOFs with precisely customized pore apertures for specific applications. Herein, we present a new method, namely linker labilization, to increase the MOF porosity and pore size, giving rise to hierarchical-pore architectures. Microporous MOFs with robust metal nodes and pro-labile linkers were initially synthesized. The mesopores were subsequently created as crystal defects through the splitting of a pro-labile-linker and the removal of the linker fragments by acid treatment. We demonstrate that linker labilization method can create controllable hierarchical porous structures in stable MOFs, which facilitates the diffusion and adsorption process of guest molecules to improve the performances of MOFs in adsorption and catalysis.

  6. Hierarchical mesoporous/microporous carbon with graphitized frameworks for high-performance lithium-ion batteries

    Directory of Open Access Journals (Sweden)

    Yingying Lv

    2014-11-01

    Full Text Available A hierarchical meso-/micro-porous graphitized carbon with uniform mesopores and ordered micropores, graphitized frameworks, and extra-high surface area of ∼2200 m2/g, was successfully synthesized through a simple one-step chemical vapor deposition process. The commercial mesoporous zeolite Y was utilized as a meso-/ micro-porous template, and the small-molecule methane was employed as a carbon precursor. The as-prepared hierarchical meso-/micro-porous carbons have homogeneously distributed mesopores as a host for electrolyte, which facilitate Li+ ions transport to the large-area micropores, resulting a high reversible lithium ion storage of 1000 mA h/g and a high columbic efficiency of 65% at the first cycle.

  7. Expanding the framework of the varieties of capitalism: Turkey as a hierarchical market economy

    Directory of Open Access Journals (Sweden)

    Jiyan Kıran

    2018-01-01

    Full Text Available This article both extends the debate on the varieties of capitalism theory beyond the existing literature to solve the ambiguous position of the variety of capitalism that is found in Turkey and brings a novel approach to the studies of the political economy of Turkey by adopting a firm-centred position using the varieties of capitalism framework. Based on a qualitative comparison with the dependent market economies (DMEs, mixed market economies (MMEs and hierarchical market economies (HMEs, this article claims that Turkey is a hierarchical market economy with four characteristic features that are also found in Latin American economies. These core features are the dominance of the family-owned diversified business groups, state-regimented and weak industrial relations, low skills and the influence of MNCs.

  8. Titanium-Phosphonate-Based Metal-Organic Frameworks with Hierarchical Porosity for Enhanced Photocatalytic Hydrogen Evolution

    KAUST Repository

    Li, Hui

    2018-02-01

    Photocatalytic hydrogen production is crucial for solar-to-chemical conversion process, wherein high-efficiency photocatalysts lie in the heart of this area. Herein a new photocatalyst of hierarchically mesoporous titanium-phosphonate-based metal-organic frameworks, featuring well-structured spheres, periodic mesostructure and large secondary mesoporosity, are rationally designed with the complex of polyelectrolyte and cathodic surfactant serving as the template. The well-structured hierarchical porosity and homogeneously incorporated phosphonate groups can favor the mass transfer and strong optical absorption during the photocatalytic reactions. Correspondingly, the titanium phosphonates exhibit significantly improved photocatalytic hydrogen evolution rate along with impressive stability. This work can provide more insights into designing advanced photocatalysts for energy conversion and render a tunable platform in photoelectrochemical field.

  9. A Framework for a Decision Support System in a Hierarchical Extended Enterprise Decision Context

    Science.gov (United States)

    Boza, Andrés; Ortiz, Angel; Vicens, Eduardo; Poler, Raul

    Decision Support System (DSS) tools provide useful information to decision makers. In an Extended Enterprise, a new goal, changes in the current objectives or small changes in the extended enterprise configuration produce a necessary adjustment in its decision system. A DSS in this context must be flexible and agile to make suitable an easy and quickly adaptation to this new context. This paper proposes to extend the Hierarchical Production Planning (HPP) structure to an Extended Enterprise decision making context. In this way, a framework for DSS in Extended Enterprise context is defined using components of HPP. Interoperability details have been reviewed to identify the impact in this framework. The proposed framework allows overcoming some interoperability barriers, identifying and organizing components for a DSS in Extended Enterprise context, and working in the definition of an architecture to be used in the design process of a flexible DSS in Extended Enterprise context which can reuse components for futures Extended Enterprise configurations.

  10. Hierarchical control framework for integrated coordination between distributed energy resources and demand response

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Di; Lian, Jianming; Sun, Yannan; Yang, Tao; Hansen, Jacob

    2017-09-01

    Demand response is representing a significant but largely untapped resource that can greatly enhance the flexibility and reliability of power systems. In this paper, a hierarchical control framework is proposed to facilitate the integrated coordination between distributed energy resources and demand response. The proposed framework consists of coordination and device layers. In the coordination layer, various resource aggregations are optimally coordinated in a distributed manner to achieve the system-level objectives. In the device layer, individual resources are controlled in real time to follow the optimal power generation or consumption dispatched from the coordination layer. For the purpose of practical applications, a method is presented to determine the utility functions of controllable loads by taking into account the real-time load dynamics and the preferences of individual customers. The effectiveness of the proposed framework is validated by detailed simulation studies.

  11. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    Science.gov (United States)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  12. Intensity-based bayesian framework for image reconstruction from sparse projection data

    International Nuclear Information System (INIS)

    Rashed, E.A.; Kudo, Hiroyuki

    2009-01-01

    This paper presents a Bayesian framework for iterative image reconstruction from projection data measured over a limited number of views. The classical Nyquist sampling rule yields the minimum number of projection views required for accurate reconstruction. However, challenges exist in many medical and industrial imaging applications in which the projection data is undersampled. Classical analytical reconstruction methods such as filtered backprojection (FBP) are not a good choice for use in such cases because the data undersampling in the angular range introduces aliasing and streak artifacts that degrade lesion detectability. In this paper, we propose a Bayesian framework for maximum likelihood-expectation maximization (ML-EM)-based iterative reconstruction methods that incorporates a priori knowledge obtained from expected intensity information. The proposed framework is based on the fact that, in tomographic imaging, it is often possible to expect a set of intensity values of the reconstructed object with relatively high accuracy. The image reconstruction cost function is modified to include the l 1 norm distance to the a priori known information. The proposed method has the potential to regularize the solution to reduce artifacts without missing lesions that cannot be expected from the a priori information. Numerical studies showed a significant improvement in image quality and lesion detectability under the condition of highly undersampled projection data. (author)

  13. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  14. An Adaptive Data Collection Algorithm Based on a Bayesian Compressed Sensing Framework

    Directory of Open Access Journals (Sweden)

    Zhi Liu

    2014-05-01

    Full Text Available For Wireless Sensor Networks, energy efficiency is always a key consideration in system design. Compressed sensing is a new theory which has promising prospects in WSNs. However, how to construct a sparse projection matrix is a problem. In this paper, based on a Bayesian compressed sensing framework, a new adaptive algorithm which can integrate routing and data collection is proposed. By introducing new target node selection metrics, embedding the routing structure and maximizing the differential entropy for each collection round, an adaptive projection vector is constructed. Simulations show that compared to reference algorithms, the proposed algorithm can decrease computation complexity and improve energy efficiency.

  15. A Bayesian posterior predictive framework for weighting ensemble regional climate models

    Directory of Open Access Journals (Sweden)

    Y. Fan

    2017-06-01

    Full Text Available We present a novel Bayesian statistical approach to computing model weights in climate change projection ensembles in order to create probabilistic projections. The weight of each climate model is obtained by weighting the current day observed data under the posterior distribution admitted under competing climate models. We use a linear model to describe the model output and observations. The approach accounts for uncertainty in model bias, trend and internal variability, including error in the observations used. Our framework is general, requires very little problem-specific input, and works well with default priors. We carry out cross-validation checks that confirm that the method produces the correct coverage.

  16. Planetary micro-rover operations on Mars using a Bayesian framework for inference and control

    Science.gov (United States)

    Post, Mark A.; Li, Junquan; Quine, Brendan M.

    2016-03-01

    With the recent progress toward the application of commercially-available hardware to small-scale space missions, it is now becoming feasible for groups of small, efficient robots based on low-power embedded hardware to perform simple tasks on other planets in the place of large-scale, heavy and expensive robots. In this paper, we describe design and programming of the Beaver micro-rover developed for Northern Light, a Canadian initiative to send a small lander and rover to Mars to study the Martian surface and subsurface. For a small, hardware-limited rover to handle an uncertain and mostly unknown environment without constant management by human operators, we use a Bayesian network of discrete random variables as an abstraction of expert knowledge about the rover and its environment, and inference operations for control. A framework for efficient construction and inference into a Bayesian network using only the C language and fixed-point mathematics on embedded hardware has been developed for the Beaver to make intelligent decisions with minimal sensor data. We study the performance of the Beaver as it probabilistically maps a simple outdoor environment with sensor models that include uncertainty. Results indicate that the Beaver and other small and simple robotic platforms can make use of a Bayesian network to make intelligent decisions in uncertain planetary environments.

  17. Climatic Models Ensemble-based Mid-21st Century Runoff Projections: A Bayesian Framework

    Science.gov (United States)

    Achieng, K. O.; Zhu, J.

    2017-12-01

    There are a number of North American Regional Climate Change Assessment Program (NARCCAP) climatic models that have been used to project surface runoff in the mid-21st century. Statistical model selection techniques are often used to select the model that best fits data. However, model selection techniques often lead to different conclusions. In this study, ten models are averaged in Bayesian paradigm to project runoff. Bayesian Model Averaging (BMA) is used to project and identify effect of model uncertainty on future runoff projections. Baseflow separation - a two-digital filter which is also called Eckhardt filter - is used to separate USGS streamflow (total runoff) into two components: baseflow and surface runoff. We use this surface runoff as the a priori runoff when conducting BMA of runoff simulated from the ten RCM models. The primary objective of this study is to evaluate how well RCM multi-model ensembles simulate surface runoff, in a Bayesian framework. Specifically, we investigate and discuss the following questions: How well do ten RCM models ensemble jointly simulate surface runoff by averaging over all the models using BMA, given a priori surface runoff? What are the effects of model uncertainty on surface runoff simulation?

  18. A Bayesian approach to estimating variance components within a multivariate generalizability theory framework.

    Science.gov (United States)

    Jiang, Zhehan; Skorupski, William

    2017-12-12

    In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.

  19. Gating mechanisms of mechanosensitive channels of large conductance, I: a continuum mechanics-based hierarchical framework.

    Science.gov (United States)

    Chen, Xi; Cui, Qiang; Tang, Yuye; Yoo, Jejoong; Yethiraj, Arun

    2008-07-01

    A hierarchical simulation framework that integrates information from molecular dynamics (MD) simulations into a continuum model is established to study the mechanical response of mechanosensitive channel of large-conductance (MscL) using the finite element method (FEM). The proposed MD-decorated FEM (MDeFEM) approach is used to explore the detailed gating mechanisms of the MscL in Escherichia coli embedded in a palmitoyloleoylphosphatidylethanolamine lipid bilayer. In Part I of this study, the framework of MDeFEM is established. The transmembrane and cytoplasmic helices are taken to be elastic rods, the loops are modeled as springs, and the lipid bilayer is approximated by a three-layer sheet. The mechanical properties of the continuum components, as well as their interactions, are derived from molecular simulations based on atomic force fields. In addition, analytical closed-form continuum model and elastic network model are established to complement the MDeFEM approach and to capture the most essential features of gating. In Part II of this study, the detailed gating mechanisms of E. coli-MscL under various types of loading are presented and compared with experiments, structural model, and all-atom simulations, as well as the analytical models established in Part I. It is envisioned that such a hierarchical multiscale framework will find great value in the study of a variety of biological processes involving complex mechanical deformations such as muscle contraction and mechanotransduction.

  20. Marine habitat classification for ecosystem-based management: a proposed hierarchical framework.

    Science.gov (United States)

    Guarinello, Marisa L; Shumchenia, Emily J; King, John W

    2010-04-01

    Creating a habitat classification and mapping system for marine and coastal ecosystems is a daunting challenge due to the complex array of habitats that shift on various spatial and temporal scales. To meet this challenge, several countries have, or are developing, national classification systems and mapping protocols for marine habitats. To be effectively applied by scientists and managers it is essential that classification systems be comprehensive and incorporate pertinent physical, geological, biological, and anthropogenic habitat characteristics. Current systems tend to provide over-simplified conceptual structures that do not capture biological habitat complexity, marginalize anthropogenic features, and remain largely untested at finer scales. We propose a multi-scale hierarchical framework with a particular focus on finer scale habitat classification levels and conceptual schematics to guide habitat studies and management decisions. A case study using published data is included to compare the proposed framework with existing schemes. The example demonstrates how the proposed framework's inclusion of user-defined variables, a combined top-down and bottom-up approach, and multi-scale hierarchical organization can facilitate examination of marine habitats and inform management decisions.

  1. A Bayesian framework for estimating moment magnitude and its uncertainty from macroseismic intensity measures

    Science.gov (United States)

    Kawabata, E.; Main, I. G.; Naylor, M.; Chandler, R. E.

    2016-12-01

    In moderate to low seismicity areas such as the UK, earthquakes represent a small but not negligible risk to sensitive structures such as nuclear power plants. As a part of the safety case in the planning and regulation of such structures, seismic activity must first be monitored and quantified to form a catalogue of past events. In a low or moderate seismicity zone, most of our knowledge of the most significant events comes from macroseismic intensity measures from the pre-instrumental period (before 1900). These historical records must then be combined and calibrated with modern analogue and digitally-recorded instrumental data on a common source magnitude scale, the most useful of which is the moment magnitude. The result is a unified catalogue that can be used for probabilistic seismic hazard analysis. An isoseismal map involves a set of contours that enclose the areas at which the event was felt at particular intensity values or higher, called felt areas. It has been common practice to draw these contours by hand with varying degrees of subjectivity. Here, we demonstrate a Bayesian method for constructing such maps objectively from macroseismic intensity measures and their observed locations. It involves using mathematical expressions to represent concentric ellipses and estimating their optimal parameters and uncertainties in a Bayesian framework. Inferred fault orientations in the UK are predominantly vertical, so the elliptical assumption is reasonable at least to first order or as a null hypothesis. Relevant physical constraints are used as priors where available. The resulting posterior distributions are used to calculate felt area at a given intensity, as well as a probability density function for the inferred epicentre. We then describe another Bayesian approach for deriving moment magnitude from felt areas based on their relationship and known constraints such as the frequency-magnitude distribution. The use of Bayesian inference allows us to quantify

  2. A hierarchical framework approach for voice activity detection and speech enhancement.

    Science.gov (United States)

    Zhang, Yan; Tang, Zhen-min; Li, Yan-ping; Luo, Yang

    2014-01-01

    Accurate and effective voice activity detection (VAD) is a fundamental step for robust speech or speaker recognition. In this study, we proposed a hierarchical framework approach for VAD and speech enhancement. The modified Wiener filter (MWF) approach is utilized for noise reduction in the speech enhancement block. For the feature selection and voting block, several discriminating features were employed in a voting paradigm for the consideration of reliability and discriminative power. Effectiveness of the proposed approach is compared and evaluated to other VAD techniques by using two well-known databases, namely, TIMIT database and NOISEX-92 database. Experimental results show that the proposed method performs well under a variety of noisy conditions.

  3. A Bayesian least squares support vector machines based framework for fault diagnosis and failure prognosis

    Science.gov (United States)

    Khawaja, Taimoor Saleem

    A high-belief low-overhead Prognostics and Health Management (PHM) system is desired for online real-time monitoring of complex non-linear systems operating in a complex (possibly non-Gaussian) noise environment. This thesis presents a Bayesian Least Squares Support Vector Machine (LS-SVM) based framework for fault diagnosis and failure prognosis in nonlinear non-Gaussian systems. The methodology assumes the availability of real-time process measurements, definition of a set of fault indicators and the existence of empirical knowledge (or historical data) to characterize both nominal and abnormal operating conditions. An efficient yet powerful Least Squares Support Vector Machine (LS-SVM) algorithm, set within a Bayesian Inference framework, not only allows for the development of real-time algorithms for diagnosis and prognosis but also provides a solid theoretical framework to address key concepts related to classification for diagnosis and regression modeling for prognosis. SVM machines are founded on the principle of Structural Risk Minimization (SRM) which tends to find a good trade-off between low empirical risk and small capacity. The key features in SVM are the use of non-linear kernels, the absence of local minima, the sparseness of the solution and the capacity control obtained by optimizing the margin. The Bayesian Inference framework linked with LS-SVMs allows a probabilistic interpretation of the results for diagnosis and prognosis. Additional levels of inference provide the much coveted features of adaptability and tunability of the modeling parameters. The two main modules considered in this research are fault diagnosis and failure prognosis. With the goal of designing an efficient and reliable fault diagnosis scheme, a novel Anomaly Detector is suggested based on the LS-SVM machines. The proposed scheme uses only baseline data to construct a 1-class LS-SVM machine which, when presented with online data is able to distinguish between normal behavior

  4. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network.

    Science.gov (United States)

    Kim, Hyun Uk; Kim, Tae Yong; Lee, Sang Yup

    2011-01-01

    Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism's metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  5. A second gradient theoretical framework for hierarchical multiscale modeling of materials

    Energy Technology Data Exchange (ETDEWEB)

    Luscher, Darby J [Los Alamos National Laboratory; Bronkhorst, Curt A [Los Alamos National Laboratory; Mc Dowell, David L [GEORGIA TECH

    2009-01-01

    A theoretical framework for the hierarchical multiscale modeling of inelastic response of heterogeneous materials has been presented. Within this multiscale framework, the second gradient is used as a non local kinematic link between the response of a material point at the coarse scale and the response of a neighborhood of material points at the fine scale. Kinematic consistency between these scales results in specific requirements for constraints on the fluctuation field. The wryness tensor serves as a second-order measure of strain. The nature of the second-order strain induces anti-symmetry in the first order stress at the coarse scale. The multiscale ISV constitutive theory is couched in the coarse scale intermediate configuration, from which an important new concept in scale transitions emerges, namely scale invariance of dissipation. Finally, a strategy for developing meaningful kinematic ISVs and the proper free energy functions and evolution kinetics is presented.

  6. Hierarchical Brokering with Feedback Control Framework in Mobile Device-Centric Clouds

    Directory of Open Access Journals (Sweden)

    Chao-Lieh Chen

    2016-01-01

    Full Text Available We propose a hierarchical brokering architecture (HiBA and Mobile Multicloud Networking (MMCN feedback control framework for mobile device-centric cloud (MDC2 computing. Exploiting the MMCN framework and RESTful web-based interconnection, each tier broker probes resource state of its federation for control and management. Real-time and seamless services were developed. Case studies including intrafederation energy-aware balancing based on fuzzy feedback control and higher tier load balancing are further demonstrated to show how HiBA with MMCN relieves the embedding of algorithms when developing services. Theoretical performance model and real-world experiments both show that an MDC2 based on HiBA features better quality in terms of resource availability and network latency if it federates devices with enough resources distributed in lower tier hierarchy. The proposed HiBA realizes a development platform for MDC2 computing which is a feasible solution to User-Centric Networks (UCNs.

  7. Applied Bayesian hierarchical methods

    National Research Council Canada - National Science Library

    Congdon, P

    2010-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Posterior Inference from Bayes Formula . . . . . . . . . . . . 1.3 Markov Chain Monte Carlo Sampling in Relation to Monte Carlo Methods: Obtaining Posterior...

  8. Applied Bayesian hierarchical methods

    National Research Council Canada - National Science Library

    Congdon, P

    2010-01-01

    .... It also incorporates BayesX code, which is particularly useful in nonlinear regression. To demonstrate MCMC sampling from first principles, the author includes worked examples using the R package...

  9. Divisive normalization and neuronal oscillations in a single hierarchical framework of selective visual attention

    Directory of Open Access Journals (Sweden)

    Jorrit Steven Montijn

    2012-05-01

    Full Text Available In divisive normalization models of covert attention, spike rate modulations are commonly used as indicators of the effect of top-down attention. In addition, an increasing number of studies have shown that top-down attention increases the synchronization of neuronal oscillations as well, particularly those in gamma-band frequencies (25 to 100 Hz. Although modulations of spike rate and synchronous oscillations are not mutually exclusive as mechanisms of attention, there has thus far been little effort to integrate these concepts into a single framework of attention. Here, we aim to provide such a unified framework by expanding the normalization model of attention with a time dimension; allowing the simulation of a recently reported backward progression of attentional effects along the visual cortical hierarchy. A simple hierarchical cascade of normalization models simulating different cortical areas however leads to signal degradation and a loss of discriminability over time. To negate this degradation and ensure stable neuronal stimulus representations, we incorporate oscillatory phase entrainment into our model, a mechanism previously proposed as the communication-through-coherence (CTC hypothesis. Our analysis shows that divisive normalization and oscillation models can complement each other in a unified account of the neural mechanisms of selective visual attention. The resulting hierarchical normalization and oscillation (HNO model reproduces several additional spatial and temporal aspects of attentional modulation.

  10. Construction of a Hierarchical Architecture of Covalent Organic Frameworks via a Postsynthetic Approach.

    Science.gov (United States)

    Zhang, Gen; Tsujimoto, Masahiko; Packwood, Daniel; Duong, Nghia Tuan; Nishiyama, Yusuke; Kadota, Kentaro; Kitagawa, Susumu; Horike, Satoshi

    2018-02-21

    Covalent organic frameworks (COFs) represent an emerging class of crystalline porous materials that are constructed by the assembly of organic building blocks linked via covalent bonds. Several strategies have been developed for the construction of new COF structures; however, a facile approach to fabricate hierarchical COF architectures with controlled domain structures remains a significant challenge, and has not yet been achieved. In this study, a dynamic covalent chemistry (DCC)-based postsynthetic approach was employed at the solid-liquid interface to construct such structures. Two-dimensional imine-bonded COFs having different aromatic groups were prepared, and a homogeneously mixed-linker structure and a heterogeneously core-shell hollow structure were fabricated by controlling the reactivity of the postsynthetic reactions. Solid-state nuclear magnetic resonance (NMR) spectroscopy and transmission electron microscopy (TEM) confirmed the structures. COFs prepared by a postsynthetic approach exhibit several functional advantages compared with their parent phases. Their Brunauer-Emmett-Teller (BET) surface areas are 2-fold greater than those of their parent phases because of the higher crystallinity. In addition, the hydrophilicity of the material and the stepwise adsorption isotherms of H 2 O vapor in the hierarchical frameworks were precisely controlled, which was feasible because of the distribution of various domains of the two COFs by controlling the postsynthetic reaction. The approach opens new routes for constructing COF architectures with functionalities that are not possible in a single phase.

  11. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.

    Science.gov (United States)

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-06-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.

  12. Integration of three strucutally different stock assessment models in a Bayesian framework

    NARCIS (Netherlands)

    Kraak, S.B.M.; Bogaards, H.; Borges, L.; Machiels, M.A.M.; Keeken, van O.A.

    2007-01-01

    Bayesian statistics provide a method for expressing uncertainty of an unknown parameter value probabilistically (www.bayesian.org). Bayesian methods have been widely used in biological sciences, and recently in fisheries science applied to stock assessment. In our previous studies on Bayesian

  13. An Active Learning Framework for Hyperspectral Image Classification Using Hierarchical Segmentation

    Science.gov (United States)

    Zhang, Zhou; Pasolli, Edoardo; Crawford, Melba M.; Tilton, James C.

    2015-01-01

    Augmenting spectral data with spatial information for image classification has recently gained significant attention, as classification accuracy can often be improved by extracting spatial information from neighboring pixels. In this paper, we propose a new framework in which active learning (AL) and hierarchical segmentation (HSeg) are combined for spectral-spatial classification of hyperspectral images. The spatial information is extracted from a best segmentation obtained by pruning the HSeg tree using a new supervised strategy. The best segmentation is updated at each iteration of the AL process, thus taking advantage of informative labeled samples provided by the user. The proposed strategy incorporates spatial information in two ways: 1) concatenating the extracted spatial features and the original spectral features into a stacked vector and 2) extending the training set using a self-learning-based semi-supervised learning (SSL) approach. Finally, the two strategies are combined within an AL framework. The proposed framework is validated with two benchmark hyperspectral datasets. Higher classification accuracies are obtained by the proposed framework with respect to five other state-of-the-art spectral-spatial classification approaches. Moreover, the effectiveness of the proposed pruning strategy is also demonstrated relative to the approaches based on a fixed segmentation.

  14. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    Science.gov (United States)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed

  15. Constructing a clinical decision-making framework for image-guided radiotherapy using a Bayesian Network

    International Nuclear Information System (INIS)

    Hargrave, C; Deegan, T; Gibbs, A; Poulsen, M; Moores, M; Harden, F; Mengersen, K

    2014-01-01

    A decision-making framework for image-guided radiotherapy (IGRT) is being developed using a Bayesian Network (BN) to graphically describe, and probabilistically quantify, the many interacting factors that are involved in this complex clinical process. Outputs of the BN will provide decision-support for radiation therapists to assist them to make correct inferences relating to the likelihood of treatment delivery accuracy for a given image-guided set-up correction. The framework is being developed as a dynamic object-oriented BN, allowing for complex modelling with specific subregions, as well as representation of the sequential decision-making and belief updating associated with IGRT. A prototype graphic structure for the BN was developed by analysing IGRT practices at a local radiotherapy department and incorporating results obtained from a literature review. Clinical stakeholders reviewed the BN to validate its structure. The BN consists of a sub-network for evaluating the accuracy of IGRT practices and technology. The directed acyclic graph (DAG) contains nodes and directional arcs representing the causal relationship between the many interacting factors such as tumour site and its associated critical organs, technology and technique, and inter-user variability. The BN was extended to support on-line and off-line decision-making with respect to treatment plan compliance. Following conceptualisation of the framework, the BN will be quantified. It is anticipated that the finalised decision-making framework will provide a foundation to develop better decision-support strategies and automated correction algorithms for IGRT.

  16. An efficient Bayesian inference framework for coalescent-based nonparametric phylodynamics.

    Science.gov (United States)

    Lan, Shiwei; Palacios, Julia A; Karcher, Michael; Minin, Vladimir N; Shahbaba, Babak

    2015-10-15

    The field of phylodynamics focuses on the problem of reconstructing population size dynamics over time using current genetic samples taken from the population of interest. This technique has been extensively used in many areas of biology but is particularly useful for studying the spread of quickly evolving infectious diseases agents, e.g. influenza virus. Phylodynamic inference uses a coalescent model that defines a probability density for the genealogy of randomly sampled individuals from the population. When we assume that such a genealogy is known, the coalescent model, equipped with a Gaussian process prior on population size trajectory, allows for nonparametric Bayesian estimation of population size dynamics. Although this approach is quite powerful, large datasets collected during infectious disease surveillance challenge the state-of-the-art of Bayesian phylodynamics and demand inferential methods with relatively low computational cost. To satisfy this demand, we provide a computationally efficient Bayesian inference framework based on Hamiltonian Monte Carlo for coalescent process models. Moreover, we show that by splitting the Hamiltonian function, we can further improve the efficiency of this approach. Using several simulated and real datasets, we show that our method provides accurate estimates of population size dynamics and is substantially faster than alternative methods based on elliptical slice sampler and Metropolis-adjusted Langevin algorithm. The R code for all simulation studies and real data analysis conducted in this article are publicly available at http://www.ics.uci.edu/∼slan/lanzi/CODES.html and in the R package phylodyn available at https://github.com/mdkarcher/phylodyn. S.Lan@warwick.ac.uk or babaks@uci.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. An unified framework for Bayesian denoising for several medical and biological imaging modalities.

    Science.gov (United States)

    Sanches, João M; Nascimento, Jacinto C; Marques, Jorge S

    2007-01-01

    Multiplicative noise is often present in several medical and biological imaging modalities, such as MRI, Ultrasound, PET/SPECT and Fluorescence Microscopy. Noise removal and preserving the details is not a trivial task. Bayesian algorithms have been used to tackle this problem. They succeed to accomplish this task, however they lead to a computational burden as we increase the image dimensionality. Therefore, a significant effort has been made to accomplish this tradeoff, i.e., to develop fast and reliable algorithms to remove noise without distorting relevant clinical information. This paper provides a new unified framework for Bayesian denoising of images corrupted with additive and multiplicative multiplicative noise. This allows to deal with additive white Gaussian and multiplicative noise described by Poisson and Rayleigh distributions respectively. The proposed algorithm is based on the maximum a posteriori (MAP) criterion, and an edge preserving priors are used to avoid the distortion of the relevant image details. The denoising task is performed by an iterative scheme based on Sylvester/Lyapunov equation. This approach allows to use fast and efficient algorithms described in the literature to solve the Sylvester/Lyapunov equation developed in the context of the Control theory. Experimental results with synthetic and real data testify the performance of the proposed technique, and competitive results are achieved when comparing to the of the state-of-the-art methods.

  18. A Hierarchical Bayesian Approach for Combining Pharmacokinetic/Pharmacodynamic Modeling and Phase IIa Trial Design in Orphan Drugs: Treating Adrenoleukodystrophy with Lorenzo’s Oil

    Science.gov (United States)

    Basu, Cynthia; Ahmed, Mariam A.; Kartha, Reena V.; Brundage, Richard C.; Raymond, Gerald V.; Cloyd, James C.; Carlin, Bradley P.

    2017-01-01

    X-linked adrenoleukodystrophy (X-ALD) is a rare, progressive and typically fatal neurodegenerative disease. Lorenzo’s Oil (LO) is one of the few X-ALD treatments available, but little has been done to establish its clinical efficacy or indications for its use. In this paper, we analyze data on 116 male asymptomatic pediatric patients who were administered LO. We offer a hierarchical Bayesian statistical approach to understanding LO pharmacokinetics (PK) and pharmacodynamics (PD) resulting from an accumulation of very long chain fatty acids. We experiment with individual- and observational-level errors, various choices of prior distributions, and deal with the limitation of having just one observation per administration of the drug, as opposed to the more usual multiple observations per administration. We link LO dose to the plasma erucic acid concentrations by PK modeling, and then link this concentration to a biomarker (C26, a very long chain fatty acid) by PD modeling. Next, we design a Bayesian Phase IIa study to estimate precisely what improvements in the biomarker can arise from various LO doses, while simultaneously modeling a binary toxicity endpoint. Our Bayesian adaptive algorithm emerges as reasonably robust and efficient while still retaining good classical (frequentist) operating characteristics. Future work looks toward using the results of this trial to design a Phase III study linking LO dose to actual improvements in health status, as measured by the appearance of brain lesions observed via magnetic resonance imaging. PMID:27547896

  19. Bayesian Framework Approach for Prognostic Studies in Electrolytic Capacitor under Thermal Overstress Conditions

    Science.gov (United States)

    Kulkarni, Chetan S.; Celaya, Jose R.; Goebel, Kai; Biswas, Gautam

    2012-01-01

    Electrolytic capacitors are used in several applications ranging from power supplies for safety critical avionics equipment to power drivers for electro-mechanical actuator. Past experiences show that capacitors tend to degrade and fail faster when subjected to high electrical or thermal stress conditions during operations. This makes them good candidates for prognostics and health management. Model-based prognostics captures system knowledge in the form of physics-based models of components in order to obtain accurate predictions of end of life based on their current state of heal th and their anticipated future use and operational conditions. The focus of this paper is on deriving first principles degradation models for thermal stress conditions and implementing Bayesian framework for making remaining useful life predictions. Data collected from simultaneous experiments are used to validate the models. Our overall goal is to derive accurate models of capacitor degradation, and use them to remaining useful life in DC-DC converters.

  20. Toward an Adaptive Learning System Framework: Using Bayesian Network to Manage Learner Model

    Directory of Open Access Journals (Sweden)

    Viet Anh Nguyen

    2012-12-01

    Full Text Available This paper represents a new approach to manage learner modeling in an adaptive learning system framework. It considers developing the basic components of an adaptive learning system such as the learner model, the course content model and the adaptation engine. We use the overlay model and Bayesian network to evaluate learners’ knowledge. In addition, we also propose a new content modeling method as well as adaptation engine to generate adaptive course based on learner’s knowledge. Based on this approach, we developed an adaptive learning system named is ACGS-II, that teaches students how to design an Entity Relationship model in a database system course. Empirical testing results for students who used the application indicate that our proposed model is very helpful as guidelines to develop adaptive learning system to meet learners’ demands.

  1. Bayesian hierarchical model for transcriptional module discovery by jointly modeling gene expression and ChIP-chip data

    Directory of Open Access Journals (Sweden)

    Sivaganesan Siva

    2007-08-01

    Full Text Available Abstract Background Transcriptional modules (TM consist of groups of co-regulated genes and transcription factors (TF regulating their expression. Two high-throughput (HT experimental technologies, gene expression microarrays and Chromatin Immuno-Precipitation on Chip (ChIP-chip, are capable of producing data informative about expression regulatory mechanism on a genome scale. The optimal approach to joint modeling of data generated by these two complementary biological assays, with the goal of identifying and characterizing TMs, is an important open problem in computational biomedicine. Results We developed and validated a novel probabilistic model and related computational procedure for identifying TMs by jointly modeling gene expression and ChIP-chip binding data. We demonstrate an improved functional coherence of the TMs produced by the new method when compared to either analyzing expression or ChIP-chip data separately or to alternative approaches for joint analysis. We also demonstrate the ability of the new algorithm to identify novel regulatory relationships not revealed by ChIP-chip data alone. The new computational procedure can be used in more or less the same way as one would use simple hierarchical clustering without performing any special transformation of data prior to the analysis. The R and C-source code for implementing our algorithm is incorporated within the R package gimmR which is freely available at http://eh3.uc.edu/gimm. Conclusion Our results indicate that, whenever available, ChIP-chip and expression data should be analyzed within the unified probabilistic modeling framework, which will likely result in improved clusters of co-regulated genes and improved ability to detect meaningful regulatory relationships. Given the good statistical properties and the ease of use, the new computational procedure offers a worthy new tool for reconstructing transcriptional regulatory networks.

  2. Multifaceted Modularity: A Key for Stepwise Building of Hierarchical Complexity in Actinide Metal–Organic Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Dolgopolova, Ekaterina A. [Department; Ejegbavwo, Otega A. [Department; Martin, Corey R. [Department; Smith, Mark D. [Department; Setyawan, Wahyu [Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Karakalos, Stavros G. [College; Henager, Charles H. [Pacific Northwest National Laboratory, Richland, Washington 99352, United States; zur Loye, Hans-Conrad [Department; Shustova, Natalia B. [Department

    2017-11-07

    Growing necessity for efficient nuclear waste management is a driving force for development of alternative architectures towards fundamental understanding of mechanisms involved in actinide integration inside extended structures. In this manuscript, metal-organic frameworks (MOFs) were investigated as a model system for engineering radionuclide containing materials through utilization of unprecedented MOF modularity, which cannot be replicated in any other type of materials. Through the implementation of recent synthetic advances in the MOF field, hierarchical complexity of An-materials were built stepwise, which was only feasible due to preparation of the first examples of actinide-based frameworks with “unsaturated” metal nodes. The first successful attempts of solid-state metathesis and metal node extension in An-MOFs are reported, and the results of the former approach revealed drastic differences in chemical behavior of extended structures versus molecular species. Successful utilization of MOF modularity also allowed us to structurally characterize the first example of bimetallic An-An nodes. To the best of our knowledge, through combination of solid-state metathesis, guest incorporation, and capping linker installation, we were able to achieve the highest Th wt% in mono- and bi-actinide frameworks with minimal structural density. Overall, combination of a multistep synthetic approach with homogeneous actinide distribution and moderate solvothermal conditions could make MOFs an exceptionally powerful tool to address fundamental questions responsible for chemical behavior of An-based extended structures, and therefore, shed light on possible optimization of nuclear waste administration.

  3. A process-based hierarchical framework for monitoring glaciated alpine headwaters

    Science.gov (United States)

    Weekes, Anne A.; Torgersen, Christian E.; Montgomery, David R.; Woodward, Andrea; Bolton, Susan M.

    2012-01-01

    Recent studies have demonstrated the geomorphic complexity and wide range of hydrologic regimes found in alpine headwater channels that provide complex habitats for aquatic taxa. These geohydrologic elements are fundamental to better understand patterns in species assemblages and indicator taxa and are necessary to aquatic monitoring protocols that aim to track changes in physical conditions. Complex physical variables shape many biological and ecological traits, including life history strategies, but these mechanisms can only be understood if critical physical variables are adequately represented within the sampling framework. To better align sampling design protocols with current geohydrologic knowledge, we present a conceptual framework that incorporates regional-scale conditions, basin-scale longitudinal profiles, valley-scale glacial macroform structure, valley segment-scale (i.e., colluvial, alluvial, and bedrock), and reach-scale channel types. At the valley segment- and reach-scales, these hierarchical levels are associated with differences in streamflow and sediment regime, water source contribution and water temperature. Examples of linked physical-ecological hypotheses placed in a landscape context and a case study using the proposed framework are presented to demonstrate the usefulness of this approach for monitoring complex temporal and spatial patterns and processes in glaciated basins. This approach is meant to aid in comparisons between mountain regions on a global scale and to improve management of potentially endangered alpine species affected by climate change and other stressors.

  4. A Hierarchical Framework Approach for Voice Activity Detection and Speech Enhancement

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2014-01-01

    Full Text Available Accurate and effective voice activity detection (VAD is a fundamental step for robust speech or speaker recognition. In this study, we proposed a hierarchical framework approach for VAD and speech enhancement. The modified Wiener filter (MWF approach is utilized for noise reduction in the speech enhancement block. For the feature selection and voting block, several discriminating features were employed in a voting paradigm for the consideration of reliability and discriminative power. Effectiveness of the proposed approach is compared and evaluated to other VAD techniques by using two well-known databases, namely, TIMIT database and NOISEX-92 database. Experimental results show that the proposed method performs well under a variety of noisy conditions.

  5. Social Influence on Information Technology Adoption and Sustained Use in Healthcare: A Hierarchical Bayesian Learning Method Analysis

    Science.gov (United States)

    Hao, Haijing

    2013-01-01

    Information technology adoption and diffusion is currently a significant challenge in the healthcare delivery setting. This thesis includes three papers that explore social influence on information technology adoption and sustained use in the healthcare delivery environment using conventional regression models and novel hierarchical Bayesian…

  6. A Hierarchical and Dynamic Seascape Framework for Scaling and Comparing Ocean Biodiversity Observations

    Science.gov (United States)

    Kavanaugh, M.; Muller-Karger, F. E.; Montes, E.; Santora, J. A.; Chavez, F.; Messié, M.; Doney, S. C.

    2016-02-01

    The pelagic ocean is a complex system in which physical, chemical and biological processes interact to shape patterns on multiple spatial and temporal scales and levels of ecological organization. Monitoring and management of marine seascapes must consider a hierarchical and dynamic mosaic, where the boundaries, extent, and location of features change with time. As part of a Marine Biodiversity Observing Network demonstration project, we conducted a multiscale classification of dynamic coastal seascapes in the northeastern Pacific and Gulf of Mexico using multivariate satellite and modeled data. Synoptic patterns were validated using mooring and ship-based observations that spanned multiple trophic levels and were collected as part of several long-term monitoring programs, including the Monterey Bay and Florida Keys National Marine Sanctuaries. Seascape extent and habitat diversity varied as a function of both seasonal and interannual forcing. We discuss the patterns of in situ observations in the context of seascape dynamics and the effect on rarefaction, spatial patchiness, and tracking and comparing ecosystems through time. A seascape framework presents an effective means to translate local biodiversity measurements to broader spatiotemporal scales, scales relevant for modeling the effects of global change and enabling whole-ecosystem management in the dynamic ocean.

  7. A Hierarchical Feature and Sample Selection Framework and Its Application for Alzheimer’s Disease Diagnosis

    Science.gov (United States)

    An, Le; Adeli, Ehsan; Liu, Mingxia; Zhang, Jun; Lee, Seong-Whan; Shen, Dinggang

    2017-03-01

    Classification is one of the most important tasks in machine learning. Due to feature redundancy or outliers in samples, using all available data for training a classifier may be suboptimal. For example, the Alzheimer’s disease (AD) is correlated with certain brain regions or single nucleotide polymorphisms (SNPs), and identification of relevant features is critical for computer-aided diagnosis. Many existing methods first select features from structural magnetic resonance imaging (MRI) or SNPs and then use those features to build the classifier. However, with the presence of many redundant features, the most discriminative features are difficult to be identified in a single step. Thus, we formulate a hierarchical feature and sample selection framework to gradually select informative features and discard ambiguous samples in multiple steps for improved classifier learning. To positively guide the data manifold preservation process, we utilize both labeled and unlabeled data during training, making our method semi-supervised. For validation, we conduct experiments on AD diagnosis by selecting mutually informative features from both MRI and SNP, and using the most discriminative samples for training. The superior classification results demonstrate the effectiveness of our approach, as compared with the rivals.

  8. A framework for Bayesian nonparametric inference for causal effects of mediation.

    Science.gov (United States)

    Kim, Chanmin; Daniels, Michael J; Marcus, Bess H; Roy, Jason A

    2017-06-01

    We propose a Bayesian non-parametric (BNP) framework for estimating causal effects of mediation, the natural direct, and indirect, effects. The strategy is to do this in two parts. Part 1 is a flexible model (using BNP) for the observed data distribution. Part 2 is a set of uncheckable assumptions with sensitivity parameters that in conjunction with Part 1 allows identification and estimation of the causal parameters and allows for uncertainty about these assumptions via priors on the sensitivity parameters. For Part 1, we specify a Dirichlet process mixture of multivariate normals as a prior on the joint distribution of the outcome, mediator, and covariates. This approach allows us to obtain a (simple) closed form of each marginal distribution. For Part 2, we consider two sets of assumptions: (a) the standard sequential ignorability (Imai et al., 2010) and (b) weakened set of the conditional independence type assumptions introduced in Daniels et al. (2012) and propose sensitivity analyses for both. We use this approach to assess mediation in a physical activity promotion trial. © 2016, The International Biometric Society.

  9. A Bayesian Framework for Multiple Trait Colo-calization from Summary Association Statistics.

    Science.gov (United States)

    Giambartolomei, Claudia; Zhenli Liu, Jimmy; Zhang, Wen; Hauberg, Mads; Shi, Huwenbo; Boocock, James; Pickrell, Joe; Jaffe, Andrew E; Pasaniuc, Bogdan; Roussos, Panos

    2018-03-19

    Most genetic variants implicated in complex diseases by genome-wide association studies (GWAS) are non-coding, making it challenging to understand the causative genes involved in disease. Integrating external information such as quantitative trait locus (QTL) mapping of molecular traits (e.g., expression, methylation) is a powerful approach to identify the subset of GWAS signals explained by regulatory effects. In particular, expression QTLs (eQTLs) help pinpoint the responsible gene among the GWAS regions that harbor many genes, while methylation QTLs (mQTLs) help identify the epigenetic mechanisms that impact gene expression which in turn affect disease risk. In this work we propose multiple-trait-coloc (moloc), a Bayesian statistical framework that integrates GWAS summary data with multiple molecular QTL data to identify regulatory effects at GWAS risk loci. We applied moloc to schizophrenia (SCZ) and eQTL/mQTL data derived from human brain tissue and identified 52 candidate genes that influence SCZ through methylation. Our method can be applied to any GWAS and relevant functional data to help prioritize disease associated genes. moloc is available for download as an R package (https://github.com/clagiamba/moloc). We also developed a web site to visualize the biological findings (icahn.mssm.edu/moloc). The browser allows searches by gene, methylation probe, and scenario of interest. claudia.giambartolomei@gmail.com. Supplementary data are available at Bioinformatics online.

  10. A Bayesian framework for human body pose tracking from depth image sequences.

    Science.gov (United States)

    Zhu, Youding; Fujimura, Kikuo

    2010-01-01

    This paper addresses the problem of accurate and robust tracking of 3D human body pose from depth image sequences. Recovering the large number of degrees of freedom in human body movements from a depth image sequence is challenging due to the need to resolve the depth ambiguity caused by self-occlusions and the difficulty to recover from tracking failure. Human body poses could be estimated through model fitting using dense correspondences between depth data and an articulated human model (local optimization method). Although it usually achieves a high accuracy due to dense correspondences, it may fail to recover from tracking failure. Alternately, human pose may be reconstructed by detecting and tracking human body anatomical landmarks (key-points) based on low-level depth image analysis. While this method (key-point based method) is robust and recovers from tracking failure, its pose estimation accuracy depends solely on image-based localization accuracy of key-points. To address these limitations, we present a flexible Bayesian framework for integrating pose estimation results obtained by methods based on key-points and local optimization. Experimental results are shown and performance comparison is presented to demonstrate the effectiveness of the proposed approach.

  11. A Bayesian Framework That Integrates Heterogeneous Data for Inferring Gene Regulatory Networks

    Energy Technology Data Exchange (ETDEWEB)

    Santra, Tapesh, E-mail: tapesh.santra@ucd.ie [Systems Biology Ireland, University College Dublin, Dublin (Ireland)

    2014-05-20

    Reconstruction of gene regulatory networks (GRNs) from experimental data is a fundamental challenge in systems biology. A number of computational approaches have been developed to infer GRNs from mRNA expression profiles. However, expression profiles alone are proving to be insufficient for inferring GRN topologies with reasonable accuracy. Recently, it has been shown that integration of external data sources (such as gene and protein sequence information, gene ontology data, protein–protein interactions) with mRNA expression profiles may increase the reliability of the inference process. Here, I propose a new approach that incorporates transcription factor binding sites (TFBS) and physical protein interactions (PPI) among transcription factors (TFs) in a Bayesian variable selection (BVS) algorithm which can infer GRNs from mRNA expression profiles subjected to genetic perturbations. Using real experimental data, I show that the integration of TFBS and PPI data with mRNA expression profiles leads to significantly more accurate networks than those inferred from expression profiles alone. Additionally, the performance of the proposed algorithm is compared with a series of least absolute shrinkage and selection operator (LASSO) regression-based network inference methods that can also incorporate prior knowledge in the inference framework. The results of this comparison suggest that BVS can outperform LASSO regression-based method in some circumstances.

  12. A Bayesian Framework that integrates heterogeneous data for inferring gene regulatory networks

    Directory of Open Access Journals (Sweden)

    Tapesh eSantra

    2014-05-01

    Full Text Available Reconstruction of gene regulatory networks (GRNs from experimental data is a fundamental challenge in systems biology. A number of computational approaches have been developed to infer GRNs from mRNA expression profiles. However, expression profiles alone are proving to be insufficient for inferring GRN topologies with reasonable accuracy. Recently, it has been shown that integration of external data sources (such as gene and protein sequence information, gene ontology data, protein protein interactions with mRNA expression profiles may increase the reliability of the inference process. Here, I propose a new approach that incorporates transcription factor binding sites (TFBS and physical protein interactions (PPI among transcription factors (TFs in a Bayesian Variable Selection (BVS algorithm which can infer GRNs from mRNA expression profiles subjected to genetic perturbations. Using real experimental data, I show that the integration of TFBS and PPI data with mRNA expression profiles leads to significantly more accurate networks than those inferred from expression profiles alone. Additionally, the performance of the proposed algorithm is compared with a series of LASSO regression based network inference methods that can also incorporate prior knowledge in the inference framework. The results of this comparison suggest that BVS can outperform LASSO regression based method in some circumstances.

  13. Translating Uncertain Sea Level Projections Into Infrastructure Impacts Using a Bayesian Framework

    Science.gov (United States)

    Moftakhari, Hamed; AghaKouchak, Amir; Sanders, Brett F.; Matthew, Richard A.; Mazdiyasni, Omid

    2017-12-01

    Climate change may affect ocean-driven coastal flooding regimes by both raising the mean sea level (msl) and altering ocean-atmosphere interactions. For reliable projections of coastal flood risk, information provided by different climate models must be considered in addition to associated uncertainties. In this paper, we propose a framework to project future coastal water levels and quantify the resulting flooding hazard to infrastructure. We use Bayesian Model Averaging to generate a weighted ensemble of storm surge predictions from eight climate models for two coastal counties in California. The resulting ensembles combined with msl projections, and predicted astronomical tides are then used to quantify changes in the likelihood of road flooding under representative concentration pathways 4.5 and 8.5 in the near-future (1998-2063) and mid-future (2018-2083). The results show that road flooding rates will be significantly higher in the near-future and mid-future compared to the recent past (1950-2015) if adaptation measures are not implemented.

  14. A state-space Bayesian framework for estimating biogeochemical transformations using time-lapse geophysical data

    Energy Technology Data Exchange (ETDEWEB)

    Chen, J.; Hubbard, S.; Williams, K.; Pride, S.; Li, L.; Steefel, C.; Slater, L.

    2009-04-15

    We develop a state-space Bayesian framework to combine time-lapse geophysical data with other types of information for quantitative estimation of biogeochemical parameters during bioremediation. We consider characteristics of end-products of biogeochemical transformations as state vectors, which evolve under constraints of local environments through evolution equations, and consider time-lapse geophysical data as available observations, which could be linked to the state vectors through petrophysical models. We estimate the state vectors and their associated unknown parameters over time using Markov chain Monte Carlo sampling methods. To demonstrate the use of the state-space approach, we apply it to complex resistivity data collected during laboratory column biostimulation experiments that were poised to precipitate iron and zinc sulfides during sulfate reduction. We develop a petrophysical model based on sphere-shaped cells to link the sulfide precipitate properties to the time-lapse geophysical attributes and estimate volume fraction of the sulfide precipitates, fraction of the dispersed, sulfide-encrusted cells, mean radius of the aggregated clusters, and permeability over the course of the experiments. Results of the case study suggest that the developed state-space approach permits the use of geophysical datasets for providing quantitative estimates of end-product characteristics and hydrological feedbacks associated with biogeochemical transformations. Although tested here on laboratory column experiment datasets, the developed framework provides the foundation needed for quantitative field-scale estimation of biogeochemical parameters over space and time using direct, but often sparse wellbore data with indirect, but more spatially extensive geophysical datasets.

  15. Hierarchical Segmentation Framework for Identifying Natural Vegetation: A Case Study of the Tehachapi Mountains, California

    Directory of Open Access Journals (Sweden)

    Yan-Ting Liau

    2014-08-01

    Full Text Available Two critical limitations of very high resolution imagery interpretations for time-series analysis are higher imagery variances and large data sizes. Although object-based analyses with a multi-scale framework for diverse object sizes are one potential solution, more data requirements and large amounts of testing at high costs are required. In this study, I applied a three-level hierarchical vegetation framework for reducing those costs, and a three-step procedure was used to evaluate its effects on a digital orthophoto quadrangles with 1 m spatial resolution. Step one and step two were for image segmentation optimized for delineation of tree density, which involved global Otsu’s method followed by the random walker algorithm. Step three was for detailed species delineations, which were derived from multiresolution segmentation, in two test areas. Step one and step two were able to delineating tree density segments and label species association robustly, compared to previous hierarchical frameworks. However, step three was limited by less image information to produce detailed, reasonable image objects with optimal scale parameters for species labeling. This hierarchical vegetation framework has potential to develop baseline data for evaluating climate change impacts on vegetation at lower cost using widely available data and a personal laptop.

  16. A Bayesian Framework for False Belief Reasoning in Children: A Rational Integration of Theory-Theory and Simulation Theory.

    Science.gov (United States)

    Asakura, Nobuhiko; Inui, Toshio

    2016-01-01

    Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities.

  17. An Extended Bayesian Framework for Atrial and Ventricular Activity Separation in Atrial Fibrillation.

    Science.gov (United States)

    Roonizi, Ebadollah Kheirati; Sassi, Roberto

    2017-11-01

    An extended nonlinear Bayesian filtering framework is introduced for the analysis of atrial fibrillation (AF), in particular with single-channel electrocardiographical (ECG) recordings. It is suitable for simultaneously tracking the fundamental frequency of atrial fibrillatory waves (f-waves), and separating signals, linked to atrial and ventricular activity, during AF. In this framework, high-power ECG components, i.e., Q, R, S, and T waves, are modeled using sum of Gaussian functions. The atrial activity dynamical model is instead based on a trigonometrical function, with a fundamental frequency (the inverse of the dominant atrial cycle length), and its harmonics. The state variables of both dynamical models (QRS-T and f-waves) are hidden and, then estimated, sample by sample, using a Kalman smoother. Remarkably, the scheme is capable of separating ventricular and atrial activity signals, while contemporarily tracking the atrial fundamental frequency in time. The proposed method was evaluated using synthetic signals. In 290 ECGs in sinus rhythm from the PhysioNet PTB Diagnostic ECG Database, the P-waves were replaced with a synthetic f-wave. Broadband noise at different signal-to-noise ratio (SNR) (from 0 to 40 dB) was added to study the performance of the filter, under different SNR conditions. The results of the study demonstrated superior results in atrial and ventricular signal separation when compared with traditional average beat subtraction (ABS), one of the most widely used method for QRS-T cancellation (normalized mean square error = 0.045 for extended Kalman smoother (EKS) and 0.18 for ABS, SNR improvement was 21.1 dB for EKS and 12.2 dB for ABS in f-wave extraction). Various advantages of the proposed method have been addressed and demonstrated, including the problem of tracking the fundamental frequency of f-waves (root mean square error (RMSE) Hz for gradually changing frequency at SNR=15 dB) and of estimating robust QT/RR values during AF (RMSE at

  18. Comparing definitions of spatial relations for the analysis of geographic disparities in mortality within a Bayesian mixed-effects framework

    Directory of Open Access Journals (Sweden)

    Diego Fernando Rojas-Gualdrón

    Full Text Available ABSTRACT: Objective: To analyze the conceptual and technical differences between three definitions of spatial relations within a Bayesian mixed-effects framework: classical multilevel definition, spatial multiple membership definition and conditional autoregressive definition with an illustration of the estimate of geographic disparities in early neonatal mortality in Colombia, 2011-2014. Methods: A registry based cross-sectional study was conducted. Births and early neonatal deaths were obtained from the Colombian vital statistics registry for 2011-2014. Crude and adjusted Bayesian mixed effects regressions were performed for each definition of spatial relation. Model fit statistics, spatial autocorrelation of residuals and estimated mortality rates, geographic disparity measures, relative ratios and relative differences were compared. Results: The definition of spatial relations between municipalities based on the conditional autoregressive prior showed the best performance according to both fit statistics and residual spatial pattern analyses. Spatial multiple membership definition had a poor performance. Conclusion: Bayesian mixed effects regression with conditional autoregressive prior as an analytical framework may be an important contribution to epidemiological design as an improved alternative to ecological methods in the analyses of geographic disparities of mortality, considering potential ecological bias and spatial model misspecification.

  19. Comparing definitions of spatial relations for the analysis of geographic disparities in mortality within a Bayesian mixed-effects framework.

    Science.gov (United States)

    Rojas-Gualdrón, Diego Fernando

    2017-01-01

    To analyze the conceptual and technical differences between three definitions of spatial relations within a Bayesian mixed-effects framework: classical multilevel definition, spatial multiple membership definition and conditional autoregressive definition with an illustration of the estimate of geographic disparities in early neonatal mortality in Colombia, 2011-2014. A registry based cross-sectional study was conducted. Births and early neonatal deaths were obtained from the Colombian vital statistics registry for 2011-2014. Crude and adjusted Bayesian mixed effects regressions were performed for each definition of spatial relation. Model fit statistics, spatial autocorrelation of residuals and estimated mortality rates, geographic disparity measures, relative ratios and relative differences were compared. The definition of spatial relations between municipalities based on the conditional autoregressive prior showed the best performance according to both fit statistics and residual spatial pattern analyses. Spatial multiple membership definition had a poor performance. Bayesian mixed effects regression with conditional autoregressive prior as an analytical framework may be an important contribution to epidemiological design as an improved alternative to ecological methods in the analyses of geographic disparities of mortality, considering potential ecological bias and spatial model misspecification.

  20. Uncertainty of mass discharge estimates from contaminated sites using a fully Bayesian framework

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John

    2011-01-01

    Mass discharge estimates are increasingly being used in the management of contaminated sites and uncertainties related to such estimates are therefore of great practical importance. We present a rigorous approach for quantifying the uncertainty in the mass discharge across a multilevel control...... plane. The method accounts for: (1) conceptual model uncertainty through Bayesian model averaging, (2) heterogeneity through Bayesian geostatistics with an uncertain geostatistical model, and (3) measurement uncertainty. An ensemble of unconditional steady-state plume realizations is generated through...

  1. Nonlinear calibration transfer based on hierarchical Bayesian models and Lagrange Multipliers: Error bounds of estimates via Monte Carlo - Markov Chain sampling.

    Science.gov (United States)

    Seichter, Felicia; Vogt, Josef; Radermacher, Peter; Mizaikoff, Boris

    2017-01-25

    The calibration of analytical systems is time-consuming and the effort for daily calibration routines should therefore be minimized, while maintaining the analytical accuracy and precision. The 'calibration transfer' approach proposes to combine calibration data already recorded with actual calibrations measurements. However, this strategy was developed for the multivariate, linear analysis of spectroscopic data, and thus, cannot be applied to sensors with a single response channel and/or a non-linear relationship between signal and desired analytical concentration. To fill this gap for a non-linear calibration equation, we assume that the coefficients for the equation, collected over several calibration runs, are normally distributed. Considering that coefficients of an actual calibration are a sample of this distribution, only a few standards are needed for a complete calibration data set. The resulting calibration transfer approach is demonstrated for a fluorescence oxygen sensor and implemented as a hierarchical Bayesian model, combined with a Lagrange Multipliers technique and Monte-Carlo Markov-Chain sampling. The latter provides realistic estimates for coefficients and prediction together with accurate error bounds by simulating known measurement errors and system fluctuations. Performance criteria for validation and optimal selection of a reduced set of calibration samples were developed and lead to a setup which maintains the analytical performance of a full calibration. Strategies for a rapid determination of problems occurring in a daily calibration routine, are proposed, thereby opening the possibility of correcting the problem just in time. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Understanding uncertainties in non-linear population trajectories: a Bayesian semi-parametric hierarchical approach to large-scale surveys of coral cover.

    Directory of Open Access Journals (Sweden)

    Julie Vercelloni

    Full Text Available Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making.

  3. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  4. A Bayesian Belief Network framework to predict SOC stock change: the Veneto region (Italy) case study

    Science.gov (United States)

    Dal Ferro, Nicola; Quinn, Claire Helen; Morari, Francesco

    2017-04-01

    A key challenge for soil scientists is predicting agricultural management scenarios that combine crop productions with high standards of environmental quality. In this context, reversing the soil organic carbon (SOC) decline in croplands is required for maintaining soil fertility and contributing to mitigate GHGs emissions. Bayesian belief networks (BBN) are probabilistic models able to accommodate uncertainty and variability in the predictions of the impacts of management and environmental changes. By linking multiple qualitative and quantitative variables in a cause-and-effect relationships, BBNs can be used as a decision support system at different spatial scales to find best management strategies in the agroecosystems. In this work we built a BBN to model SOC dynamics (0-30 cm layer) in the low-lying plain of Veneto region, north-eastern Italy, and define best practices leading to SOC accumulation and GHGs (CO2-equivalent) emissions reduction. Regional pedo-climatic, land use and management information were combined with experimental and modelled data on soil C dynamics as natural and anthropic key drivers affecting SOC stock change. Moreover, utility nodes were introduced to determine optimal decisions for mitigating GHGs emissions from croplands considering also three different IPCC climate scenarios. The network was finally validated with real field data in terms of SOC stock change. Results showed that the BBN was able to model real SOC stock changes, since validation slightly overestimated SOC reduction (+5%) at the expenses of its accumulation. At regional level, probability distributions showed 50% of SOC loss, while only 17% of accumulation. However, the greatest losses (34%) were associated with low reduction rates (100-500 kg C ha-1 y-1), followed by 33% of stabilized conditions (-100 < SOC < 100 kg ha-1 y-1). Land use management (especially tillage operations and soil cover) played a primary role to affect SOC stock change, while climate conditions

  5. A generalized linear factor model approach to the hierarchical framework for responses and response times.

    Science.gov (United States)

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-05-01

    We show how the hierarchical model for responses and response times as developed by van der Linden (2007), Fox, Klein Entink, and van der Linden (2007), Klein Entink, Fox, and van der Linden (2009), and Glas and van der Linden (2010) can be simplified to a generalized linear factor model with only the mild restriction that there is no hierarchical model at the item side. This result is valuable as it enables all well-developed modelling tools and extensions that come with these methods. We show that the restriction we impose on the hierarchical model does not influence parameter recovery under realistic circumstances. In addition, we present two illustrative real data analyses to demonstrate the practical benefits of our approach. © 2014 The British Psychological Society.

  6. A decision‐making framework for flood risk management based on a Bayesian Influence Diagram

    DEFF Research Database (Denmark)

    Åstrøm, Helena Lisa Alexandra; Madsen, Henrik; Friis-Hansen, Peter

    2014-01-01

    new values of variables are observed, assuring that the risk assessment is constantly based on best available knowledge for each variable. Input data to IDs can come from multiple sources, and since each variable is described with a probability density function (pdf) this method provides an effective......We develop a Bayesian Influence Diagram (ID) approach for risk‐based decision‐ making in flood management. We show that it is a flexible decision‐making tool to assess flood risk in a non‐stationary environment and with an ability to test different adaptation measures in order to agree on the best...... combination of adaptation measures and the best time to invest in flood adaptation. IDs use Bayesian statistics which apply prior probabilities to produce posterior probabilities and, hence, use Bayesian probabilistic thinking to describe relationships between variables in a system. . Hence, we allow...

  7. A Bayesian Framework for False Belief Reasoning in Children: A Rational Integration of Theory-Theory and Simulation Theory

    Science.gov (United States)

    Asakura, Nobuhiko; Inui, Toshio

    2016-01-01

    Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities. PMID:28082941

  8. A two-stage model in a Bayesian framework to estimate a survival endpoint in the presence of confounding by indication.

    Science.gov (United States)

    Bellera, Carine; Proust-Lima, Cécile; Joseph, Lawrence; Richaud, Pierre; Taylor, Jeremy; Sandler, Howard; Hanley, James; Mathoulin-Pélissier, Simone

    2018-04-01

    Background Biomarker series can indicate disease progression and predict clinical endpoints. When a treatment is prescribed depending on the biomarker, confounding by indication might be introduced if the treatment modifies the marker profile and risk of failure. Objective Our aim was to highlight the flexibility of a two-stage model fitted within a Bayesian Markov Chain Monte Carlo framework. For this purpose, we monitored the prostate-specific antigens in prostate cancer patients treated with external beam radiation therapy. In the presence of rising prostate-specific antigens after external beam radiation therapy, salvage hormone therapy can be prescribed to reduce both the prostate-specific antigens concentration and the risk of clinical failure, an illustration of confounding by indication. We focused on the assessment of the prognostic value of hormone therapy and prostate-specific antigens trajectory on the risk of failure. Methods We used a two-stage model within a Bayesian framework to assess the role of the prostate-specific antigens profile on clinical failure while accounting for a secondary treatment prescribed by indication. We modeled prostate-specific antigens using a hierarchical piecewise linear trajectory with a random changepoint. Residual prostate-specific antigens variability was expressed as a function of prostate-specific antigens concentration. Covariates in the survival model included hormone therapy, baseline characteristics, and individual predictions of the prostate-specific antigens nadir and timing and prostate-specific antigens slopes before and after the nadir as provided by the longitudinal process. Results We showed positive associations between an increased prostate-specific antigens nadir, an earlier changepoint and a steeper post-nadir slope with an increased risk of failure. Importantly, we highlighted a significant benefit of hormone therapy, an effect that was not observed when the prostate-specific antigens trajectory was

  9. A generalized linear factor model approach to the hierarchical framework for responses and response times

    NARCIS (Netherlands)

    Molenaar, D.; Tuerlinckx, F.; van der Maas, H.L.J.

    2015-01-01

    We show how the hierarchical model for responses and response times as developed by van der Linden (2007), Fox, Klein Entink, and van der Linden (2007), Klein Entink, Fox, and van der Linden (2009), and Glas and van der Linden (2010) can be simplified to a generalized linear factor model with only

  10. A Framework for the Statistical Analysis of Probability of Mission Success Based on Bayesian Theory

    Science.gov (United States)

    2014-06-01

    regression, and two non-probabilistic methods, fuzzy logic and neural networks, are discussed and compared below to determine which gives the best...2 2.3 Fuzzy Logic ...accurate measure than possibilistic methods, such as fuzzy logic discussed below [5]. Bayesian inference easily accounts for subjectivity and

  11. Hierarchical Bayesian mixture modelling for antigen-specific T-cell subtyping in combinatorially encoded flow cytometry studies

    DEFF Research Database (Denmark)

    Lin, Lin; Chan, Cliburn; Hadrup, Sine R

    2013-01-01

    in the ability to characterize variation in immune responses involving larger numbers of functionally differentiated cell subtypes. We describe novel classes of Markov chain Monte Carlo methods for model fitting that exploit distributed GPU (graphics processing unit) implementation. We discuss issues of cellular...... subtype identification in this novel, general model framework, and provide a detailed example using simulated data. We then describe application to a data set from an experimental study of antigen-specific T-cell subtyping using combinatorially encoded assays in human blood samples. Summary comments...

  12. Hierarchical Nanostructures of Metal-Organic Frameworks Applied in Gas Separating ZIF-8-on-ZIF-67 Membranes.

    Science.gov (United States)

    Knebel, Alexander; Wulfert-Holzmann, Paul; Friebe, Sebastian; Pavel, Janet; Strauß, Ina; Mundstock, Alexander; Steinbach, Frank; Caro, Jürgen

    2018-02-02

    Membranes from metal-organic frameworks (MOFs) are highly interesting for industrial gas separation applications. Strongly improved performances for carbon capture and H 2 purification tasks in MOF membranes are obtained by using highly reproducable and very accuratly, hierarchically grown ZIF-8-on-ZIF-67 (ZIF-8@ZIF-67) nanostructures. To forgo hardly controllable solvothermal synthesis, particles and layers are prepared by self-assembling methods. It was possible for the first time to confirm ZIF-8-on-ZIF-67 membrane growth on rough and porous ceramic supports using the layer-by-layer deposition. Additionally, hierarchical particles are made in a fast RT synthesis with high monodispersity. Characterization of the hierarchical and epitaxial grown layers and particles is performed by SEM, TEM, EDXM and gas permeation. The system ZIF-8@ZIF-67 shows a nearly doubled H 2 /CO 2 separation factor, regardless of whether neat membrane or mixed-matrix-membrane in comparison to other MOF materials. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A hierarchical Bayesian network approach for linkage disequilibrium modeling and data-dimensionality reduction prior to genome-wide association studies

    Directory of Open Access Journals (Sweden)

    Leray Philippe

    2011-01-01

    Full Text Available Abstract Background Discovering the genetic basis of common genetic diseases in the human genome represents a public health issue. However, the dimensionality of the genetic data (up to 1 million genetic markers and its complexity make the statistical analysis a challenging task. Results We present an accurate modeling of dependences between genetic markers, based on a forest of hierarchical latent class models which is a particular class of probabilistic graphical models. This model offers an adapted framework to deal with the fuzzy nature of linkage disequilibrium blocks. In addition, the data dimensionality can be reduced through the latent variables of the model which synthesize the information borne by genetic markers. In order to tackle the learning of both forest structure and probability distributions, a generic algorithm has been proposed. A first implementation of our algorithm has been shown to be tractable on benchmarks describing 105 variables for 2000 individuals. Conclusions The forest of hierarchical latent class models offers several advantages for genome-wide association studies: accurate modeling of linkage disequilibrium, flexible data dimensionality reduction and biological meaning borne by latent variables.

  14. A hierarchical Bayesian network approach for linkage disequilibrium modeling and data-dimensionality reduction prior to genome-wide association studies.

    Science.gov (United States)

    Mourad, Raphaël; Sinoquet, Christine; Leray, Philippe

    2011-01-12

    Discovering the genetic basis of common genetic diseases in the human genome represents a public health issue. However, the dimensionality of the genetic data (up to 1 million genetic markers) and its complexity make the statistical analysis a challenging task. We present an accurate modeling of dependences between genetic markers, based on a forest of hierarchical latent class models which is a particular class of probabilistic graphical models. This model offers an adapted framework to deal with the fuzzy nature of linkage disequilibrium blocks. In addition, the data dimensionality can be reduced through the latent variables of the model which synthesize the information borne by genetic markers. In order to tackle the learning of both forest structure and probability distributions, a generic algorithm has been proposed. A first implementation of our algorithm has been shown to be tractable on benchmarks describing 105 variables for 2000 individuals. The forest of hierarchical latent class models offers several advantages for genome-wide association studies: accurate modeling of linkage disequilibrium, flexible data dimensionality reduction and biological meaning borne by latent variables.

  15. Ozone and childhood respiratory disease in three US cities: evaluation of effect measure modification by neighborhood socioeconomic status using a Bayesian hierarchical approach.

    Science.gov (United States)

    O' Lenick, Cassandra R; Chang, Howard H; Kramer, Michael R; Winquist, Andrea; Mulholland, James A; Friberg, Mariel D; Sarnat, Stefanie Ebelt

    2017-04-05

    Ground-level ozone is a potent airway irritant and a determinant of respiratory morbidity. Susceptibility to the health effects of ambient ozone may be influenced by both intrinsic and extrinsic factors, such as neighborhood socioeconomic status (SES). Questions remain regarding the manner and extent that factors such as SES influence ozone-related health effects, particularly across different study areas. Using a 2-stage modeling approach we evaluated neighborhood SES as a modifier of ozone-related pediatric respiratory morbidity in Atlanta, Dallas, & St. Louis. We acquired multi-year data on emergency department (ED) visits among 5-18 year olds with a primary diagnosis of respiratory disease in each city. Daily concentrations of 8-h maximum ambient ozone were estimated for all ZIP Code Tabulation Areas (ZCTA) in each city by fusing observed concentration data from available network monitors with simulations from an emissions-based chemical transport model. In the first stage, we used conditional logistic regression to estimate ZCTA-specific odds ratios (OR) between ozone and respiratory ED visits, controlling for temporal trends and meteorology. In the second stage, we combined ZCTA-level estimates in a Bayesian hierarchical model to assess overall associations and effect modification by neighborhood SES considering categorical and continuous SES indicators (e.g., ZCTA-specific levels of poverty). We estimated ORs and 95% posterior intervals (PI) for a 25 ppb increase in ozone. The hierarchical model combined effect estimates from 179 ZCTAs in Atlanta, 205 ZCTAs in Dallas, and 151 ZCTAs in St. Louis. The strongest overall association of ozone and pediatric respiratory disease was in Atlanta (OR = 1.08, 95% PI: 1.06, 1.11), followed by Dallas (OR = 1.04, 95% PI: 1.01, 1.07) and St. Louis (OR = 1.03, 95% PI: 0.99, 1.07). Patterns of association across levels of neighborhood SES in each city suggested stronger ORs in low compared to high SES areas, with

  16. A Framework for Final Drive Simultaneous Failure Diagnosis Based on Fuzzy Entropy and Sparse Bayesian Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Qing Ye

    2015-01-01

    Full Text Available This research proposes a novel framework of final drive simultaneous failure diagnosis containing feature extraction, training paired diagnostic models, generating decision threshold, and recognizing simultaneous failure modes. In feature extraction module, adopt wavelet package transform and fuzzy entropy to reduce noise interference and extract representative features of failure mode. Use single failure sample to construct probability classifiers based on paired sparse Bayesian extreme learning machine which is trained only by single failure modes and have high generalization and sparsity of sparse Bayesian learning approach. To generate optimal decision threshold which can convert probability output obtained from classifiers into final simultaneous failure modes, this research proposes using samples containing both single and simultaneous failure modes and Grid search method which is superior to traditional techniques in global optimization. Compared with other frequently used diagnostic approaches based on support vector machine and probability neural networks, experiment results based on F1-measure value verify that the diagnostic accuracy and efficiency of the proposed framework which are crucial for simultaneous failure diagnosis are superior to the existing approach.

  17. A Biologically-Inspired Framework for Contour Detection Using Superpixel-Based Candidates and Hierarchical Visual Cues

    Directory of Open Access Journals (Sweden)

    Xiao Sun

    2015-10-01

    Full Text Available Contour detection has been extensively investigated as a fundamental problem in computer vision. In this study, a biologically-inspired candidate weighting framework is proposed for the challenging task of detecting meaningful contours. In contrast to previous models that detect contours from pixels, a modified superpixel generation processing is proposed to generate a contour candidate set and then weigh the candidates by extracting hierarchical visual cues. We extract the low-level visual local cues to weigh the contour intrinsic property and mid-level visual cues on the basis of Gestalt principles for weighting the contour grouping constraint. Experimental results tested on the BSDS benchmark show that the proposed framework exhibits promising performances to capture meaningful contours in complex scenes.

  18. A biologically-inspired framework for contour detection using superpixel-based candidates and hierarchical visual cues.

    Science.gov (United States)

    Sun, Xiao; Shang, Ke; Ming, Delie; Tian, Jinwen; Ma, Jiayi

    2015-10-20

    Contour detection has been extensively investigated as a fundamental problem in computer vision. In this study, a biologically-inspired candidate weighting framework is proposed for the challenging task of detecting meaningful contours. In contrast to previous models that detect contours from pixels, a modified superpixel generation processing is proposed to generate a contour candidate set and then weigh the candidates by extracting hierarchical visual cues. We extract the low-level visual local cues to weigh the contour intrinsic property and mid-level visual cues on the basis of Gestalt principles for weighting the contour grouping constraint. Experimental results tested on the BSDS benchmark show that the proposed framework exhibits promising performances to capture meaningful contours in complex scenes.

  19. Evaluating the consequences of impaired monitoring of learned behavior in attention-deficit/hyperactivity disorder using a Bayesian hierarchical model of choice response time.

    Science.gov (United States)

    Weigard, Alexander; Huang-Pollock, Cynthia; Brown, Scott

    2016-05-01

    Performance monitoring deficits have been proposed as a cognitive marker involved in the development of attention-deficit/hyperactivity disorder (ADHD), but it is unclear whether these deficits cause impairment when established action sequences conflict with environmental demands. The current study applies a novel data-analytic technique to a well-established sequence learning paradigm to investigate reactions to disruption of learned behavior in ADHD. Children (ages 8-12) with and without ADHD completed a serial reaction time task in which they implicitly learned an 8-item sequence of keypresses over 5 training blocks. The training sequence was replaced with a novel sequence in a transfer block, and returned in 2 subsequent recovery blocks. Response time (RT) data were fit by a Bayesian hierarchical version of the linear ballistic accumulator model, which permitted the dissociation of learning processes from performance monitoring effects on RT. Sequence-specific learning on the task was reflected in the systematic reduction of the amount of evidence required to initiate a response, and was unimpaired in ADHD. When the novel sequence onset, typically developing children displayed a shift in their attentional state while children with ADHD did not, leading to worse subsequent performance compared to controls. Children with ADHD are not impaired in learning novel action sequences, but display difficulty monitoring their implementation and engaging top-down control when they become inadequate. These results support theories of ADHD that highlight the interactions between monitoring processes and changing cognitive demands as the cause of self-regulation and information-processing problems in the disorder. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Uncertainty estimation of the mass discharge from a contaminated site using a fully Bayesian framework

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, W.; Binning, Philip John

    2010-01-01

    Field estimates of the mass discharge (mass/time) from contaminated sites are increasingly being used in contaminated site management and a quantification of the uncertainties related to such estimates are therefore of great practical importance. We present here a comprehensive approach for quant......Field estimates of the mass discharge (mass/time) from contaminated sites are increasingly being used in contaminated site management and a quantification of the uncertainties related to such estimates are therefore of great practical importance. We present here a comprehensive approach...... for quantifying the uncertainty in the mass discharge across a multilevel control plane. The method is based on geostatistical inverse modelling and accounts for i) conceptual model uncertainty through multiple conceptual models and Bayesian model averaging, ii) heterogeneity through Bayesian geostatistics...

  1. BiomeNet: a Bayesian model for inference of metabolic divergence among microbial communities.

    OpenAIRE

    Mahdi Shafiei; Katherine A Dunn; Hugh Chipman; Hong Gu; Joseph P Bielawski

    2014-01-01

    Metagenomics yields enormous numbers of microbial sequences that can be assigned a metabolic function. Using such data to infer community-level metabolic divergence is hindered by the lack of a suitable statistical framework. Here, we describe a novel hierarchical Bayesian model, called BiomeNet (Bayesian inference of metabolic networks), for inferring differential prevalence of metabolic subnetworks among microbial communities. To infer the structure of community-level metabolic interactions...

  2. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  3. A Bayesian inversion framework for yield and height-of-burst/depth-of-burial for near-surface explosions

    Energy Technology Data Exchange (ETDEWEB)

    Johannesson, Gardar [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bulaevskaya, Vera [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ramirez, Abe [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ford, Sean [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rodgers, Artie [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-07

    A Bayesian inversion framework is presented to estimate the yield of an explosion and height-of-burst/depth-of-burial (HOB/DOB) using seismic and air pressure data. This is accomplished by first calibrating the parameters in the forward models that relate the observations to the yield and HOB/DOB and then using the calibrated model to estimate yield and HOB/DOB associated with a new set of seismic and air pressure observations. The MCMC algorithms required to perform these steps are outlined, and the results with real data are shown. Finally, an extension is proposed for a case when clustering in the seismic displacement occurs as a function of different types of rock and other factors.

  4. LiDAR based prediction of forest biomass using hierarchical models with spatially varying coefficients

    Science.gov (United States)

    Chad Babcock; Andrew O. Finley; John B. Bradford; Randy Kolka; Richard Birdsey; Michael G. Ryan

    2015-01-01

    Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both...

  5. A Bayesian Framework for Generalized Linear Mixed Modeling Identifies New Candidate Loci for Late-Onset Alzheimer's Disease.

    Science.gov (United States)

    Wang, Xulong; Philip, Vivek M; Ananda, Guruprasad; White, Charles C; Malhotra, Ankit; Michalski, Paul J; Karuturi, Krishna R Murthy; Chintalapudi, Sumana R; Acklin, Casey; Sasner, Michael; Bennett, David A; De Jager, Philip L; Howell, Gareth R; Carter, Gregory W

    2018-03-05

    Recent technical and methodological advances have greatly enhanced genome-wide association studies (GWAS). The advent of low-cost whole-genome sequencing facilitates high-resolution variant identification, and the development of linear mixed models (LMM) allows improved identification of putatively causal variants. While essential for correcting false positive associations due to sample relatedness and population stratification, LMMs have commonly been restricted to quantitative variables. However, phenotypic traits in association studies are often categorical, coded as binary case-control or ordered variables describing disease stages. To address these issues, we have devised a method for genomic association studies that implements a generalized linear mixed model (GLMM) in a Bayesian framework, called Bayes-GLMM Bayes-GLMM has four major features: (1) support of categorical, binary and quantitative variables; (2) cohesive integration of previous GWAS results for related traits; (3) correction for sample relatedness by mixed modeling; and (4) model estimation by both Markov chain Monte Carlo (MCMC) sampling and maximal likelihood estimation. We applied Bayes-GLMM to the whole-genome sequencing cohort of the Alzheimer's Disease Sequencing Project (ADSP). This study contains 570 individuals from 111 families, each with Alzheimer's disease diagnosed at one of four confidence levels. With Bayes-GLMM we identified four variants in three loci significantly associated with Alzheimer's disease. Two variants, rs140233081 and rs149372995 lie between PRKAR1B and PDGFA The coded proteins are localized to the glial-vascular unit, and PDGFA transcript levels are associated with AD-related neuropathology. In summary, this work provides implementation of a flexible, generalized mixed model approach in a Bayesian framework for association studies. Copyright © 2018, Genetics.

  6. A hierarchical spatial framework and database for the national river fish habitat condition assessment

    Science.gov (United States)

    Wang, L.; Infante, D.; Esselman, P.; Cooper, A.; Wu, D.; Taylor, W.; Beard, D.; Whelan, G.; Ostroff, A.

    2011-01-01

    Fisheries management programs, such as the National Fish Habitat Action Plan (NFHAP), urgently need a nationwide spatial framework and database for health assessment and policy development to protect and improve riverine systems. To meet this need, we developed a spatial framework and database using National Hydrography Dataset Plus (I-.100,000-scale); http://www.horizon-systems.com/nhdplus). This framework uses interconfluence river reaches and their local and network catchments as fundamental spatial river units and a series of ecological and political spatial descriptors as hierarchy structures to allow users to extract or analyze information at spatial scales that they define. This database consists of variables describing channel characteristics, network position/connectivity, climate, elevation, gradient, and size. It contains a series of catchment-natural and human-induced factors that are known to influence river characteristics. Our framework and database assembles all river reaches and their descriptors in one place for the first time for the conterminous United States. This framework and database provides users with the capability of adding data, conducting analyses, developing management scenarios and regulation, and tracking management progresses at a variety of spatial scales. This database provides the essential data needs for achieving the objectives of NFHAP and other management programs. The downloadable beta version database is available at http://ec2-184-73-40-15.compute-1.amazonaws.com/nfhap/main/.

  7. Uncertainty evaluation of mass discharge estimates from a contaminated site using a fully Bayesian framework

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, W.; Tuxen, N.

    2010-01-01

    , it is important to quantify the associated uncertainties. Here a rigorous approach for quantifying the uncertainty in the mass discharge across a multilevel control plane is presented. The method accounts for (1) conceptual model uncertainty using multiple conceptual models and Bayesian model averaging (BMA), (2...... for each of the conceptual models considered. The probability distribution of mass discharge is obtained by combining all ensembles via BMA. The method was applied to a trichloroethylene-contaminated site located in northern Copenhagen. Four essentially different conceptual models based on two source zone...... models and two geological models were set up for this site, each providing substantially different prior mass discharge distributions. After conditioning to data, the predicted mass discharge distributions from each of the four conceptual models all approach each other. This indicates that the data set...

  8. Sparse Bayesian framework applied to 3D super-resolution reconstruction in fetal brain MRI

    Science.gov (United States)

    Becerra, Laura C.; Velasco Toledo, Nelson; Romero Castro, Eduardo

    2015-01-01

    Fetal Magnetic Resonance (FMR) is an imaging technique that is becoming increasingly important as allows assessing brain development and thus make an early diagnostic of congenital abnormalities, spatial resolution is limited by the short acquisition time and the unpredictable fetus movements, in consequence the resulting images are characterized by non-parallel projection planes composed by anisotropic voxels. The sparse Bayesian representation is a flexible strategy which is able to model complex relationships. The Super-resolution is approached as a regression problem, the main advantage is the capability to learn data relations from observations. Quantitative performance evaluation was carried out using synthetic images, the proposed method demonstrates a better reconstruction quality compared with standard interpolation approach. The presented method is a promising approach to improve the information quality related with the 3-D fetal brain structure. It is important because allows assessing brain development and thus make an early diagnostic of congenital abnormalities.

  9. Neurocognitive and behavioral predictors of social problems in ADHD: A Bayesian framework.

    Science.gov (United States)

    Kofler, Michael J; Harmon, Sherelle L; Aduen, Paula A; Day, Taylor N; Austin, Kristin E; Spiegel, Jamie A; Irwin, Lauren; Sarver, Dustin E

    2018-03-01

    Social problems are a key area of functional impairment for children with attention deficit hyperactivity disorder (ADHD), and converging evidence points to executive dysfunction as a potential mechanism underlying ADHD-related social dysfunction. The evidence is mixed, however, with regard to which neurocognitive abilities account for these relations. A well-characterized group of 117 children ages 8-13 (M = 10.45, SD = 1.53; 43 girls; 69.5% Caucasian/Non-Hispanic) with ADHD (n = 77) and without ADHD (n = 40) were administered multiple, counterbalanced tests of neurocognitive functioning and assessed for social skills via multi-informant reports. Bayesian linear regressions revealed strong support for working memory and cross-informant interfering behaviors (inattention, hyperactivity/impulsivity) as predictors of parent- and teacher-reported social problems. Working memory was also implicated in social skills acquisition deficits, performance deficits, and strengths based on parent and/or teacher report; inattention and/or hyperactivity showed strong correspondence with cross-informant social problems in all models. There was no evidence for, and in most models strong evidence against, effects of inhibitory control and processing speed. The ADHD group was impaired relative to the non-ADHD group on social skills (d = 0.82-0.88), visuospatial working memory (d = 0.89), and phonological working memory (d = 0.58). In contrast, the Bayesian ANOVAs indicated that the ADHD and non-ADHD groups were equivalent on processing speed, IQ, age, gender, and socioeconomic status (SES). There was no support for or against group differences in inhibition. These findings confirm that ADHD is associated with impaired social performance, and implicate working memory and core ADHD symptoms in the acquisition and performance of socially skilled behavior. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Temporal and spatial variabilities of Antarctic ice mass changes inferred by GRACE in a Bayesian framework

    Science.gov (United States)

    Wang, L.; Davis, J. L.; Tamisiea, M. E.

    2017-12-01

    The Antarctic ice sheet (AIS) holds about 60% of all fresh water on the Earth, an amount equivalent to about 58 m of sea-level rise. Observation of AIS mass change is thus essential in determining and predicting its contribution to sea level. While the ice mass loss estimates for West Antarctica (WA) and the Antarctic Peninsula (AP) are in good agreement, what the mass balance over East Antarctica (EA) is, and whether or not it compensates for the mass loss is under debate. Besides the different error sources and sensitivities of different measurement types, complex spatial and temporal variabilities would be another factor complicating the accurate estimation of the AIS mass balance. Therefore, a model that allows for variabilities in both melting rate and seasonal signals would seem appropriate in the estimation of present-day AIS melting. We present a stochastic filter technique, which enables the Bayesian separation of the systematic stripe noise and mass signal in decade-length GRACE monthly gravity series, and allows the estimation of time-variable seasonal and inter-annual components in the signals. One of the primary advantages of this Bayesian method is that it yields statistically rigorous uncertainty estimates reflecting the inherent spatial resolution of the data. By applying the stochastic filter to the decade-long GRACE observations, we present the temporal variabilities of the AIS mass balance at basin scale, particularly over East Antarctica, and decipher the EA mass variations in the past decade, and their role in affecting overall AIS mass balance and sea level.

  11. Modeling framework for representing long-term effectiveness of best management practices in addressing hydrology and water quality problems: Framework development and demonstration using a Bayesian method

    Science.gov (United States)

    Liu, Yaoze; Engel, Bernard A.; Flanagan, Dennis C.; Gitau, Margaret W.; McMillan, Sara K.; Chaubey, Indrajeet; Singh, Shweta

    2018-05-01

    Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water quality models, a high level and forward-looking modeling framework was developed. The components in the framework consist of establishment period efficiency, starting efficiency, efficiency for each storm event, efficiency between maintenance, and efficiency over the life cycle. Combined, they represent long-term efficiency for a specific type of practice and specific environmental concern (runoff/pollutant). An approach for possible implementation of the framework was discussed. The long-term impacts of grass buffer strips (agricultural BMP) and bioretention systems (urban BMP) in reducing total phosphorus were simulated to demonstrate the framework. Data gaps were captured in estimating the long-term performance of the BMPs. A Bayesian method was used to match the simulated distribution of long-term BMP efficiencies with the observed distribution with the assumption that the observed data represented long-term BMP efficiencies. The simulated distribution matched the observed distribution well with only small total predictive uncertainties. With additional data, the same method can be used to further improve the simulation results. The modeling framework and results of this study, which can be adopted in hydrologic/water quality models to better represent long-term BMP effectiveness, can help improve decision support systems for creating long-term stormwater management strategies for watershed management projects.

  12. A regional classification of unregulated stream flows: spatial resolution and hierarchical frameworks.

    Science.gov (United States)

    Ryan A. McManamay; Donald J. Orth; Charles A. Dolloff; Emmaneul A. Firmpong

    2012-01-01

    River regulation has resulted in substantial losses in habitat connectivity, biodiversity and ecosystem services. River managers are faced with a growing need to protect the key aspects of the natural flow regime. A practical approach to providing environmental flow standards is to create a regional framework by classifying unregulated streams into groups of similar...

  13. Metal organic framework synthesis in the presence of surfactants : Towards hierarchical MOFs?

    NARCIS (Netherlands)

    Seoane, B.; Dikhtiarenko, A.; Mayoral, A.; Tellez, C.; Coronas, J.; Kapteijn, F.; Gascon, J.

    2015-01-01

    The effect of synthesis pH and H2O/EtOH molar ratio on the textural properties of different aluminium trimesate metal organic frameworks (MOFs) prepared in the presence of the well-known cationic surfactant cetyltrimethylammonium bromide (CTAB) at 120 °C was studied with the purpose of obtaining a

  14. A Parametric Empirical Bayesian framework for the EEG/MEG inverse problem: generative models for multisubject and multimodal integration

    Directory of Open Access Journals (Sweden)

    Richard N Henson

    2011-08-01

    Full Text Available We review recent methodological developments within a Parametric Empirical Bayesian (PEB framework for reconstructing intracranial sources of extracranial electroencephalographic (EEG and magnetoencephalographic (MEG data under linear Gaussian assumptions. The PEB framework offers a natural way to integrate multiple constraints (spatial priors on this inverse problem, such as those derived from different modalities (e.g., from functional magnetic resonance imaging, fMRI or from multiple replications (e.g., subjects. Using variations of the same basic generative model, we illustrate the application of PEB to three cases: 1 symmetric integration (fusion of MEG and EEG; 2 asymmetric integration of MEG or EEG with fMRI, and 3 group-optimisation of spatial priors across subjects. We evaluate these applications on multimodal data acquired from 18 subjects, focusing on energy induced by face perception within a time-frequency window of 100-220ms, 8-18Hz. We show the benefits of multi-modal, multi-subject integration in terms of the model evidence and the reproducibility (over subjects of cortical responses to faces.

  15. Assessment of a Bayesian Belief Network-GIS framework as a practical tool to support marine planning.

    Science.gov (United States)

    Stelzenmüller, V; Lee, J; Garnacho, E; Rogers, S I

    2010-10-01

    For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Allometric Models Based on Bayesian Frameworks Give Better Estimates of Aboveground Biomass in the Miombo Woodlands

    Directory of Open Access Journals (Sweden)

    Shem Kuyah

    2016-02-01

    Full Text Available The miombo woodland is the most extensive dry forest in the world, with the potential to store substantial amounts of biomass carbon. Efforts to obtain accurate estimates of carbon stocks in the miombo woodlands are limited by a general lack of biomass estimation models (BEMs. This study aimed to evaluate the accuracy of most commonly employed allometric models for estimating aboveground biomass (AGB in miombo woodlands, and to develop new models that enable more accurate estimation of biomass in the miombo woodlands. A generalizable mixed-species allometric model was developed from 88 trees belonging to 33 species ranging in diameter at breast height (DBH from 5 to 105 cm using Bayesian estimation. A power law model with DBH alone performed better than both a polynomial model with DBH and the square of DBH, and models including height and crown area as additional variables along with DBH. The accuracy of estimates from published models varied across different sites and trees of different diameter classes, and was lower than estimates from our model. The model developed in this study can be used to establish conservative carbon stocks required to determine avoided emissions in performance-based payment schemes, for example in afforestation and reforestation activities.

  17. Particle Kalman Filtering: A Nonlinear Bayesian Framework for Ensemble Kalman Filters*

    KAUST Repository

    Hoteit, Ibrahim

    2012-02-01

    This paper investigates an approximation scheme of the optimal nonlinear Bayesian filter based on the Gaussian mixture representation of the state probability distribution function. The resulting filter is similar to the particle filter, but is different from it in that the standard weight-type correction in the particle filter is complemented by the Kalman-type correction with the associated covariance matrices in the Gaussian mixture. The authors show that this filter is an algorithm in between the Kalman filter and the particle filter, and therefore is referred to as the particle Kalman filter (PKF). In the PKF, the solution of a nonlinear filtering problem is expressed as the weighted average of an “ensemble of Kalman filters” operating in parallel. Running an ensemble of Kalman filters is, however, computationally prohibitive for realistic atmospheric and oceanic data assimilation problems. For this reason, the authors consider the construction of the PKF through an “ensemble” of ensemble Kalman filters (EnKFs) instead, and call the implementation the particle EnKF (PEnKF). It is shown that different types of the EnKFs can be considered as special cases of the PEnKF. Similar to the situation in the particle filter, the authors also introduce a resampling step to the PEnKF in order to reduce the risk of weights collapse and improve the performance of the filter. Numerical experiments with the strongly nonlinear Lorenz-96 model are presented and discussed.

  18. A Bayesian framework to identify methylcytosines from high-throughput bisulfite sequencing data.

    Directory of Open Access Journals (Sweden)

    Qing Xie

    2014-09-01

    Full Text Available High-throughput bisulfite sequencing technologies have provided a comprehensive and well-fitted way to investigate DNA methylation at single-base resolution. However, there are substantial bioinformatic challenges to distinguish precisely methylcytosines from unconverted cytosines based on bisulfite sequencing data. The challenges arise, at least in part, from cell heterozygosis caused by multicellular sequencing and the still limited number of statistical methods that are available for methylcytosine calling based on bisulfite sequencing data. Here, we present an algorithm, termed Bycom, a new Bayesian model that can perform methylcytosine calling with high accuracy. Bycom considers cell heterozygosis along with sequencing errors and bisulfite conversion efficiency to improve calling accuracy. Bycom performance was compared with the performance of Lister, the method most widely used to identify methylcytosines from bisulfite sequencing data. The results showed that the performance of Bycom was better than that of Lister for data with high methylation levels. Bycom also showed higher sensitivity and specificity for low methylation level samples (<1% than Lister. A validation experiment based on reduced representation bisulfite sequencing data suggested that Bycom had a false positive rate of about 4% while maintaining an accuracy of close to 94%. This study demonstrated that Bycom had a low false calling rate at any methylation level and accurate methylcytosine calling at high methylation levels. Bycom will contribute significantly to studies aimed at recalibrating the methylation level of genomic regions based on the presence of methylcytosines.

  19. Low frequency full waveform seismic inversion within a tree based Bayesian framework

    Science.gov (United States)

    Ray, Anandaroop; Kaplan, Sam; Washbourne, John; Albertin, Uwe

    2018-01-01

    Limited illumination, insufficient offset, noisy data and poor starting models can pose challenges for seismic full waveform inversion. We present an application of a tree based Bayesian inversion scheme which attempts to mitigate these problems by accounting for data uncertainty while using a mildly informative prior about subsurface structure. We sample the resulting posterior model distribution of compressional velocity using a trans-dimensional (trans-D) or Reversible Jump Markov chain Monte Carlo method in the wavelet transform domain of velocity. This allows us to attain rapid convergence to a stationary distribution of posterior models while requiring a limited number of wavelet coefficients to define a sampled model. Two synthetic, low frequency, noisy data examples are provided. The first example is a simple reflection + transmission inverse problem, and the second uses a scaled version of the Marmousi velocity model, dominated by reflections. Both examples are initially started from a semi-infinite half-space with incorrect background velocity. We find that the trans-D tree based approach together with parallel tempering for navigating rugged likelihood (i.e. misfit) topography provides a promising, easily generalized method for solving large-scale geophysical inverse problems which are difficult to optimize, but where the true model contains a hierarchy of features at multiple scales.

  20. Palaeoenvironmental transfer functions in a bayesian framework with application to holocene climate variability in the near east

    Energy Technology Data Exchange (ETDEWEB)

    Schoelzel, C. [Bonn Univ. (Germany). Meteorologisches Inst.

    2006-07-01

    Ram data requires the derivation of the posterior distribution given the coexistence of the taxa found in the fossil spectrum. For both models, the prior densities are informative, using the recent climate mean and the largest expected Holocene variation as variance. The results are probabilistic reconstructions of the climate of the Holocene Near East in a Bayesian framework. (orig.)

  1. A Hierarchical Framework for Evaluation and Informed Decision Making Regarding Smartphone Apps for Clinical Care.

    Science.gov (United States)

    Torous, John Blake; Chan, Steven Richard; Gipson, Shih Yee-Marie Tan; Kim, Jung Won; Nguyen, Thuc-Quyen; Luo, John; Wang, Philip

    2018-02-15

    With thousands of smartphone apps targeting mental health, it is difficult to ignore the rapidly expanding use of apps in the treatment of psychiatric disorders. Patients with psychiatric conditions are interested in mental health apps and have begun to use them. That does not mean that clinicians must support, endorse, or even adopt the use of apps, but they should be prepared to answer patients' questions about apps and facilitate shared decision making around app use. This column describes an evaluation framework designed by the American Psychiatric Association to guide informed decision making around the use of smartphone apps in clinical care.

  2. Hierarchical Structure and Molecular Dynamics of Metal-Organic Framework as Characterized by Solid State NMR

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2016-01-01

    Full Text Available Metal-organic framework (MOF stands out as a promising material with great potential in application areas, such as gas separation and catalysis, due to its extraordinary properties. In order to fully characterize the structure of MOFs, especially those without single crystal, Solid State NMR (SSNMR is an indispensable tool. As a complimentary analytical technique to X-ray diffraction, SSNMR could provide detailed atomic level structure information. Meanwhile, SSNMR can characterize molecular dynamics over a wide dynamics range. In this review, selected applications of SSNMR on various MOFs are summarized and discussed.

  3. Bayesian integration of large SNA data frameworks with an application to Guatemala

    NARCIS (Netherlands)

    Van Tongeren, J.W.; Magnus, J.R.

    2012-01-01

    In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility

  4. Enabling Sustainability: Hierarchical Need-Based Framework for Promoting Sustainable Data Infrastructure in Developing Countries

    Directory of Open Access Journals (Sweden)

    David O. Yawson

    2009-11-01

    Full Text Available The paper presents thoughts on Sustainable Data Infrastructure (SDI development, and its user requirements bases. It brings Maslow's motivational theory to the fore, and proposes it as a rationalization mechanism for entities (mostly governmental that aim at realizing SDI. Maslow's theory, though well-known, is somewhat new in geospatial circles; this is where the novelty of the paper resides. SDI has been shown to enable and aid development in diverse ways. However, stimulating developing countries to appreciate the utility of SDI, implement, and use SDI in achieving sustainable development has proven to be an imposing challenge. One of the key reasons for this could be the absence of a widely accepted psychological theory to drive needs assessment and intervention design for the purpose of SDI development. As a result, it is reasonable to explore Maslow’s theory of human motivation as a psychological theory for promoting SDI in developing countries. In this article, we review and adapt Maslow’s hierarchy of needs as a framework for the assessment of the needs of developing nations. The paper concludes with the implications of this framework for policy with the view to stimulating the implementation of SDI in developing nations.

  5. A Framework for Land Cover Classification Using Discrete Return LiDAR Data: Adopting Pseudo-Waveform and Hierarchical Segmentation

    Science.gov (United States)

    Jung, Jinha; Pasolli, Edoardo; Prasad, Saurabh; Tilton, James C.; Crawford, Melba M.

    2014-01-01

    Acquiring current, accurate land-use information is critical for monitoring and understanding the impact of anthropogenic activities on natural environments.Remote sensing technologies are of increasing importance because of their capability to acquire information for large areas in a timely manner, enabling decision makers to be more effective in complex environments. Although optical imagery has demonstrated to be successful for land cover classification, active sensors, such as light detection and ranging (LiDAR), have distinct capabilities that can be exploited to improve classification results. However, utilization of LiDAR data for land cover classification has not been fully exploited. Moreover, spatial-spectral classification has recently gained significant attention since classification accuracy can be improved by extracting additional information from the neighboring pixels. Although spatial information has been widely used for spectral data, less attention has been given to LiDARdata. In this work, a new framework for land cover classification using discrete return LiDAR data is proposed. Pseudo-waveforms are generated from the LiDAR data and processed by hierarchical segmentation. Spatial featuresare extracted in a region-based way using a new unsupervised strategy for multiple pruning of the segmentation hierarchy. The proposed framework is validated experimentally on a real dataset acquired in an urban area. Better classification results are exhibited by the proposed framework compared to the cases in which basic LiDAR products such as digital surface model and intensity image are used. Moreover, the proposed region-based feature extraction strategy results in improved classification accuracies in comparison with a more traditional window-based approach.

  6. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    Science.gov (United States)

    Ross, G.

    2015-12-01

    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.

  7. Globally COnstrained Local Function Approximation via Hierarchical Modelling, a Framework for System Modelling under Partial Information

    DEFF Research Database (Denmark)

    Øjelund, Henrik; Sadegh, Payman

    2000-01-01

    be obtained. This paper presents a new approach for system modelling under partial (global) information (or the so called Gray-box modelling) that seeks to perserve the benefits of the global as well as local methodologies sithin a unified framework. While the proposed technique relies on local approximations......Local function approximations concern fitting low order models to weighted data in neighbourhoods of the points where the approximations are desired. Despite their generality and convenience of use, local models typically suffer, among others, from difficulties arising in physical interpretation...... simultaneously with the (local estimates of) function values. The approach is applied to modelling of a linear time variant dynamic system under prior linear time invariant structure where local regression fails as a result of high dimensionality....

  8. Bayesian Group Bridge for Bi-level Variable Selection.

    Science.gov (United States)

    Mallick, Himel; Yi, Nengjun

    2017-06-01

    A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.

  9. Inverting Glacial Isostatic Adjustment with Paleo Sea Level Records using Bayesian Framework and Burgers Rheology

    Science.gov (United States)

    Caron, L.; Metivier, L.; Greff-Lefftz, M.; Fleitout, L.; Rouby, H.

    2015-12-01

    Glacial Isostatic Adjustment models most often assume a mantle with a viscoelastic Maxwell rheology and a given ice history model. Here we use a Bayesian Monte Carlo with Markov Chains formalism to invert the global GIA signal simultaneously for the mechanical properties of the mantle and for the volume of the various ice-sheets using as starting ice models two distinct previously published ice histories. Burgers as well as Maxwell rheologies are considered.The fitted data consist of 5720 paleo sea level records from the last 35kyrs, with a world-wide distribution. Our ambition is to present not only the best fitting model, but also the range of possible solutions (within the explored space of parameters) with their respective probability of explaining the data, and thus reveal the trade-off effects and range of uncertainty affecting the parameters. Our a posteriori probality maps exhibit in all cases two distinct peaks: both are characterized by an upper mantle viscosity around 5.1020Pa.s but one of the peaks features a lower mantle viscosity around 3.1021Pa.s while the other indicates lower mantle viscosity of more than 1.1022Pa.s. The global maximum depends upon the starting ice history and the chosen rheology: the first peak (P1) has the highest probability only in the case with a Maxwell rheology and ice history based on ICE-5G, while the second peak (P2) is favored when using ANU-based ice history or Burgers rheology, and is our preferred solution as it is also consistent with long-term geodynamics and gravity gradients anomalies over Laurentide. P2 is associated with larger volumes for the Laurentian and Fennoscandian ice-sheets and as a consequence of total ice volume balance, smaller volumes for the Antactic ice-sheet. This last point interfers with the estimate of present-day ice-melting in Antarctica from GRACE data. Finally, we find that P2 with Burgers rheology favors the existence of a tectosphere, i.e. a viscous sublithospheric layer.

  10. Inverting Glacial Isostatic Adjustment signal using Bayesian framework and two linearly relaxing rheologies

    Science.gov (United States)

    Caron, L.; Métivier, L.; Greff-Lefftz, M.; Fleitout, L.; Rouby, H.

    2017-05-01

    Glacial Isostatic Adjustment (GIA) models commonly assume a mantle with a viscoelastic Maxwell rheology and a fixed ice history model. Here, we use a Bayesian Monte Carlo approach with a Markov chain formalism to invert the global GIA signal simultaneously for the mechanical properties of the mantle and the volumes of the ice sheets, using as starting ice models two previously published ice histories. Two stress relaxing rheologies are considered: Burgers and Maxwell linear viscoelasticities. A total of 5720 global palaeo sea level records are used, covering the last 35 kyr. Our goal is not only to seek the model best fitting this data set, but also to determine and display the range of possible solutions with their respective probability of explaining the data. In all cases, our a posteriori probability maps exhibit the classic character of solutions for GIA-determined mantle viscosity with two distinct peaks. What is new in our treatment is the presence of the bi-viscous Burgers rheology and the fact that we invert rheology jointly with ice history, in combination with the greatly expanded palaeo sea level records. The solutions tend to be characterized by an upper-mantle viscosity of around 5 × 1020 Pa s with one preferred lower-mantle viscosities at 3 × 1021 Pa s and the other more than 2 × 1022 Pa s, a rather classical pairing. Best-fitting models depend upon the starting ice history and the stress relaxing law. A first peak (P1) has the highest probability only in the case with a Maxwell rheology and ice history based on ICE-5G, while the second peak (P2) is favoured for ANU-based ice history or Burgers stress relaxation. The latter solution also may satisfy lower-mantle viscosity inferences from long-term geodynamics and gravity gradient anomalies over Laurentia. P2 is also consistent with large Laurentian and Fennoscandian ice-sheet volumes at the Last Glacial Maximum (LGM) and smaller LGM Antarctic ice volume than in either ICE-5G or ANU. Exploration of

  11. A Bayesian network based framework for real-time crash prediction on the basic freeway segments of urban expressways.

    Science.gov (United States)

    Hossain, Moinul; Muromachi, Yasunori

    2012-03-01

    The concept of measuring the crash risk for a very short time window in near future is gaining more practicality due to the recent advancements in the fields of information systems and traffic sensor technology. Although some real-time crash prediction models have already been proposed, they are still primitive in nature and require substantial improvements to be implemented in real-life. This manuscript investigates the major shortcomings of the existing models and offers solutions to overcome them with an improved framework and modeling method. It employs random multinomial logit model to identify the most important predictors as well as the most suitable detector locations to acquire data to build such a model. Afterwards, it applies Bayesian belief net (BBN) to build the real-time crash prediction model. The model has been constructed using high resolution detector data collected from Shibuya 3 and Shinjuku 4 expressways under the jurisdiction of Tokyo Metropolitan Expressway Company Limited, Japan. It has been specifically built for the basic freeway segments and it predicts the chance of formation of a hazardous traffic condition within the next 4-9 min for a particular 250 meter long road section. The performance evaluation results reflect that at an average threshold value the model is able to successful classify 66% of the future crashes with a false alarm rate less than 20%. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. What drives the perceptual change resulting from speech motor adaptation? Evaluation of hypotheses in a Bayesian modeling framework

    Science.gov (United States)

    Perrier, Pascal; Schwartz, Jean-Luc; Diard, Julien

    2018-01-01

    Shifts in perceptual boundaries resulting from speech motor learning induced by perturbations of the auditory feedback were taken as evidence for the involvement of motor functions in auditory speech perception. Beyond this general statement, the precise mechanisms underlying this involvement are not yet fully understood. In this paper we propose a quantitative evaluation of some hypotheses concerning the motor and auditory updates that could result from motor learning, in the context of various assumptions about the roles of the auditory and somatosensory pathways in speech perception. This analysis was made possible thanks to the use of a Bayesian model that implements these hypotheses by expressing the relationships between speech production and speech perception in a joint probability distribution. The evaluation focuses on how the hypotheses can (1) predict the location of perceptual boundary shifts once the perturbation has been removed, (2) account for the magnitude of the compensation in presence of the perturbation, and (3) describe the correlation between these two behavioral characteristics. Experimental findings about changes in speech perception following adaptation to auditory feedback perturbations serve as reference. Simulations suggest that they are compatible with a framework in which motor adaptation updates both the auditory-motor internal model and the auditory characterization of the perturbed phoneme, and where perception involves both auditory and somatosensory pathways. PMID:29357357

  13. Inversion and uncertainty of highly parameterized models in a Bayesian framework by sampling the maximal conditional posterior distribution of parameters

    Science.gov (United States)

    Mara, Thierry A.; Fajraoui, Noura; Younes, Anis; Delay, Frederick

    2015-02-01

    We introduce the concept of maximal conditional posterior distribution (MCPD) to assess the uncertainty of model parameters in a Bayesian framework. Although, Markov Chains Monte Carlo (MCMC) methods are particularly suited for this task, they become challenging with highly parameterized nonlinear models. The MCPD represents the conditional probability distribution function of a given parameter knowing that the other parameters maximize the conditional posterior density function. Unlike MCMC which accepts or rejects solutions sampled in the parameter space, MCPD is calculated through several optimization processes. Model inversion using MCPD algorithm is particularly useful for highly parameterized problems because calculations are independent. Consequently, they can be evaluated simultaneously with a multi-core computer. In the present work, the MCPD approach is applied to invert a 2D stochastic groundwater flow problem where the log-transmissivity field of the medium is inferred from scarce and noisy data. For this purpose, the stochastic field is expanded onto a set of orthogonal functions using a Karhunen-Loève (KL) transformation. Though the prior guess on the stochastic structure (covariance) of the transmissivity field is erroneous, the MCPD inference of the KL coefficients is able to extract relevant inverse solutions.

  14. Synthesis of hierarchical porous carbon monoliths with incorporated metal-organic frameworks for enhancing volumetric based CO₂ capture capability.

    Science.gov (United States)

    Qian, Dan; Lei, Cheng; Hao, Guang-Ping; Li, Wen-Cui; Lu, An-Hui

    2012-11-01

    This work aims to optimize the structural features of hierarchical porous carbon monolith (HCM) by incorporating the advantages of metal-organic frameworks (MOFs) (Cu₃(BTC)₂) to maximize the volumetric based CO₂ capture capability (CO₂ capacity in cm³ per cm³ adsorbent), which is seriously required for the practical application of CO₂ capture. The monolithic HCM was used as a matrix, in which Cu₃(BTC)₂ was in situ synthesized, to form HCM-Cu₃(BTC)₂ composites by a step-by-step impregnation and crystallization method. The resulted HCM-Cu₃(BTC)₂ composites, which retain the monolithic shape and exhibit unique hybrid structure features of both HCM and Cu₃(BTC)₂, show high CO₂ uptake of 22.7 cm³ cm⁻³ on a volumetric basis. This value is nearly as twice as the uptake of original HCM. The dynamic gas separation measurement of HCM-Cu₃(BTC)₂, using 16% (v/v) CO₂ in N₂ as feedstock, illustrates that CO₂ can be easily separated from N₂ under the ambient conditions and achieves a high separation factor for CO₂ over N₂, ranging from 67 to 100, reflecting a strongly competitive CO₂ adsorption by the composite. A facile CO₂ release can be realized by purging an argon flow through the fixed-bed adsorber at 25 °C, indicating the good regeneration ability.

  15. Unified framework for triaxial accelerometer-based fall event detection and classification using cumulants and hierarchical decision tree classifier.

    Science.gov (United States)

    Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram

    2015-08-01

    In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.

  16. Time-Varying Identification Model for Crack Monitoring Data from Concrete Dams Based on Support Vector Regression and the Bayesian Framework

    OpenAIRE

    Chen, Bo; Wu, Zhongru; Liang, Jiachen; Dou, Yanhong

    2017-01-01

    The modeling of cracks and identification of dam behavior changes are difficult issues in dam health monitoring research. In this paper, a time-varying identification model for crack monitoring data is built using support vector regression (SVR) and the Bayesian evidence framework (BEF). First, the SVR method is adopted for better modeling of the nonlinear relationship between the crack opening displacement (COD) and its influencing factors. Second, the BEF approach is applied to determine th...

  17. Reverse Flow Routing in a Bayesian Framework Using a GPU-accelerated 2D Shallow Water Model

    Science.gov (United States)

    D'Oria, M.; Ferrari, A.; Mignosa, P.; Tanda, M. G.; Vacondio, R.

    2017-12-01

    Knowledge of discharge hydrographs in specific sites of natural rivers is important for water resource management, flood frequency analysis, design of structures, etc. Many times, the flood hydrograph is needed in a river section upstream of a monitoring station; here the flood wave differs from the upstream one because of the effects of resistance, channel storage, lateral inflow, etc. Reverse flow routing is a method that allows obtaining hydrographs in upstream ungauged stations using information available at downstream monitored sites. In this study, we propose an inverse procedure, based on a Bayesian Geostatistical Approach, to solve the reverse problem. The upstream flow values over time (parameters) are considered as random variables and a-priori information about the parameters and observations (downstream discharge or water level values) are combined together in a Bayesian framework. The methodology needs a forward model of the considered open channel that includes the upstream ungauged station and the downstream gauged one and it is able to describe, with sufficient accuracy, the hydraulic routing processes. In many real cases, especially when large floodable areas are involved, a 1D hydraulic model is not able to capture the complex river hydrodynamic and a 2D model must be used. The inverse procedure requires a high number of flow model run to linearize the forward problem through multiple evaluations of a Jacobian matrix (sensitivity of each observation to each parameter) using a finite difference approach. For this reason, the computational efficiency of the forward model is a crucial element to reduce the overall computational costs. Therefore, in this work we used, in combination with the inverse procedure, a GPU-parallel numerical model for the solution of the 2D Shallow Water equations (implemented in CUDA/C++ code) that allows achieving ratio of physical to computational time of about 500-1000 (depending on the test case features). In addition

  18. A bayesian hierarchical model for spatio-temporal prediction and uncertainty assessment using repeat LiDAR acquisitions for the Kenai Peninsula, AK, USA

    Science.gov (United States)

    Chad Babcock; Hans Andersen; Andrew O. Finley; Bruce D. Cook

    2015-01-01

    Models leveraging repeat LiDAR and field collection campaigns may be one possible mechanism to monitor carbon flux in remote forested regions. Here, we look to the spatio-temporally data-rich Kenai Peninsula in Alaska, USA to examine the potential for Bayesian spatio-temporal mapping of terrestrial forest carbon storage and uncertainty.

  19. A Framework for Probabilistic Multi-Hazard Assessment of Rain-Triggered Lahars Using Bayesian Belief Networks

    Directory of Open Access Journals (Sweden)

    Pablo Tierz

    2017-09-01

    Full Text Available Volcanic water-sediment flows, commonly known as lahars, can often pose a higher threat to population and infrastructure than primary volcanic hazardous processes such as tephra fallout and Pyroclastic Density Currents (PDCs. Lahars are volcaniclastic flows of water, volcanic debris and entrained sediments that can travel long distances from their source, causing severe damage by impact and burial. Lahars are frequently triggered by intense or prolonged rainfall occurring after explosive eruptions, and their occurrence depends on numerous factors including the spatio-temporal rainfall characteristics, the spatial distribution and hydraulic properties of the tephra deposit, and the pre- and post-eruption topography. Modeling (and forecasting such a complex system requires the quantification of aleatory variability in the lahar triggering and propagation. To fulfill this goal, we develop a novel framework for probabilistic hazard assessment of lahars within a multi-hazard environment, based on coupling a versatile probabilistic model for lahar triggering (a Bayesian Belief Network: Multihaz with a dynamic physical model for lahar propagation (LaharFlow. Multihaz allows us to estimate the probability of lahars of different volumes occurring by merging varied information about regional rainfall, scientific knowledge on lahar triggering mechanisms and, crucially, probabilistic assessment of available pyroclastic material from tephra fallout and PDCs. LaharFlow propagates the aleatory variability modeled by Multihaz into hazard footprints of lahars. We apply our framework to Somma-Vesuvius (Italy because: (1 the volcano is strongly lahar-prone based on its previous activity, (2 there are many possible source areas for lahars, and (3 there is high density of population nearby. Our results indicate that the size of the eruption preceding the lahar occurrence and the spatial distribution of tephra accumulation have a paramount role in the lahar

  20. Flexible Bayesian Human Fecundity Models.

    Science.gov (United States)

    Kim, Sungduk; Sundaram, Rajeshwari; Buck Louis, Germaine M; Pyper, Cecilia

    2012-12-01

    Human fecundity is an issue of considerable interest for both epidemiological and clinical audiences, and is dependent upon a couple's biologic capacity for reproduction coupled with behaviors that place a couple at risk for pregnancy. Bayesian hierarchical models have been proposed to better model the conception probabilities by accounting for the acts of intercourse around the day of ovulation, i.e., during the fertile window. These models can be viewed in the framework of a generalized nonlinear model with an exponential link. However, a fixed choice of link function may not always provide the best fit, leading to potentially biased estimates for probability of conception. Motivated by this, we propose a general class of models for fecundity by relaxing the choice of the link function under the generalized nonlinear model framework. We use a sample from the Oxford Conception Study (OCS) to illustrate the utility and fit of this general class of models for estimating human conception. Our findings reinforce the need for attention to be paid to the choice of link function in modeling conception, as it may bias the estimation of conception probabilities. Various properties of the proposed models are examined and a Markov chain Monte Carlo sampling algorithm was developed for implementing the Bayesian computations. The deviance information criterion measure and logarithm of pseudo marginal likelihood are used for guiding the choice of links. The supplemental material section contains technical details of the proof of the theorem stated in the paper, and contains further simulation results and analysis.

  1. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  2. Bayesian framework for prediction of future number of failures from a single group of units in the field

    International Nuclear Information System (INIS)

    Ebrahimi, Nader

    2009-01-01

    This paper considers prediction of unknown number of failures in a future inspection of a group of in-service units based on number of failures observed from an earlier inspection. We develop a flexible Bayesian model and calculate Bayesian estimator for this unknown number and other quantities of interest. The paper also includes an illustration of our method in an example about heat exchanger. A main advantage of our approach is in its nonparametric nature. By nonparametric here we simply mean that no assumption is required about the failure time distribution of a unit

  3. Programming with Hierarchical Maps

    DEFF Research Database (Denmark)

    Ørbæk, Peter

    This report desribes the hierarchical maps used as a central data structure in the Corundum framework. We describe its most prominent features, ague for its usefulness and briefly describe some of the software prototypes implemented using the technology....

  4. Modelos hierárquicos bayesianos para estimação robusta e análise de dados censurados em melhoramento animal Hierarchical Bayesian models for robust estimation and censored data analysis in animal breeding

    Directory of Open Access Journals (Sweden)

    Fernando Flores Cardoso

    2009-07-01

    Full Text Available Dados extremos influenciados por fatores não considerados no modelo estatístico, podem enviesar as estimativas dos parâmetros e valores genéticos. Além disso, diversas características de importância econômica não seguem uma distribuição normal ou apresentam dados censurados. O objetivo deste trabalho é descrever e ilustrar a aplicação de modelos hierárquicos bayesianos para a detecção e mitigação de dados extremos e para análise de dados censurados. Primeiro, é apresentada a especificação tradicional do modelo animal em estágios hierárquicos sob o enfoque bayesiano, para dados não censurados com distribuição Normal. A seguir, esse modelo é generalizado pela introdução de uma variável de ponderação independente, que permite a especificação de densidades residuais de caudas longas da família de distribuições Normal/independente. Finalmente, para contemplar a análise de dados censurados, o modelo básico é ampliado pela inclusão de uma variável com distribuição normal truncada no limite inferior do valor observado da característica no momento da avaliação, para aqueles animais que ainda não completaram sua vida reprodutiva no momento da avaliação.Data strongly influenced by factors not accounted for by the statistical model can bias estimates of genetic parameters and values. Moreover, several traits of economic importance do not follow a normal distribution or have censored data. The objective of this study is to describe and illustrate the application of hierarchical Bayesian models for the detection and muting of outliers and for the analysis of censored data. First, the traditional specification of the animal model in hierarchical stages is presented under the Bayesian approach for normally distributed uncensored data. Then, this model is extended by introducing an independent weighting variable, which allows for the specification of thick tail residual densities from the Normal

  5. Using polarimetric radar observations and probabilistic inference to develop the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), a novel microphysical parameterization framework

    Science.gov (United States)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.

    2016-12-01

    Microphysical parameterization schemes have reached an impressive level of sophistication: numerous prognostic hydrometeor categories, and either size-resolved (bin) particle size distributions, or multiple prognostic moments of the size distribution. Yet, uncertainty in model representation of microphysical processes and the effects of microphysics on numerical simulation of weather has not shown a improvement commensurate with the advanced sophistication of these schemes. We posit that this may be caused by unconstrained assumptions of these schemes, such as ad-hoc parameter value choices and structural uncertainties (e.g. choice of a particular form for the size distribution). We present work on development and observational constraint of a novel microphysical parameterization approach, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), which seeks to address these sources of uncertainty. Our framework avoids unnecessary a priori assumptions, and instead relies on observations to provide probabilistic constraint of the scheme structure and sensitivities to environmental and microphysical conditions. We harness the rich microphysical information content of polarimetric radar observations to develop and constrain BOSS within a Bayesian inference framework using a Markov Chain Monte Carlo sampler (see Kumjian et al., this meeting for details on development of an associated polarimetric forward operator). Our work shows how knowledge of microphysical processes is provided by polarimetric radar observations of diverse weather conditions, and which processes remain highly uncertain, even after considering observations.

  6. Coupled Land-Atmosphere Dynamics Govern Long Duration Floods: A Pilot Study in Missouri River Basin Using a Bayesian Hierarchical Model

    Science.gov (United States)

    Najibi, N.; Lu, M.; Devineni, N.

    2017-12-01

    Long duration floods cause substantial damages and prolonged interruptions to water resource facilities and critical infrastructure. We present a novel generalized statistical and physical based model for flood duration with a deeper understanding of dynamically coupled nexus of the land surface wetness, effective atmospheric circulation and moisture transport/release. We applied the model on large reservoirs in the Missouri River Basin. The results indicate that the flood duration is not only a function of available moisture in the air, but also the antecedent condition of the blocking system of atmospheric pressure, resulting in enhanced moisture convergence, as well as the effectiveness of moisture condensation process leading to release. Quantifying these dynamics with a two-layer climate informed Bayesian multilevel model, we explain more than 80% variations in flood duration. The model considers the complex interaction between moisture transport, synoptic-to-large-scale atmospheric circulation pattern, and the antecedent wetness condition in the basin. Our findings suggest that synergy between a large low-pressure blocking system and a higher rate of divergent wind often triggers a long duration flood, and the prerequisite for moisture supply to trigger such event is moderate, which is more associated with magnitude than duration. In turn, this condition causes an extremely long duration flood if the surface wetness rate advancing to the flood event was already increased.

  7. Hierarchical Bayesian analysis to incorporate age uncertainty in growth curve analysis and estimates of age from length: Florida manatee (Trichechus manatus) carcasses

    Science.gov (United States)

    Schwarz, L.K.; Runge, M.C.

    2009-01-01

    Age estimation of individuals is often an integral part of species management research, and a number of ageestimation techniques are commonly employed. Often, the error in these techniques is not quantified or accounted for in other analyses, particularly in growth curve models used to describe physiological responses to environment and human impacts. Also, noninvasive, quick, and inexpensive methods to estimate age are needed. This research aims to provide two Bayesian methods to (i) incorporate age uncertainty into an age-length Schnute growth model and (ii) produce a method from the growth model to estimate age from length. The methods are then employed for Florida manatee (Trichechus manatus) carcasses. After quantifying the uncertainty in the aging technique (counts of ear bone growth layers), we fit age-length data to the Schnute growth model separately by sex and season. Independent prior information about population age structure and the results of the Schnute model are then combined to estimate age from length. Results describing the age-length relationship agree with our understanding of manatee biology. The new methods allow us to estimate age, with quantified uncertainty, for 98% of collected carcasses: 36% from ear bones, 62% from length.

  8. Bayesian Inference: with ecological applications

    Science.gov (United States)

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  9. Three-dimensional hierarchical frameworks based on MoS₂ nanosheets self-assembled on graphene oxide for efficient electrocatalytic hydrogen evolution.

    Science.gov (United States)

    Zhou, Weijia; Zhou, Kai; Hou, Dongman; Liu, Xiaojun; Li, Guoqiang; Sang, Yuanhua; Liu, Hong; Li, Ligui; Chen, Shaowei

    2014-12-10

    Advanced materials for electrocatalytic water splitting are central to renewable energy research. In this work, three-dimensional (3D) hierarchical frameworks based on the self-assembly of MoS2 nanosheets on graphene oxide were produced via a simple one-step hydrothermal process. The structures of the resulting 3D frameworks were characterized by using a variety of microscopic and spectroscopic tools, including scanning and transmission electron microscopies, X-ray diffraction, X-ray photoelectron spectroscopy, and Raman scattering. Importantly, the three-dimensional MoS2/graphene frameworks might be used directly as working electrodes which exhibited apparent and stable electrocatalytic activity in hydrogen evolution reaction (HER), as manifested by a large cathodic current density with a small overpotential of -107 mV (-121 mV when loaded on a glassy-carbon electrode) and a Tafel slope of 86.3 mV/dec (46.3 mV/dec when loaded on a glassy-carbon electrode). The remarkable performance might be ascribed to the good mechanical strength and high electrical conductivity of the 3D frameworks for fast charge transport and collection, where graphene oxide provided abundant nucleation sites for MoS2 deposition and oxygen incorporation led to the formation of defect-rich MoS2 nanosheets with active sites for HER.

  10. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  11. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  12. Diagnostic accuracy of clinical illness for bovine respiratory disease (BRD) diagnosis in beef cattle placed in feedlots: A systematic literature review and hierarchical Bayesian latent-class meta-analysis.

    Science.gov (United States)

    Timsit, E; Dendukuri, N; Schiller, I; Buczinski, S

    2016-12-01

    Diagnosis of bovine respiratory disease (BRD) in beef cattle placed in feedlots is typically based on clinical illness (CI) detected by pen-checkers. Unfortunately, the accuracy of this diagnostic approach (namely, sensitivity [Se] and specificity [Sp]) remains poorly understood, in part due to the absence of a reference test for ante-mortem diagnosis of BRD. Our objective was to pool available estimates of CI's diagnostic accuracy for BRD diagnosis in feedlot beef cattle while adjusting for the inaccuracy in the reference test. The presence of lung lesions (LU) at slaughter was used as the reference test. A systematic review of the literature was conducted to identify research articles comparing CI detected by pen-checkers during the feeding period to LU at slaughter. A hierarchical Bayesian latent-class meta-analysis was used to model test accuracy. This approach accounted for imperfections of both tests as well as the within and between study variability in the accuracy of CI. Furthermore, it also predicted the Se CI and Sp CI for future studies. Conditional independence between CI and LU was assumed, as these two tests are not based on similar biological principles. Seven studies were included in the meta-analysis. Estimated pooled Se CI and Sp CI were 0.27 (95% Bayesian credible interval: 0.12-0.65) and 0.92 (0.72-0.98), respectively, whereas estimated pooled Se LU and Sp LU were 0.91 (0.82-0.99) and 0.67 (0.64-0.79). Predicted Se CI and Sp CI for future studies were 0.27 (0.01-0.96) and 0.92 (0.14-1.00), respectively. The wide credible intervals around predicted Se CI and Sp CI estimates indicated considerable heterogeneity among studies, which suggests that pooled Se CI and Sp CI are not generalizable to individual studies. In conclusion, CI appeared to have poor Se but high Sp for BRD diagnosis in feedlots. Furthermore, considerable heterogeneity among studies highlighted an urgent need to standardize BRD diagnosis in feedlots. Copyright © 2016 Elsevier B

  13. Bayesian phylogeography of influenza A/H3N2 for the 2014-15 season in the United States using three frameworks of ancestral state reconstruction.

    Science.gov (United States)

    Magee, Daniel; Suchard, Marc A; Scotch, Matthew

    2017-02-01

    Ancestral state reconstructions in Bayesian phylogeography of virus pandemics have been improved by utilizing a Bayesian stochastic search variable selection (BSSVS) framework. Recently, this framework has been extended to model the transition rate matrix between discrete states as a generalized linear model (GLM) of genetic, geographic, demographic, and environmental predictors of interest to the virus and incorporating BSSVS to estimate the posterior inclusion probabilities of each predictor. Although the latter appears to enhance the biological validity of ancestral state reconstruction, there has yet to be a comparison of phylogenies created by the two methods. In this paper, we compare these two methods, while also using a primitive method without BSSVS, and highlight the differences in phylogenies created by each. We test six coalescent priors and six random sequence samples of H3N2 influenza during the 2014-15 flu season in the U.S. We show that the GLMs yield significantly greater root state posterior probabilities than the two alternative methods under five of the six priors, and significantly greater Kullback-Leibler divergence values than the two alternative methods under all priors. Furthermore, the GLMs strongly implicate temperature and precipitation as driving forces of this flu season and nearly unanimously identified a single root state, which exhibits the most tropical climate during a typical flu season in the U.S. The GLM, however, appears to be highly susceptible to sampling bias compared with the other methods, which casts doubt on whether its reconstructions should be favored over those created by alternate methods. We report that a BSSVS approach with a Poisson prior demonstrates less bias toward sample size under certain conditions than the GLMs or primitive models, and believe that the connection between reconstruction method and sampling bias warrants further investigation.

  14. Bayesian phylogeography of influenza A/H3N2 for the 2014-15 season in the United States using three frameworks of ancestral state reconstruction.

    Directory of Open Access Journals (Sweden)

    Daniel Magee

    2017-02-01

    Full Text Available Ancestral state reconstructions in Bayesian phylogeography of virus pandemics have been improved by utilizing a Bayesian stochastic search variable selection (BSSVS framework. Recently, this framework has been extended to model the transition rate matrix between discrete states as a generalized linear model (GLM of genetic, geographic, demographic, and environmental predictors of interest to the virus and incorporating BSSVS to estimate the posterior inclusion probabilities of each predictor. Although the latter appears to enhance the biological validity of ancestral state reconstruction, there has yet to be a comparison of phylogenies created by the two methods. In this paper, we compare these two methods, while also using a primitive method without BSSVS, and highlight the differences in phylogenies created by each. We test six coalescent priors and six random sequence samples of H3N2 influenza during the 2014-15 flu season in the U.S. We show that the GLMs yield significantly greater root state posterior probabilities than the two alternative methods under five of the six priors, and significantly greater Kullback-Leibler divergence values than the two alternative methods under all priors. Furthermore, the GLMs strongly implicate temperature and precipitation as driving forces of this flu season and nearly unanimously identified a single root state, which exhibits the most tropical climate during a typical flu season in the U.S. The GLM, however, appears to be highly susceptible to sampling bias compared with the other methods, which casts doubt on whether its reconstructions should be favored over those created by alternate methods. We report that a BSSVS approach with a Poisson prior demonstrates less bias toward sample size under certain conditions than the GLMs or primitive models, and believe that the connection between reconstruction method and sampling bias warrants further investigation.

  15. Hierarchical (Ni,Co)Se 2 /Carbon Hollow Rhombic Dodecahedra Derived from Metal-Organic Frameworks for Efficient Water-Splitting Electrocatalysis

    KAUST Repository

    Ming, Fangwang

    2017-08-12

    In this work, we demonstrate that the electrocatalytic activity of transition metal chalcogenides can be greatly enhanced by simultaneously engineering the active sites, surface area, and conductivity. Using metal-organic frameworks-derived (Ni,Co)Se2/C hollow rhombic dodecahedra (HRD) as a demonstration, we show that the incorporation of Ni into CoSe2 could generates additional active sites, the hierarchical hollow structure promotes the electrolyte diffusion, the in-situ hybridization with C improves the conductivity. As a result, the (Ni,Co)Se2/C HRD exhibit superior performance toward the overall water-splitting electrocatalysis in 1M KOH with a cell voltage as low as 1.58V at the current density of 10mAcm−2, making the (Ni,Co)Se2/C HRD as a promising alternative to noble metal catalysts for water splitting.

  16. Fatigue Damage Prognosis in FRP Composites by Combining Multi-Scale Degradation Fault Modes in an Uncertainty Bayesian Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — In this work, a framework for the estimation of the fatigue damage propagation in CFRP composites is proposed. Macro-scale phenomena such as stiffness and strength...

  17. Bayesian Data Analysis (lecture 1)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  18. Bayesian Data Analysis (lecture 2)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  19. On how to avoid input and structural uncertainties corrupt the inference of hydrological parameters using a Bayesian framework

    Science.gov (United States)

    Hernández, Mario R.; Francés, Félix

    2015-04-01

    One phase of the hydrological models implementation process, significantly contributing to the hydrological predictions uncertainty, is the calibration phase in which values of the unknown model parameters are tuned by optimizing an objective function. An unsuitable error model (e.g. Standard Least Squares or SLS) introduces noise into the estimation of the parameters. The main sources of this noise are the input errors and the hydrological model structural deficiencies. Thus, the biased calibrated parameters cause the divergence model phenomenon, where the errors variance of the (spatially and temporally) forecasted flows far exceeds the errors variance in the fitting period, and provoke the loss of part or all of the physical meaning of the modeled processes. In other words, yielding a calibrated hydrological model which works well, but not for the right reasons. Besides, an unsuitable error model yields a non-reliable predictive uncertainty assessment. Hence, with the aim of prevent all these undesirable effects, this research focuses on the Bayesian joint inference (BJI) of both the hydrological and error model parameters, considering a general additive (GA) error model that allows for correlation, non-stationarity (in variance and bias) and non-normality of model residuals. As hydrological model, it has been used a conceptual distributed model called TETIS, with a particular split structure of the effective model parameters. Bayesian inference has been performed with the aid of a Markov Chain Monte Carlo (MCMC) algorithm called Dream-ZS. MCMC algorithm quantifies the uncertainty of the hydrological and error model parameters by getting the joint posterior probability distribution, conditioned on the observed flows. The BJI methodology is a very powerful and reliable tool, but it must be used correctly this is, if non-stationarity in errors variance and bias is modeled, the Total Laws must be taken into account. The results of this research show that the

  20. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  1. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  2. Application of a hierarchical framework for assessing environmental impacts of dam operation: changes in hydrology, channel hydraulics, bed mobility and recruitment of riparian trees in a western North American river

    Science.gov (United States)

    Michael Burke; Klaus Jorde; John M. Buffington

    2009-01-01

    River systems have been altered worldwide by dams and diversions, resulting in a broad array of environmental impacts. The use of a process-based, hierarchical framework for assessing environmental impacts of dams is explored here in terms of a case study of the Kootenai River, western North America. The goal of the case study is to isolate and quantify the relative...

  3. Bayesian models: A statistical primer for ecologists

    Science.gov (United States)

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  4. Evaluating nonindigenous species management in a Bayesian networks derived relative risk framework for Padilla Bay, WA, USA.

    Science.gov (United States)

    Herring, Carlie E; Stinson, Jonah; Landis, Wayne G

    2015-10-01

    Many coastal regions are encountering issues with the spread of nonindigenous species (NIS). In this study, we conducted a regional risk assessment using a Bayesian network relative risk model (BN-RRM) to analyze multiple vectors of NIS introductions to Padilla Bay, Washington, a National Estuarine Research Reserve. We had 3 objectives in this study. The 1st objective was to determine whether the BN-RRM could be used to calculate risk from NIS introductions for Padilla Bay. Our 2nd objective was to determine which regions and endpoints were at greatest risk from NIS introductions. Our 3rd objective was to incorporate a management option into the model and predict endpoint risk if it were to be implemented. Eradication can occur at different stages of NIS invasions, such as the elimination of these species before being introduced to the habitat or removal of the species after settlement. We incorporated the ballast water treatment management scenario into the model, observed the risk to the endpoints, and compared this risk with the initial risk estimates. The model results indicated that the southern portion of the bay was at greatest risk because of NIS. Changes in community composition, Dungeness crab, and eelgrass were the endpoints most at risk from NIS introductions. The currents node, which controls the exposure of NIS to the bay from the surrounding marine environment, was the parameter that had the greatest influence on risk. The ballast water management scenario displayed an approximate 1% reduction in risk in this Padilla Bay case study. The models we developed provide an adaptable template for decision makers interested in managing NIS in other coastal regions and large bodies of water. © 2015 SETAC.

  5. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization.......In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  6. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  7. Computational framework for risk-based planning of inspections, maintenance, and condition monitoring using discrete Bayesian networks

    DEFF Research Database (Denmark)

    Nielsen, Jannie Sønderkær; Sørensen, John Dalsgaard

    2018-01-01

    This paper presents a computational framework for risk-based planning of inspections and repairs for deteriorating components. Two distinct types of decision rules are used to model decisions: simple decision rules that depend on constants or observed variables (e.g. inspection outcome...... expecte d life-cycle costs. For advanced decision rules, simulations are performed to estimate the expected costs, and dBNs are used within the simulations for decision-making. Information from inspections and condition monitoring are included if available. An example in the paper demonstrates...... the framework and the implemented strategies and decision rules, including various types of condition-based maintenance. The strategies using advanced decision rules lead to reduced costs compared to the simple decision rules when condition monitoring is applied, and the value of condition monitoring...

  8. DISSECTING MAGNETAR VARIABILITY WITH BAYESIAN HIERARCHICAL MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Huppenkothen, Daniela; Elenbaas, Chris; Watts, Anna L.; Horst, Alexander J. van der [Anton Pannekoek Institute for Astronomy, University of Amsterdam, Postbus 94249, 1090 GE Amsterdam (Netherlands); Brewer, Brendon J. [Department of Statistics, The University of Auckland, Private Bag 92019, Auckland 1142 (New Zealand); Hogg, David W. [Center for Data Science, New York University, 726 Broadway, 7th Floor, New York, NY 10003 (United States); Murray, Iain [School of Informatics, University of Edinburgh, Edinburgh EH8 9AB (United Kingdom); Frean, Marcus [School of Engineering and Computer Science, Victoria University of Wellington (New Zealand); Levin, Yuri [Monash Center for Astrophysics and School of Physics, Monash University, Clayton, Victoria 3800 (Australia); Kouveliotou, Chryssa, E-mail: daniela.huppenkothen@nyu.edu [Astrophysics Office, ZP 12, NASA/Marshall Space Flight Center, Huntsville, AL 35812 (United States)

    2015-09-01

    Neutron stars are a prime laboratory for testing physical processes under conditions of strong gravity, high density, and extreme magnetic fields. Among the zoo of neutron star phenomena, magnetars stand out for their bursting behavior, ranging from extremely bright, rare giant flares to numerous, less energetic recurrent bursts. The exact trigger and emission mechanisms for these bursts are not known; favored models involve either a crust fracture and subsequent energy release into the magnetosphere, or explosive reconnection of magnetic field lines. In the absence of a predictive model, understanding the physical processes responsible for magnetar burst variability is difficult. Here, we develop an empirical model that decomposes magnetar bursts into a superposition of small spike-like features with a simple functional form, where the number of model components is itself part of the inference problem. The cascades of spikes that we model might be formed by avalanches of reconnection, or crust rupture aftershocks. Using Markov Chain Monte Carlo sampling augmented with reversible jumps between models with different numbers of parameters, we characterize the posterior distributions of the model parameters and the number of components per burst. We relate these model parameters to physical quantities in the system, and show for the first time that the variability within a burst does not conform to predictions from ideas of self-organized criticality. We also examine how well the properties of the spikes fit the predictions of simplified cascade models for the different trigger mechanisms.

  9. Bayesian Exploratory Factor Analysis

    Science.gov (United States)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  10. Time-Varying Identification Model for Crack Monitoring Data from Concrete Dams Based on Support Vector Regression and the Bayesian Framework

    Directory of Open Access Journals (Sweden)

    Bo Chen

    2017-01-01

    Full Text Available The modeling of cracks and identification of dam behavior changes are difficult issues in dam health monitoring research. In this paper, a time-varying identification model for crack monitoring data is built using support vector regression (SVR and the Bayesian evidence framework (BEF. First, the SVR method is adopted for better modeling of the nonlinear relationship between the crack opening displacement (COD and its influencing factors. Second, the BEF approach is applied to determine the optimal SVR modeling parameters, including the penalty coefficient, the loss coefficient, and the width coefficient of the radial kernel function, under the principle that the prediction errors between the monitored and the model forecasted values are as small as possible. Then, considering the predicted COD, the historical maximum COD, and the time-dependent component, forewarning criteria are proposed for identifying the time-varying behavior of cracks and the degree of abnormality of dam health. Finally, an example of modeling and forewarning analysis is presented using two monitoring subsequences from a real structural crack in the Chencun concrete arch-gravity dam. The findings indicate that the proposed time-varying model can provide predicted results that are more accurately nonlinearity fitted and is suitable for use in evaluating the behavior of cracks in dams.

  11. Under which conditions, additional monitoring data are worth gathering for improving decision making? Application of the VOI theory in the Bayesian Event Tree eruption forecasting framework

    Science.gov (United States)

    Loschetter, Annick; Rohmer, Jérémy

    2016-04-01

    Standard and new generation of monitoring observations provide in almost real-time important information about the evolution of the volcanic system. These observations are used to update the model and contribute to a better hazard assessment and to support decision making concerning potential evacuation. The framework BET_EF (based on Bayesian Event Tree) developed by INGV enables dealing with the integration of information from monitoring with the prospect of decision making. Using this framework, the objectives of the present work are i. to propose a method to assess the added value of information (within the Value Of Information (VOI) theory) from monitoring; ii. to perform sensitivity analysis on the different parameters that influence the VOI from monitoring. VOI consists in assessing the possible increase in expected value provided by gathering information, for instance through monitoring. Basically, the VOI is the difference between the value with information and the value without additional information in a Cost-Benefit approach. This theory is well suited to deal with situations that can be represented in the form of a decision tree such as the BET_EF tool. Reference values and ranges of variation (for sensitivity analysis) were defined for input parameters, based on data from the MESIMEX exercise (performed at Vesuvio volcano in 2006). Complementary methods for sensitivity analyses were implemented: local, global using Sobol' indices and regional using Contribution to Sample Mean and Variance plots. The results (specific to the case considered) obtained with the different techniques are in good agreement and enable answering the following questions: i. Which characteristics of monitoring are important for early warning (reliability)? ii. How do experts' opinions influence the hazard assessment and thus the decision? Concerning the characteristics of monitoring, the more influent parameters are the means rather than the variances for the case considered

  12. A Bayesian foundation for individual learning under uncertainty

    Directory of Open Access Journals (Sweden)

    Christoph eMathys

    2011-05-01

    Full Text Available Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty. The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next higher level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i are analytical and extremely efficient, enabling real-time learning, (ii have a natural interpretation in terms of RL, and (iii contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty. These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability

  13. A bayesian foundation for individual learning under uncertainty.

    Science.gov (United States)

    Mathys, Christoph; Daunizeau, Jean; Friston, Karl J; Stephan, Klaas E

    2011-01-01

    Computational learning models are critical for understanding mechanisms of adaptive behavior. However, the two major current frameworks, reinforcement learning (RL) and Bayesian learning, both have certain limitations. For example, many Bayesian models are agnostic of inter-individual variability and involve complicated integrals, making online learning difficult. Here, we introduce a generic hierarchical Bayesian framework for individual learning under multiple forms of uncertainty (e.g., environmental volatility and perceptual uncertainty). The model assumes Gaussian random walks of states at all but the first level, with the step size determined by the next highest level. The coupling between levels is controlled by parameters that shape the influence of uncertainty on learning in a subject-specific fashion. Using variational Bayes under a mean-field approximation and a novel approximation to the posterior energy function, we derive trial-by-trial update equations which (i) are analytical and extremely efficient, enabling real-time learning, (ii) have a natural interpretation in terms of RL, and (iii) contain parameters representing processes which play a key role in current theories of learning, e.g., precision-weighting of prediction error. These parameters allow for the expression of individual differences in learning and may relate to specific neuromodulatory mechanisms in the brain. Our model is very general: it can deal with both discrete and continuous states and equally accounts for deterministic and probabilistic relations between environmental events and perceptual states (i.e., situations with and without perceptual uncertainty). These properties are illustrated by simulations and analyses of empirical time series. Overall, our framework provides a novel foundation for understanding normal and pathological learning that contextualizes RL within a generic Bayesian scheme and thus connects it to principles of optimality from probability theory.

  14. A Bayesian approach to degradation-based burn-in optimization for display products exhibiting two-phase degradation patterns

    International Nuclear Information System (INIS)

    Yuan, Tao; Bae, Suk Joo; Zhu, Xiaoyan

    2016-01-01

    Motivated by the two-phase degradation phenomena observed in light displays (e.g., plasma display panels (PDPs), organic light emitting diodes (OLEDs)), this study proposes a new degradation-based burn-in testing plan for display products exhibiting two-phase degradation patterns. The primary focus of the burn-in test in this study is to eliminate the initial rapid degradation phase, while the major purpose of traditional burn-in tests is to detect and eliminate early failures from weak units. A hierarchical Bayesian bi-exponential model is used to capture two-phase degradation patterns of the burn-in population. Mission reliability and total cost are introduced as planning criteria. The proposed burn-in approach accounts for unit-to-unit variability within the burn-in population, and uncertainty concerning the model parameters, mainly in the hierarchical Bayesian framework. Available pre-burn-in data is conveniently incorporated into the burn-in decision-making procedure. A practical example of PDP degradation data is used to illustrate the proposed methodology. The proposed method is compared to other approaches such as the maximum likelihood method or the change-point regression. - Highlights: • We propose a degradation-based burn-in test for products with two-phase degradation. • Mission reliability and total cost are used as planning criteria. • The proposed burn-in approach is built within the hierarchical Bayesian framework. • A practical example was used to illustrate the proposed methodology.

  15. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  16. A Bayesian framework to account for complex non-genetic factors in gene expression levels greatly increases power in eQTL studies.

    Directory of Open Access Journals (Sweden)

    Oliver Stegle

    2010-05-01

    Full Text Available Gene expression measurements are influenced by a wide range of factors, such as the state of the cell, experimental conditions and variants in the sequence of regulatory regions. To understand the effect of a variable of interest, such as the genotype of a locus, it is important to account for variation that is due to confounding causes. Here, we present VBQTL, a probabilistic approach for mapping expression quantitative trait loci (eQTLs that jointly models contributions from genotype as well as known and hidden confounding factors. VBQTL is implemented within an efficient and flexible inference framework, making it fast and tractable on large-scale problems. We compare the performance of VBQTL with alternative methods for dealing with confounding variability on eQTL mapping datasets from simulations, yeast, mouse, and human. Employing Bayesian complexity control and joint modelling is shown to result in more precise estimates of the contribution of different confounding factors resulting in additional associations to measured transcript levels compared to alternative approaches. We present a threefold larger collection of cis eQTLs than previously found in a whole-genome eQTL scan of an outbred human population. Altogether, 27% of the tested probes show a significant genetic association in cis, and we validate that the additional eQTLs are likely to be real by replicating them in different sets of individuals. Our method is the next step in the analysis of high-dimensional phenotype data, and its application has revealed insights into genetic regulation of gene expression by demonstrating more abundant cis-acting eQTLs in human than previously shown. Our software is freely available online at http://www.sanger.ac.uk/resources/software/peer/.

  17. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  18. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Science.gov (United States)

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  19. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  20. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  1. Bayesian networks in reliability

    Energy Technology Data Exchange (ETDEWEB)

    Langseth, Helge [Department of Mathematical Sciences, Norwegian University of Science and Technology, N-7491 Trondheim (Norway)]. E-mail: helgel@math.ntnu.no; Portinale, Luigi [Department of Computer Science, University of Eastern Piedmont ' Amedeo Avogadro' , 15100 Alessandria (Italy)]. E-mail: portinal@di.unipmn.it

    2007-01-15

    Over the last decade, Bayesian networks (BNs) have become a popular tool for modelling many kinds of statistical problems. We have also seen a growing interest for using BNs in the reliability analysis community. In this paper we will discuss the properties of the modelling framework that make BNs particularly well suited for reliability applications, and point to ongoing research that is relevant for practitioners in reliability.

  2. Hierarchical species distribution models

    Science.gov (United States)

    Hefley, Trevor J.; Hooten, Mevin B.

    2016-01-01

    Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.

  3. Estimating mental states of a depressed person with bayesian networks

    NARCIS (Netherlands)

    Klein, Michel C.A.; Modena, Gabriele

    2013-01-01

    In this work in progress paper we present an approach based on Bayesian Networks to model the relationship between mental states and empirical observations in a depressed person. We encode relationships and domain expertise as a Hierarchical Bayesian Network. Mental states are represented as latent

  4. A Bayesian framework based on a Gaussian mixture model and radial-basis-function Fisher discriminant analysis (BayGmmKda V1.1) for spatial prediction of floods

    Science.gov (United States)

    Tien Bui, Dieu; Hoang, Nhat-Duc

    2017-09-01

    In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM), radial-basis-function Fisher discriminant analysis (RBFDA), and a geographic information system (GIS) database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.

  5. Tractography segmentation using a hierarchical Dirichlet processes mixture model.

    Science.gov (United States)

    Wang, Xiaogang; Grimson, W Eric L; Westin, Carl-Fredrik

    2011-01-01

    In this paper, we propose a new nonparametric Bayesian framework to cluster white matter fiber tracts into bundles using a hierarchical Dirichlet processes mixture (HDPM) model. The number of clusters is automatically learned driven by data with a Dirichlet process (DP) prior instead of being manually specified. After the models of bundles have been learned from training data without supervision, they can be used as priors to cluster/classify fibers of new subjects for comparison across subjects. When clustering fibers of new subjects, new clusters can be created for structures not observed in the training data. Our approach does not require computing pairwise distances between fibers and can cluster a huge set of fibers across multiple subjects. We present results on several data sets, the largest of which has more than 120,000 fibers. Copyright © 2010 Elsevier Inc. All rights reserved.

  6. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    Science.gov (United States)

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  7. Bayesian inference of local geomagnetic secular variation curves: application to archaeomagnetism

    Science.gov (United States)

    Lanos, Philippe

    2014-05-01

    The errors that occur at different stages of the archaeomagnetic calibration process are combined using a Bayesian hierarchical modelling. The archaeomagnetic data obtained from archaeological structures such as hearths, kilns or sets of bricks and tiles, exhibit considerable experimental errors and are generally more or less well dated by archaeological context, history or chronometric methods (14C, TL, dendrochronology, etc.). They can also be associated with stratigraphic observations which provide prior relative chronological information. The modelling we propose allows all these observations and errors to be linked together thanks to appropriate prior probability densities. The model also includes penalized cubic splines for estimating the univariate, spherical or three-dimensional curves for the secular variation of the geomagnetic field (inclination, declination, intensity) over time at a local place. The mean smooth curve we obtain, with its posterior Bayesian envelop provides an adaptation to the effects of variability in the density of reference points over time. Moreover, the hierarchical modelling also allows an efficient way to penalize outliers automatically. With this new posterior estimate of the curve, the Bayesian statistical framework then allows to estimate the calendar dates of undated archaeological features (such as kilns) based on one, two or three geomagnetic parameters (inclination, declination and/or intensity). Date estimates are presented in the same way as those that arise from radiocarbon dating. In order to illustrate the model and the inference method used, we will present results based on French, Bulgarian and Austrian datasets recently published.

  8. Bayesian Network for multiple hypthesis tracking

    NARCIS (Netherlands)

    Zajdel, W.P.; Kröse, B.J.A.; Blockeel, H.; Denecker, M.

    2002-01-01

    For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a

  9. Doing bayesian data analysis a tutorial with R and BUGS

    CERN Document Server

    Kruschke, John K

    2011-01-01

    There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all

  10. Identifying Mixtures of Mixtures Using Bayesian Estimation

    Science.gov (United States)

    Malsiner-Walli, Gertraud; Frühwirth-Schnatter, Sylvia; Grün, Bettina

    2017-01-01

    ABSTRACT The use of a finite mixture of normal distributions in model-based clustering allows us to capture non-Gaussian data clusters. However, identifying the clusters from the normal components is challenging and in general either achieved by imposing constraints on the model or by using post-processing procedures. Within the Bayesian framework, we propose a different approach based on sparse finite mixtures to achieve identifiability. We specify a hierarchical prior, where the hyperparameters are carefully selected such that they are reflective of the cluster structure aimed at. In addition, this prior allows us to estimate the model using standard MCMC sampling methods. In combination with a post-processing approach which resolves the label switching issue and results in an identified model, our approach allows us to simultaneously (1) determine the number of clusters, (2) flexibly approximate the cluster distributions in a semiparametric way using finite mixtures of normals and (3) identify cluster-specific parameters and classify observations. The proposed approach is illustrated in two simulation studies and on benchmark datasets. Supplementary materials for this article are available online. PMID:28626349

  11. Hierarchical modeling and inference in ecology: The analysis of data from populations, metapopulations and communities

    Science.gov (United States)

    Royle, J. Andrew; Dorazio, Robert M.

    2008-01-01

    A guide to data collection, modeling and inference strategies for biological survey data using Bayesian and classical statistical methods. This book describes a general and flexible framework for modeling and inference in ecological systems based on hierarchical models, with a strict focus on the use of probability models and parametric inference. Hierarchical models represent a paradigm shift in the application of statistics to ecological inference problems because they combine explicit models of ecological system structure or dynamics with models of how ecological systems are observed. The principles of hierarchical modeling are developed and applied to problems in population, metapopulation, community, and metacommunity systems. The book provides the first synthetic treatment of many recent methodological advances in ecological modeling and unifies disparate methods and procedures. The authors apply principles of hierarchical modeling to ecological problems, including * occurrence or occupancy models for estimating species distribution * abundance models based on many sampling protocols, including distance sampling * capture-recapture models with individual effects * spatial capture-recapture models based on camera trapping and related methods * population and metapopulation dynamic models * models of biodiversity, community structure and dynamics.

  12. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees a...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....... and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last...

  13. Smart microgrid hierarchical frequency control ancillary service provision based on virtual inertia concept: An integrated demand response and droop controlled distributed generation framework

    International Nuclear Information System (INIS)

    Rezaei, Navid; Kalantar, Mohsen

    2015-01-01

    Highlights: • Detailed formulation of the microgrid static and dynamic securities based on droop control and virtual inertia concepts. • Constructing a novel objective function using frequency excursion and rate of change of frequency profiles. • Ensuring the microgrid security subject to the microgrid economic and environmental policies. • Coordinated management of demand response and droop controlled distributed generation resources. • Precise scheduling of day-ahead hierarchical frequency control ancillary services using a scenario based stochastic programming. - Abstract: Low inertia stack, high penetration levels of renewable energy source and great ratio of power deviations in a small power delivery system put microgrid frequency at risk of instability. On the basis of the close coupling between the microgrid frequency and system security requirements, procurement of adequate ancillary services from cost-effective and environmental friendly resources is a great challenge requests an efficient energy management system. Motivated by this need, this paper presents a novel energy management system that is aimed to coordinately manage the demand response and distributed generation resources. The proposed approach is carried out by constructing a hierarchical frequency control structure in which the frequency dependent control functions of the microgrid components are modeled comprehensively. On the basis of the derived modeling, both the static and dynamic frequency securities of an islanded microgrid are provided in primary and secondary control levels. Besides, to cope with the low inertia stack of islanded microgrids, novel virtual inertia concept is devised based on the precise modeling of droop controlled distributed generation resources. The proposed approach is applied to typical test microgrid. Energy and hierarchical reserve resource are scheduled precisely using a scenario-based stochastic programming methodology. Moreover, analyzing the

  14. Segmentasi Bayesian Hirarki Untuk Model Ma Konstan Sepotong Demi Sepotong Berbasis Algoritma Reversible Jump Mcmc

    Directory of Open Access Journals (Sweden)

    Suparman Suparman

    2012-03-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} This paper addresses the problem of the signal segmentation within a hierarchical Bayesian framework by using reversible jump MCMC sampling. The signal is modelled by piecewise constant MA processes where the numbers of segments, the position of abrupt, the order and the coefficients of  the MA processes for each segment are unknown. The reversible jump MCMC algorithm is then used to generate samples distributed according to the joint posterior distribution of the unknown parameters. These samples allow to compute some interesting features of the a posterior distribution. Main advantage of the algorithm reversible jump MCMC algorithm is produce the joint estimators for the parameter and hyper parameter in hierarchical Bayesian.  The performance of the this methodology is illustrated via several simulation results.   Keywords :     Hierarchical Bayesian model, Reversible Jump MCMC methods, Signal  Segmentation, piecewise constant Moving-Average (MA processes

  15. Bayesian inference with information content model check for Langevin equations

    DEFF Research Database (Denmark)

    Krog, Jens F. C.; Lomholt, Michael Andersen

    2017-01-01

    The Bayesian data analysis framework has been proven to be a systematic and effective method of parameter inference and model selection for stochastic processes. In this work we introduce an information content model check which may serve as a goodness-of-fit, like the chi-square procedure......, to complement conventional Bayesian analysis. We demonstrate this extended Bayesian framework on a system of Langevin equations, where coordinate dependent mobilities and measurement noise hinder the normal mean squared displacement approach....

  16. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  17. A hierarchical state space approach to affective dynamics.

    Science.gov (United States)

    Lodewyckx, Tom; Tuerlinckx, Francis; Kuppens, Peter; Allen, Nicholas; Sheeber, Lisa

    2011-02-01

    Linear dynamical system theory is a broad theoretical framework that has been applied in various research areas such as engineering, econometrics and recently in psychology. It quantifies the relations between observed inputs and outputs that are connected through a set of latent state variables. State space models are used to investigate the dynamical properties of these latent quantities. These models are especially of interest in the study of emotion dynamics, with the system representing the evolving emotion components of an individual. However, for simultaneous modeling of individual and population differences, a hierarchical extension of the basic state space model is necessary. Therefore, we introduce a Bayesian hierarchical model with random effects for the system parameters. Further, we apply our model to data that were collected using the Oregon adolescent interaction task: 66 normal and 67 depressed adolescents engaged in a conflict interaction with their parents and second-to-second physiological and behavioral measures were obtained. System parameters in normal and depressed adolescents were compared, which led to interesting discussions in the light of findings in recent literature on the links between cardiovascular processes, emotion dynamics and depression. We illustrate that our approach is flexible and general: The model can be applied to any time series for multiple systems (where a system can represent any entity) and moreover, one is free to focus on whatever component of the versatile model.

  18. Kernel Bayesian ART and ARTMAP.

    Science.gov (United States)

    Masuyama, Naoki; Loo, Chu Kiong; Dawood, Farhan

    2018-02-01

    Adaptive Resonance Theory (ART) is one of the successful approaches to resolving "the plasticity-stability dilemma" in neural networks, and its supervised learning model called ARTMAP is a powerful tool for classification. Among several improvements, such as Fuzzy or Gaussian based models, the state of art model is Bayesian based one, while solving the drawbacks of others. However, it is known that the Bayesian approach for the high dimensional and a large number of data requires high computational cost, and the covariance matrix in likelihood becomes unstable. This paper introduces Kernel Bayesian ART (KBA) and ARTMAP (KBAM) by integrating Kernel Bayes' Rule (KBR) and Correntropy Induced Metric (CIM) to Bayesian ART (BA) and ARTMAP (BAM), respectively, while maintaining the properties of BA and BAM. The kernel frameworks in KBA and KBAM are able to avoid the curse of dimensionality. In addition, the covariance-free Bayesian computation by KBR provides the efficient and stable computational capability to KBA and KBAM. Furthermore, Correntropy-based similarity measurement allows improving the noise reduction ability even in the high dimensional space. The simulation experiments show that KBA performs an outstanding self-organizing capability than BA, and KBAM provides the superior classification ability than BAM, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Personalized Audio Systems - a Bayesian Approach

    DEFF Research Database (Denmark)

    Nielsen, Jens Brehm; Jensen, Bjørn Sand; Hansen, Toke Jansen

    2013-01-01

    , the present paper presents a general inter-active framework for personalization of such audio systems. The framework builds on Bayesian Gaussian process regression in which a model of the users's objective function is updated sequentially. The parameter setting to be evaluated in a given trial is selected...... are optimized using the proposed framework. Twelve test subjects obtain a personalized setting with the framework, and these settings are signicantly preferred to those obtained with random experimentation....

  20. Modelling dependable systems using hybrid Bayesian networks

    International Nuclear Information System (INIS)

    Neil, Martin; Tailor, Manesh; Marquez, David; Fenton, Norman; Hearty, Peter

    2008-01-01

    A hybrid Bayesian network (BN) is one that incorporates both discrete and continuous nodes. In our extensive applications of BNs for system dependability assessment, the models are invariably hybrid and the need for efficient and accurate computation is paramount. We apply a new iterative algorithm that efficiently combines dynamic discretisation with robust propagation algorithms on junction tree structures to perform inference in hybrid BNs. We illustrate its use in the field of dependability with two example of reliability estimation. Firstly we estimate the reliability of a simple single system and next we implement a hierarchical Bayesian model. In the hierarchical model we compute the reliability of two unknown subsystems from data collected on historically similar subsystems and then input the result into a reliability block model to compute system level reliability. We conclude that dynamic discretisation can be used as an alternative to analytical or Monte Carlo methods with high precision and can be applied to a wide range of dependability problems

  1. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  2. CARBayes: An R Package for Bayesian Spatial Modeling with Conditional Autoregressive Priors

    Directory of Open Access Journals (Sweden)

    Duncan Lee

    2013-11-01

    Full Text Available Conditional autoregressive models are commonly used to represent spatial autocorrelation in data relating to a set of non-overlapping areal units, which arise in a wide variety of applications including agriculture, education, epidemiology and image analysis. Such models are typically specified in a hierarchical Bayesian framework, with inference based on Markov chain Monte Carlo (MCMC simulation. The most widely used software to fit such models is WinBUGS or OpenBUGS, but in this paper we introduce the R package CARBayes. The main advantage of CARBayes compared with the BUGS software is its ease of use, because: (1 the spatial adjacency information is easy to specify as a binary neighbourhood matrix; and (2 given the neighbourhood matrix the models can be implemented by a single function call in R. This paper outlines the general class of Bayesian hierarchical models that can be implemented in the CARBayes software, describes their implementation via MCMC simulation techniques, and illustrates their use with two worked examples in the fields of house price analysis and disease mapping.

  3. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... and computational complexity. We also analyze the impact of transceiver filters on the sparseness of the channel response, and propose a dictionary design that permits the deployment of sparse inference methods in conditions of low bandwidth....

  4. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  5. Empirical Bayesian inference and model uncertainty

    International Nuclear Information System (INIS)

    Poern, K.

    1994-01-01

    This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability

  6. Hierarchical quark mass matrices

    International Nuclear Information System (INIS)

    Rasin, A.

    1998-02-01

    I define a set of conditions that the most general hierarchical Yukawa mass matrices have to satisfy so that the leading rotations in the diagonalization matrix are a pair of (2,3) and (1,2) rotations. In addition to Fritzsch structures, examples of such hierarchical structures include also matrices with (1,3) elements of the same order or even much larger than the (1,2) elements. Such matrices can be obtained in the framework of a flavor theory. To leading order, the values of the angle in the (2,3) plane (s 23 ) and the angle in the (1,2) plane (s 12 ) do not depend on the order in which they are taken when diagonalizing. We find that any of the Cabbibo-Kobayashi-Maskawa matrix parametrizations that consist of at least one (1,2) and one (2,3) rotation may be suitable. In the particular case when the s 13 diagonalization angles are sufficiently small compared to the product s 12 s 23 , two special CKM parametrizations emerge: the R 12 R 23 R 12 parametrization follows with s 23 taken before the s 12 rotation, and vice versa for the R 23 R 12 R 23 parametrization. (author)

  7. Disease Mapping and Regression with Count Data in the Presence of Overdispersion and Spatial Autocorrelation: A Bayesian Model Averaging Approach

    Science.gov (United States)

    Mohebbi, Mohammadreza; Wolfe, Rory; Forbes, Andrew

    2014-01-01

    This paper applies the generalised linear model for modelling geographical variation to esophageal cancer incidence data in the Caspian region of Iran. The data have a complex and hierarchical structure that makes them suitable for hierarchical analysis using Bayesian techniques, but with care required to deal with problems arising from counts of events observed in small geographical areas when overdispersion and residual spatial autocorrelation are present. These considerations lead to nine regression models derived from using three probability distributions for count data: Poisson, generalised Poisson and negative binomial, and three different autocorrelation structures. We employ the framework of Bayesian variable selection and a Gibbs sampling based technique to identify significant cancer risk factors. The framework deals with situations where the number of possible models based on different combinations of candidate explanatory variables is large enough such that calculation of posterior probabilities for all models is difficult or infeasible. The evidence from applying the modelling methodology suggests that modelling strategies based on the use of generalised Poisson and negative binomial with spatial autocorrelation work well and provide a robust basis for inference. PMID:24413702

  8. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  9. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  10. Computational Neuropsychology and Bayesian Inference.

    Science.gov (United States)

    Parr, Thomas; Rees, Geraint; Friston, Karl J

    2018-01-01

    Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine 'prior' beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology - optimal inference with suboptimal priors - and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient's behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.

  11. Towards Bayesian Inference of the Fast-Ion Distribution Function

    DEFF Research Database (Denmark)

    Stagner, L.; Heidbrink, W.W.; Salewski, Mirko

    2012-01-01

    sensitivity of the measurements are incorporated into Bayesian likelihood probabilities, while prior probabilities enforce physical constraints. As an initial step, this poster uses Bayesian statistics to infer the DIII-D electron density profile from multiple diagnostic measurements. Likelihood functions....... However, when theory and experiment disagree (for one or more diagnostics), it is unclear how to proceed. Bayesian statistics provides a framework to infer the DF, quantify errors, and reconcile discrepant diagnostic measurements. Diagnostic errors and ``weight functions" that describe the phase space...

  12. Implementing the Bayesian paradigm in risk analysis

    International Nuclear Information System (INIS)

    Aven, T.; Kvaloey, J.T.

    2002-01-01

    The Bayesian paradigm comprises a unified and consistent framework for analyzing and expressing risk. Yet, we see rather few examples of applications where the full Bayesian setting has been adopted with specifications of priors of unknown parameters. In this paper, we discuss some of the practical challenges of implementing Bayesian thinking and methods in risk analysis, emphasizing the introduction of probability models and parameters and associated uncertainty assessments. We conclude that there is a need for a pragmatic view in order to 'successfully' apply the Bayesian approach, such that we can do the assignments of some of the probabilities without adopting the somewhat sophisticated procedure of specifying prior distributions of parameters. A simple risk analysis example is presented to illustrate ideas

  13. A Bayesian Panel Data Approach to Explaining Market Beta Dynamics

    NARCIS (Netherlands)

    R. Bauer (Rob); M.M.J.E. Cosemans (Mathijs); R. Frehen (Rik); P.C. Schotman (Peter)

    2008-01-01

    markdownabstractWe characterize the process that drives the market betas of individual stocks by setting up a hierarchical Bayesian panel data model that allows a flexible specification for beta. We show that combining the parametric relationship between betas and conditioning variables specified by

  14. Bayesian model ensembling using meta-trained recurrent neural networks

    NARCIS (Netherlands)

    Ambrogioni, L.; Berezutskaya, Y.; Gü ç lü , U.; Borne, E.W.P. van den; Gü ç lü tü rk, Y.; Gerven, M.A.J. van; Maris, E.G.G.

    2017-01-01

    In this paper we demonstrate that a recurrent neural network meta-trained on an ensemble of arbitrary classification tasks can be used as an approximation of the Bayes optimal classifier. This result is obtained by relying on the framework of e-free approximate Bayesian inference, where the Bayesian

  15. Applications of Bayesian decision theory to intelligent tutoring systems

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1994-01-01

    Some applications of Bayesian decision theory to intelligent tutoring systems are considered. How the problem of adapting the appropriate amount of instruction to the changing nature of a student's capabilities during the learning process can be situated in the general framework of Bayesian decision

  16. Hierarchical spatial capture-recapture models: Modeling population density from stratified populations

    Science.gov (United States)

    Royle, J. Andrew; Converse, Sarah J.

    2014-01-01

    Capture–recapture studies are often conducted on populations that are stratified by space, time or other factors. In this paper, we develop a Bayesian spatial capture–recapture (SCR) modelling framework for stratified populations – when sampling occurs within multiple distinct spatial and temporal strata.We describe a hierarchical model that integrates distinct models for both the spatial encounter history data from capture–recapture sampling, and also for modelling variation in density among strata. We use an implementation of data augmentation to parameterize the model in terms of a latent categorical stratum or group membership variable, which provides a convenient implementation in popular BUGS software packages.We provide an example application to an experimental study involving small-mammal sampling on multiple trapping grids over multiple years, where the main interest is in modelling a treatment effect on population density among the trapping grids.Many capture–recapture studies involve some aspect of spatial or temporal replication that requires some attention to modelling variation among groups or strata. We propose a hierarchical model that allows explicit modelling of group or strata effects. Because the model is formulated for individual encounter histories and is easily implemented in the BUGS language and other free software, it also provides a general framework for modelling individual effects, such as are present in SCR models.

  17. An introduction to Bayesian statistics in health psychology.

    Science.gov (United States)

    Depaoli, Sarah; Rus, Holly M; Clifton, James P; van de Schoot, Rens; Tiemensma, Jitske

    2017-09-01

    The aim of the current article is to provide a brief introduction to Bayesian statistics within the field of health psychology. Bayesian methods are increasing in prevalence in applied fields, and they have been shown in simulation research to improve the estimation accuracy of structural equation models, latent growth curve (and mixture) models, and hierarchical linear models. Likewise, Bayesian methods can be used with small sample sizes since they do not rely on large sample theory. In this article, we discuss several important components of Bayesian statistics as they relate to health-based inquiries. We discuss the incorporation and impact of prior knowledge into the estimation process and the different components of the analysis that should be reported in an article. We present an example implementing Bayesian estimation in the context of blood pressure changes after participants experienced an acute stressor. We conclude with final thoughts on the implementation of Bayesian statistics in health psychology, including suggestions for reviewing Bayesian manuscripts and grant proposals. We have also included an extensive amount of online supplementary material to complement the content presented here, including Bayesian examples using many different software programmes and an extensive sensitivity analysis examining the impact of priors.

  18. Bayesian analysis of genetic association across tree-structured routine healthcare data in the UK Biobank.

    Science.gov (United States)

    Cortes, Adrian; Dendrou, Calliope A; Motyer, Allan; Jostins, Luke; Vukcevic, Damjan; Dilthey, Alexander; Donnelly, Peter; Leslie, Stephen; Fugger, Lars; McVean, Gil

    2017-09-01

    Genetic discovery from the multitude of phenotypes extractable from routine healthcare data can transform understanding of the human phenome and accelerate progress toward precision medicine. However, a critical question when analyzing high-dimensional and heterogeneous data is how best to interrogate increasingly specific subphenotypes while retaining statistical power to detect genetic associations. Here we develop and employ a new Bayesian analysis framework that exploits the hierarchical structure of diagnosis classifications to analyze genetic variants against UK Biobank disease phenotypes derived from self-reporting and hospital episode statistics. Our method displays a more than 20% increase in power to detect genetic effects over other approaches and identifies new associations between classical human leukocyte antigen (HLA) alleles and common immune-mediated diseases (IMDs). By applying the approach to genetic risk scores (GRSs), we show the extent of genetic sharing among IMDs and expose differences in disease perception or diagnosis with potential clinical implications.

  19. Object-Oriented Bayesian Networks (OOBN) for Aviation Accident Modeling and Technology Portfolio Impact Assessment

    Science.gov (United States)

    Shih, Ann T.; Ancel, Ersin; Jones, Sharon M.

    2012-01-01

    The concern for reducing aviation safety risk is rising as the National Airspace System in the United States transforms to the Next Generation Air Transportation System (NextGen). The NASA Aviation Safety Program is committed to developing an effective aviation safety technology portfolio to meet the challenges of this transformation and to mitigate relevant safety risks. The paper focuses on the reasoning of selecting Object-Oriented Bayesian Networks (OOBN) as the technique and commercial software for the accident modeling and portfolio assessment. To illustrate the benefits of OOBN in a large and complex aviation accident model, the in-flight Loss-of-Control Accident Framework (LOCAF) constructed as an influence diagram is presented. An OOBN approach not only simplifies construction and maintenance of complex causal networks for the modelers, but also offers a well-organized hierarchical network that is easier for decision makers to exploit the model examining the effectiveness of risk mitigation strategies through technology insertions.

  20. An introduction to using Bayesian linear regression with clinical data.

    Science.gov (United States)

    Baldwin, Scott A; Larson, Michael J

    2017-11-01

    Statistical training psychology focuses on frequentist methods. Bayesian methods are an alternative to standard frequentist methods. This article provides researchers with an introduction to fundamental ideas in Bayesian modeling. We use data from an electroencephalogram (EEG) and anxiety study to illustrate Bayesian models. Specifically, the models examine the relationship between error-related negativity (ERN), a particular event-related potential, and trait anxiety. Methodological topics covered include: how to set up a regression model in a Bayesian framework, specifying priors, examining convergence of the model, visualizing and interpreting posterior distributions, interval estimates, expected and predicted values, and model comparison tools. We also discuss situations where Bayesian methods can outperform frequentist methods as well has how to specify more complicated regression models. Finally, we conclude with recommendations about reporting guidelines for those using Bayesian methods in their own research. We provide data and R code for replicating our analyses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Uncertainty in perception and the Hierarchical Gaussian Filter

    Directory of Open Access Journals (Sweden)

    Christoph Daniel Mathys

    2014-11-01

    Full Text Available In its full sense, perception rests on an agent’s model of how its sensory input comes about and the inferences it draws based on this model. These inferences are necessarily uncertain. Here, we illustrate how the hierarchical Gaussian filter (HGF offers a principled and generic way to deal with the several forms that uncertainty in perception takes. The HGF is a recent derivation of one-step update equations from Bayesian principles that rests on a hierarchical generative model of the environment and its (instability. It is computationally highly efficient, allows for online estimates of hidden states, and has found numerous applications to experimental data from human subjects. In this paper, we generalize previous descriptions of the HGF and its account of perceptual uncertainty. First, we explicitly formulate the extension of the HGF’s hierarchy to any number of levels; second, we discuss how various forms of uncertainty are accommodated by the minimization of variational free energy as encoded in the update equations; third, we combine the HGF with decision models and demonstrate the inversion of this combination; finally, we report a simulation study that compared four optimization methods for inverting the HGF/decision model combination at different noise levels. These four methods (Nelder-Mead simplex algorithm, Gaussian process-based global optimization, variational Bayes and Markov chain Monte Carlo sampling all performed well even under considerable noise, with variational Bayes offering the best combination of efficiency and informativeness of inference. Our results demonstrate that the HGF provides a principled, flexible, and efficient - but at the same time intuitive - framework for the resolution of perceptual uncertainty in behaving agents.

  2. Hierarchical calibration and validation framework of bench-scale computational fluid dynamics simulations for solvent-based carbon capture. Part 2: Chemical absorption across a wetted wall column: Original Research Article: Hierarchical calibration and validation framework of bench-scale computational fluid dynamics simulations

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Chao [Pacific Northwest National Laboratory, Physical and Computational Sciences Directorate, Richland WA; Xu, Zhijie [Pacific Northwest National Laboratory, Physical and Computational Sciences Directorate, Richland WA; Lai, Kevin [Pacific Northwest National Laboratory, Physical and Computational Sciences Directorate, Richland WA; Whyatt, Greg [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland WA; Marcy, Peter W. [Los Alamos National Laboratory, Statistical Sciences Group, Los Alamos NM; Sun, Xin [Oak Ridge National Laboratory, Energy and Transportation Science Division, Oak Ridge TN

    2017-10-24

    The first part of this paper (Part 1) presents a numerical model for non-reactive physical mass transfer across a wetted wall column (WWC). In Part 2, we improved the existing computational fluid dynamics (CFD) model to simulate chemical absorption occurring in a WWC as a bench-scale study of solvent-based carbon dioxide (CO2) capture. To generate data for WWC model validation, CO2 mass transfer across a monoethanolamine (MEA) solvent was first measured on a WWC experimental apparatus. The numerical model developed in this work has the ability to account for both chemical absorption and desorption of CO2 in MEA. In addition, the overall mass transfer coefficient predicted using traditional/empirical correlations is conducted and compared with CFD prediction results for both steady and wavy falling films. A Bayesian statistical calibration algorithm is adopted to calibrate the reaction rate constants in chemical absorption/desorption of CO2 across a falling film of MEA. The posterior distributions of the two transport properties, i.e., Henry’s constant and gas diffusivity in the non-reacting nitrous oxide (N2O)/MEA system obtained from Part 1 of this study, serves as priors for the calibration of CO2 reaction rate constants after using the N2O/CO2 analogy method. The calibrated model can be used to predict the CO2 mass transfer in a WWC for a wider range of operating conditions.

  3. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  4. Theory change and Bayesian statistical inference

    NARCIS (Netherlands)

    Romeijn, Jan-Willem

    2005-01-01

    This paper addresses the problem that Bayesian statistical inference cannot accommodate theory change, and proposes a framework for dealing with such changes. It first presents a scheme for generating predictions from observations by means of hypotheses. An example shows how the hypotheses represent

  5. Neural network classification - A Bayesian interpretation

    Science.gov (United States)

    Wan, Eric A.

    1990-01-01

    The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework.

  6. Theory Change and Bayesian Statistical Inference

    NARCIS (Netherlands)

    Romeyn, Jan-Willem

    2008-01-01

    This paper addresses the problem that Bayesian statistical inference cannot accommodate theory change, and proposes a framework for dealing with such changes. It first presents a scheme for generating predictions from observations by means of hypotheses. An example shows how the hypotheses represent

  7. Default Bayesian Estimation of the Fundamental Frequency

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2013-01-01

    Joint fundamental frequency and model order esti- mation is an important problem in several applications. In this paper, a default estimation algorithm based on a minimum of prior information is presented. The algorithm is developed in a Bayesian framework, and it can be applied to both real...

  8. Adaptive bayesian analysis for binomial proportions

    CSIR Research Space (South Africa)

    Das, Sonali

    2008-10-01

    Full Text Available The authors consider the problem of statistical inference of binomial proportions for non-matched, correlated samples, under the Bayesian framework. Such inference can arise when the same group is observed at a different number of times with the aim...

  9. Philosophy and the practice of Bayesian statistics.

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.

  10. Philosophy and the practice of Bayesian statistics

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2015-01-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575

  11. Mental structures and hierarchical brain processing. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    Science.gov (United States)

    Petkov, C. I.

    2014-09-01

    Fitch proposes an appealing hypothesis that humans are dendrophiles, who constantly build mental trees supported by analogous hierarchical brain processes [1]. Moreover, it is argued that, by comparison, nonhuman animals build flat or more compact behaviorally-relevant structures. Should we thus expect less impressive hierarchical brain processes in other animals? Not necessarily.

  12. Bayesian classification and regression trees for predicting incidence of cryptosporidiosis.

    Directory of Open Access Journals (Sweden)

    Wenbiao Hu

    Full Text Available BACKGROUND: Classification and regression tree (CART models are tree-based exploratory data analysis methods which have been shown to be very useful in identifying and estimating complex hierarchical relationships in ecological and medical contexts. In this paper, a Bayesian CART model is described and applied to the problem of modelling the cryptosporidiosis infection in Queensland, Australia. METHODOLOGY/PRINCIPAL FINDINGS: We compared the results of a Bayesian CART model with those obtained using a Bayesian spatial conditional autoregressive (CAR model. Overall, the analyses indicated that the nature and magnitude of the effect estimates were similar for the two methods in this study, but the CART model more easily accommodated higher order interaction effects. CONCLUSIONS/SIGNIFICANCE: A Bayesian CART model for identification and estimation of the spatial distribution of disease risk is useful in monitoring and assessment of infectious diseases prevention and control.

  13. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  14. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  15. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  16. The application of a hierarchical Bayesian spatiotemporal model for ...

    Indian Academy of Sciences (India)

    for protons were used as the model input to forecast the flux values on 31 March 2008. Data were trans- formed into logarithmic values and gridded in a 5◦×5◦ longitude and latitude size to fulfill the modelling precondition. A Monte Carlo ..... of Mathematics University of Southampton, http://www.personal.soton.ac.uk/sks/.

  17. Loops in hierarchical channel networks

    Science.gov (United States)

    Katifori, Eleni; Magnasco, Marcelo

    2012-02-01

    Nature provides us with many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture. Although a number of methods have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated and natural graphs extracted from digitized images of dicotyledonous leaves and animal vasculature. We calculate various metrics on the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.

  18. Cell Detection Using Extremal Regions in a Semisupervised Learning Framework

    Directory of Open Access Journals (Sweden)

    Nisha Ramesh

    2017-01-01

    Full Text Available This paper discusses an algorithm to build a semisupervised learning framework for detecting cells. The cell candidates are represented as extremal regions drawn from a hierarchical image representation. Training a classifier for cell detection using supervised approaches relies on a large amount of training data, which requires a lot of effort and time. We propose a semisupervised approach to reduce this burden. The set of extremal regions is generated using a maximally stable extremal region (MSER detector. A subset of nonoverlapping regions with high similarity to the cells of interest is selected. Using the tree built from the MSER detector, we develop a novel differentiable unsupervised loss term that enforces the nonoverlapping constraint with the learned function. Our algorithm requires very few examples of cells with simple dot annotations for training. The supervised and unsupervised losses are embedded in a Bayesian framework for probabilistic learning.

  19. Hierarchical Bayesian models for genotype × environment estimates in post-weaning gain of Hereford bovine via reaction norms Modelos hierárquicos bayesianos para estimativas de interação genótipo × ambiente em ganho pós-desmama de bovinos Hereford via normas de reação

    Directory of Open Access Journals (Sweden)

    Leandro Lunardini Cardoso

    2011-02-01

    Full Text Available It was evaluated statistical models with different assumptions to define the one that best describes the presence of genotype × environment interaction on adjusted post-weaning weight gain (PWG345 of Hereford cattle, through the study of reactions norms to the environment, obtained by random regression using a Bayesian approach. Four reaction norms hierarchical models (RNHM were used through the INTERGEN program. The RNHM K uses the solutions of contemporary groups previously estimated by the standard animal model (AM and considers them as environmental level for predicting the reaction norms and the RNHM S, which jointly estimate these two sets of unknowns. For both models, two versions were considered, one with a homogeneous (hm and another with a heterogeneous (ht residual variance. Based on the deviance information criterion and Bayes factor, RNHMs hm showed the best fit to the data, and by the deviance based on conditional predictive ordinate, the best fit was the RNHM Kht, whereas, by all the three criteria used, the worst fit was obtained by using the standard animal model. Heritabilities estimated on RNHM were increasing in the environmental gradients for PWG345, at -60 kg, 0 and +60 kg. The genetic correlation estimated between the level and slope of reaction norms was high, from 0.97 to 0.99, characterizing a scale effect on genotype × environment interaction. The reaction norms hierarchical models are efficient to describe the changes in variance components due to the environment and to describe the presence of genotype × environment interaction on PWG345 trait of Hereford cattle.Avaliaram-se modelos estatísticos com diferentes pressuposições para definir o que melhor descreva a presença de interação genótipo × ambiente no ganho de peso pós-desmama ajustado (GPD345 de bovinos Hereford, mediante o estudo de normas de reação ao ambiente, obtidas por regressão aleatória, usando uma abordagem bayesiana. Quatro modelos hier

  20. Computational Neuropsychology and Bayesian Inference

    Directory of Open Access Journals (Sweden)

    Thomas Parr

    2018-02-01

    Full Text Available Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine ‘prior’ beliefs with a generative (predictive model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world. This draws upon the notion of a Bayes optimal pathology – optimal inference with suboptimal priors – and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient’s behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.

  1. What are hierarchical models and how do we analyze them?

    Science.gov (United States)

    Royle, Andy

    2016-01-01

    In this chapter we provide a basic definition of hierarchical models and introduce the two canonical hierarchical models in this book: site occupancy and N-mixture models. The former is a hierarchical extension of logistic regression and the latter is a hierarchical extension of Poisson regression. We introduce basic concepts of probability modeling and statistical inference including likelihood and Bayesian perspectives. We go through the mechanics of maximizing the likelihood and characterizing the posterior distribution by Markov chain Monte Carlo (MCMC) methods. We give a general perspective on topics such as model selection and assessment of model fit, although we demonstrate these topics in practice in later chapters (especially Chapters 5, 6, 7, and 10 Chapter 5 Chapter 6 Chapter 7 Chapter 10)

  2. Bayesian Analysis of Hot Jupiter Radii Points to Ohmic Dissipation

    Science.gov (United States)

    Thorngren, Daniel; Fortney, Jonathan J.

    2017-10-01

    The cause of the unexpectedly large radii of hot Jupiters has been the subject of many hypotheses over the past 15 years and is one of the long-standing open issues in exoplanetary physics. In our work, we seek to examine the population of 300 hot Jupiters to identify a model that best explains their radii. Using a hierarchical Bayesian framework, we match structure evolution models to the observed giant planets’ masses, radii, and ages, with a prior for bulk composition based on the mass from Thorngren et al. (2016). We consider various models for the relationship between heating efficiency (the fraction of flux absorbed into the interior) and incident flux. For the first time, we are able to derive this heating efficiency as a function of planetary T_eq. Models in which the heating efficiency decreases at the higher temperatures (above ~1600 K) are strongly and statistically significantly preferred. Of the published models for the radius anomaly, only the Ohmic dissipation model predicts this feature, which it explains as being the result of magnetic drag reducing atmospheric wind speeds. We interpret our results as strong evidence in favor of the Ohmic dissipation model.

  3. Hierarchical Data Structures, Institutional Research, and Multilevel Modeling

    Science.gov (United States)

    O'Connell, Ann A.; Reed, Sandra J.

    2012-01-01

    Multilevel modeling (MLM), also referred to as hierarchical linear modeling (HLM) or mixed models, provides a powerful analytical framework through which to study colleges and universities and their impact on students. Due to the natural hierarchical structure of data obtained from students or faculty in colleges and universities, MLM offers many…

  4. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  5. The Bayesian Score Statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.; Kleijn, R.; Paap, R.

    2000-01-01

    We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike

  6. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  7. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  8. Inequality contrained hierarchical models

    NARCIS (Netherlands)

    Kato, B.S.

    2005-01-01

    In multilevel research, the data structure in the population is hierarchical, and the sample data are viewed as a multistage sample from this hierarchical population. For instance in educational research, the population consists of schools and pupils within these schools. In this scenario, pupils

  9. A Bayesian Reflection on Surfaces

    Directory of Open Access Journals (Sweden)

    David R. Wolf

    1999-10-01

    Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.

  10. Hierarchical quantum communication

    International Nuclear Information System (INIS)

    Shukla, Chitra; Pathak, Anirban

    2013-01-01

    A general approach to study the hierarchical quantum information splitting (HQIS) is proposed and the same is used to systematically investigate the possibility of realizing HQIS using different classes of 4-qubit entangled states that are not connected by stochastic local operations and classical communication (SLOCC). Explicit examples of HQIS using 4-qubit cluster state and 4-qubit |Ω〉 state are provided. Further, the proposed HQIS scheme is generalized to introduce two new aspects of hierarchical quantum communication. To be precise, schemes of probabilistic hierarchical quantum information splitting and hierarchical quantum secret sharing are obtained by modifying the proposed HQIS scheme. A number of practical situations where hierarchical quantum communication would be of use, are also presented.

  11. Hierarchical Estimation as Basis for Hierarchical Forecasting

    NARCIS (Netherlands)

    Strijbosch, L.W.G.; Heuts, R.M.J.; Moors, J.J.A.

    2006-01-01

    In inventory management, hierarchical forecasting (HF) is a hot issue : families of items are formed for which total demand is forecasted; total forecast then is broken up to produce forecasts for the individual items.Since HF is a complicated procedure, analytical results are hard to obtain;

  12. Introduction to applied Bayesian statistics and estimation for social scientists

    CERN Document Server

    Lynch, Scott M

    2007-01-01

    ""Introduction to Applied Bayesian Statistics and Estimation for Social Scientists"" covers the complete process of Bayesian statistical analysis in great detail from the development of a model through the process of making statistical inference. The key feature of this book is that it covers models that are most commonly used in social science research - including the linear regression model, generalized linear models, hierarchical models, and multivariate regression models - and it thoroughly develops each real-data example in painstaking detail.The first part of the book provides a detailed

  13. Synthesizing trait correlations and functional relationships across multiple scales: A Hierarchical Bayes approach

    Science.gov (United States)

    Shiklomanov, A. N.; Cowdery, E.; Dietze, M.

    2016-12-01

    Recent syntheses of global trait databases have revealed that although the functional diversity among plant species is immense, this diversity is constrained by trade-offs between plant strategies. However, the use of among-trait and trait-environment correlations at the global scale for both qualitative ecological inference and land surface modeling has several important caveats. An alternative approach is to preserve the existing PFT-based model structure while using statistical analyses to account for uncertainty and variability in model parameters. In this study, we used a hierarchical Bayesian model of foliar traits in the TRY database to test the following hypotheses: (1) Leveraging the covariance between foliar traits will significantly constrain our uncertainty in their distributions; and (2) Among-trait covariance patterns are significantly different among and within PFTs, reflecting differences in trade-offs associated with biome-level evolution, site-level community assembly, and individual-level ecophysiological acclimation. We found that among-trait covariance significantly constrained estimates of trait means, and the additional information provided by across-PFT covariance led to more constraint still, especially for traits and PFTs with low sample sizes. We also found that among-trait correlations were highly variable among PFTs, and were generally inconsistent with correlations within PFTs. The hierarchical multivariate framework developed in our study can readily be enhanced with additional levels of hierarchy to account for geographic, species, and individual-level variability.

  14. Bayesian Nonparametric Estimation of Targeted Agent Effects on Biomarker Change to Predict Clinical Outcome

    Science.gov (United States)

    Graziani, Rebecca; Guindani, Michele; Thall, Peter F.

    2015-01-01

    Summary The effect of a targeted agent on a cancer patient's clinical outcome putatively is mediated through the agent's effect on one or more early biological events. This is motivated by pre-clinical experiments with cells or animals that identify such events, represented by binary or quantitative biomarkers. When evaluating targeted agents in humans, central questions are whether the distribution of a targeted biomarker changes following treatment, the nature and magnitude of this change, and whether it is associated with clinical outcome. Major difficulties in estimating these effects are that a biomarker's distribution may be complex, vary substantially between patients, and have complicated relationships with clinical outcomes. We present a probabilistically coherent framework for modeling and estimation in this setting, including a hierarchical Bayesian nonparametric mixture model for biomarkers that we use to define a functional profile of pre-versus-post treatment biomarker distribution change. The functional is similar to the receiver operating characteristic used in diagnostic testing. The hierarchical model yields clusters of individual patient biomarker profile functionals, and we use the profile as a covariate in a regression model for clinical outcome. The methodology is illustrated by analysis of a dataset from a clinical trial in prostate cancer using imatinib to target platelet-derived growth factor, with the clinical aim to improve progression-free survival time. PMID:25319212

  15. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  16. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  17. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  18. Practical Bayesian tomography

    Science.gov (United States)

    Granade, Christopher; Combes, Joshua; Cory, D. G.

    2016-03-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of-the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we address all three problems. First, we use modern statistical methods, as pioneered by Huszár and Houlsby (2012 Phys. Rev. A 85 052120) and by Ferrie (2014 New J. Phys. 16 093035), to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first priors on quantum states and channels that allow for including useful experimental insight. Finally, we develop a method that allows tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  19. Variational Bayesian Filtering

    Czech Academy of Sciences Publication Activity Database

    Šmídl, Václav; Quinn, A.

    2008-01-01

    Roč. 56, č. 10 (2008), s. 5020-5030 ISSN 1053-587X R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian filtering * particle filtering * Variational Bayes Subject RIV: BC - Control Systems Theory Impact factor: 2.335, year: 2008 http://library.utia.cas.cz/separaty/2008/AS/smidl-variational bayesian filtering.pdf

  20. Bayesian Networks An Introduction

    CERN Document Server

    Koski, Timo

    2009-01-01

    Bayesian Networks: An Introduction provides a self-contained introduction to the theory and applications of Bayesian networks, a topic of interest and importance for statisticians, computer scientists and those involved in modelling complex data sets. The material has been extensively tested in classroom teaching and assumes a basic knowledge of probability, statistics and mathematics. All notions are carefully explained and feature exercises throughout. Features include:.: An introduction to Dirichlet Distribution, Exponential Families and their applications.; A detailed description of learni

  1. Micromechanics of hierarchical materials

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon, Jr.

    2012-01-01

    A short overview of micromechanical models of hierarchical materials (hybrid composites, biomaterials, fractal materials, etc.) is given. Several examples of the modeling of strength and damage in hierarchical materials are summarized, among them, 3D FE model of hybrid composites...... with nanoengineered matrix, fiber bundle model of UD composites with hierarchically clustered fibers and 3D multilevel model of wood considered as a gradient, cellular material with layered composite cell walls. The main areas of research in micromechanics of hierarchical materials are identified, among them......, the investigations of the effects of load redistribution between reinforcing elements at different scale levels, of the possibilities to control different material properties and to ensure synergy of strengthening effects at different scale levels and using the nanoreinforcement effects. The main future directions...

  2. Introduction into Hierarchical Matrices

    KAUST Repository

    Litvinenko, Alexander

    2013-12-05

    Hierarchical matrices allow us to reduce computational storage and cost from cubic to almost linear. This technique can be applied for solving PDEs, integral equations, matrix equations and approximation of large covariance and precision matrices.

  3. Hierarchical Network Design

    DEFF Research Database (Denmark)

    Thomadsen, Tommy

    2005-01-01

    Communication networks are immensely important today, since both companies and individuals use numerous services that rely on them. This thesis considers the design of hierarchical (communication) networks. Hierarchical networks consist of layers of networks and are well-suited for coping...... with changing and increasing demands. Two-layer networks consist of one backbone network, which interconnects cluster networks. The clusters consist of nodes and links, which connect the nodes. One node in each cluster is a hub node, and the backbone interconnects the hub nodes of each cluster and thus...... the clusters. The design of hierarchical networks involves clustering of nodes, hub selection, and network design, i.e. selection of links and routing of ows. Hierarchical networks have been in use for decades, but integrated design of these networks has only been considered for very special types of networks...

  4. Hierarchical Communication Diagrams

    OpenAIRE

    Marcin Szpyrka; Piotr Matyasik; Jerzy Biernacki; Agnieszka Biernacka; Michał Wypych; Leszek Kotulski

    2016-01-01

    Formal modelling languages range from strictly textual ones like process algebra scripts to visual modelling languages based on hierarchical graphs like coloured Petri nets. Approaches equipped with visual modelling capabilities make developing process easier and help users to cope with more complex systems. Alvis is a modelling language that combines possibilities of formal models verification with flexibility and simplicity of practical programming languages. The paper deals with hierarchic...

  5. Evolution of Subjective Hurricane Risk Perceptions: A Bayesian Approach

    OpenAIRE

    David Kelly; David Letson; Forest Nelson; David S. Nolan; Daniel Solis

    2009-01-01

    This paper studies how individuals update subjective risk perceptions in response to hurricane track forecast information, using a unique data set from an event market, the Hurricane Futures Market (HFM). We derive a theoretical Bayesian framework which predicts how traders update their perceptions of the probability of a hurricane making landfall in a certain range of coastline. Our results suggest that traders behave in a way consistent with Bayesian updating but this behavior is based on t...

  6. Progress on Bayesian Inference of the Fast Ion Distribution Function

    DEFF Research Database (Denmark)

    Stagner, L.; Heidbrink, W.W,; Chen, X.

    2013-01-01

    . However, when theory and experiment disagree (for one or more diagnostics), it is unclear how to proceed. Bayesian statistics provides a framework to infer the DF, quantify errors, and reconcile discrepant diagnostic measurements. Diagnostic errors and weight functions that describe the phase space...... sensitivity of the measurements are incorporated into Bayesian likelihood probabilities. Prior probabilities describe physical constraints. This poster will show reconstructions of classically described, low-power, MHD-quiescent distribution functions from actual FIDA measurements. A description of the full...

  7. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  8. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  9. Bayesian Population Physiologically-Based Pharmacokinetic (PBPK Approach for a Physiologically Realistic Characterization of Interindividual Variability in Clinically Relevant Populations.

    Directory of Open Access Journals (Sweden)

    Markus Krauss

    Full Text Available Interindividual variability in anatomical and physiological properties results in significant differences in drug pharmacokinetics. The consideration of such pharmacokinetic variability supports optimal drug efficacy and safety for each single individual, e.g. by identification of individual-specific dosings. One clear objective in clinical drug development is therefore a thorough characterization of the physiological sources of interindividual variability. In this work, we present a Bayesian population physiologically-based pharmacokinetic (PBPK approach for the mechanistically and physiologically realistic identification of interindividual variability. The consideration of a generic and highly detailed mechanistic PBPK model structure enables the integration of large amounts of prior physiological knowledge, which is then updated with new experimental data in a Bayesian framework. A covariate model integrates known relationships of physiological parameters to age, gender and body height. We further provide a framework for estimation of the a posteriori parameter dependency structure at the population level. The approach is demonstrated considering a cohort of healthy individuals and theophylline as an application example. The variability and co-variability of physiological parameters are specified within the population; respectively. Significant correlations are identified between population parameters and are applied for individual- and population-specific visual predictive checks of the pharmacokinetic behavior, which leads to improved results compared to present population approaches. In the future, the integration of a generic PBPK model into an hierarchical approach allows for extrapolations to other populations or drugs, while the Bayesian paradigm allows for an iterative application of the approach and thereby a continuous updating of physiological knowledge with new data. This will facilitate decision making e.g. from preclinical to

  10. LiDAR based prediction of forest biomass using hierarchical models with spatially varying coefficients

    Science.gov (United States)

    Babcock, Chad; Finley, Andrew O.; Bradford, John B.; Kolka, Randall K.; Birdsey, Richard A.; Ryan, Michael G.

    2015-01-01

    Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both residual spatial dependence and non-stationarity of model covariates through the introduction of spatial random effects. We explored this objective using four forest inventory datasets that are part of the North American Carbon Program, each comprising point-referenced measures of above-ground forest biomass and discrete LiDAR. For each dataset, we considered at least five regression model specifications of varying complexity. Models were assessed based on goodness of fit criteria and predictive performance using a 10-fold cross-validation procedure. Results showed that the addition of spatial random effects to the regression model intercept improved fit and predictive performance in the presence of substantial residual spatial dependence. Additionally, in some cases, allowing either some or all regression slope parameters to vary spatially, via the addition of spatial random effects, further improved model fit and predictive performance. In other instances, models showed improved fit but decreased predictive performance—indicating over-fitting and underscoring the need for cross-validation to assess predictive ability. The proposed Bayesian modeling framework provided access to pixel-level posterior predictive distributions that were useful for uncertainty mapping, diagnosing spatial extrapolation issues, revealing missing model covariates, and discovering locally significant parameters.

  11. Towards a hierarchical optimization modeling framework for the evaluation and construction of spatially targeted incentive policies to promote green infrastructure (GI) amidst budgetary, compliance and GI-effectiveness uncertainties

    Science.gov (United States)

    Background:Bilevel optimization has been recognized as a 2-player Stackelberg game where players are represented as leaders and followers and each pursue their own set of objectives. Hierarchical optimization problems, which are a generalization of bilevel, are especially difficu...

  12. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  13. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M.

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  14. Bayesian Kernel Mixtures for Counts.

    Science.gov (United States)

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online.

  15. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  16. Parallel hierarchical radiosity rendering

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Michael [Iowa State Univ., Ames, IA (United States)

    1993-07-01

    In this dissertation, the step-by-step development of a scalable parallel hierarchical radiosity renderer is documented. First, a new look is taken at the traditional radiosity equation, and a new form is presented in which the matrix of linear system coefficients is transformed into a symmetric matrix, thereby simplifying the problem and enabling a new solution technique to be applied. Next, the state-of-the-art hierarchical radiosity methods are examined for their suitability to parallel implementation, and scalability. Significant enhancements are also discovered which both improve their theoretical foundations and improve the images they generate. The resultant hierarchical radiosity algorithm is then examined for sources of parallelism, and for an architectural mapping. Several architectural mappings are discussed. A few key algorithmic changes are suggested during the process of making the algorithm parallel. Next, the performance, efficiency, and scalability of the algorithm are analyzed. The dissertation closes with a discussion of several ideas which have the potential to further enhance the hierarchical radiosity method, or provide an entirely new forum for the application of hierarchical methods.

  17. Personalized Audio Systems - a Bayesian Approach

    DEFF Research Database (Denmark)

    Nielsen, Jens Brehm; Jensen, Bjørn Sand; Hansen, Toke Jansen

    2013-01-01

    Modern audio systems are typically equipped with several user-adjustable parameters unfamiliar to most users listening to the system. To obtain the best possible setting, the user is forced into multi-parameter optimization with respect to the users's own objective and preference. To address this......, the present paper presents a general inter-active framework for personalization of such audio systems. The framework builds on Bayesian Gaussian process regression in which a model of the users's objective function is updated sequentially. The parameter setting to be evaluated in a given trial is selected...

  18. Hierarchical spatial models for predicting pygmy rabbit distribution and relative abundance

    Science.gov (United States)

    Wilson, T.L.; Odei, J.B.; Hooten, M.B.; Edwards, T.C.

    2010-01-01

    Conservationists routinely use species distribution models to plan conservation, restoration and development actions, while ecologists use them to infer process from pattern. These models tend to work well for common or easily observable species, but are of limited utility for rare and cryptic species. This may be because honest accounting of known observation bias and spatial autocorrelation are rarely included, thereby limiting statistical inference of resulting distribution maps. We specified and implemented a spatially explicit Bayesian hierarchical model for a cryptic mammal species (pygmy rabbit Brachylagus idahoensis). Our approach used two levels of indirect sign that are naturally hierarchical (burrows and faecal pellets) to build a model that allows for inference on regression coefficients as well as spatially explicit model parameters. We also produced maps of rabbit distribution (occupied burrows) and relative abundance (number of burrows expected to be occupied by pygmy rabbits). The model demonstrated statistically rigorous spatial prediction by including spatial autocorrelation and measurement uncertainty. We demonstrated flexibility of our modelling framework by depicting probabilistic distribution predictions using different assumptions of pygmy rabbit habitat requirements. Spatial representations of the variance of posterior predictive distributions were obtained to evaluate heterogeneity in model fit across the spatial domain. Leave-one-out cross-validation was conducted to evaluate the overall model fit. Synthesis and applications. Our method draws on the strengths of previous work, thereby bridging and extending two active areas of ecological research: species distribution models and multi-state occupancy modelling. Our framework can be extended to encompass both larger extents and other species for which direct estimation of abundance is difficult. ?? 2010 The Authors. Journal compilation ?? 2010 British Ecological Society.

  19. Heuristic algorithms for feature selection under Bayesian models with block-diagonal covariance structure.

    Science.gov (United States)

    Foroughi Pour, Ali; Dalton, Lori A

    2018-03-21

    Many bioinformatics studies aim to identify markers, or features, that can be used to discriminate between distinct groups. In problems where strong individual markers are not available, or where interactions between gene products are of primary interest, it may be necessary to consider combinations of features as a marker family. To this end, recent work proposes a hierarchical Bayesian framework for feature selection that places a prior on the set of features we wish to select and on the label-conditioned feature distribution. While an analytical posterior under Gaussian models with block covariance structures is available, the optimal feature selection algorithm for this model remains intractable since it requires evaluating the posterior over the space of all possible covariance block structures and feature-block assignments. To address this computational barrier, in prior work we proposed a simple suboptimal algorithm, 2MNC-Robust, with robust performance across the space of block structures. Here, we present three new heuristic feature selection algorithms. The proposed algorithms outperform 2MNC-Robust and many other popular feature selection algorithms on synthetic data. In addition, enrichment analysis on real breast cancer, colon cancer, and Leukemia data indicates they also output many of the genes and pathways linked to the cancers under study. Bayesian feature selection is a promising framework for small-sample high-dimensional data, in particular biomarker discovery applications. When applied to cancer data these algorithms outputted many genes already shown to be involved in cancer as well as potentially new biomarkers. Furthermore, one of the proposed algorithms, SPM, outputs blocks of heavily correlated genes, particularly useful for studying gene interactions and gene networks.

  20. BAYESIAN ESTIMATION OF THERMONUCLEAR REACTION RATES

    Energy Technology Data Exchange (ETDEWEB)

    Iliadis, C.; Anderson, K. S. [Department of Physics and Astronomy, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-3255 (United States); Coc, A. [Centre de Sciences Nucléaires et de Sciences de la Matière (CSNSM), CNRS/IN2P3, Univ. Paris-Sud, Université Paris–Saclay, Bâtiment 104, F-91405 Orsay Campus (France); Timmes, F. X.; Starrfield, S., E-mail: iliadis@unc.edu [School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1504 (United States)

    2016-11-01

    The problem of estimating non-resonant astrophysical S -factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied to this problem in the past, almost all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extrasolar planets, gravitational waves, and Type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present astrophysical S -factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the reactions d(p, γ ){sup 3}He, {sup 3}He({sup 3}He,2p){sup 4}He, and {sup 3}He( α , γ ){sup 7}Be, important for deuterium burning, solar neutrinos, and Big Bang nucleosynthesis.

  1. Quantifying Registration Uncertainty With Sparse Bayesian Modelling.

    Science.gov (United States)

    Le Folgoc, Loic; Delingette, Herve; Criminisi, Antonio; Ayache, Nicholas

    2017-02-01

    We investigate uncertainty quantification under a sparse Bayesian model of medical image registration. Bayesian modelling has proven powerful to automate the tuning of registration hyperparameters, such as the trade-off between the data and regularization functionals. Sparsity-inducing priors have recently been used to render the parametrization itself adaptive and data-driven. The sparse prior on transformation parameters effectively favors the use of coarse basis functions to capture the global trends in the visible motion while finer, highly localized bases are introduced only in the presence of coherent image information and motion. In earlier work, approximate inference under the sparse Bayesian model was tackled in an efficient Variational Bayes (VB) framework. In this paper we are interested in the theoretical and empirical quality of uncertainty estimates derived under this approximate scheme vs. under the exact model. We implement an (asymptotically) exact inference scheme based on reversible jump Markov Chain Monte Carlo (MCMC) sampling to characterize the posterior distribution of the transformation and compare the predictions of the VB and MCMC based methods. The true posterior distribution under the sparse Bayesian model is found to be meaningful: orders of magnitude for the estimated uncertainty are quantitatively reasonable, the uncertainty is higher in textureless regions and lower in the direction of strong intensity gradients.

  2. Inverse Problems in a Bayesian Setting

    KAUST Repository

    Matthies, Hermann G.

    2016-02-13

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.

  3. Hierarchical Fuzzy Sets To Query Possibilistic Databases

    OpenAIRE

    Thomopoulos, Rallou; Buche, Patrice; Haemmerlé, Ollivier

    2008-01-01

    Within the framework of flexible querying of possibilistic databases, based on the fuzzy set theory, this chapter focuses on the case where the vocabulary used both in the querying language and in the data is hierarchically organized, which occurs in systems that use ontologies. We give an overview of previous works concerning two issues: firstly, flexible querying of imprecise data in the relational model; secondly, the introduction of fuzziness in hierarchies. Concerning the latter point, w...

  4. Semantic Image Segmentation with Contextual Hierarchical Models.

    Science.gov (United States)

    Seyedhosseini, Mojtaba; Tasdizen, Tolga

    2016-05-01

    Semantic segmentation is the problem of assigning an object label to each pixel. It unifies the image segmentation and object recognition problems. The importance of using contextual information in semantic segmentation frameworks has been widely realized in the field. We propose a contextual framework, called contextual hierarchical model (CHM), which learns contextual information in a hierarchical framework for semantic segmentation. At each level of the hierarchy, a classifier is trained based on downsampled input images and outputs of previous levels. Our model then incorporates the resulting multi-resolution contextual information into a classifier to segment the input image at original resolution. This training strategy allows for optimization of a joint posterior probability at multiple resolutions through the hierarchy. Contextual hierarchical model is purely based on the input image patches and does not make use of any fragments or shape examples. Hence, it is applicable to a variety of problems such as object segmentation and edge detection. We demonstrate that CHM performs at par with state-of-the-art on Stanford background and Weizmann horse datasets. It also outperforms state-of-the-art edge detection methods on NYU depth dataset and achieves state-of-the-art on Berkeley segmentation dataset (BSDS 500).

  5. Hierarchical Porous Structures

    Energy Technology Data Exchange (ETDEWEB)

    Grote, Christopher John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-07

    Materials Design is often at the forefront of technological innovation. While there has always been a push to generate increasingly low density materials, such as aero or hydrogels, more recently the idea of bicontinuous structures has gone more into play. This review will cover some of the methods and applications for generating both porous, and hierarchically porous structures.

  6. Catalysis with hierarchical zeolites

    DEFF Research Database (Denmark)

    Holm, Martin Spangsberg; Taarning, Esben; Egeblad, Kresten

    2011-01-01

    zeolites that have been reported hitherto. Prototypical examples from some of the different categories of catalytic reactions that have been studied using hierarchical zeolite catalysts are highlighted. This clearly illustrates the different ways that improved performance can be achieved with this family...

  7. The Hierarchical Perspective

    Directory of Open Access Journals (Sweden)

    Daniel Sofron

    2015-05-01

    Full Text Available This paper is focused on the hierarchical perspective, one of the methods for representing space that was used before the discovery of the Renaissance linear perspective. The hierarchical perspective has a more or less pronounced scientific character and its study offers us a clear image of the way the representatives of the cultures that developed it used to perceive the sensitive reality. This type of perspective is an original method of representing three-dimensional space on a flat surface, which characterises the art of Ancient Egypt and much of the art of the Middle Ages, being identified in the Eastern European Byzantine art, as well as in the Western European Pre-Romanesque and Romanesque art. At the same time, the hierarchical perspective is also present in naive painting and infantile drawing. Reminiscences of this method can be recognised also in the works of some precursors of the Italian Renaissance. The hierarchical perspective can be viewed as a subjective ranking criterion, according to which the elements are visually represented by taking into account their relevance within the image while perception is ignored. This paper aims to show how the main objective of the artists of those times was not to faithfully represent the objective reality, but rather to emphasize the essence of the world and its perennial aspects. This may represent a possible explanation for the refusal of perspective in the Egyptian, Romanesque and Byzantine painting, characterised by a marked two-dimensionality.

  8. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  9. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  10. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  11. A tutorial introduction to Bayesian models of cognitive development.

    Science.gov (United States)

    Perfors, Amy; Tenenbaum, Joshua B; Griffiths, Thomas L; Xu, Fei

    2011-09-01

    We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in the cognitive science applications, mathematical foundations, or machine learning details in more depth. In addition, we discuss some important interpretation issues that often arise when evaluating Bayesian models in cognitive science. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Hierarchical models and functional traits

    NARCIS (Netherlands)

    van Loon, E.E.; Shamoun-Baranes, J.; Sierdsema, H.; Bouten, W.; Cramer, W.; Badeck, F.; Krukenberg, B.; Klotz, S.; Kühn, I.; Schweiger, O.; Böhning-Gaese, K.; Schaefer, H.-C.; Kissling, D.; Brandl, R.; Brändle, M.; Fricke, R.; Leuschner, C.; Buschmann, H.; Köckermann, B.; Rose, L.

    2006-01-01

    Hierarchical models for animal abundance prediction are conceptually elegant. They are generally more parsimonous than non-hierarchical models derived from the same data, give relatively robust predictions and automatically provide consistent output at multiple (spatio-temporal) scales. Another

  13. A FAST BAYESIAN METHOD FOR UPDATING AND FORECASTING HOURLY OZONE LEVELS

    Science.gov (United States)

    A Bayesian hierarchical space-time model is proposed by combining information from real-time ambient AIRNow air monitoring data, and output from a computer simulation model known as the Community Multi-scale Air Quality (Eta-CMAQ) forecast model. A model validation analysis shows...

  14. A Bayesian approach to landscape ecological risk assessment applied to the upper Grande Ronde watershed, Oregon

    Science.gov (United States)

    Kimberley K. Ayre; Wayne G. Landis

    2012-01-01

    We present a Bayesian network model based on the ecological risk assessment framework to evaluate potential impacts to habitats and resources resulting from wildfire, grazing, forest management activities, and insect outbreaks in a forested landscape in northeastern Oregon. The Bayesian network structure consisted of three tiers of nodes: landscape disturbances,...

  15. A Bayesian Approach to Person Fit Analysis in Item Response Theory Models. Research Report.

    Science.gov (United States)

    Glas, Cees A. W.; Meijer, Rob R.

    A Bayesian approach to the evaluation of person fit in item response theory (IRT) models is presented. In a posterior predictive check, the observed value on a discrepancy variable is positioned in its posterior distribution. In a Bayesian framework, a Markov Chain Monte Carlo procedure can be used to generate samples of the posterior distribution…

  16. Some explorations into Bayesian modelling of risks due to pesticide intake from food

    OpenAIRE

    Voet, van der, H.; Paulo, M.J.

    2004-01-01

    This paper presents some common types of data and models in pesticide exposure assessment. The problems of traditional methods are discussed in connection with possibilities to address them in a Bayesian framework. We present simple Bayesian models for consumption of food and for residue monitoring data

  17. Bayesian Peak Picking for NMR Spectra

    KAUST Repository

    Cheng, Yichen

    2014-02-01

    Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR) has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.

  18. Bayesian image reconstruction: Application to emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, J.; Llacer, J.

    1989-02-01

    In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.

  19. Bayesian Peak Picking for NMR Spectra

    Directory of Open Access Journals (Sweden)

    Yichen Cheng

    2014-02-01

    Full Text Available Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.

  20. Bayesian optimization for materials science

    CERN Document Server

    Packwood, Daniel

    2017-01-01

    This book provides a short and concise introduction to Bayesian optimization specifically for experimental and computational materials scientists. After explaining the basic idea behind Bayesian optimization and some applications to materials science in Chapter 1, the mathematical theory of Bayesian optimization is outlined in Chapter 2. Finally, Chapter 3 discusses an application of Bayesian optimization to a complicated structure optimization problem in computational surface science. Bayesian optimization is a promising global optimization technique that originates in the field of machine learning and is starting to gain attention in materials science. For the purpose of materials design, Bayesian optimization can be used to predict new materials with novel properties without extensive screening of candidate materials. For the purpose of computational materials science, Bayesian optimization can be incorporated into first-principles calculations to perform efficient, global structure optimizations. While re...

  1. A guide to Bayesian model selection for ecologists

    Science.gov (United States)

    Hooten, Mevin B.; Hobbs, N.T.

    2015-01-01

    The steady upward trend in the use of model selection and Bayesian methods in ecological research has made it clear that both approaches to inference are important for modern analysis of models and data. However, in teaching Bayesian methods and in working with our research colleagues, we have noticed a general dissatisfaction with the available literature on Bayesian model selection and multimodel inference. Students and researchers new to Bayesian methods quickly find that the published advice on model selection is often preferential in its treatment of options for analysis, frequently advocating one particular method above others. The recent appearance of many articles and textbooks on Bayesian modeling has provided welcome background on relevant approaches to model selection in the Bayesian framework, but most of these are either very narrowly focused in scope or inaccessible to ecologists. Moreover, the methodological details of Bayesian model selection approaches are spread thinly throughout the literature, appearing in journals from many different fields. Our aim with this guide is to condense the large body of literature on Bayesian approaches to model selection and multimodel inference and present it specifically for quantitative ecologists as neutrally as possible. We also bring to light a few important and fundamental concepts relating directly to model selection that seem to have gone unnoticed in the ecological literature. Throughout, we provide only a minimal discussion of philosophy, preferring instead to examine the breadth of approaches as well as their practical advantages and disadvantages. This guide serves as a reference for ecologists using Bayesian methods, so that they can better understand their options and can make an informed choice that is best aligned with their goals for inference.

  2. Fast and accurate Bayesian model criticism and conflict diagnostics using R-INLA

    KAUST Repository

    Ferkingstad, Egil

    2017-10-16

    Bayesian hierarchical models are increasingly popular for realistic modelling and analysis of complex data. This trend is accompanied by the need for flexible, general and computationally efficient methods for model criticism and conflict detection. Usually, a Bayesian hierarchical model incorporates a grouping of the individual data points, as, for example, with individuals in repeated measurement data. In such cases, the following question arises: Are any of the groups “outliers,” or in conflict with the remaining groups? Existing general approaches aiming to answer such questions tend to be extremely computationally demanding when model fitting is based on Markov chain Monte Carlo. We show how group-level model criticism and conflict detection can be carried out quickly and accurately through integrated nested Laplace approximations (INLA). The new method is implemented as a part of the open-source R-INLA package for Bayesian computing (http://r-inla.org).

  3. Bayesian modeling of ChIP-chip data using latent variables

    Directory of Open Access Journals (Sweden)

    Tian Yanan

    2009-10-01

    Full Text Available Abstract Background The ChIP-chip technology has been used in a wide range of biomedical studies, such as identification of human transcription factor binding sites, investigation of DNA methylation, and investigation of histone modifications in animals and plants. Various methods have been proposed in the literature for analyzing the ChIP-chip data, such as the sliding window methods, the hidden Markov model-based methods, and Bayesian methods. Although, due to the integrated consideration of uncertainty of the models and model parameters, Bayesian methods can potentially work better than the other two classes of methods, the existing Bayesian methods do not perform satisfactorily. They usually require multiple replicates or some extra experimental information to parametrize the model, and long CPU time due to involving of MCMC simulations. Results In this paper, we propose a Bayesian latent model for the ChIP-chip data. The new model mainly differs from the existing Bayesian models, such as the joint deconvolution model, the hierarchical gamma mixture model, and the Bayesian hierarchical model, in two respects. Firstly, it works on the difference between the averaged treatment and control samples. This enables the use of a simple model for the data, which avoids the probe-specific effect and the sample (control/treatment effect. As a consequence, this enables an efficient MCMC simulation of the posterior distribution of the model, and also makes the model more robust to the outliers. Secondly, it models the neighboring dependence of probes by introducing a latent indicator vector. A truncated Poisson prior distribution is assumed for the latent indicator variable, with the rationale being justified at length. Conclusion The Bayesian latent method is successfully applied to real and ten simulated datasets, with comparisons with some of the existing Bayesian methods, hidden Markov model methods, and sliding window methods. The numerical results

  4. Bayesian modeling of ChIP-chip data using latent variables.

    KAUST Repository

    Wu, Mingqi

    2009-10-26

    BACKGROUND: The ChIP-chip technology has been used in a wide range of biomedical studies, such as identification of human transcription factor binding sites, investigation of DNA methylation, and investigation of histone modifications in animals and plants. Various methods have been proposed in the literature for analyzing the ChIP-chip data, such as the sliding window methods, the hidden Markov model-based methods, and Bayesian methods. Although, due to the integrated consideration of uncertainty of the models and model parameters, Bayesian methods can potentially work better than the other two classes of methods, the existing Bayesian methods do not perform satisfactorily. They usually require multiple replicates or some extra experimental information to parametrize the model, and long CPU time due to involving of MCMC simulations. RESULTS: In this paper, we propose a Bayesian latent model for the ChIP-chip data. The new model mainly differs from the existing Bayesian models, such as the joint deconvolution model, the hierarchical gamma mixture model, and the Bayesian hierarchical model, in two respects. Firstly, it works on the difference between the averaged treatment and control samples. This enables the use of a simple model for the data, which avoids the probe-specific effect and the sample (control/treatment) effect. As a consequence, this enables an efficient MCMC simulation of the posterior distribution of the model, and also makes the model more robust to the outliers. Secondly, it models the neighboring dependence of probes by introducing a latent indicator vector. A truncated Poisson prior distribution is assumed for the latent indicator variable, with the rationale being justified at length. CONCLUSION: The Bayesian latent method is successfully applied to real and ten simulated datasets, with comparisons with some of the existing Bayesian methods, hidden Markov model methods, and sliding window methods. The numerical results indicate that the

  5. A Bayesian approach shows no correlation between transit-depth and stellar metallicity for confirmed and candidates Kepler gas giants planets

    International Nuclear Information System (INIS)

    Nehmé, C; Sarkis, P

    2017-01-01

    Previous study to investigate the correlation between the transit depth and the stellar metallicity of Kepler’s (Q1-Q12) gas giant planets (radii of 5-20R ⊙ ) has led to a weakly significant negative correlation. We use the cumulative catalog of planets detected by the NASA Kepler mission Q1-Q17 catalog, as of April 2015, to perform a solid statistical analysis of this correlation. In the present work, we revise this correlation, within a Bayesian framework, for two large samples: sample A confirmed planets and sample B (confirmed + candidates). We expand a hierarchical method to account for false positives in the studied samples. Our statistical analysis reveals no correlation between the transit depth and the stellar metallicity. This has implications for planet formation theory and interior structure of giant planets. (paper)

  6. Hierarchically Structured Electrospun Fibers

    Directory of Open Access Journals (Sweden)

    Nicole E. Zander

    2013-01-01

    Full Text Available Traditional electrospun nanofibers have a myriad of applications ranging from scaffolds for tissue engineering to components of biosensors and energy harvesting devices. The generally smooth one-dimensional structure of the fibers has stood as a limitation to several interesting novel applications. Control of fiber diameter, porosity and collector geometry will be briefly discussed, as will more traditional methods for controlling fiber morphology and fiber mat architecture. The remainder of the review will focus on new techniques to prepare hierarchically structured fibers. Fibers with hierarchical primary structures—including helical, buckled, and beads-on-a-string fibers, as well as fibers with secondary structures, such as nanopores, nanopillars, nanorods, and internally structured fibers and their applications—will be discussed. These new materials with helical/buckled morphology are expected to possess unique optical and mechanical properties with possible applications for negative refractive index materials, highly stretchable/high-tensile-strength materials, and components in microelectromechanical devices. Core-shell type fibers enable a much wider variety of materials to be electrospun and are expected to be widely applied in the sensing, drug delivery/controlled release fields, and in the encapsulation of live cells for biological applications. Materials with a hierarchical secondary structure are expected to provide new superhydrophobic and self-cleaning materials.

  7. Bayesian coronal seismology

    Science.gov (United States)

    Arregui, Iñigo

    2018-01-01

    In contrast to the situation in a laboratory, the study of the solar atmosphere has to be pursued without direct access to the physical conditions of interest. Information is therefore incomplete and uncertain and inference methods need to be employed to diagnose the physical conditions and processes. One of such methods, solar atmospheric seismology, makes use of observed and theoretically predicted properties of waves to infer plasma and magnetic field properties. A recent development in solar atmospheric seismology consists in the use of inversion and model comparison methods based on Bayesian analysis. In this paper, the philosophy and methodology of Bayesian analysis are first explained. Then, we provide an account of what has been achieved so far from the application of these techniques to solar atmospheric seismology and a prospect of possible future extensions.

  8. Bayesian community detection.

    Science.gov (United States)

    Mørup, Morten; Schmidt, Mikkel N

    2012-09-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled.

  9. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  10. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  11. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...... economics, with careful controls for the confounding effects of risk aversion. Our results show that risk aversion significantly alters inferences on deviations from Bayes’ Rule....

  12. Approximate Bayesian recursive estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav

    2014-01-01

    Roč. 285, č. 1 (2014), s. 100-111 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf

  13. Urban pattern: Layout design by hierarchical domain splitting

    KAUST Repository

    Yang, Yongliang

    2013-11-06

    We present a framework for generating street networks and parcel layouts. Our goal is the generation of high-quality layouts that can be used for urban planning and virtual environments. We propose a solution based on hierarchical domain splitting using two splitting types: streamline-based splitting, which splits a region along one or multiple streamlines of a cross field, and template-based splitting, which warps pre-designed templates to a region and uses the interior geometry of the template as the splitting lines. We combine these two splitting approaches into a hierarchical framework, providing automatic and interactive tools to explore the design space.

  14. A Bayesian meta-analytic approach for safety signal detection in randomized clinical trials.

    Science.gov (United States)

    Odani, Motoi; Fukimbara, Satoru; Sato, Tosiya

    2017-04-01

    Meta-analyses are frequently performed on adverse event data and are primarily used for improving statistical power to detect safety signals. However, in the evaluation of drug safety for New Drug Applications, simple pooling of adverse event data from multiple clinical trials is still commonly used. We sought to propose a new Bayesian hierarchical meta-analytic approach based on consideration of a hierarchical structure of reported individual adverse event data from multiple randomized clinical trials. To develop our meta-analysis model, we extended an existing three-stage Bayesian hierarchical model by including an additional stage of the clinical trial level in the hierarchical model; this generated a four-stage Bayesian hierarchical model. We applied the proposed Bayesian meta-analysis models to published adverse event data from three premarketing randomized clinical trials of tadalafil and to a simulation study motivated by the case example to evaluate the characteristics of three alternative models. Comparison of the results from the Bayesian meta-analysis model with those from Fisher's exact test after simple pooling showed that 6 out of 10 adverse events were the same within a top 10 ranking of individual adverse events with regard to association with treatment. However, more individual adverse events were detected in the Bayesian meta-analysis model than in Fisher's exact test under the body system "Musculoskeletal and connective tissue disorders." Moreover, comparison of the overall trend of estimates between the Bayesian model and the standard approach (odds ratios after simple pooling methods) revealed that the posterior median odds ratios for the Bayesian model for most adverse events shrank toward values for no association. Based on the simulation results, the Bayesian meta-analysis model could balance the false detection rate and power to a better extent than Fisher's exact test. For example, when the threshold value of the posterior probability for

  15. Bayesian Reasoning Using 3D Relations for Lane Marker Detection

    DEFF Research Database (Denmark)

    Boesman, Bart; Jensen, Lars Baunegaard With; Baseski, Emre

    2009-01-01

    We introduce a lane marker detection algorithm that integrates 3D attributes as well as 3D relations between local edges and semi-global contours in a Bayesian framework. The algorithm is parameter free and does not make use of any heuristic assumptions. The reasoning is based on the complete...

  16. Applications of Bayesian decision theory to sequential mastery testing

    NARCIS (Netherlands)

    Vos, Hendrik J.

    1999-01-01

    The purpose of this paper is to formulate optimal sequential rules for mastery tests. The framework for the approach is derived from Bayesian sequential decision theory. Both a threshold and linear loss structure are considered. The binomial probability distribution is adopted as the psychometric

  17. Relating mesocarnivore relative abundance to anthropogenic land-use with a hierarchical spatial count model

    Science.gov (United States)

    Crimmins, Shawn M.; Walleser, Liza R.; Hertel, Dan R.; McKann, Patrick C.; Rohweder, Jason J.; Thogmartin, Wayne E.

    2016-01-01

    There is growing need to develop models of spatial patterns in animal abundance, yet comparatively few examples of such models exist. This is especially true in situations where the abundance of one species may inhibit that of another, such as the intensively-farmed landscape of the Prairie Pothole Region (PPR) of the central United States, where waterfowl production is largely constrained by mesocarnivore nest predation. We used a hierarchical Bayesian approach to relate the distribution of various land-cover types to the relative abundances of four mesocarnivores in the PPR: coyote Canis latrans, raccoon Procyon lotor, red fox Vulpes vulpes, and striped skunk Mephitis mephitis. We developed models for each species at multiple spatial resolutions (41.4 km2, 10.4 km2, and 2.6 km2) to address different ecological and management-related questions. Model results for each species were similar irrespective of resolution. We found that the amount of row-crop agriculture was nearly ubiquitous in our best models, exhibiting a positive relationship with relative abundance for each species. The amount of native grassland land-cover was positively associated with coyote and raccoon relative abundance, but generally absent from models for red fox and skunk. Red fox and skunk were positively associated with each other, suggesting potential niche overlap. We found no evidence that coyote abundance limited that of other mesocarnivore species, as might be expected under a hypothesis of mesopredator release. The relationships between relative abundance and land-cover types were similar across spatial resolutions. Our results indicated that mesocarnivores in the PPR are most likely to occur in portions of the landscape with large amounts of agricultural land-cover. Further, our results indicated that track-survey data can be used in a hierarchical framework to gain inferences regarding spatial patterns in animal relative abundance.

  18. Trans-Dimensional Bayesian Imaging of 3-D Crustal and Upper Mantle Structure in Northeast Asia

    Science.gov (United States)

    Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.

    2016-12-01

    Imaging 3-D structures using stepwise inversions of ambient noise and receiver function data is now a routine work. Here, we carry out the inversion in the trans-dimensional and hierarchical extension of the Bayesian framework to obtain rigorous estimates of uncertainty and high-resolution images of crustal and upper mantle structures beneath Northeast (NE) Asia. The methods inherently account for data sensitivities by means of using adaptive parameterizations and treating data noise as free parameters. Therefore, parsimonious results from the methods are balanced out between model complexity and data fitting. This allows fully exploiting data information, preventing from over- or under-estimation of the data fit, and increases model resolution. In addition, the reliability of results is more rigorously checked through the use of Bayesian uncertainties. It is shown by various synthetic recovery tests that complex and spatially variable features are well resolved in our resulting images of NE Asia. Rayleigh wave phase and group velocity tomograms (8-70 s), a 3-D shear-wave velocity model from depth inversions of the estimated dispersion maps, and regional 3-D models (NE China, the Korean Peninsula, and the Japanese islands) from joint inversions with receiver function data of dense networks are presented. High-resolution models are characterized by a number of tectonically meaningful features. We focus our interpretation on complex patterns of sub-lithospheric low velocity structures that extend from back-arc regions to continental margins. We interpret the anomalies in conjunction with distal and distributed intraplate volcanoes in NE Asia. Further discussion on other imaged features will be presented.

  19. Bayesian models for astrophysical data using R, JAGS, Python, and Stan

    CERN Document Server

    Hilbe, Joseph M; Ishida, Emille E O

    2017-01-01

    This comprehensive guide to Bayesian methods in astronomy enables hands-on work by supplying complete R, JAGS, Python, and Stan code, to use directly or to adapt. It begins by examining the normal model from both frequentist and Bayesian perspectives and then progresses to a full range of Bayesian generalized linear and mixed or hierarchical models, as well as additional types of models such as ABC and INLA. The book provides code that is largely unavailable elsewhere and includes details on interpreting and evaluating Bayesian models. Initial discussions offer models in synthetic form so that readers can easily adapt them to their own data; later the models are applied to real astronomical data. The consistent focus is on hands-on modeling, analysis of data, and interpretations that address scientific questions. A must-have for astronomers, its concrete approach will also be attractive to researchers in the sciences more generally.

  20. Bayesian analysis in plant pathology.

    Science.gov (United States)

    Mila, A L; Carriquiry, A L

    2004-09-01

    ABSTRACT Bayesian methods are currently much discussed and applied in several disciplines from molecular biology to engineering. Bayesian inference is the process of fitting a probability model to a set of data and summarizing the results via probability distributions on the parameters of the model and unobserved quantities such as predictions for new observations. In this paper, after a short introduction of Bayesian inference, we present the basic features of Bayesian methodology using examples from sequencing genomic fragments and analyzing microarray gene-expressing levels, reconstructing disease maps, and designing experiments.

  1. Reconstruction of late Holocene climate based on tree growth and mechanistic hierarchical models

    Science.gov (United States)

    Tipton, John; Hooten, Mevin B.; Pederson, Neil; Tingley, Martin; Bishop, Daniel

    2016-01-01

    Reconstruction of pre-instrumental, late Holocene climate is important for understanding how climate has changed in the past and how climate might change in the future. Statistical prediction of paleoclimate from tree ring widths is challenging because tree ring widths are a one-dimensional summary of annual growth that represents a multi-dimensional set of climatic and biotic influences. We develop a Bayesian hierarchical framework using a nonlinear, biologically motivated tree ring growth model to jointly reconstruct temperature and precipitation in the Hudson Valley, New York. Using a common growth function to describe the response of a tree to climate, we allow for species-specific parameterizations of the growth response. To enable predictive backcasts, we model the climate variables with a vector autoregressive process on an annual timescale coupled with a multivariate conditional autoregressive process that accounts for temporal correlation and cross-correlation between temperature and precipitation on a monthly scale. Our multi-scale temporal model allows for flexibility in the climate response through time at different temporal scales and predicts reasonable climate scenarios given tree ring width data.

  2. A hierarchical model for estimating density in camera-trap studies

    Science.gov (United States)

    Royle, J. Andrew; Nichols, James D.; Karanth, K.Ullas; Gopalaswamy, Arjun M.

    2009-01-01

    Estimating animal density using capture–recapture data from arrays of detection devices such as camera traps has been problematic due to the movement of individuals and heterogeneity in capture probability among them induced by differential exposure to trapping.We develop a spatial capture–recapture model for estimating density from camera-trapping data which contains explicit models for the spatial point process governing the distribution of individuals and their exposure to and detection by traps.We adopt a Bayesian approach to analysis of the hierarchical model using the technique of data augmentation.The model is applied to photographic capture–recapture data on tigers Panthera tigris in Nagarahole reserve, India. Using this model, we estimate the density of tigers to be 14·3 animals per 100 km2 during 2004.Synthesis and applications. Our modelling framework largely overcomes several weaknesses in conventional approaches to the estimation of animal density from trap arrays. It effectively deals with key problems such as individual heterogeneity in capture probabilities, movement of traps, presence of potential ‘holes’ in the array and ad hoc estimation of sample area. The formulation, thus, greatly enhances flexibility in the conduct of field surveys as well as in the analysis of data, from studies that may involve physical, photographic or DNA-based ‘captures’ of individual animals.

  3. Context updates are hierarchical

    Directory of Open Access Journals (Sweden)

    Anton Karl Ingason

    2016-10-01

    Full Text Available This squib studies the order in which elements are added to the shared context of interlocutors in a conversation. It focuses on context updates within one hierarchical structure and argues that structurally higher elements are entered into the context before lower elements, even if the structurally higher elements are pronounced after the lower elements. The crucial data are drawn from a comparison of relative clauses in two head-initial languages, English and Icelandic, and two head-final languages, Korean and Japanese. The findings have consequences for any theory of a dynamic semantics.

  4. How to make more out of community data? A conceptual framework and its implementation as models and software.

    Science.gov (United States)

    Ovaskainen, Otso; Tikhonov, Gleb; Norberg, Anna; Guillaume Blanchet, F; Duan, Leo; Dunson, David; Roslin, Tomas; Abrego, Nerea

    2017-05-01

    Community ecology aims to understand what factors determine the assembly and dynamics of species assemblages at different spatiotemporal scales. To facilitate the integration between conceptual and statistical approaches in community ecology, we propose Hierarchical Modelling of Species Communities (HMSC) as a general, flexible framework for modern analysis of community data. While non-manipulative data allow for only correlative and not causal inference, this framework facilitates the formulation of data-driven hypotheses regarding the processes that structure communities. We model environmental filtering by variation and covariation in the responses of individual species to the characteristics of their environment, with potential contingencies on species traits and phylogenetic relationships. We capture biotic assembly rules by species-to-species association matrices, which may be estimated at multiple spatial or temporal scales. We operationalise the HMSC framework as a hierarchical Bayesian joint species distribution model, and implement it as R- and Matlab-packages which enable computationally efficient analyses of large data sets. Armed with this tool, community ecologists can make sense of many types of data, including spatially explicit data and time-series data. We illustrate the use of this framework through a series of diverse ecological examples. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  5. Sparse Bayesian Learning for DOA Estimation with Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Jisheng Dai

    2015-10-01

    Full Text Available Sparse Bayesian learning (SBL has given renewed interest to the problem of direction-of-arrival (DOA estimation. It is generally assumed that the measurement matrix in SBL is precisely known. Unfortunately, this assumption may be invalid in practice due to the imperfect manifold caused by unknown or misspecified mutual coupling. This paper describes a modified SBL method for joint estimation of DOAs and mutual coupling coefficients with uniform linear arrays (ULAs. Unlike the existing method that only uses stationary priors, our new approach utilizes a hierarchical form of the Student t prior to enforce the sparsity of the unknown signal more heavily. We also provide a distinct Bayesian inference for the expectation-maximization (EM algorithm, which can update the mutual coupling coefficients more efficiently. Another difference is that our method uses an additional singular value decomposition (SVD to reduce the computational complexity of the signal reconstruction process and the sensitivity to the measurement noise.

  6. A Study of Tacit Knowledge Transfer Based on Complex Networks Technology in Hierarchical Organizations

    Science.gov (United States)

    Cheng, Tingting; Wang, Hengshan; Wang, Lubang

    In reality, most economic entities are hierarchical organizations. But in the hierarchical organizations tacit knowledge can be transferred across different hierarchies even across different departments. By use of complex networks technology, a hierarchical organization’s framework is modeled in this paper. Through quantifying a number of technical datas we analyze and have a research on the transfer distance and the optimum tacit knowledge transfer path in hierarchy networks.

  7. Bayesian tomographic reconstruction of microsystems

    International Nuclear Information System (INIS)

    Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali

    2007-01-01

    The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast).To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique.In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations

  8. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  9. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  10. Nested and Hierarchical Archimax copulas

    KAUST Repository

    Hofert, Marius

    2017-07-03

    The class of Archimax copulas is generalized to nested and hierarchical Archimax copulas in several ways. First, nested extreme-value copulas or nested stable tail dependence functions are introduced to construct nested Archimax copulas based on a single frailty variable. Second, a hierarchical construction of d-norm generators is presented to construct hierarchical stable tail dependence functions and thus hierarchical extreme-value copulas. Moreover, one can, by itself or additionally, introduce nested frailties to extend Archimax copulas to nested Archimax copulas in a similar way as nested Archimedean copulas extend Archimedean copulas. Further results include a general formula for the density of Archimax copulas.

  11. Hierarchical materials: Background and perspectives

    DEFF Research Database (Denmark)

    2016-01-01

    Hierarchical design draws inspiration from analysis of biological materials and has opened new possibilities for enhancing performance and enabling new functionalities and extraordinary properties. With the development of nanotechnology, the necessary technological requirements for the manufactur...... for the manufacturing of hierarchical materials are advancing at a fast pace, opening new challenges and opportunities. This article presents an overview of possible applications of and perspectives on hierarchical materials.......Hierarchical design draws inspiration from analysis of biological materials and has opened new possibilities for enhancing performance and enabling new functionalities and extraordinary properties. With the development of nanotechnology, the necessary technological requirements...

  12. Bayesian Spatiotemporal Pattern and Eco-climatological Drivers of Striped Skunk Rabies in the North Central Plains.

    Science.gov (United States)

    Raghavan, Ram K; Hanlon, Cathleen A; Goodin, Douglas G; Davis, Rolan; Moore, Michael; Moore, Susan; Anderson, Gary A

    2016-04-01

    Striped skunks are one of the most important terrestrial reservoirs of rabies virus in North America, and yet the prevalence of rabies among this host is only passively monitored and the disease among this host remains largely unmanaged. Oral vaccination campaigns have not efficiently targeted striped skunks, while periodic spillovers of striped skunk variant viruses to other animals, including some domestic animals, are routinely recorded. In this study we evaluated the spatial and spatio-temporal patterns of infection status among striped skunk cases submitted for rabies testing in the North Central Plains of US in a Bayesian hierarchical framework, and also evaluated potential eco-climatological drivers of such patterns. Two Bayesian hierarchical models were fitted to point-referenced striped skunk rabies cases [n = 656 (negative), and n = 310 (positive)] received at a leading rabies diagnostic facility between the years 2007-2013. The first model included only spatial and temporal terms and a second covariate model included additional covariates representing eco-climatic conditions within a 4 km(2) home-range area for striped skunks. The better performing covariate model indicated the presence of significant spatial and temporal trends in the dataset and identified higher amounts of land covered by low-intensity developed areas [Odds ratio (OR) = 3.41; 95% Bayesian Credible Intervals (CrI) = 2.08, 3.85], higher level of patch fragmentation (OR = 1.70; 95% CrI = 1.25, 2.89), and diurnal temperature range (OR = 0.54; 95% CrI = 0.27, 0.91) to be important drivers of striped skunk rabies incidence in the study area. Model validation statistics indicated satisfactory performance for both models; however, the covariate model fared better. The findings of this study are important in the context of rabies management among striped skunks in North America, and the relevance of physical and climatological factors as risk factors for skunk to human rabies transmission and

  13. Bayesian Spatiotemporal Pattern and Eco-climatological Drivers of Striped Skunk Rabies in the North Central Plains.

    Directory of Open Access Journals (Sweden)

    Ram K Raghavan

    2016-04-01

    Full Text Available Striped skunks are one of the most important terrestrial reservoirs of rabies virus in North America, and yet the prevalence of rabies among this host is only passively monitored and the disease among this host remains largely unmanaged. Oral vaccination campaigns have not efficiently targeted striped skunks, while periodic spillovers of striped skunk variant viruses to other animals, including some domestic animals, are routinely recorded. In this study we evaluated the spatial and spatio-temporal patterns of infection status among striped skunk cases submitted for rabies testing in the North Central Plains of US in a Bayesian hierarchical framework, and also evaluated potential eco-climatological drivers of such patterns. Two Bayesian hierarchical models were fitted to point-referenced striped skunk rabies cases [n = 656 (negative, and n = 310 (positive] received at a leading rabies diagnostic facility between the years 2007-2013. The first model included only spatial and temporal terms and a second covariate model included additional covariates representing eco-climatic conditions within a 4 km(2 home-range area for striped skunks. The better performing covariate model indicated the presence of significant spatial and temporal trends in the dataset and identified higher amounts of land covered by low-intensity developed areas [Odds ratio (OR = 3.41; 95% Bayesian Credible Intervals (CrI = 2.08, 3.85], higher level of patch fragmentation (OR = 1.70; 95% CrI = 1.25, 2.89, and diurnal temperature range (OR = 0.54; 95% CrI = 0.27, 0.91 to be important drivers of striped skunk rabies incidence in the study area. Model validation statistics indicated satisfactory performance for both models; however, the covariate model fared better. The findings of this study are important in the context of rabies management among striped skunks in North America, and the relevance of physical and climatological factors as risk factors for skunk to human rabies

  14. A hierarchical model combining distance sampling and time removal to estimate detection probability during avian point counts

    Science.gov (United States)

    Amundson, Courtney L.; Royle, J. Andrew; Handel, Colleen M.

    2014-01-01

    Imperfect detection during animal surveys biases estimates of abundance and can lead to improper conclusions regarding distribution and population trends. Farnsworth et al. (2005) developed a combined distance-sampling and time-removal model for point-transect surveys that addresses both availability (the probability that an animal is available for detection; e.g., that a bird sings) and perceptibility (the probability that an observer detects an animal, given that it is available for detection). We developed a hierarchical extension of the combined model that provides an integrated analysis framework for a collection of survey points at which both distance from the observer and time of initial detection are recorded. Implemented in a Bayesian framework, this extension facilitates evaluating covariates on abundance and detection probability, incorporating excess zero counts (i.e. zero-inflation), accounting for spatial autocorrelation, and estimating population density. Species-specific characteristics, such as behavioral displays and territorial dispersion, may lead to different patterns of availability and perceptibility, which may, in turn, influence the performance of such hierarchical models. Therefore, we first test our proposed model using simulated data under different scenarios of availability and perceptibility. We then illustrate its performance with empirical point-transect data for a songbird that consistently produces loud, frequent, primarily auditory signals, the Golden-crowned Sparrow (Zonotrichia atricapilla); and for 2 ptarmigan species (Lagopus spp.) that produce more intermittent, subtle, and primarily visual cues. Data were collected by multiple observers along point transects across a broad landscape in southwest Alaska, so we evaluated point-level covariates on perceptibility (observer and habitat), availability (date within season and time of day), and abundance (habitat, elevation, and slope), and included a nested point

  15. Bayesian Analysis of Hot Jupiter Radius Anomalies Points to Ohmic Dissipation

    Science.gov (United States)

    Thorngren, Daniel; Fortney, Jonathan

    2018-01-01

    The cause of the unexpectedly large radii of hot Jupiters has been the subject of many hypotheses over the past 15 years and is one of the long-standing open issues in exoplanetary physics. In our work, we seek to examine the population of 300 hot Jupiters to identify a model that best explains their radii. Using a hierarchical Bayesian framework, we match structure evolution models to the observed giant planets’ masses, radii, and ages, with a prior for bulk composition based on the mass from Thorngren et al. (2016). We consider various models for the relationship between heating efficiency (the fraction of flux absorbed into the interior) and incident flux. For the first time, we are able to derive this heating efficiency as a function of planetary T_eq. Models in which the heating efficiency decreases at the higher temperatures (above ~1600 K) are strongly and statistically significantly preferred. Of the published models for the radius anomaly, only the Ohmic dissipation model predicts this feature, which it explains as being the result of magnetic drag reducing atmospheric wind speeds. We interpret our results as evidence in favor of the Ohmic dissipation model.

  16. Bayesian inference in camera trapping studies for a class of spatial capture-recapture models

    Science.gov (United States)

    Royle, J. Andrew; Karanth, K. Ullas; Gopalaswamy, Arjun M.; Kumar, N. Samba

    2009-01-01

    We develop a class of models for inference about abundance or density using spatial capture-recapture data from studies based on camera trapping and related methods. The model is a hierarchical model composed of two components: a point process model describing the distribution of individuals in space (or their home range centers) and a model describing the observation of individuals in traps. We suppose that trap- and individual-specific capture probabilities are a function of distance between individual home range centers and trap locations. We show that the models can be regarded as generalized linear mixed models, where the individual home range centers are random effects. We adopt a Bayesian framework for inference under these models using a formulation based on data augmentation. We apply the models to camera trapping data on tigers from the Nagarahole Reserve, India, collected over 48 nights in 2006. For this study, 120 camera locations were used, but cameras were only operational at 30 locations during any given sample occasion. Movement of traps is common in many camera-trapping studies and represents an important feature of the observation model that we address explicitly in our application.

  17. Predicting bison migration out of Yellowstone National Park using bayesian models.

    Directory of Open Access Journals (Sweden)

    Chris Geremia

    Full Text Available Long distance migrations by ungulate species often surpass the boundaries of preservation areas where conflicts with various publics lead to management actions that can threaten populations. We chose the partially migratory bison (Bison bison population in Yellowstone National Park as an example of integrating science into management policies to better conserve migratory ungulates. Approximately 60% of these bison have been exposed to bovine brucellosis and thousands of migrants exiting the park boundary have been culled during the past two decades to reduce the risk of disease transmission to cattle. Data were assimilated using models representing competing hypotheses of bison migration during 1990-2009 in a hierarchal bayesian framework. Migration differed at the scale of herds, but a single unifying logistic model was useful for predicting migrations by both herds. Migration beyond the northern park boundary was affected by herd size, accumulated snow water equivalent, and aboveground dried biomass. Migration beyond the western park boundary was less influenced by these predictors and process model performance suggested an important control on recent migrations was excluded. Simulations of migrations over the next decade suggest that allowing increased numbers of bison beyond park boundaries during severe climate conditions may be the only means of avoiding episodic, large-scale reductions to the Yellowstone bison population in the foreseeable future. This research is an example of how long distance migration dynamics can be incorporated into improved management policies.

  18. Bayesian inference in camera trapping studies for a class of spatial capture-recapture models.

    Science.gov (United States)

    Royle, J Andrew; Karanth, K Ullas; Gopalaswamy, Arjun M; Kumar, N Samba

    2009-11-01

    We develop a class of models for inference about abundance or density using spatial capture-recapture data from studies based on camera trapping and related methods. The model is a hierarchical model composed of two components: a point process model describing the distribution of individuals in space (or their home range centers) and a model describing the observation of individuals in traps. We suppose that trap- and individual-specific capture probabilities are a function of distance between individual home range centers and trap locations. We show that the models can be regarded as generalized linear mixed models, where the individual home range centers are random effects. We adopt a Bayesian framework for inference under these models using a formulation based on data augmentation. We apply the models to camera trapping data on tigers from the Nagarahole Reserve, India, collected over 48 nights in 2006. For this study, 120 camera locations were used, but cameras were only operational at 30 locations during any given sample occasion. Movement of traps is common in many camera-trapping studies and represents an important feature of the observation model that we address explicitly in our application.

  19. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Directory of Open Access Journals (Sweden)

    Volker J. Schmid

    2007-10-01

    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  20. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...... and a fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated...

  1. A Hierarchal Risk Assessment Model Using the Evidential Reasoning Rule

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Ji

    2017-02-01

    Full Text Available This paper aims to develop a hierarchical risk assessment model using the newly-developed evidential reasoning (ER rule, which constitutes a generic conjunctive probabilistic reasoning process. In this paper, we first provide a brief introduction to the basics of the ER rule and emphasize the strengths for representing and aggregating uncertain information from multiple experts and sources. Further, we discuss the key steps of developing the hierarchical risk assessment framework systematically, including (1 formulation of risk assessment hierarchy; (2 representation of both qualitative and quantitative information; (3 elicitation of attribute weights and information reliabilities; (4 aggregation of assessment information using the ER rule and (5 quantification and ranking of risks using utility-based transformation. The proposed hierarchical risk assessment framework can potentially be implemented to various complex and uncertain systems. A case study on the fire/explosion risk assessment of marine vessels demonstrates the applicability of the proposed risk assessment model.

  2. Hierarchical Image Segmentation Based on Iterative Contraction and Merging.

    Science.gov (United States)

    Syu, Jia-Hao; Wang, Sheng-Jyh; Wang, Li-Chun

    2017-05-01

    In this paper, we propose a new framework for hierarchical image segmentation based on iterative contraction and merging. In the proposed framework, we treat the hierarchical image segmentation problem as a sequel of optimization problems, with each optimization process being realized by a contraction-and-merging process to identify and merge the most similar data pairs at the current resolution. At the beginning, we perform pixel-based contraction and merging to quickly combine image pixels into initial region-elements with visually indistinguishable intra-region color difference. After that, we iteratively perform region-based contraction and merging to group adjacent regions into larger ones to progressively form a segmentation dendrogram for hierarchical segmentation. Comparing with the state-of-the-art techniques, the proposed algorithm can not only produce high-quality segmentation results in a more efficient way, but also keep a lot of boundary details in the segmentation results.

  3. Bayesian Variable Selection on Model Spaces Constrained by Heredity Conditions.

    Science.gov (United States)

    Taylor-Rodriguez, Daniel; Womack, Andrew; Bliznyuk, Nikolay

    2016-01-01

    This paper investigates Bayesian variable selection when there is a hierarchical dependence structure on the inclusion of predictors in the model. In particular, we study the type of dependence found in polynomial response surfaces of orders two and higher, whose model spaces are required to satisfy weak or strong heredity conditions. These conditions restrict the inclusion of higher-order terms depending upon the inclusion of lower-order parent terms. We develop classes of priors on the model space, investigate their theoretical and finite sample properties, and provide a Metropolis-Hastings algorithm for searching the space of models. The tools proposed allow fast and thorough exploration of model spaces that account for hierarchical polynomial structure in the predictors and provide control of the inclusion of false positives in high posterior probability models.

  4. Bayesian inference of chemical kinetic models from proposed reactions

    KAUST Repository

    Galagali, Nikhil

    2015-02-01

    © 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.

  5. Classification using Bayesian neural nets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)

    1995-01-01

    textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to

  6. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    Science.gov (United States)

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes.

  7. The Bayesian Covariance Lasso.

    Science.gov (United States)

    Khondker, Zakaria S; Zhu, Hongtu; Chu, Haitao; Lin, Weili; Ibrahim, Joseph G

    2013-04-01

    Estimation of sparse covariance matrices and their inverse subject to positive definiteness constraints has drawn a lot of attention in recent years. The abundance of high-dimensional data, where the sample size ( n ) is less than the dimension ( d ), requires shrinkage estimation methods since the maximum likelihood estimator is not positive definite in this case. Furthermore, when n is larger than d but not sufficiently larger, shrinkage estimation is more stable than maximum likelihood as it reduces the condition number of the precision matrix. Frequentist methods have utilized penalized likelihood methods, whereas Bayesian approaches rely on matrix decompositions or Wishart priors for shrinkage. In this paper we propose a new method, called the Bayesian Covariance Lasso (BCLASSO), for the shrinkage estimation of a precision (covariance) matrix. We consider a class of priors for the precision matrix that leads to the popular frequentist penalties as special cases, develop a Bayes estimator for the precision matrix, and propose an efficient sampling scheme that does not precalculate boundaries for positive definiteness. The proposed method is permutation invariant and performs shrinkage and estimation simultaneously for non-full rank data. Simulations show that the proposed BCLASSO performs similarly as frequentist methods for non-full rank data.

  8. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  9. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  10. Synthesis strategies in the search for hierarchical zeolites.

    Science.gov (United States)

    Serrano, D P; Escola, J M; Pizarro, P

    2013-05-07

    Great interest has arisen in the past years in the development of hierarchical zeolites, having at least two levels of porosities. Hierarchical zeolites show an enhanced accessibility, leading to improved catalytic activity in reactions suffering from steric and/or diffusional limitations. Moreover, the secondary porosity offers an ideal space for the deposition of additional active phases and for functionalization with organic moieties. However, the secondary surface represents a discontinuity of the crystalline framework, with a low connectivity and a high concentration of silanols. Consequently, hierarchical zeolites exhibit a less "zeolitic behaviour" than conventional ones in terms of acidity, hydrophobic/hydrophilic character, confinement effects, shape-selectivity and hydrothermal stability. Nevertheless, this secondary surface is far from being amorphous, which provides hierarchical zeolites with a set of novel features. A wide variety of innovative strategies have been developed for generating a secondary porosity in zeolites. In the present review, the different synthetic routes leading to hierarchical zeolites have been classified into five categories: removal of framework atoms, surfactant-assisted procedures, hard-templating, zeolitization of preformed solids and organosilane-based methods. Significant advances have been achieved recently in several of these alternatives. These include desilication, due to its versatility, dual templating with polyquaternary ammonium surfactants and framework reorganization by treatment with surfactant-containing basic solutions. In the last two cases, the materials so prepared show both mesoscopic ordering and zeolitic lattice planes. Likewise, interesting results have been obtained with the incorporation of different types of organosilanes into the zeolite crystallization gels, taking advantage of their high affinity for silicate and aluminosilicate species. Crystallization of organofunctionalized species favours the

  11. Hierarchical structure for risk criteria applicable to nuclear power plants

    International Nuclear Information System (INIS)

    Hall, R.E.; Mitra, S.P.

    1985-01-01

    This paper discusses the development of a hierarchical structure for risk criteria applicable to nuclear power plants. The structure provides a unified framework to systematically analyze the implications of different types of criteria, each focusing on a particular aspect of nuclear power plant risks. The framework allows investigation of the specific coverage of a particular criterion and comparison of different criteria with regard to areas to which they apply. 5 refs., 2 figs

  12. Hierarchical modelling for the environmental sciences statistical methods and applications

    CERN Document Server

    Clark, James S

    2006-01-01

    New statistical tools are changing the way in which scientists analyze and interpret data and models. Hierarchical Bayes and Markov Chain Monte Carlo methods for analysis provide a consistent framework for inference and prediction where information is heterogeneous and uncertain, processes are complicated, and responses depend on scale. Nowhere are these methods more promising than in the environmental sciences.

  13. A Bayesian equivalency test for two independent binomial proportions.

    Science.gov (United States)

    Kawasaki, Yohei; Shimokawa, Asanao; Yamada, Hiroshi; Miyaoka, Etsuo

    2016-01-01

    In clinical trials, it is often necessary to perform an equivalence study. The equivalence study requires actively denoting equivalence between two different drugs or treatments. Since it is not possible to assert equivalence that is not rejected by a superiority test, statistical methods known as equivalency tests have been suggested. These methods for equivalency tests are based on the frequency framework; however, there are few such methods in the Bayesian framework. Hence, this article proposes a new index that suggests the equivalency of binomial proportions, which is constructed based on the Bayesian framework. In this study, we provide two methods for calculating the index and compare the probabilities that have been calculated by these two calculation methods. Moreover, we apply this index to the results of actual clinical trials to demonstrate the utility of the index.

  14. Trees and Hierarchical Structures

    CERN Document Server

    Haeseler, Arndt

    1990-01-01

    The "raison d'etre" of hierarchical dustering theory stems from one basic phe­ nomenon: This is the notorious non-transitivity of similarity relations. In spite of the fact that very often two objects may be quite similar to a third without being that similar to each other, one still wants to dassify objects according to their similarity. This should be achieved by grouping them into a hierarchy of non-overlapping dusters such that any two objects in ~ne duster appear to be more related to each other than they are to objects outside this duster. In everyday life, as well as in essentially every field of scientific investigation, there is an urge to reduce complexity by recognizing and establishing reasonable das­ sification schemes. Unfortunately, this is counterbalanced by the experience of seemingly unavoidable deadlocks caused by the existence of sequences of objects, each comparatively similar to the next, but the last rather different from the first.

  15. Transmutations across hierarchical levels

    International Nuclear Information System (INIS)

    O'Neill, R.V.

    1977-01-01

    The development of large-scale ecological models depends implicitly on a concept known as hierarchy theory which views biological systems in a series of hierarchical levels (i.e., organism, population, trophic level, ecosystem). The theory states that an explanation of a biological phenomenon is provided when it is shown to be the consequence of the activities of the system's components, which are themselves systems in the next lower level of the hierarchy. Thus, the behavior of a population is explained by the behavior of the organisms in the population. The initial step in any modeling project is, therefore, to identify the system components and the interactions between them. A series of examples of transmutations in aquatic and terrestrial ecosystems are presented to show how and why changes occur. The types of changes are summarized and possible implications of transmutation for hierarchy theory, for the modeler, and for the ecological theoretician are discussed

  16. Optimizing Prediction Using Bayesian Model Averaging: Examples Using Large-Scale Educational Assessments.

    Science.gov (United States)

    Kaplan, David; Lee, Chansoon

    2018-01-01

    This article provides a review of Bayesian model averaging as a means of optimizing the predictive performance of common statistical models applied to large-scale educational assessments. The Bayesian framework recognizes that in addition to parameter uncertainty, there is uncertainty in the choice of models themselves. A Bayesian approach to addressing the problem of model uncertainty is the method of Bayesian model averaging. Bayesian model averaging searches the space of possible models for a set of submodels that satisfy certain scientific principles and then averages the coefficients across these submodels weighted by each model's posterior model probability (PMP). Using the weighted coefficients for prediction has been shown to yield optimal predictive performance according to certain scoring rules. We demonstrate the utility of Bayesian model averaging for prediction in education research with three examples: Bayesian regression analysis, Bayesian logistic regression, and a recently developed approach for Bayesian structural equation modeling. In each case, the model-averaged estimates are shown to yield better prediction of the outcome of interest than any submodel based on predictive coverage and the log-score rule. Implications for the design of large-scale assessments when the goal is optimal prediction in a policy context are discussed.

  17. Accelerating Bayesian inference for evolutionary biology models.

    Science.gov (United States)

    Meyer, Xavier; Chopard, Bastien; Salamin, Nicolas

    2017-03-01

    Bayesian inference is widely used nowadays and relies largely on Markov chain Monte Carlo (MCMC) methods. Evolutionary biology has greatly benefited from the developments of MCMC methods, but the design of more complex and realistic models and the ever growing availability of novel data is pushing the limits of the current use of these methods. We present a parallel Metropolis-Hastings (M-H) framework built with a novel combination of enhancements aimed towards parameter-rich and complex models. We show on a parameter-rich macroevolutionary model increases of the sampling speed up to 35 times with 32 processors when compared to a sequential M-H process. More importantly, our framework achieves up to a twentyfold faster convergence to estimate the posterior probability of phylogenetic trees using 32 processors when compared to the well-known software MrBayes for Bayesian inference of phylogenetic trees. https://bitbucket.org/XavMeyer/hogan. nicolas.salamin@unil.ch. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  18. Detecting Hierarchical Structure in Networks

    DEFF Research Database (Denmark)

    Herlau, Tue; Mørup, Morten; Schmidt, Mikkel Nørgaard

    2012-01-01

    . On synthetic and real data we demonstrate that our model can detect hierarchical structure leading to better link-prediction than competing models. Our model can be used to detect if a network exhibits hierarchical structure, thereby leading to a better comprehension and statistical account the network....

  19. Static Correctness of Hierarchical Procedures

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    1990-01-01

    A system of hierarchical, fully recursive types in a truly imperative language allows program fragments written for small types to be reused for all larger types. To exploit this property to enable type-safe hierarchical procedures, it is necessary to impose a static requirement on procedure calls...

  20. Uncertainty, reward, and attention in the Bayesian brain

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma

    2008-01-01

    of attention as Bayesian prior, and uni¿es apparently disparate attentional ‘bottlenecks’. We present simulations of three key paradigms, and discuss how such modelling could be extended to more detailed, neurally inspired settings. Broadening the Bayesian picture of perception and strengthening its connection......The ‘Bayesian Coding Hypothesis’ formalises the classic Helmholtzian picture of perception as inverse inference, stating that the brain uses Bayes’ rule to compute posterior belief distributions over states of the world. There is much behavioural evidence that human observers can behave Bayes...... in the focus of attention. When faced instead with a complex scene, the brain can’t be Bayes-optimal everywhere. We suggest that a general limitation on the representation of complex posteriors causes the brain to make approximations, which are then locally re¿ned by attention. This framework extends ideas...