WorldWideScience

Sample records for bayesian spatial modeling

  1. Bayesian Variable Selection in Spatial Autoregressive Models

    OpenAIRE

    Jesus Crespo Cuaresma; Philipp Piribauer

    2015-01-01

    This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. We present two alternative approaches which can be implemented using Gibbs sampling methods in a straightforward way and allow us to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. In a simulation study we show that the variable selection approaches tend to outperform existing Bayesian model averaging tech...

  2. Bayesian Spatial Modelling with R-INLA

    OpenAIRE

    Finn Lindgren; Håvard Rue

    2015-01-01

    The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA) approach proposed by Rue, Martino, and Chopin (2009) is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized) linear mixed to spatial and spatio-temporal models. Combined with the stochastic...

  3. Bayesian Spatial Modelling with R-INLA

    Directory of Open Access Journals (Sweden)

    Finn Lindgren

    2015-02-01

    Full Text Available The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA approach proposed by Rue, Martino, and Chopin (2009 is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized linear mixed to spatial and spatio-temporal models. Combined with the stochastic partial differential equation approach (SPDE, Lindgren, Rue, and Lindstrm 2011, one can accommodate all kinds of geographically referenced data, including areal and geostatistical ones, as well as spatial point process data. The implementation interface covers stationary spatial mod- els, non-stationary spatial models, and also spatio-temporal models, and is applicable in epidemiology, ecology, environmental risk assessment, as well as general geostatistics.

  4. An Inhomogeneous Bayesian Texture Model for Spatially Varying Parameter Estimation

    OpenAIRE

    Dharmagunawardhana, Chathurika; Mahmoodi, Sasan; Bennett, Michael; Niranjan, Mahesan

    2014-01-01

    In statistical model based texture feature extraction, features based on spatially varying parameters achieve higher discriminative performances compared to spatially constant parameters. In this paper we formulate a novel Bayesian framework which achieves texture characterization by spatially varying parameters based on Gaussian Markov random fields. The parameter estimation is carried out by Metropolis-Hastings algorithm. The distributions of estimated spatially varying paramete...

  5. Hierarchical Bayesian spatial models for multispecies conservation planning and monitoring.

    Science.gov (United States)

    Carroll, Carlos; Johnson, Devin S; Dunk, Jeffrey R; Zielinski, William J

    2010-12-01

    Biologists who develop and apply habitat models are often familiar with the statistical challenges posed by their data's spatial structure but are unsure of whether the use of complex spatial models will increase the utility of model results in planning. We compared the relative performance of nonspatial and hierarchical Bayesian spatial models for three vertebrate and invertebrate taxa of conservation concern (Church's sideband snails [Monadenia churchi], red tree voles [Arborimus longicaudus], and Pacific fishers [Martes pennanti pacifica]) that provide examples of a range of distributional extents and dispersal abilities. We used presence-absence data derived from regional monitoring programs to develop models with both landscape and site-level environmental covariates. We used Markov chain Monte Carlo algorithms and a conditional autoregressive or intrinsic conditional autoregressive model framework to fit spatial models. The fit of Bayesian spatial models was between 35 and 55% better than the fit of nonspatial analogue models. Bayesian spatial models outperformed analogous models developed with maximum entropy (Maxent) methods. Although the best spatial and nonspatial models included similar environmental variables, spatial models provided estimates of residual spatial effects that suggested how ecological processes might structure distribution patterns. Spatial models built from presence-absence data improved fit most for localized endemic species with ranges constrained by poorly known biogeographic factors and for widely distributed species suspected to be strongly affected by unmeasured environmental variables or population processes. By treating spatial effects as a variable of interest rather than a nuisance, hierarchical Bayesian spatial models, especially when they are based on a common broad-scale spatial lattice (here the national Forest Inventory and Analysis grid of 24 km(2) hexagons), can increase the relevance of habitat models to multispecies

  6. Bayesian point event modeling in spatial and environmental epidemiology.

    Science.gov (United States)

    Lawson, Andrew B

    2012-10-01

    This paper reviews the current state of point event modeling in spatial epidemiology from a Bayesian perspective. Point event (or case event) data arise when geo-coded addresses of disease events are available. Often, this level of spatial resolution would not be accessible due to medical confidentiality constraints. However, for the examination of small spatial scales, it is important to be capable of examining point process data directly. Models for such data are usually formulated based on point process theory. In addition, special conditioning arguments can lead to simpler Bernoulli likelihoods and logistic spatial models. Goodness-of-fit diagnostics and Bayesian residuals are also considered. Applications within putative health hazard risk assessment, cluster detection, and linkage to environmental risk fields (misalignment) are considered. PMID:23035034

  7. Spatial and spatio-temporal bayesian models with R - INLA

    CERN Document Server

    Blangiardo, Marta

    2015-01-01

    Dedication iiiPreface ix1 Introduction 11.1 Why spatial and spatio-temporal statistics? 11.2 Why do we use Bayesian methods for modelling spatial and spatio-temporal structures? 21.3 Why INLA? 31.4 Datasets 32 Introduction to 212.1 The language 212.2 objects 222.3 Data and session management 342.4 Packages 352.5 Programming in 362.6 Basic statistical analysis with 393 Introduction to Bayesian Methods 533.1 Bayesian Philosophy 533.2 Basic Probability Elements 573.3 Bayes Theorem 623.4 Prior and Posterior Distributions 643.5 Working with the Posterior Distribution 663.6 Choosing the Prior Distr

  8. The impact of spatial scales and spatial smoothing on the outcome of bayesian spatial model.

    Directory of Open Access Journals (Sweden)

    Su Yun Kang

    Full Text Available Discretization of a geographical region is quite common in spatial analysis. There have been few studies into the impact of different geographical scales on the outcome of spatial models for different spatial patterns. This study aims to investigate the impact of spatial scales and spatial smoothing on the outcomes of modelling spatial point-based data. Given a spatial point-based dataset (such as occurrence of a disease, we study the geographical variation of residual disease risk using regular grid cells. The individual disease risk is modelled using a logistic model with the inclusion of spatially unstructured and/or spatially structured random effects. Three spatial smoothness priors for the spatially structured component are employed in modelling, namely an intrinsic Gaussian Markov random field, a second-order random walk on a lattice, and a Gaussian field with Matérn correlation function. We investigate how changes in grid cell size affect model outcomes under different spatial structures and different smoothness priors for the spatial component. A realistic example (the Humberside data is analyzed and a simulation study is described. Bayesian computation is carried out using an integrated nested Laplace approximation. The results suggest that the performance and predictive capacity of the spatial models improve as the grid cell size decreases for certain spatial structures. It also appears that different spatial smoothness priors should be applied for different patterns of point data.

  9. Bayesian joint modeling of longitudinal and spatial survival AIDS data.

    Science.gov (United States)

    Martins, Rui; Silva, Giovani L; Andreozzi, Valeska

    2016-08-30

    Joint analysis of longitudinal and survival data has received increasing attention in the recent years, especially for analyzing cancer and AIDS data. As both repeated measurements (longitudinal) and time-to-event (survival) outcomes are observed in an individual, a joint modeling is more appropriate because it takes into account the dependence between the two types of responses, which are often analyzed separately. We propose a Bayesian hierarchical model for jointly modeling longitudinal and survival data considering functional time and spatial frailty effects, respectively. That is, the proposed model deals with non-linear longitudinal effects and spatial survival effects accounting for the unobserved heterogeneity among individuals living in the same region. This joint approach is applied to a cohort study of patients with HIV/AIDS in Brazil during the years 2002-2006. Our Bayesian joint model presents considerable improvements in the estimation of survival times of the Brazilian HIV/AIDS patients when compared with those obtained through a separate survival model and shows that the spatial risk of death is the same across the different Brazilian states. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26990773

  10. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.

    2014-09-16

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models\\' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  11. Forecasting unconventional resource productivity - A spatial Bayesian model

    Science.gov (United States)

    Montgomery, J.; O'sullivan, F.

    2015-12-01

    Today's low prices mean that unconventional oil and gas development requires ever greater efficiency and better development decision-making. Inter and intra-field variability in well productivity, which is a major contemporary driver of uncertainty regarding resource size and its economics is driven by factors including geological conditions, well and completion design (which companies vary as they seek to optimize their performance), and uncertainty about the nature of fracture propagation. Geological conditions are often not be well understood early on in development campaigns, but nevertheless critical assessments and decisions must be made regarding the value of drilling an area and the placement of wells. In these situations, location provides a reasonable proxy for geology and the "rock quality." We propose a spatial Bayesian model for forecasting acreage quality, which improves decision-making by leveraging available production data and provides a framework for statistically studying the influence of different parameters on well productivity. Our approach consists of subdividing a field into sections and forming prior distributions for productivity in each section based on knowledge about the overall field. Production data from wells is used to update these estimates in a Bayesian fashion, improving model accuracy far more rapidly and with less sensitivity to outliers than a model that simply establishes an "average" productivity in each section. Additionally, forecasts using this model capture the importance of uncertainty—either due to a lack of information or for areas that demonstrate greater geological risk. We demonstrate the forecasting utility of this method using public data and also provide examples of how information from this model can be combined with knowledge about a field's geology or changes in technology to better quantify development risk. This approach represents an important shift in the way that production data is used to guide

  12. Mapping the Obesity in Iran by Bayesian Spatial Model

    Directory of Open Access Journals (Sweden)

    Maryam Farhadian

    2013-06-01

    Full Text Available Background: One of the methods used in the analysis of data related to diseases, and their underlying reasons is drawing geographical map. Mapping diseases is a valuable tool to determine the regions of high rate of infliction requiring therapeutic interventions. The objective of this study was to investigate obesity pattern in Iran by drawing geographical maps based on Bayesian spatial model to recognize the pattern of the understudy symptom more carefully.Methods: The data of this study consisted of the number of obese people in provinces of Iran in terms of sex based on the reports of non-contagious disease's risks in 30 provinces by the Iran MSRT disease center in 2007. The analysis of data was carried out by software R and Open BUGS. In addition, the data required for the adjacency matrix were produced by Geo bugs software.Results: The greatest percentage of obese people in all age ranges (15-64 is 17.8 for men in Mazandaran and the lowest is 4.9 in Sistan and Baluchestan. For women the highest and lowest are 29.9 and 11.9 in Mazandaran and Hormozgan, respectively. Mazandaran was considered the province of the greatest odds ratio of obesity for men and women.Conclusion: Recognizing the geographical distribution and the regions of high risk of obesity is the prerequisite of decision making in management and planning for health system of the country. The results can be applied in allocating correct resources between different regions of Iran.

  13. Bayesian prediction of spatial count data using generalized linear mixed models

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund; Waagepetersen, Rasmus Plenge

    2002-01-01

    Spatial weed count data are modeled and predicted using a generalized linear mixed model combined with a Bayesian approach and Markov chain Monte Carlo. Informative priors for a data set with sparse sampling are elicited using a previously collected data set with extensive sampling. Furthermore, ...

  14. A sequential point process model and Bayesian inference for spatial point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    We introduce a flexible spatial point process model for spatial point patterns exhibiting linear structures, without incorporating a latent line process. The model is given by an underlying sequential point process model, i.e. each new point is generated given the previous points. Under this model...... previous points is such that the dependent cluster point is likely to occur closely to a previous cluster point. We demonstrate the flexibility of the model for producing point patterns with linear structures, and propose to use the model as the likelihood in a Bayesian setting when analyzing a spatial...

  15. Modelling the presence of disease under spatial misalignment using Bayesian latent Gaussian models.

    Science.gov (United States)

    Barber, Xavier; Conesa, David; Lladosa, Silvia; López-Quílez, Antonio

    2016-01-01

    Modelling patterns of the spatial incidence of diseases using local environmental factors has been a growing problem in the last few years. Geostatistical models have become popular lately because they allow estimating and predicting the underlying disease risk and relating it with possible risk factors. Our approach to these models is based on the fact that the presence/absence of a disease can be expressed with a hierarchical Bayesian spatial model that incorporates the information provided by the geographical and environmental characteristics of the region of interest. Nevertheless, our main interest here is to tackle the misalignment problem arising when information about possible covariates are partially (or totally) different than those of the observed locations and those in which we want to predict. As a result, we present two different models depending on the fact that there is uncertainty on the covariates or not. In both cases, Bayesian inference on the parameters and prediction of presence/absence in new locations are made by considering the model as a latent Gaussian model, which allows the use of the integrated nested Laplace approximation. In particular, the spatial effect is implemented with the stochastic partial differential equation approach. The methodology is evaluated on the presence of the Fasciola hepatica in Galicia, a North-West region of Spain. PMID:27087038

  16. Spatially-Explicit Bayesian Information Entropy Metrics for Calibrating Landscape Transformation Models

    Directory of Open Access Journals (Sweden)

    Kostas Alexandridis

    2013-06-01

    Full Text Available Assessing spatial model performance often presents challenges related to the choice and suitability of traditional statistical methods in capturing the true validity and dynamics of the predicted outcomes. The stochastic nature of many of our contemporary spatial models of land use change necessitate the testing and development of new and innovative methodologies in statistical spatial assessment. In many cases, spatial model performance depends critically on the spatially-explicit prior distributions, characteristics, availability and prevalence of the variables and factors under study. This study explores the statistical spatial characteristics of statistical model assessment of modeling land use change dynamics in a seven-county study area in South-Eastern Wisconsin during the historical period of 1963–1990. The artificial neural network-based Land Transformation Model (LTM predictions are used to compare simulated with historical land use transformations in urban/suburban landscapes. We introduce a range of Bayesian information entropy statistical spatial metrics for assessing the model performance across multiple simulation testing runs. Bayesian entropic estimates of model performance are compared against information-theoretic stochastic entropy estimates and theoretically-derived accuracy assessments. We argue for the critical role of informational uncertainty across different scales of spatial resolution in informing spatial landscape model assessment. Our analysis reveals how incorporation of spatial and landscape information asymmetry estimates can improve our stochastic assessments of spatial model predictions. Finally our study shows how spatially-explicit entropic classification accuracy estimates can work closely with dynamic modeling methodologies in improving our scientific understanding of landscape change as a complex adaptive system and process.

  17. Evaluation of Image Registration Spatial Accuracy Using a Bayesian Hierarchical Model

    OpenAIRE

    Liu, Suyu; Yuan, Ying; Castillo, Richard; Guerrero, Thomas; Johnson, Valen E.

    2014-01-01

    To evaluate the utility of automated deformable image registration (DIR) algorithms, it is necessary to evaluate both the registration accuracy of the DIR algorithm itself, as well as the registration accuracy of the human readers from whom the ”gold standard” is obtained. We propose a Bayesian hierarchical model to evaluate the spatial accuracy of human readers and automatic DIR methods based on multiple image registration data generated by human readers and automatic DIR methods. To fully a...

  18. Bayesian spatial semi-parametric modeling of HIV variation in Kenya.

    Directory of Open Access Journals (Sweden)

    Oscar Ngesa

    Full Text Available Spatial statistics has seen rapid application in many fields, especially epidemiology and public health. Many studies, nonetheless, make limited use of the geographical location information and also usually assume that the covariates, which are related to the response variable, have linear effects. We develop a Bayesian semi-parametric regression model for HIV prevalence data. Model estimation and inference is based on fully Bayesian approach via Markov Chain Monte Carlo (McMC. The model is applied to HIV prevalence data among men in Kenya, derived from the Kenya AIDS indicator survey, with n = 3,662. Past studies have concluded that HIV infection has a nonlinear association with age. In this study a smooth function based on penalized regression splines is used to estimate this nonlinear effect. Other covariates were assumed to have a linear effect. Spatial references to the counties were modeled as both structured and unstructured spatial effects. We observe that circumcision reduces the risk of HIV infection. The results also indicate that men in the urban areas were more likely to be infected by HIV as compared to their rural counterpart. Men with higher education had the lowest risk of HIV infection. A nonlinear relationship between HIV infection and age was established. Risk of HIV infection increases with age up to the age of 40 then declines with increase in age. Men who had STI in the last 12 months were more likely to be infected with HIV. Also men who had ever used a condom were found to have higher likelihood to be infected by HIV. A significant spatial variation of HIV infection in Kenya was also established. The study shows the practicality and flexibility of Bayesian semi-parametric regression model in analyzing epidemiological data.

  19. A Bayesian, spatially-varying calibration model for the TEX86 proxy

    Science.gov (United States)

    Tierney, Jessica E.; Tingley, Martin P.

    2014-02-01

    TEX86 is an important proxy for constraining ocean temperatures in the Earth's past. Current calibrations, however, feature structured residuals indicative of a spatially-varying relationship between TEX86 and sea-surface temperatures (SSTs). Here we develop and apply a Bayesian regression approach to the TEX86-SST calibration that explicitly allows for model parameters to smoothly vary as a function of space, and considers uncertainties in the modern SSTs as well as in the TEX86-SST relationship. The spatially-varying model leads to larger uncertainties at locations that are data-poor, while Bayesian inference naturally propagates calibration uncertainty into the uncertainty in the predictions. Applications to both Quaternary and Eocene TEX86 data demonstrate that our approach produces reasonable results, and improves upon previous methods by allowing for probabilistic assessments of past temperatures. The scientific understanding of TEX86 remains imperfect, and the model presented here allows for predictions that implicitly account for the effects of environmental factors other than SSTs that lead to a spatially non-stationary TEX86-SST relationship.

  20. Bayesian statistical modeling of spatially correlated error structure in atmospheric tracer inverse analysis

    Directory of Open Access Journals (Sweden)

    C. Mukherjee

    2011-01-01

    Full Text Available Inverse modeling applications in atmospheric chemistry are increasingly addressing the challenging statistical issues of data synthesis by adopting refined statistical analysis methods. This paper advances this line of research by addressing several central questions in inverse modeling, focusing specifically on Bayesian statistical computation. Motivated by problems of refining bottom-up estimates of source/sink fluxes of trace gas and aerosols based on increasingly high-resolution satellite retrievals of atmospheric chemical concentrations, we address head-on the need for integrating formal spatial statistical methods of residual error structure in global scale inversion models. We do this using analytically and computationally tractable spatial statistical models, know as conditional autoregressive spatial models, as components of a global inversion framework. We develop Markov chain Monte Carlo methods to explore and fit these spatial structures in an overall statistical framework that simultaneously estimates source fluxes. Additional aspects of the study extend the statistical framework to utilize priors in a more physically realistic manner, and to formally address and deal with missing data in satellite retrievals. We demonstrate the analysis in the context of inferring carbon monoxide (CO sources constrained by satellite retrievals of column CO from the Measurement of Pollution in the Troposphere (MOPITT instrument on the TERRA satellite, paying special attention to evaluating performance of the inverse approach using various statistical diagnostic metrics. This is developed using synthetic data generated to resemble MOPITT data to define a~proof-of-concept and model assessment, and then in analysis of real MOPITT data.

  1. Hierarchical Bayesian spatial models for alcohol availability, drug "hot spots" and violent crime

    Directory of Open Access Journals (Sweden)

    Horel Scott

    2006-12-01

    Full Text Available Abstract Background Ecologic studies have shown a relationship between alcohol outlet densities, illicit drug use and violence. The present study examined this relationship in the City of Houston, Texas, using a sample of 439 census tracts. Neighborhood sociostructural covariates, alcohol outlet density, drug crime density and violent crime data were collected for the year 2000, and analyzed using hierarchical Bayesian models. Model selection was accomplished by applying the Deviance Information Criterion. Results The counts of violent crime in each census tract were modelled as having a conditional Poisson distribution. Four neighbourhood explanatory variables were identified using principal component analysis. The best fitted model was selected as the one considering both unstructured and spatial dependence random effects. The results showed that drug-law violation explained a greater amount of variance in violent crime rates than alcohol outlet densities. The relative risk for drug-law violation was 2.49 and that for alcohol outlet density was 1.16. Of the neighbourhood sociostructural covariates, males of age 15 to 24 showed an effect on violence, with a 16% decrease in relative risk for each increase the size of its standard deviation. Both unstructured heterogeneity random effect and spatial dependence need to be included in the model. Conclusion The analysis presented suggests that activity around illicit drug markets is more strongly associated with violent crime than is alcohol outlet density. Unique among the ecological studies in this field, the present study not only shows the direction and magnitude of impact of neighbourhood sociostructural covariates as well as alcohol and illicit drug activities in a neighbourhood, it also reveals the importance of applying hierarchical Bayesian models in this research field as both spatial dependence and heterogeneity random effects need to be considered simultaneously.

  2. Bayesian statistical modeling of spatially correlated error structure in atmospheric tracer inverse analysis

    Directory of Open Access Journals (Sweden)

    C. Mukherjee

    2011-06-01

    Full Text Available We present and discuss the use of Bayesian modeling and computational methods for atmospheric chemistry inverse analyses that incorporate evaluation of spatial structure in model-data residuals. Motivated by problems of refining bottom-up estimates of source/sink fluxes of trace gas and aerosols based on satellite retrievals of atmospheric chemical concentrations, we address the need for formal modeling of spatial residual error structure in global scale inversion models. We do this using analytically and computationally tractable conditional autoregressive (CAR spatial models as components of a global inversion framework. We develop Markov chain Monte Carlo methods to explore and fit these spatial structures in an overall statistical framework that simultaneously estimates source fluxes. Additional aspects of the study extend the statistical framework to utilize priors on source fluxes in a physically realistic manner, and to formally address and deal with missing data in satellite retrievals. We demonstrate the analysis in the context of inferring carbon monoxide (CO sources constrained by satellite retrievals of column CO from the Measurement of Pollution in the Troposphere (MOPITT instrument on the TERRA satellite, paying special attention to evaluating performance of the inverse approach using various statistical diagnostic metrics. This is developed using synthetic data generated to resemble MOPITT data to define a proof-of-concept and model assessment, and then in analysis of real MOPITT data. These studies demonstrate the ability of these simple spatial models to substantially improve over standard non-spatial models in terms of statistical fit, ability to recover sources in synthetic examples, and predictive match with real data.

  3. Estimating Temporal Trend in the Presence of Spatial Complexity: A Bayesian Hierarchical Model for a Wetland Plant Population Undergoing Restoration

    OpenAIRE

    Rodhouse, Thomas J.; Kathryn M Irvine; Vierling, Kerri T.; Lee A Vierling

    2011-01-01

    Monitoring programs that evaluate restoration and inform adaptive management are important for addressing environmental degradation. These efforts may be well served by spatially explicit hierarchical approaches to modeling because of unavoidable spatial structure inherited from past land use patterns and other factors. We developed Bayesian hierarchical models to estimate trends from annual density counts observed in a spatially structured wetland forb (Camassia quamash [camas]) population f...

  4. Spatial Clustering of Curves with Functional Covariates: A Bayesian Partitioning Model with Application to Spectra Radiance in Climate Study

    OpenAIRE

    Zhang, Zhen; Lim, Chae Young; Maiti, Tapabrata; Kato, Seiji

    2016-01-01

    In climate change study, the infrared spectral signatures of climate change have recently been conceptually adopted, and widely applied to identifying and attributing atmospheric composition change. We propose a Bayesian hierarchical model for spatial clustering of the high-dimensional functional data based on the effects of functional covariates and local features. We couple the functional mixed-effects model with a generalized spatial partitioning method for: (1) producing spatially contigu...

  5. BSMac: a MATLAB toolbox implementing a Bayesian spatial model for brain activation and connectivity.

    Science.gov (United States)

    Zhang, Lijun; Agravat, Sanjay; Derado, Gordana; Chen, Shuo; McIntosh, Belinda J; Bowman, F DuBois

    2012-02-15

    We present a statistical and graphical visualization MATLAB toolbox for the analysis of functional magnetic resonance imaging (fMRI) data, called the Bayesian Spatial Model for activation and connectivity (BSMac). BSMac simultaneously performs whole-brain activation analyses at the voxel and region of interest (ROI) levels as well as task-related functional connectivity (FC) analyses using a flexible Bayesian modeling framework (Bowman et al., 2008). BSMac allows for inputting data in either Analyze or Nifti file formats. The user provides information pertaining to subgroup memberships, scanning sessions, and experimental tasks (stimuli), from which the design matrix is constructed. BSMac then performs parameter estimation based on Markov Chain Monte Carlo (MCMC) methods and generates plots for activation and FC, such as interactive 2D maps of voxel and region-level task-related changes in neural activity and animated 3D graphics of the FC results. The toolbox can be downloaded from http://www.sph.emory.edu/bios/CBIS/. We illustrate the BSMac toolbox through an application to an fMRI study of working memory in patients with schizophrenia. PMID:22101143

  6. A Bayesian Hierarchical Model for Estimation of Abundance and Spatial Density of Aedes aegypti.

    Directory of Open Access Journals (Sweden)

    Daniel A M Villela

    Full Text Available Strategies to minimize dengue transmission commonly rely on vector control, which aims to maintain Ae. aegypti density below a theoretical threshold. Mosquito abundance is traditionally estimated from mark-release-recapture (MRR experiments, which lack proper analysis regarding accurate vector spatial distribution and population density. Recently proposed strategies to control vector-borne diseases involve replacing the susceptible wild population by genetically modified individuals' refractory to the infection by the pathogen. Accurate measurements of mosquito abundance in time and space are required to optimize the success of such interventions. In this paper, we present a hierarchical probabilistic model for the estimation of population abundance and spatial distribution from typical mosquito MRR experiments, with direct application to the planning of these new control strategies. We perform a Bayesian analysis using the model and data from two MRR experiments performed in a neighborhood of Rio de Janeiro, Brazil, during both low- and high-dengue transmission seasons. The hierarchical model indicates that mosquito spatial distribution is clustered during the winter (0.99 mosquitoes/premise 95% CI: 0.80-1.23 and more homogeneous during the high abundance period (5.2 mosquitoes/premise 95% CI: 4.3-5.9. The hierarchical model also performed better than the commonly used Fisher-Ford's method, when using simulated data. The proposed model provides a formal treatment of the sources of uncertainty associated with the estimation of mosquito abundance imposed by the sampling design. Our approach is useful in strategies such as population suppression or the displacement of wild vector populations by refractory Wolbachia-infected mosquitoes, since the invasion dynamics have been shown to follow threshold conditions dictated by mosquito abundance. The presence of spatially distributed abundance hotspots is also formally addressed under this modeling

  7. Bayesian inference in camera trapping studies for a class of spatial capture-recapture models

    Science.gov (United States)

    Royle, J. Andrew; Karanth, K. Ullas; Gopalaswamy, Arjun M.; Kumar, N. Samba

    2009-01-01

    We develop a class of models for inference about abundance or density using spatial capture-recapture data from studies based on camera trapping and related methods. The model is a hierarchical model composed of two components: a point process model describing the distribution of individuals in space (or their home range centers) and a model describing the observation of individuals in traps. We suppose that trap- and individual-specific capture probabilities are a function of distance between individual home range centers and trap locations. We show that the models can be regarded as generalized linear mixed models, where the individual home range centers are random effects. We adopt a Bayesian framework for inference under these models using a formulation based on data augmentation. We apply the models to camera trapping data on tigers from the Nagarahole Reserve, India, collected over 48 nights in 2006. For this study, 120 camera locations were used, but cameras were only operational at 30 locations during any given sample occasion. Movement of traps is common in many camera-trapping studies and represents an important feature of the observation model that we address explicitly in our application.

  8. Assessing the Effectiveness of Local Management of Coral Reefs Using Expert Opinion and Spatial Bayesian Modeling

    OpenAIRE

    Stephen S Ban; Pressey, Robert L.; Graham, Nicholas A. J.

    2015-01-01

    Multiple stressors are an increasing concern in the management and conservation of ecosystems, and have been identified as a key gap in research. Coral reefs are one example of an ecosystem where management of local stressors may be a way of mitigating or delaying the effects of climate change. Predicting how multiple stressors interact, particularly in a spatially explicit fashion, is a difficult challenge. Here we use a combination of an expert-elicited Bayesian network (BN) and spatial env...

  9. A bayesian integrative model for genetical genomics with spatially informed variable selection.

    Science.gov (United States)

    Cassese, Alberto; Guindani, Michele; Vannucci, Marina

    2014-01-01

    We consider a Bayesian hierarchical model for the integration of gene expression levels with comparative genomic hybridization (CGH) array measurements collected on the same subjects. The approach defines a measurement error model that relates the gene expression levels to latent copy number states. In turn, the latent states are related to the observed surrogate CGH measurements via a hidden Markov model. The model further incorporates variable selection with a spatial prior based on a probit link that exploits dependencies across adjacent DNA segments. Posterior inference is carried out via Markov chain Monte Carlo stochastic search techniques. We study the performance of the model in simulations and show better results than those achieved with recently proposed alternative priors. We also show an application to data from a genomic study on lung squamous cell carcinoma, where we identify potential candidates of associations between copy number variants and the transcriptional activity of target genes. Gene ontology (GO) analyses of our findings reveal enrichments in genes that code for proteins involved in cancer. Our model also identifies a number of potential candidate biomarkers for further experimental validation. PMID:25288877

  10. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes and...... largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  11. A Bayesian approach for inverse modeling, data assimilation, and conditional simulation of spatial random fields

    Science.gov (United States)

    Rubin, Yoram; Chen, Xingyuan; Murakami, Haruko; Hahn, Melanie

    2010-10-01

    This paper addresses the inverse problem in spatially variable fields such as hydraulic conductivity in groundwater aquifers or rainfall intensity in hydrology. Common to all these problems is the existence of a complex pattern of spatial variability of the target variables and observations, the multiple sources of data available for characterizing the fields, the complex relations between the observed and target variables and the multiple scales and frequencies of the observations. The method of anchored distributions (MAD) that we propose here is a general Bayesian method of inverse modeling of spatial random fields that addresses this complexity. The central elements of MAD are a modular classification of all relevant data and a new concept called "anchors." Data types are classified by the way they relate to the target variable, as either local or nonlocal and as either direct or indirect. Anchors are devices for localization of data: they are used to convert nonlocal, indirect data into local distributions of the target variables. The target of the inversion is the derivation of the joint distribution of the anchors and structural parameters, conditional to all measurements, regardless of scale or frequency of measurement. The structural parameters describe large-scale trends of the target variable fields, whereas the anchors capture local inhomogeneities. Following inversion, the joint distribution of anchors and structural parameters is used for generating random fields of the target variable(s) that are conditioned on the nonlocal, indirect data through their anchor representation. We demonstrate MAD through a detailed case study that assimilates point measurements of the conductivity with head measurements from natural gradient flow. The resulting statistical distributions of the parameters are non-Gaussian. Similarly, the moments of the estimates of the hydraulic head are non-Gaussian. We provide an extended discussion of MAD vis à vis other inversion

  12. Exploring the Specifications of Spatial Adjacencies and Weights in Bayesian Spatial Modeling with Intrinsic Conditional Autoregressive Priors in a Small-area Study of Fall Injuries

    OpenAIRE

    Jane Law

    2016-01-01

    Intrinsic conditional autoregressive modeling in a Bayeisan hierarchical framework has been increasingly applied in small-area ecological studies. This study explores the specifications of spatial structure in this Bayesian framework in two aspects: adjacency, i.e., the set of neighbor(s) for each area; and (spatial) weight for each pair of neighbors. Our analysis was based on a small-area study of falling injuries among people age 65 and older in Ontario, Canada, that was aimed to estimate r...

  13. Estimating temporal trend in the presence of spatial complexity: a Bayesian hierarchical model for a wetland plant population undergoing restoration.

    Science.gov (United States)

    Rodhouse, Thomas J; Irvine, Kathryn M; Vierling, Kerri T; Vierling, Lee A

    2011-01-01

    Monitoring programs that evaluate restoration and inform adaptive management are important for addressing environmental degradation. These efforts may be well served by spatially explicit hierarchical approaches to modeling because of unavoidable spatial structure inherited from past land use patterns and other factors. We developed bayesian hierarchical models to estimate trends from annual density counts observed in a spatially structured wetland forb (Camassia quamash [camas]) population following the cessation of grazing and mowing on the study area, and in a separate reference population of camas. The restoration site was bisected by roads and drainage ditches, resulting in distinct subpopulations ("zones") with different land use histories. We modeled this spatial structure by fitting zone-specific intercepts and slopes. We allowed spatial covariance parameters in the model to vary by zone, as in stratified kriging, accommodating anisotropy and improving computation and biological interpretation. Trend estimates provided evidence of a positive effect of passive restoration, and the strength of evidence was influenced by the amount of spatial structure in the model. Allowing trends to vary among zones and accounting for topographic heterogeneity increased precision of trend estimates. Accounting for spatial autocorrelation shifted parameter coefficients in ways that varied among zones depending on strength of statistical shrinkage, autocorrelation and topographic heterogeneity--a phenomenon not widely described. Spatially explicit estimates of trend from hierarchical models will generally be more useful to land managers than pooled regional estimates and provide more realistic assessments of uncertainty. The ability to grapple with historical contingency is an appealing benefit of this approach. PMID:22163047

  14. Estimating temporal trend in the presence of spatial complexity: A Bayesian hierarchical model for a wetland plant population undergoing restoration

    Science.gov (United States)

    Rodhouse, T.J.; Irvine, K.M.; Vierling, K.T.; Vierling, L.A.

    2011-01-01

    Monitoring programs that evaluate restoration and inform adaptive management are important for addressing environmental degradation. These efforts may be well served by spatially explicit hierarchical approaches to modeling because of unavoidable spatial structure inherited from past land use patterns and other factors. We developed Bayesian hierarchical models to estimate trends from annual density counts observed in a spatially structured wetland forb (Camassia quamash [camas]) population following the cessation of grazing and mowing on the study area, and in a separate reference population of camas. The restoration site was bisected by roads and drainage ditches, resulting in distinct subpopulations ("zones") with different land use histories. We modeled this spatial structure by fitting zone-specific intercepts and slopes. We allowed spatial covariance parameters in the model to vary by zone, as in stratified kriging, accommodating anisotropy and improving computation and biological interpretation. Trend estimates provided evidence of a positive effect of passive restoration, and the strength of evidence was influenced by the amount of spatial structure in the model. Allowing trends to vary among zones and accounting for topographic heterogeneity increased precision of trend estimates. Accounting for spatial autocorrelation shifted parameter coefficients in ways that varied among zones depending on strength of statistical shrinkage, autocorrelation and topographic heterogeneity-a phenomenon not widely described. Spatially explicit estimates of trend from hierarchical models will generally be more useful to land managers than pooled regional estimates and provide more realistic assessments of uncertainty. The ability to grapple with historical contingency is an appealing benefit of this approach.

  15. Order-Constrained Reference Priors with Implications for Bayesian Isotonic Regression, Analysis of Covariance and Spatial Models

    Science.gov (United States)

    Gong, Maozhen

    Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.

  16. A sequential point process model and Bayesian inference for spatial point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    We introduce a flexible spatial point process model for spatial point patterns exhibiting linear structures, without incorporating a latent line process. The model is given by an underlying sequential point process model, i.e. each new point is generated given the previous points. Under this mode...

  17. Spatial guilds in the Serengeti food web revealed by a Bayesian group model.

    Directory of Open Access Journals (Sweden)

    Edward B Baskerville

    2011-12-01

    Full Text Available Food webs, networks of feeding relationships in an ecosystem, provide fundamental insights into mechanisms that determine ecosystem stability and persistence. A standard approach in food-web analysis, and network analysis in general, has been to identify compartments, or modules, defined by many links within compartments and few links between them. This approach can identify large habitat boundaries in the network but may fail to identify other important structures. Empirical analyses of food webs have been further limited by low-resolution data for primary producers. In this paper, we present a Bayesian computational method for identifying group structure using a flexible definition that can describe both functional trophic roles and standard compartments. We apply this method to a newly compiled plant-mammal food web from the Serengeti ecosystem that includes high taxonomic resolution at the plant level, allowing a simultaneous examination of the signature of both habitat and trophic roles in network structure. We find that groups at the plant level reflect habitat structure, coupled at higher trophic levels by groups of herbivores, which are in turn coupled by carnivore groups. Thus the group structure of the Serengeti web represents a mixture of trophic guild structure and spatial pattern, in contrast to the standard compartments typically identified. The network topology supports recent ideas on spatial coupling and energy channels in ecosystems that have been proposed as important for persistence. Furthermore, our Bayesian approach provides a powerful, flexible framework for the study of network structure, and we believe it will prove instrumental in a variety of biological contexts.

  18. Exploring the Specifications of Spatial Adjacencies and Weights in Bayesian Spatial Modeling with Intrinsic Conditional Autoregressive Priors in a Small-area Study of Fall Injuries

    Directory of Open Access Journals (Sweden)

    Jane Law

    2016-03-01

    Full Text Available Intrinsic conditional autoregressive modeling in a Bayeisan hierarchical framework has been increasingly applied in small-area ecological studies. This study explores the specifications of spatial structure in this Bayesian framework in two aspects: adjacency, i.e., the set of neighbor(s for each area; and (spatial weight for each pair of neighbors. Our analysis was based on a small-area study of falling injuries among people age 65 and older in Ontario, Canada, that was aimed to estimate risks and identify risk factors of such falls. In the case study, we observed incorrect adjacencies information caused by deficiencies in the digital map itself. Further, when equal weights was replaced by weights based on a variable of expected count, the range of estimated risks increased, the number of areas with probability of estimated risk greater than one at different probability thresholds increased, and model fit improved. More importantly, significance of a risk factor diminished. Further research to thoroughly investigate different methods of variable weights; quantify the influence of specifications of spatial weights; and develop strategies for better defining spatial structure of a map in small-area analysis in Bayesian hierarchical spatial modeling is recommended.

  19. A sequential point process model and Bayesian inference for spatial point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    2012-01-01

    We introduce a flexible spatial point process model for spatial point patterns exhibiting linear structures, without incorporating a latent line process. The model is given by an underlying sequential point process model. Under this model, the points can be of one of three types: a ‘background...... point’ an ‘independent cluster point’ or a ‘dependent cluster point’. The background and independent cluster points are thought to exhibit ‘complete spatial randomness’, whereas the dependent cluster points are likely to occur close to previous cluster points. We demonstrate the flexibility of the model...

  20. Bayesian hierarchical spatial models to improve forest variable prediction and mapping with Light Detection and Ranging data sets

    Science.gov (United States)

    Ball, Jessica Lynne

    Light Detection and Ranging (LiDAR) data has shown great potential to estimate spatially explicit forest variables, including above-ground biomass, stem density, tree height, and more. Due to its ability to garner information about the vertical and horizontal structure of forest canopies effectively and efficiently, LiDAR sensors have played a key role in the development of operational air and space-borne instruments capable of gathering information about forest structure at regional, continental, and global scales. Combining LiDAR datasets with field-based validation measurements to build predictive models is becoming an attractive solution to the problem of quantifying and mapping forest structure for private forest land owners and local, state, and federal government entities alike. As with any statistical model using spatially indexed data, the potential to violate modeling assumptions resulting from spatial correlation is high. This thesis explores several different modeling frameworks that aim to accommodate correlation structures within model residuals. The development is motivated using LiDAR and forest inventory datasets. Special attention is paid to estimation and propagation of parameter and model uncertainty through to prediction units. Inference follows a Bayesian statistical paradigm. Results suggest the proposed frameworks help ensure model assumptions are met and prediction performance can be improved by pursuing spatially enabled models.

  1. Estimating temporal trend in the presence of spatial complexity: a Bayesian hierarchical model for a wetland plant population undergoing restoration.

    Directory of Open Access Journals (Sweden)

    Thomas J Rodhouse

    Full Text Available Monitoring programs that evaluate restoration and inform adaptive management are important for addressing environmental degradation. These efforts may be well served by spatially explicit hierarchical approaches to modeling because of unavoidable spatial structure inherited from past land use patterns and other factors. We developed bayesian hierarchical models to estimate trends from annual density counts observed in a spatially structured wetland forb (Camassia quamash [camas] population following the cessation of grazing and mowing on the study area, and in a separate reference population of camas. The restoration site was bisected by roads and drainage ditches, resulting in distinct subpopulations ("zones" with different land use histories. We modeled this spatial structure by fitting zone-specific intercepts and slopes. We allowed spatial covariance parameters in the model to vary by zone, as in stratified kriging, accommodating anisotropy and improving computation and biological interpretation. Trend estimates provided evidence of a positive effect of passive restoration, and the strength of evidence was influenced by the amount of spatial structure in the model. Allowing trends to vary among zones and accounting for topographic heterogeneity increased precision of trend estimates. Accounting for spatial autocorrelation shifted parameter coefficients in ways that varied among zones depending on strength of statistical shrinkage, autocorrelation and topographic heterogeneity--a phenomenon not widely described. Spatially explicit estimates of trend from hierarchical models will generally be more useful to land managers than pooled regional estimates and provide more realistic assessments of uncertainty. The ability to grapple with historical contingency is an appealing benefit of this approach.

  2. Bayesian spatial modeling of cetacean sightings during a seismic acquisition survey.

    Science.gov (United States)

    Vilela, Raul; Pena, Ursula; Esteban, Ruth; Koemans, Robin

    2016-08-15

    A visual monitoring of marine mammals was carried out during a seismic acquisition survey performed in waters south of Portugal with the aim of assessing the likelihood of encountering Mysticeti species in this region as well as to determine the impact of the seismic activity upon encounter. Sightings and effort data were assembled with a range of environmental variables at different lags, and a Bayesian site-occupancy modeling approach was used to develop prediction maps and evaluate how species-specific habitat conditions evolved throughout the presence or not of seismic activity. No statistical evidence of a decrease in the sighting rates of Mysticeti by comparison to source activity was found. Indeed, it was found how Mysticeti distribution during the survey period was driven solely by environmental variables. Although further research is needed, possible explanations may include anthropogenic noise habituation and zone of seismic activity coincident with a naturally low density area. PMID:27210556

  3. Baltic sea algae analysis using Bayesian spatial statistics methods

    Directory of Open Access Journals (Sweden)

    Eglė Baltmiškytė

    2013-03-01

    Full Text Available Spatial statistics is one of the fields in statistics dealing with spatialy spread data analysis. Recently, Bayes methods are often applied for data statistical analysis. A spatial data model for predicting algae quantity in the Baltic Sea is made and described in this article. Black Carrageen is a dependent variable and depth, sand, pebble, boulders are independent variables in the described model. Two models with different covariation functions (Gaussian and exponential are built to estimate the best model fitting for algae quantity prediction. Unknown model parameters are estimated and Bayesian kriging prediction posterior distribution is computed in OpenBUGS modeling environment by using Bayesian spatial statistics methods.

  4. Maximum Likelihood Bayesian Averaging of Spatial Variability Models in Unsaturated Fractured Tuff

    International Nuclear Information System (INIS)

    Hydrologic analyses typically rely on a single conceptual-mathematical model. Yet hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Adopting only one of these may lead to statistical bias and underestimation of uncertainty. Bayesian Model Averaging (BMA) provides an optimal way to combine the predictions of several competing models and to assess their joint predictive uncertainty. However, it tends to be computationally demanding and relies heavily on prior information about model parameters. We apply a maximum likelihood (ML) version of BMA (MLBMA) to seven alternative variogram models of log air permeability data from single-hole pneumatic injection tests in six boreholes at the Apache Leap Research Site (ALRS) in central Arizona. Unbiased ML estimates of variogram and drift parameters are obtained using Adjoint State Maximum Likelihood Cross Validation in conjunction with Universal Kriging and Generalized L east Squares. Standard information criteria provide an ambiguous ranking of the models, which does not justify selecting one of them and discarding all others as is commonly done in practice. Instead, we eliminate some of the models based on their negligibly small posterior probabilities and use the rest to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. We then average these four projections, and associated kriging variances, using the posterior probability of each model as weight. Finally, we cross-validate the results by eliminating from consideration all data from one borehole at a time, repeating the above process, and comparing the predictive capability of MLBMA with that of each individual model. We find that MLBMA is superior to any individual geostatistical model of log permeability among those we consider at the ALRS

  5. Bayesian default probability models

    OpenAIRE

    Andrlíková, Petra

    2014-01-01

    This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...

  6. A multi-resolution, non-parametric, Bayesian framework for identification of spatially-varying model parameters

    CERN Document Server

    Koutsourelakis, P S

    2008-01-01

    This paper proposes a hierarchical, multi-resolution framework for the identification of model parameters and their spatially variability from noisy measurements of the response or output. Such parameters are frequently encountered in PDE-based models and correspond to quantities such as density or pressure fields, elasto-plastic moduli and internal variables in solid mechanics, conductivity fields in heat diffusion problems, permeability fields in fluid flow through porous media etc. The proposed model has all the advantages of traditional Bayesian formulations such as the ability to produce measures of confidence for the inferences made and providing not only predictive estimates but also quantitative measures of the predictive uncertainty. In contrast to existing approaches it utilizes a parsimonious, non-parametric formulation that favors sparse representations and whose complexity can be determined from the data. The proposed framework in non-intrusive and makes use of a sequence of forward solvers opera...

  7. Geographical mapping and Bayesian spatial modeling of malaria incidence in Sistan and Baluchistan province, Iran

    Institute of Scientific and Technical Information of China (English)

    Farid Zayeri; Masoud Salehi; Hasan Pirhosseini

    2011-01-01

    Objective:To present the geographical map of malaria and identify some of the important environmental factors of this disease in Sistan and Baluchistan province, Iran.Methods:We used the registered malaria data to compute the standard incidence rates (SIRs) of malaria in different areas of Sistan and Baluchistan province for a nine-year period (from 2001 to 2009). Statistical analyses consisted of two different parts: geographical mapping of malaria incidence rates, and modeling the environmental factors. The empirical Bayesian estimates of malaria SIRs were utilized for geographical mapping of malaria and a Poisson random effects model was used for assessing the effect of environmental factors on malaria SIRs.Results:In general, 64 926 new cases of malaria were registered in Sistan and Baluchistan Province from 2001 to 2009. Among them, 42 695 patients (65.8%) were male and 22 231 patients (34.2%) were female. Modeling the environmental factors showed that malaria incidence rates had positive relationship with humidity, elevation, average minimum temperature and average maximum temperature, while rainfall had negative effect on malaria SIRs in this province.Conclusions:The results of the present study reveals that malaria is still a serious health problem in Sistan and Baluchistan province, Iran. Geographical map and related environmental factors of malaria can help the health policy makers to intervene in high risk areas more efficiently and allocate the resources in a proper manner.

  8. Bayesians in Space: Using Bayesian Methods to Inform Choice of Spatial Weights Matrix in Hedonic Property Analyses

    OpenAIRE

    Mueller, Julie M.; Loomis, John B.

    2010-01-01

    The choice of weights is a non-nested problem in most applied spatial econometric models. Despite numerous recent advances in spatial econometrics, the choice of spatial weights remains exogenously determined by the researcher in empirical applications. Bayesian techniques provide statistical evidence regarding the simultaneous choice of model specification and spatial weights matrices by using posterior probabilities. This paper demonstrates the Bayesian estimation approach in a spatial hedo...

  9. A Bayesian heterogeneous coefficients spatial autoregressive panel data model of retail fuel price rivalry

    OpenAIRE

    Lesage, James P.; Vance, Colin; Chih, Yao-Yu

    2016-01-01

    We apply a heterogenous coefficient spatial autoregressive panel model from Aquaro, Bailey and Pesaran (2015) to explore competition/cooperation by Berlin fueling stations in setting prices for diesel and E5 fuel. Unlike the maximum likelihood estimation method set forth by Aquaro, Bailey and Pesaran (2015), we rely on a Markov Chain Monte Carlo (MCMC) estimation methodology. MCMC estimates as applied here with non-informative priors will produce estimates equal to those from maximum likeliho...

  10. A Bayesian spatial assimilation scheme for snow coverage observations in a gridded snow model

    OpenAIRE

    Kolberg, S.; Rue, H.; Gottschalk, L.

    2006-01-01

    A method for assimilating remotely sensed snow covered area (SCA) into the snow subroutine of a grid distributed precipitation-runoff model (PRM) is presented. The PRM is assumed to simulate the snow state in each grid cell by a snow depletion curve (SDC), which relates that cell's SCA to its snow cover mass balance. The assimilation is based on Bayes' theorem, which requires a joint prior distribution of the SDC variables in all the grid cells. In this paper we propose a spatial model for th...

  11. A Bayesian spatial assimilation scheme for snow coverage observations in a gridded snow model

    Directory of Open Access Journals (Sweden)

    S. Kolberg

    2006-01-01

    Full Text Available A method for assimilating remotely sensed snow covered area (SCA into the snow subroutine of a grid distributed precipitation-runoff model (PRM is presented. The PRM is assumed to simulate the snow state in each grid cell by a snow depletion curve (SDC, which relates that cell's SCA to its snow cover mass balance. The assimilation is based on Bayes' theorem, which requires a joint prior distribution of the SDC variables in all the grid cells. In this paper we propose a spatial model for this prior distribution, and include similarities and dependencies among the grid cells. Used to represent the PRM simulated snow cover state, our joint prior model regards two elevation gradients and a degree-day factor as global variables, rather than describing their effect separately for each cell. This transformation results in smooth normalised surfaces for the two related mass balance variables, supporting a strong inter-cell dependency in their joint prior model. The global features and spatial interdependency in the prior model cause each SCA observation to provide information for many grid cells. The spatial approach similarly facilitates the utilisation of observed discharge. Assimilation of SCA data using the proposed spatial model is evaluated in a 2400 km2 mountainous region in central Norway (61° N, 9° E, based on two Landsat 7 ETM+ images generalized to 1 km2 resolution. An image acquired on 11 May, a week before the peak flood, removes 78% of the variance in the remaining snow storage. Even an image from 4 May, less than a week after the melt onset, reduces this variance by 53%. These results are largely improved compared to a cell-by-cell independent assimilation routine previously reported. Including observed discharge in the updating information improves the 4 May results, but has weak effect on 11 May. Estimated elevation gradients are shown to be sensitive to informational deficits occurring at high altitude, where snowmelt has not started

  12. Evaluation of the spatial patterns and risk factors, including backyard pigs, for classical swine fever occurrence in Bulgaria using a Bayesian model

    OpenAIRE

    Beatriz Martínez-López; Tsviatko Alexandrov; Lina Mur; Fernando Sánchez-Vizcaíno; Sánchez-Vizcaíno, José M.

    2014-01-01

    The spatial pattern and epidemiology of backyard pig farming and other low bio-security pig production systems and their role in the occurrence of classical swine fever (CSF) is described and evaluated. A spatial Bayesian model was used to explore the risk factors, including human demographics, socioeconomic and environmental factors. The analyses were performed for Bulgaria, which has a large number of backyard farms (96% of all pig farms in the country are classified as backyard farms), and...

  13. A spatial Bayesian network model to assess the benefits of early warning for urban flood risk to people

    Science.gov (United States)

    Balbi, S.; Villa, F.; Mojtahed, V.; Hegetschweiler, K. T.; Giupponi, C.

    2015-10-01

    This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of: (1) likelihood of non-fatal physical injury; (2) likelihood of post-traumatic stress disorder; (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the benefits of improving an existing Early Warning System, taking into account the reliability, lead-time and scope (i.e. coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event: about 75 % of fatalities, 25 % of injuries and 18 % of post-traumatic stress disorders could be avoided.

  14. A spatial Bayesian network model to assess the benefits of early warning for urban flood risk to people

    Science.gov (United States)

    Balbi, Stefano; Villa, Ferdinando; Mojtahed, Vahid; Hegetschweiler, Karin Tessa; Giupponi, Carlo

    2016-06-01

    This article presents a novel methodology to assess flood risk to people by integrating people's vulnerability and ability to cushion hazards through coping and adapting. The proposed approach extends traditional risk assessments beyond material damages; complements quantitative and semi-quantitative data with subjective and local knowledge, improving the use of commonly available information; and produces estimates of model uncertainty by providing probability distributions for all of its outputs. Flood risk to people is modeled using a spatially explicit Bayesian network model calibrated on expert opinion. Risk is assessed in terms of (1) likelihood of non-fatal physical injury, (2) likelihood of post-traumatic stress disorder and (3) likelihood of death. The study area covers the lower part of the Sihl valley (Switzerland) including the city of Zurich. The model is used to estimate the effect of improving an existing early warning system, taking into account the reliability, lead time and scope (i.e., coverage of people reached by the warning). Model results indicate that the potential benefits of an improved early warning in terms of avoided human impacts are particularly relevant in case of a major flood event.

  15. Association between the Density of Physicians and Suicide Rates in Japan: Nationwide Ecological Study Using a Spatial Bayesian Model.

    Directory of Open Access Journals (Sweden)

    Hideaki Kawaguchi

    Full Text Available Regional disparity in suicide rates is a serious problem worldwide. One possible cause is unequal distribution of the health workforce, especially psychiatrists. Research about the association between regional physician numbers and suicide rates is therefore important but studies are rare. The objective of this study was to evaluate the association between physician numbers and suicide rates in Japan, by municipality.The study included all the municipalities in Japan (n = 1,896. We estimated smoothed standardized mortality ratios of suicide rates for each municipality and evaluated the association between health workforce and suicide rates using a hierarchical Bayesian model accounting for spatially correlated random effects, a conditional autoregressive model. We assumed a Poisson distribution for the observed number of suicides and set the expected number of suicides as the offset variable. The explanatory variables were numbers of physicians, a binary variable for the presence of psychiatrists, and social covariates.After adjustment for socioeconomic factors, suicide rates in municipalities that had at least one psychiatrist were lower than those in the other municipalities. There was, however, a positive and statistically significant association between the number of physicians and suicide rates.Suicide rates in municipalities that had at least one psychiatrist were lower than those in other municipalities, but the number of physicians was positively and significantly related with suicide rates. To improve the regional disparity in suicide rates, the government should encourage psychiatrists to participate in community-based suicide prevention programs and to settle in municipalities that currently have no psychiatrists. The government and other stakeholders should also construct better networks between psychiatrists and non-psychiatrists to support sharing of information for suicide prevention.

  16. Bayesian inference of the lung alveolar spatial model for the identification of alveolar mechanics associated with acute respiratory distress syndrome

    Science.gov (United States)

    Christley, Scott; Emr, Bryanna; Ghosh, Auyon; Satalin, Josh; Gatto, Louis; Vodovotz, Yoram; Nieman, Gary F.; An, Gary

    2013-06-01

    Acute respiratory distress syndrome (ARDS) is acute lung failure secondary to severe systemic inflammation, resulting in a derangement of alveolar mechanics (i.e. the dynamic change in alveolar size and shape during tidal ventilation), leading to alveolar instability that can cause further damage to the pulmonary parenchyma. Mechanical ventilation is a mainstay in the treatment of ARDS, but may induce mechano-physical stresses on unstable alveoli, which can paradoxically propagate the cellular and molecular processes exacerbating ARDS pathology. This phenomenon is called ventilator induced lung injury (VILI), and plays a significant role in morbidity and mortality associated with ARDS. In order to identify optimal ventilation strategies to limit VILI and treat ARDS, it is necessary to understand the complex interplay between biological and physical mechanisms of VILI, first at the alveolar level, and then in aggregate at the whole-lung level. Since there is no current consensus about the underlying dynamics of alveolar mechanics, as an initial step we investigate the ventilatory dynamics of an alveolar sac (AS) with the lung alveolar spatial model (LASM), a 3D spatial biomechanical representation of the AS and its interaction with airflow pressure and the surface tension effects of pulmonary surfactant. We use the LASM to identify the mechanical ramifications of alveolar dynamics associated with ARDS. Using graphical processing unit parallel algorithms, we perform Bayesian inference on the model parameters using experimental data from rat lung under control and Tween-induced ARDS conditions. Our results provide two plausible models that recapitulate two fundamental hypotheses about volume change at the alveolar level: (1) increase in alveolar size through isotropic volume change, or (2) minimal change in AS radius with primary expansion of the mouth of the AS, with the implication that the majority of change in lung volume during the respiratory cycle occurs in the

  17. Bayesian inference of the lung alveolar spatial model for the identification of alveolar mechanics associated with acute respiratory distress syndrome

    International Nuclear Information System (INIS)

    Acute respiratory distress syndrome (ARDS) is acute lung failure secondary to severe systemic inflammation, resulting in a derangement of alveolar mechanics (i.e. the dynamic change in alveolar size and shape during tidal ventilation), leading to alveolar instability that can cause further damage to the pulmonary parenchyma. Mechanical ventilation is a mainstay in the treatment of ARDS, but may induce mechano-physical stresses on unstable alveoli, which can paradoxically propagate the cellular and molecular processes exacerbating ARDS pathology. This phenomenon is called ventilator induced lung injury (VILI), and plays a significant role in morbidity and mortality associated with ARDS. In order to identify optimal ventilation strategies to limit VILI and treat ARDS, it is necessary to understand the complex interplay between biological and physical mechanisms of VILI, first at the alveolar level, and then in aggregate at the whole-lung level. Since there is no current consensus about the underlying dynamics of alveolar mechanics, as an initial step we investigate the ventilatory dynamics of an alveolar sac (AS) with the lung alveolar spatial model (LASM), a 3D spatial biomechanical representation of the AS and its interaction with airflow pressure and the surface tension effects of pulmonary surfactant. We use the LASM to identify the mechanical ramifications of alveolar dynamics associated with ARDS. Using graphical processing unit parallel algorithms, we perform Bayesian inference on the model parameters using experimental data from rat lung under control and Tween-induced ARDS conditions. Our results provide two plausible models that recapitulate two fundamental hypotheses about volume change at the alveolar level: (1) increase in alveolar size through isotropic volume change, or (2) minimal change in AS radius with primary expansion of the mouth of the AS, with the implication that the majority of change in lung volume during the respiratory cycle occurs in the

  18. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  19. Spatial Bayesian Latent Factor Regression Modeling of Coordinate-based Meta-analysis Data

    OpenAIRE

    Montagna, Silvia; Wager, Tor; Feldman-Barrett, Lisa; Timothy D. Johnson; Nichols, Thomas E.

    2016-01-01

    Now over 20 years old, functional MRI (fMRI) has a large and growing literature that is best synthesised with meta-analytic tools. As most authors do not share image data, only the peak activation coordinates (foci) reported in the paper are available for Coordinate-based Meta-analysis (CBMA). Neuroimaging meta-analysis is used to 1) identify areas of consistent activation; and 2) build a predictive model of task type or cognitive process for new studies (reverse inference). To simultaneously...

  20. Hierarchical Bayesian spatial models for predicting multiple forest variables using waveform LiDAR, hyperspectral imagery, and large inventory datasets

    Science.gov (United States)

    Finley, Andrew O.; Banerjee, Sudipto; Cook, Bruce D.; Bradford, John B.

    2013-01-01

    In this paper we detail a multivariate spatial regression model that couples LiDAR, hyperspectral and forest inventory data to predict forest outcome variables at a high spatial resolution. The proposed model is used to analyze forest inventory data collected on the US Forest Service Penobscot Experimental Forest (PEF), ME, USA. In addition to helping meet the regression model's assumptions, results from the PEF analysis suggest that the addition of multivariate spatial random effects improves model fit and predictive ability, compared with two commonly applied modeling approaches. This improvement results from explicitly modeling the covariation among forest outcome variables and spatial dependence among observations through the random effects. Direct application of such multivariate models to even moderately large datasets is often computationally infeasible because of cubic order matrix algorithms involved in estimation. We apply a spatial dimension reduction technique to help overcome this computational hurdle without sacrificing richness in modeling.

  1. Spatial Patterns of Ischemic Heart Disease in Shenzhen, China: A Bayesian Multi-Disease Modelling Approach to Inform Health Planning Policies.

    Science.gov (United States)

    Du, Qingyun; Zhang, Mingxiao; Li, Yayan; Luan, Hui; Liang, Shi; Ren, Fu

    2016-01-01

    Incorporating the information of hypertension, this paper applies Bayesian multi-disease analysis to model the spatial patterns of Ischemic Heart Disease (IHD) risks. Patterns of harmful alcohol intake (HAI) and overweight/obesity are also modelled as they are common risk factors contributing to both IHD and hypertension. The hospitalization data of IHD and hypertension in 2012 were analyzed with three Bayesian multi-disease models at the sub-district level of Shenzhen. Results revealed that the IHD high-risk cluster shifted slightly north-eastward compared with the IHD Standardized Hospitalization Ratio (SHR). Spatial variations of overweight/obesity and HAI were found to contribute most to the IHD patterns. Identified patterns of IHD risk would benefit IHD integrated prevention. Spatial patterns of overweight/obesity and HAI could supplement the current disease surveillance system by providing information about small-area level risk factors, and thus benefit integrated prevention of related chronic diseases. Middle southern Shenzhen, where high risk of IHD, overweight/obesity, and HAI are present, should be prioritized for interventions, including alcohol control, innovative healthy diet toolkit distribution, insurance system revision, and community-based chronic disease intervention. Related health resource planning is also suggested to focus on these areas first. PMID:27104551

  2. Bayesian mixture models for Poisson astronomical images

    OpenAIRE

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2012-01-01

    Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...

  3. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  4. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  5. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis: A Cross-National Investigation of Schwartz Values

    Science.gov (United States)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the model parameters and demonstrates the consequences…

  6. Bayesian kinematic earthquake source models

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.

    2009-12-01

    Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high

  7. Evaluation of the spatial patterns and risk factors, including backyard pigs, for classical swine fever occurrence in Bulgaria using a Bayesian model

    Directory of Open Access Journals (Sweden)

    Beatriz Martínez-López

    2014-05-01

    Full Text Available The spatial pattern and epidemiology of backyard pig farming and other low bio-security pig production systems and their role in the occurrence of classical swine fever (CSF is described and evaluated. A spatial Bayesian model was used to explore the risk factors, including human demographics, socioeconomic and environmental factors. The analyses were performed for Bulgaria, which has a large number of backyard farms (96% of all pig farms in the country are classified as backyard farms, and it is one of the countries for which both backyard pig and farm counts were available. Results reveal that the high-risk areas are typically concentrated in areas with small family farms, high numbers of outgoing pig shipments and low levels of personal consumption (i.e. economically deprived areas. Identification of risk factors and high-risk areas for CSF will allow to targeting risk-based surveillance strategies leading to prevention, control and, ultimately, elimination of the disease in Bulgaria and other countries with similar socio-epidemiological conditions.

  8. A Bayesian Nonparametric IRT Model

    OpenAIRE

    Karabatsos, George

    2015-01-01

    This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...

  9. Bayesian Stable Isotope Mixing Models

    OpenAIRE

    Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard

    2012-01-01

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...

  10. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis : A Cross-National Investigation of Schwartz Values

    NARCIS (Netherlands)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the mo

  11. Relative accuracy of spatial predictive models for lynx Lynx canadensis derived using logistic regression-AIC, multiple criteria evaluation and Bayesian approaches

    Institute of Scientific and Technical Information of China (English)

    Hejun KANG; Shelley M.ALEXANDER

    2009-01-01

    We compared probability surfaces derived using one set of environmental variables in three Geographic Information Systems (GIS) -based approaches: logistic regression and Akaike's Information Criterion (AIC),Multiple Criteria Evaluation (MCE),and Bayesian Analysis (specifically Dempster-Shafer theory). We used lynx Lynx canadensis as our focal species,and developed our environment relationship model using track data collected in Banff National Park,Alberta,Canada,during winters from 1997 to 2000. The accuracy of the three spatial models were compared using a contingency table method. We determined the percentage of cases in which both presence and absence points were correctly classified (overall accuracy),the failure to predict a species where it occurred (omission error) and the prediction of presence where there was absence (commission error). Our overall accuracy showed the logistic regression approach was the most accurate (74.51% ). The multiple criteria evaluation was intermediate (39.22%),while the Dempster-Shafer (D-S) theory model was the poorest (29.90%). However,omission and commission error tell us a different story: logistic regression had the lowest commission error,while D-S theory produced the lowest omission error. Our results provide evidence that habitat modellers should evaluate all three error measures when ascribing confidence in their model. We suggest that for our study area at least,the logistic regression model is optimal. However,where sample size is small or the species is very rare,it may also be useful to explore and/or use a more ecologically cautious modelling approach (e.g. Dempster-Shafer) that would over-predict,protect more sites,and thereby minimize the risk of missing critical habitat in conservation plans.

  12. Spatial cluster modelling

    CERN Document Server

    Lawson, Andrew B

    2002-01-01

    Research has generated a number of advances in methods for spatial cluster modelling in recent years, particularly in the area of Bayesian cluster modelling. Along with these advances has come an explosion of interest in the potential applications of this work, especially in epidemiology and genome research. In one integrated volume, this book reviews the state-of-the-art in spatial clustering and spatial cluster modelling, bringing together research and applications previously scattered throughout the literature. It begins with an overview of the field, then presents a series of chapters that illuminate the nature and purpose of cluster modelling within different application areas, including astrophysics, epidemiology, ecology, and imaging. The focus then shifts to methods, with discussions on point and object process modelling, perfect sampling of cluster processes, partitioning in space and space-time, spatial and spatio-temporal process modelling, nonparametric methods for clustering, and spatio-temporal ...

  13. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    C. Dimitrakakis

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st

  14. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  15. A Bayesian Spatial Model Highlights Distinct Dynamics in Deforestation from Coca and Pastures in an Andean Biodiversity Hotspot

    OpenAIRE

    Maria Alejandra Chadid; Dávalos, Liliana M.; Jorge Molina; Dolors Armenteras

    2015-01-01

    The loss of tropical forests has continued in recent decades despite wide recognition of their importance to maintaining biodiversity. Here, we examine the conversion of forests to pastures and coca crops (illicit activity) on the San Lucas Mountain Range, Colombia for 2002–2007 and 2007–2010. Land use maps and biophysical variables were used as inputs to generate land use and cover change (LUCC) models using the DINAMICA EGO software. These analyses revealed a dramatic acceleration of the pa...

  16. A Bayesian multidimensional scaling procedure for the spatial analysis of revealed choice data

    NARCIS (Netherlands)

    DeSarbo, WS; Kim, Y; Fong, D

    1999-01-01

    We present a new Bayesian formulation of a vector multidimensional scaling procedure for the spatial analysis of binary choice data. The Gibbs sampler is gainfully employed to estimate the posterior distribution of the specified scalar products, bilinear model parameters. The computational procedure

  17. A Bayesian Spatial Model Highlights Distinct Dynamics in Deforestation from Coca and Pastures in an Andean Biodiversity Hotspot

    Directory of Open Access Journals (Sweden)

    Maria Alejandra Chadid

    2015-10-01

    Full Text Available The loss of tropical forests has continued in recent decades despite wide recognition of their importance to maintaining biodiversity. Here, we examine the conversion of forests to pastures and coca crops (illicit activity on the San Lucas Mountain Range, Colombia for 2002–2007 and 2007–2010. Land use maps and biophysical variables were used as inputs to generate land use and cover change (LUCC models using the DINAMICA EGO software. These analyses revealed a dramatic acceleration of the pace of deforestation in the region, with rates of conversion from forest to pasture doubling from the first to the second period. Altitude, distance to other crops, and distance to rivers were the primary drivers of deforestation. The influence of these drivers, however, differed markedly depending on whether coca cultivation or pastures replaced forest. Conversion to coca was more probable farther from other crops and from settlements. In contrast, proximity to other crops and to settlements increased conversion to pasture. These relationships highlight the different roles of coca and pastures in forest loss, with coca tending to open up new forest frontiers, and pastures tending to consolidate agricultural expansion and urban influence. Large differences between LUCC processes for each period suggest highly dynamic changes, likely associated with shifting underlying causes of deforestation. These changes may relate to shifts in demand for illicit crops, land, or mining products; however, the data to test these hypotheses are currently lacking. More frequent and detailed monitoring is required to guide actions to decrease the loss of forest in this highly vulnerable biodiversity hotspot in the Northern Andes.

  18. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  19. Computational methods for Bayesian model choice

    OpenAIRE

    Robert, Christian P.; Wraith, Darren

    2009-01-01

    In this note, we shortly survey some recent approaches on the approximation of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model choice. In particular, we reassess importance sampling, harmonic mean sampling, and nested sampling from a unified perspective.

  20. Learning Bayesian networks from big meteorological spatial datasets. An alternative to complex network analysis

    Science.gov (United States)

    Gutiérrez, Jose Manuel; San Martín, Daniel; Herrera, Sixto; Santiago Cofiño, Antonio

    2016-04-01

    The growing availability of spatial datasets (observations, reanalysis, and regional and global climate models) demands efficient multivariate spatial modeling techniques for many problems of interest (e.g. teleconnection analysis, multi-site downscaling, etc.). Complex networks have been recently applied in this context using graphs built from pairwise correlations between the different stations (or grid boxes) forming the dataset. However, this analysis does not take into account the full dependence structure underlying the data, gien by all possible marginal and conditional dependencies among the stations, and does not allow a probabilistic analysis of the dataset. In this talk we introduce Bayesian networks as an alternative multivariate analysis and modeling data-driven technique which allows building a joint probability distribution of the stations including all relevant dependencies in the dataset. Bayesian networks is a sound machine learning technique using a graph to 1) encode the main dependencies among the variables and 2) to obtain a factorization of the joint probability distribution of the stations given by a reduced number of parameters. For a particular problem, the resulting graph provides a qualitative analysis of the spatial relationships in the dataset (alternative to complex network analysis), and the resulting model allows for a probabilistic analysis of the dataset. Bayesian networks have been widely applied in many fields, but their use in climate problems is hampered by the large number of variables (stations) involved in this field, since the complexity of the existing algorithms to learn from data the graphical structure grows nonlinearly with the number of variables. In this contribution we present a modified local learning algorithm for Bayesian networks adapted to this problem, which allows inferring the graphical structure for thousands of stations (from observations) and/or gridboxes (from model simulations) thus providing new

  1. Bayesian Models of Brain and Behaviour

    OpenAIRE

    Penny, William

    2012-01-01

    This paper presents a review of Bayesian models of brain and behaviour. We first review the basic principles of Bayesian inference. This is followed by descriptions of sampling and variational methods for approximate inference, and forward and backward recursions in time for inference in dynamical models. The review of behavioural models covers work in visual processing, sensory integration, sensorimotor integration, and collective decision making. The review of brain models covers a range of...

  2. A Bayesian network approach to knowledge integration and representation of farm irrigation: 3. Spatial application

    Science.gov (United States)

    Robertson, D. E.; Wang, Q. J.; McAllister, A. T.; Abuzar, M.; Malano, H. M.; Etchells, T.

    2009-02-01

    Catchment managers are interested in understanding impacts of the management options they promote at both farm and regional scales. In this third paper of this series, we use Inteca-Farm, a Bayesian network model of farm irrigation in the Shepparton Irrigation Region of northern Victoria, Australia, to assess the current condition of management outcome measures and the impact of historical and future management intervention. To help overcome difficulties in comprehending modeling results that are expressed as probability distributions, to capture uncertainties, we introduce methods to spatially display and compare the output from Bayesian network models and to use these methods to compare model predictions for three management scenarios. Model predictions suggest that management intervention has made a substantial improvement to the condition of management outcome measures and that further improvements are possible. The results highlight that the management impacts are spatially variable, which demonstrates that farm modeling can provide valuable evidence in substantiating the impact of catchment management intervention.

  3. Bayesian mixture models for Poisson astronomical images

    CERN Document Server

    Guglielmetti, Fabrizia; Dose, Volker

    2012-01-01

    Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as well as the sources with their respective uncertainties. Background estimation and source detection is achieved in a single algorithm. A large variety of source morphologies is revealed. The technique is applied in the X-ray part of the electromagnetic spectrum on ROSAT and Chandra data sets and it is under a feasibility study for the forthcoming eROSITA mission.

  4. Statistical modelling of railway track geometry degradation using hierarchical Bayesian models

    OpenAIRE

    Andrade, António Ramos; Teixeira, P. Fonseca

    2015-01-01

    Railway maintenance planners require a predictive model that can assess the railway track geometry degradation. The present paper uses a hierarchical Bayesian model as a tool to model the main two quality indicators related to railway track geometry degradation: the standard deviation of longitudinal level defects and the standard deviation of horizontal alignment defects. Hierarchical Bayesian Models (HBM) are flexible statistical models that allow specifying different spatially correlated c...

  5. A Bayesian observer model constrained by efficient coding can explain 'anti-Bayesian' percepts.

    Science.gov (United States)

    Wei, Xue-Xin; Stocker, Alan A

    2015-10-01

    Bayesian observer models provide a principled account of the fact that our perception of the world rarely matches physical reality. The standard explanation is that our percepts are biased toward our prior beliefs. However, reported psychophysical data suggest that this view may be simplistic. We propose a new model formulation based on efficient coding that is fully specified for any given natural stimulus distribution. The model makes two new and seemingly anti-Bayesian predictions. First, it predicts that perception is often biased away from an observer's prior beliefs. Second, it predicts that stimulus uncertainty differentially affects perceptual bias depending on whether the uncertainty is induced by internal or external noise. We found that both model predictions match reported perceptual biases in perceived visual orientation and spatial frequency, and were able to explain data that have not been explained before. The model is general and should prove applicable to other perceptual variables and tasks. PMID:26343249

  6. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  7. A genetic and spatial Bayesian analysis of mastitis resistance

    OpenAIRE

    Frigessi Arnoldo; Sæbø Solve

    2004-01-01

    Abstract A nationwide health card recording system for dairy cattle was introduced in Norway in 1975 (the Norwegian Cattle Health Services). The data base holds information on mastitis occurrences on an individual cow basis. A reduction in mastitis frequency across the population is desired, and for this purpose risk factors are investigated. In this paper a Bayesian proportional hazards model is used for modelling the time to first veterinary treatment of clinical mastitis, including both ge...

  8. A genetic and spatial Bayesian analysis of mastitis resistance

    OpenAIRE

    Sæbø, Solve; Frigessi, Arnoldo

    2004-01-01

    A nationwide health card recording system for dairy cattle was introduced in Norway in 1975 (the Norwegian Cattle Health Services). The data base holds information on mastitis occurrences on an individual cow basis. A reduction in mastitis frequency across the population is desired, and for this purpose risk factors are investigated. In this paper a Bayesian proportional hazards model is used for modelling the time to first veterinary treatment of clinical mastitis, including both genetic and...

  9. Bayesian Analysis of Multivariate Probit Models

    OpenAIRE

    Siddhartha Chib; Edward Greenberg

    1996-01-01

    This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...

  10. Bayesian Network Models for Adaptive Testing

    Czech Academy of Sciences Publication Activity Database

    Plajner, Martin; Vomlel, Jiří

    Achen: Sun SITE Central Europe, 2016 - (Agosta, J.; Carvalho, R.), s. 24-33. (CEUR Workshop Proceedings. Vol 1565). ISSN 1613-0073. [The Twelfth UAI Bayesian Modeling Applications Workshop (BMAW 2015). Amsterdam (NL), 16.07.2015] R&D Projects: GA ČR GA13-20012S Institutional support: RVO:67985556 Keywords : Bayesian networks * Computerized adaptive testing Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2016/MTR/plajner-0458062.pdf

  11. On Bayesian Nonparametric Continuous Time Series Models

    OpenAIRE

    Karabatsos, George; Walker, Stephen G.

    2013-01-01

    This paper is a note on the use of Bayesian nonparametric mixture models for continuous time series. We identify a key requirement for such models, and then establish that there is a single type of model which meets this requirement. As it turns out, the model is well known in multiple change-point problems.

  12. Bayesian semiparametric dynamic Nelson-Siegel model

    NARCIS (Netherlands)

    C. Cakmakli

    2011-01-01

    This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Diric

  13. Bayesian calibration of car-following models

    NARCIS (Netherlands)

    Van Hinsbergen, C.P.IJ.; Van Lint, H.W.C.; Hoogendoorn, S.P.; Van Zuylen, H.J.

    2010-01-01

    Recent research has revealed that there exist large inter-driver differences in car-following behavior such that different car-following models may apply to different drivers. This study applies Bayesian techniques to the calibration of car-following models, where prior distributions on each model p

  14. Bayesian Semiparametric Modeling of Realized Covariance Matrices

    OpenAIRE

    Jin, Xin; John M Maheu

    2014-01-01

    This paper introduces several new Bayesian nonparametric models suitable for capturing the unknown conditional distribution of realized covariance (RCOV) matrices. Existing dynamic Wishart models are extended to countably infinite mixture models of Wishart and inverse-Wishart distributions. In addition to mixture models with constant weights we propose models with time-varying weights to capture time dependence in the unknown distribution. Each of our models can be combined with returns...

  15. Complex Bayesian models: construction, and sampling strategies

    OpenAIRE

    Huston, Carolyn Marie

    2011-01-01

    Bayesian models are useful tools for realistically modeling processes occurring in the real world. In particular, we consider models for spatio-temporal data where the response vector is compositional, ie. has components that sum-to-one. A unique multivariate conditional hierarchical model (MVCAR) is proposed. Statistical methods for MVCAR models are well developed and we extend these tools for use with a discrete compositional response. We harness the advantages of an MVCAR model when the re...

  16. Bayesian Approach to Neuro-Rough Models for Modelling HIV

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.

  17. Survey of Bayesian Models for Modelling of Stochastic Temporal Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ng, B

    2006-10-12

    This survey gives an overview of popular generative models used in the modeling of stochastic temporal systems. In particular, this survey is organized into two parts. The first part discusses the discrete-time representations of dynamic Bayesian networks and dynamic relational probabilistic models, while the second part discusses the continuous-time representation of continuous-time Bayesian networks.

  18. Spatial correlation in Bayesian logistic regression with misclassification

    DEFF Research Database (Denmark)

    Bihrmann, Kristine; Toft, Nils; Nielsen, Søren Saxmose;

    2014-01-01

    Standard logistic regression assumes that the outcome is measured perfectly. In practice, this is often not the case, which could lead to biased estimates if not accounted for. This study presents Bayesian logistic regression with adjustment for misclassification of the outcome applied to data with....... Parameters were estimated by Markov Chain Monte Carlo methods, using slice sampling to improve convergence. The results demonstrated that adjustment for misclassification must be included to produce unbiased regression estimates. With strong correlation the ICAR model performed best. With weak or moderate...

  19. Spatial-Temporal Epidemiology of Tuberculosis in Mainland China: An Analysis Based on Bayesian Theory

    Directory of Open Access Journals (Sweden)

    Kai Cao

    2016-05-01

    Full Text Available Objective: To explore the spatial-temporal interaction effect within a Bayesian framework and to probe the ecological influential factors for tuberculosis. Methods: Six different statistical models containing parameters of time, space, spatial-temporal interaction and their combination were constructed based on a Bayesian framework. The optimum model was selected according to the deviance information criterion (DIC value. Coefficients of climate variables were then estimated using the best fitting model. Results: The model containing spatial-temporal interaction parameter was the best fitting one, with the smallest DIC value (−4,508,660. Ecological analysis results showed the relative risks (RRs of average temperature, rainfall, wind speed, humidity, and air pressure were 1.00324 (95% CI, 1.00150–1.00550, 1.01010 (95% CI, 1.01007–1.01013, 0.83518 (95% CI, 0.93732–0.96138, 0.97496 (95% CI, 0.97181–1.01386, and 1.01007 (95% CI, 1.01003–1.01011, respectively. Conclusions: The spatial-temporal interaction was statistically meaningful and the prevalence of tuberculosis was influenced by the time and space interaction effect. Average temperature, rainfall, wind speed, and air pressure influenced tuberculosis. Average humidity had no influence on tuberculosis.

  20. Baltic sea algae analysis using Bayesian spatial statistics methods

    OpenAIRE

    Eglė Baltmiškytė; Kęstutis Dučinskas

    2013-01-01

    Spatial statistics is one of the fields in statistics dealing with spatialy spread data analysis. Recently, Bayes methods are often applied for data statistical analysis. A spatial data model for predicting algae quantity in the Baltic Sea is made and described in this article. Black Carrageen is a dependent variable and depth, sand, pebble, boulders are independent variables in the described model. Two models with different covariation functions (Gaussian and exponential) are built to estima...

  1. Bayesian modeling and classification of neural signals

    OpenAIRE

    Lewicki, Michael S.

    1994-01-01

    Signal processing and classification algorithms often have limited applicability resulting from an inaccurate model of the signal's underlying structure. We present here an efficient, Bayesian algorithm for modeling a signal composed of the superposition of brief, Poisson-distributed functions. This methodology is applied to the specific problem of modeling and classifying extracellular neural waveforms which are composed of a superposition of an unknown number of action potentials CAPs). ...

  2. Spatial Distribution of TDS in Drinking Water of Tehsil Jampur using Ordinary and Bayesian Kriging

    Directory of Open Access Journals (Sweden)

    Maqsood Ahmad

    2015-09-01

    Full Text Available In this research article, level of TDS in groundwater with spatial domain Tehsil Jampur, Pakistan is considered as response variable. Its enhanced level in drinking water produces both the human health concerns and aquatic ecological impacts. Its high value causes several diseases like bilestone, joints stiffness, obstruction of blood vessel and kidney stones. Some Geostatistical techniques were used to interpolate TDS at unmonitored locations of Tehsil Jampur. Four estimation techniques were comparatively studied for fitting well known matern spatial covariance models. Model based Ordinary Kriging (OK and Bayesian Kriging (BK were used for spatial interpolation at unmonitored locations. Cross validation statistic was used to select best interpolation technique with reduced RMSPE. Prediction maps were generated for visual presentation of interpolated sited for both techniques. This study revealed that among thirty observed locations, 56% water samples exceed the maximum permissible limit (1000g/ml of TDS as described by WHO

  3. Distributed Bayesian Networks for User Modeling

    DEFF Research Database (Denmark)

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang;

    2006-01-01

    The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used by such...... adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... mechanism efficiently combines distributed learner models without the need to exchange internal structure of local Bayesian networks, nor local evidence between the involved platforms....

  4. Constrained bayesian inference of project performance models

    OpenAIRE

    Sunmola, Funlade

    2013-01-01

    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  5. Bayesian Network Based XP Process Modelling

    Directory of Open Access Journals (Sweden)

    Mohamed Abouelela

    2010-07-01

    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  6. A Bayesian Modelling of Wildfires in Portugal

    OpenAIRE

    Silva, Giovani L.; Soares, Paulo; Marques, Susete; Dias, Inês M.; Oliveira, Manuela M.; Borges, Guilherme J.

    2015-01-01

    In the last decade wildfires became a serious problem in Portugal due to different issues such as climatic characteristics and nature of Portuguese forest. In order to analyse wildfire data, we employ beta regression for modelling the proportion of burned forest area, under a Bayesian perspective. Our main goal is to find out fire risk factors that influence the proportion of area burned and what may make a forest type susceptible or resistant to fire. Then, we analyse wildfire...

  7. Market Segmentation Using Bayesian Model Based Clustering

    OpenAIRE

    Van Hattum, P.

    2009-01-01

    This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...

  8. Centralized Bayesian reliability modelling with sensor networks

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Sečkárová, Vladimíra

    2013-01-01

    Roč. 19, č. 5 (2013), s. 471-482. ISSN 1387-3954 R&D Projects: GA MŠk 7D12004 Grant ostatní: GA MŠk(CZ) SVV-265315 Keywords : Bayesian modelling * Sensor network * Reliability Subject RIV: BD - Theory of Information Impact factor: 0.984, year: 2013 http://library.utia.cas.cz/separaty/2013/AS/dedecius-0392551.pdf

  9. A Bayesian semiparametric approach with change points for spatial ordinal data.

    Science.gov (United States)

    Cai, Bo; Lawson, Andrew B; McDermott, Suzanne; Aelion, C Marjorie

    2016-04-01

    The change-point model has drawn much attention over the past few decades. It can accommodate the jump process, which allows for changes of the effects before and after the change point. Intellectual disability is a long-term disability that impacts performance in cognitive aspects of life and usually has its onset prior to birth. Among many potential causes, soil chemical exposures are associated with the risk of intellectual disability in children. Motivated by a study for soil metal effects on intellectual disability, we propose a Bayesian hierarchical spatial model with change points for spatial ordinal data to detect the unknown threshold effects. The spatial continuous latent variable underlying the spatial ordinal outcome is modeled by the multivariate Gaussian process, which captures spatial variation and is centered at the nonlinear mean. The mean function is modeled by using the penalized smoothing splines for some covariates with unknown change points and the linear regression for the others. Some identifiability constraints are used to define the latent variable. A simulation example is presented to evaluate the performance of the proposed approach with the competing models. A retrospective cohort study for intellectual disability in South Carolina is used as an illustration. PMID:23070600

  10. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  11. Bayesian regression model for seasonal forecast of precipitation over Korea

    Science.gov (United States)

    Jo, Seongil; Lim, Yaeji; Lee, Jaeyong; Kang, Hyun-Suk; Oh, Hee-Seok

    2012-08-01

    In this paper, we apply three different Bayesian methods to the seasonal forecasting of the precipitation in a region around Korea (32.5°N-42.5°N, 122.5°E-132.5°E). We focus on the precipitation of summer season (June-July-August; JJA) for the period of 1979-2007 using the precipitation produced by the Global Data Assimilation and Prediction System (GDAPS) as predictors. Through cross-validation, we demonstrate improvement for seasonal forecast of precipitation in terms of root mean squared error (RMSE) and linear error in probability space score (LEPS). The proposed methods yield RMSE of 1.09 and LEPS of 0.31 between the predicted and observed precipitations, while the prediction using GDAPS output only produces RMSE of 1.20 and LEPS of 0.33 for CPC Merged Analyzed Precipitation (CMAP) data. For station-measured precipitation data, the RMSE and LEPS of the proposed Bayesian methods are 0.53 and 0.29, while GDAPS output is 0.66 and 0.33, respectively. The methods seem to capture the spatial pattern of the observed precipitation. The Bayesian paradigm incorporates the model uncertainty as an integral part of modeling in a natural way. We provide a probabilistic forecast integrating model uncertainty.

  12. Bayesian Kinematic Finite Fault Source Models (Invited)

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.

    2010-12-01

    Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.

  13. Bayesian Estimation of a Mixture Model

    OpenAIRE

    Ilhem Merah; Assia Chadli

    2015-01-01

    We present the properties of a bathtub curve reliability model having both a sufficient adaptability and a minimal number of parameters introduced by Idée and Pierrat (2010). This one is a mixture of a Gamma distribution G(2, (1/θ)) and a new distribution L(θ). We are interesting by Bayesian estimation of the parameters and survival function of this model with a squared-error loss function and non-informative prior using the approximations of Lindley (1980) and Tierney and Kadane (1986). Usin...

  14. Bayesian mixture models for partially verified data

    DEFF Research Database (Denmark)

    Kostoulas, Polychronis; Browne, William J.; Nielsen, Søren Saxmose;

    2013-01-01

    , where a perfect reference test does not exist. However, their discriminatory ability diminishes with increasing overlap of the distributions and with increasing number of latent infection stages to be discriminated. We provide a method that uses partially verified data, with known infection status for......Bayesian mixture models can be used to discriminate between the distributions of continuous test responses for different infection stages. These models are particularly useful in case of chronic infections with a long latent period, like Mycobacterium avium subsp. paratuberculosis (MAP) infection...

  15. A Nonparametric Bayesian Model for Nested Clustering.

    Science.gov (United States)

    Lee, Juhee; Müller, Peter; Zhu, Yitan; Ji, Yuan

    2016-01-01

    We propose a nonparametric Bayesian model for clustering where clusters of experimental units are determined by a shared pattern of clustering another set of experimental units. The proposed model is motivated by the analysis of protein activation data, where we cluster proteins such that all proteins in one cluster give rise to the same clustering of patients. That is, we define clusters of proteins by the way that patients group with respect to the corresponding protein activations. This is in contrast to (almost) all currently available models that use shared parameters in the sampling model to define clusters. This includes in particular model based clustering, Dirichlet process mixtures, product partition models, and more. We show results for two typical biostatistical inference problems that give rise to clustering. PMID:26519174

  16. Bayesian Discovery of Linear Acyclic Causal Models

    CERN Document Server

    Hoyer, Patrik O

    2012-01-01

    Methods for automated discovery of causal relationships from non-interventional data have received much attention recently. A widely used and well understood model family is given by linear acyclic causal models (recursive structural equation models). For Gaussian data both constraint-based methods (Spirtes et al., 1993; Pearl, 2000) (which output a single equivalence class) and Bayesian score-based methods (Geiger and Heckerman, 1994) (which assign relative scores to the equivalence classes) are available. On the contrary, all current methods able to utilize non-Gaussianity in the data (Shimizu et al., 2006; Hoyer et al., 2008) always return only a single graph or a single equivalence class, and so are fundamentally unable to express the degree of certainty attached to that output. In this paper we develop a Bayesian score-based approach able to take advantage of non-Gaussianity when estimating linear acyclic causal models, and we empirically demonstrate that, at least on very modest size networks, its accur...

  17. MEG source localization of spatially extended generators of epileptic activity: comparing entropic and hierarchical bayesian approaches.

    Directory of Open Access Journals (Sweden)

    Rasheda Arman Chowdhury

    Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.

  18. Adversarial life testing: A Bayesian negotiation model

    International Nuclear Information System (INIS)

    Life testing is a procedure intended for facilitating the process of making decisions in the context of industrial reliability. On the other hand, negotiation is a process of making joint decisions that has one of its main foundations in decision theory. A Bayesian sequential model of negotiation in the context of adversarial life testing is proposed. This model considers a general setting for which a manufacturer offers a product batch to a consumer. It is assumed that the reliability of the product is measured in terms of its lifetime. Furthermore, both the manufacturer and the consumer have to use their own information with respect to the quality of the product. Under these assumptions, two situations can be analyzed. For both of them, the main aim is to accept or reject the product batch based on the product reliability. This topic is related to a reliability demonstration problem. The procedure is applied to a class of distributions that belong to the exponential family. Thus, a unified framework addressing the main topics in the considered Bayesian model is presented. An illustrative example shows that the proposed technique can be easily applied in practice

  19. Bayesian Fundamentalism or Enlightenment? On the explanatory status and theoretical contributions of Bayesian models of cognition.

    Science.gov (United States)

    Jones, Matt; Love, Bradley C

    2011-08-01

    The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls

  20. Making the most of spatial information in health: a tutorial in Bayesian disease mapping for areal data.

    Science.gov (United States)

    Kang, Su Yun; Cramb, Susanna M; White, Nicole M; Ball, Stephen J; Mengersen, Kerrie L

    2016-01-01

    Disease maps are effective tools for explaining and predicting patterns of disease outcomes across geographical space, identifying areas of potentially elevated risk, and formulating and validating aetiological hypotheses for a disease. Bayesian models have become a standard approach to disease mapping in recent decades. This article aims to provide a basic understanding of the key concepts involved in Bayesian disease mapping methods for areal data. It is anticipated that this will help in interpretation of published maps, and provide a useful starting point for anyone interested in running disease mapping methods for areal data. The article provides detailed motivation and descriptions on disease mapping methods by explaining the concepts, defining the technical terms, and illustrating the utility of disease mapping for epidemiological research by demonstrating various ways of visualising model outputs using a case study. The target audience includes spatial scientists in health and other fields, policy or decision makers, health geographers, spatial analysts, public health professionals, and epidemiologists. PMID:27245803

  1. Bayesian Estimation of a Mixture Model

    Directory of Open Access Journals (Sweden)

    Ilhem Merah

    2015-05-01

    Full Text Available We present the properties of a bathtub curve reliability model having both a sufficient adaptability and a minimal number of parameters introduced by Idée and Pierrat (2010. This one is a mixture of a Gamma distribution G(2, (1/θ and a new distribution L(θ. We are interesting by Bayesian estimation of the parameters and survival function of this model with a squared-error loss function and non-informative prior using the approximations of Lindley (1980 and Tierney and Kadane (1986. Using a statistical sample of 60 failure data relative to a technical device, we illustrate the results derived. Based on a simulation study, comparisons are made between these two methods and the maximum likelihood method of this two parameters model.

  2. The Bayesian Modelling Of Inflation Rate In Romania

    OpenAIRE

    Mihaela Simionescu

    2014-01-01

    Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estim...

  3. A tutorial introduction to Bayesian models of cognitive development

    OpenAIRE

    Perfors, Amy; Tenenbaum, Joshua B.; Griffiths, Thomas L.; Xu, Fei

    2010-01-01

    We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in...

  4. Merging Digital Surface Models Implementing Bayesian Approaches

    Science.gov (United States)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  5. Spatial Bayesian surveillance for small area case event data.

    Science.gov (United States)

    Rotejanaprasert, Chawarat; Lawson, Andrew; Bolick-Aldrich, Susan; Hurley, Deborah

    2016-08-01

    There has been little development of surveillance procedures for epidemiological data with fine spatial resolution such as case events at residential address locations. This is often due to difficulties of access when confidentiality of medical records is an issue. However, when such data are available, it is important to be able to affect an appropriate analysis strategy. We propose a model for point events in the context of prospective surveillance based on conditional logistic modeling. A weighted conditional autoregressive model is developed for irregular lattices to account for distance effects, and a Dirichlet tessellation is adopted to define the neighborhood structure. Localized clustering diagnostics are compared including the proposed local Kullback-Leibler information criterion. A simulation study is conducted to examine the surveillance and detection methods, and a data example is provided of non-Hodgkin's lymphoma data in South Carolina. PMID:27566768

  6. Bayesian Models of Graphs, Arrays and Other Exchangeable Random Structures.

    Science.gov (United States)

    Orbanz, Peter; Roy, Daniel M

    2015-02-01

    The natural habitat of most Bayesian methods is data represented by exchangeable sequences of observations, for which de Finetti's theorem provides the theoretical foundation. Dirichlet process clustering, Gaussian process regression, and many other parametric and nonparametric Bayesian models fall within the remit of this framework; many problems arising in modern data analysis do not. This article provides an introduction to Bayesian models of graphs, matrices, and other data that can be modeled by random structures. We describe results in probability theory that generalize de Finetti's theorem to such data and discuss their relevance to nonparametric Bayesian modeling. With the basic ideas in place, we survey example models available in the literature; applications of such models include collaborative filtering, link prediction, and graph and network analysis. We also highlight connections to recent developments in graph theory and probability, and sketch the more general mathematical foundation of Bayesian methods for other types of data beyond sequences and arrays. PMID:26353253

  7. Road network safety evaluation using Bayesian hierarchical joint model.

    Science.gov (United States)

    Wang, Jie; Huang, Helai

    2016-05-01

    Safety and efficiency are commonly regarded as two significant performance indicators of transportation systems. In practice, road network planning has focused on road capacity and transport efficiency whereas the safety level of a road network has received little attention in the planning stage. This study develops a Bayesian hierarchical joint model for road network safety evaluation to help planners take traffic safety into account when planning a road network. The proposed model establishes relationships between road network risk and micro-level variables related to road entities and traffic volume, as well as socioeconomic, trip generation and network density variables at macro level which are generally used for long term transportation plans. In addition, network spatial correlation between intersections and their connected road segments is also considered in the model. A road network is elaborately selected in order to compare the proposed hierarchical joint model with a previous joint model and a negative binomial model. According to the results of the model comparison, the hierarchical joint model outperforms the joint model and negative binomial model in terms of the goodness-of-fit and predictive performance, which indicates the reasonableness of considering the hierarchical data structure in crash prediction and analysis. Moreover, both random effects at the TAZ level and the spatial correlation between intersections and their adjacent segments are found to be significant, supporting the employment of the hierarchical joint model as an alternative in road-network-level safety modeling as well. PMID:26945109

  8. Modeling Social Annotation: a Bayesian Approach

    CERN Document Server

    Plangprasopchok, Anon

    2008-01-01

    Collaborative tagging systems, such as del.icio.us, CiteULike, and others, allow users to annotate objects, e.g., Web pages or scientific papers, with descriptive labels called tags. The social annotations, contributed by thousands of users, can potentially be used to infer categorical knowledge, classify documents or recommend new relevant information. Traditional text inference methods do not make best use of socially-generated data, since they do not take into account variations in individual users' perspectives and vocabulary. In a previous work, we introduced a simple probabilistic model that takes interests of individual annotators into account in order to find hidden topics of annotated objects. Unfortunately, our proposed approach had a number of shortcomings, including overfitting, local maxima and the requirement to specify values for some parameters. In this paper we address these shortcomings in two ways. First, we extend the model to a fully Bayesian framework. Second, we describe an infinite ver...

  9. Improving randomness characterization through Bayesian model selection

    CERN Document Server

    R., Rafael Díaz-H; Martínez, Alí M Angulo; U'Ren, Alfred B; Hirsch, Jorge G; Marsili, Matteo; Castillo, Isaac Pérez

    2016-01-01

    Nowadays random number generation plays an essential role in technology with important applications in areas ranging from cryptography, which lies at the core of current communication protocols, to Monte Carlo methods, and other probabilistic algorithms. In this context, a crucial scientific endeavour is to develop effective methods that allow the characterization of random number generators. However, commonly employed methods either lack formality (e.g. the NIST test suite), or are inapplicable in principle (e.g. the characterization derived from the Algorithmic Theory of Information (ATI)). In this letter we present a novel method based on Bayesian model selection, which is both rigorous and effective, for characterizing randomness in a bit sequence. We derive analytic expressions for a model's likelihood which is then used to compute its posterior probability distribution. Our method proves to be more rigorous than NIST's suite and the Borel-Normality criterion and its implementation is straightforward. We...

  10. Modeling Land-Use Decision Behavior with Bayesian Belief Networks

    Directory of Open Access Journals (Sweden)

    Inge Aalders

    2008-06-01

    Full Text Available The ability to incorporate and manage the different drivers of land-use change in a modeling process is one of the key challenges because they are complex and are both quantitative and qualitative in nature. This paper uses Bayesian belief networks (BBN to incorporate characteristics of land managers in the modeling process and to enhance our understanding of land-use change based on the limited and disparate sources of information. One of the two models based on spatial data represented land managers in the form of a quantitative variable, the area of individual holdings, whereas the other model included qualitative data from a survey of land managers. Random samples from the spatial data provided evidence of the relationship between the different variables, which I used to develop the BBN structure. The model was tested for four different posterior probability distributions, and results showed that the trained and learned models are better at predicting land use than the uniform and random models. The inference from the model demonstrated the constraints that biophysical characteristics impose on land managers; for older land managers without heirs, there is a higher probability of the land use being arable agriculture. The results show the benefits of incorporating a more complex notion of land managers in land-use models, and of using different empirical data sources in the modeling process. Future research should focus on incorporating more complex social processes into the modeling structure, as well as incorporating spatio-temporal dynamics in a BBN.

  11. Statistical modelling of railway track geometry degradation using Hierarchical Bayesian models

    International Nuclear Information System (INIS)

    Railway maintenance planners require a predictive model that can assess the railway track geometry degradation. The present paper uses a Hierarchical Bayesian model as a tool to model the main two quality indicators related to railway track geometry degradation: the standard deviation of longitudinal level defects and the standard deviation of horizontal alignment defects. Hierarchical Bayesian Models (HBM) are flexible statistical models that allow specifying different spatially correlated components between consecutive track sections, namely for the deterioration rates and the initial qualities parameters. HBM are developed for both quality indicators, conducting an extensive comparison between candidate models and a sensitivity analysis on prior distributions. HBM is applied to provide an overall assessment of the degradation of railway track geometry, for the main Portuguese railway line Lisbon–Oporto. - Highlights: • Rail track geometry degradation is analysed using Hierarchical Bayesian models. • A Gibbs sampling strategy is put forward to estimate the HBM. • Model comparison and sensitivity analysis find the most suitable model. • We applied the most suitable model to all the segments of the main Portuguese line. • Tackling spatial correlations using CAR structures lead to a better model fit

  12. A SEMIPARAMETRIC BAYESIAN MODEL FOR CIRCULAR-LINEAR REGRESSION

    Science.gov (United States)

    We present a Bayesian approach to regress a circular variable on a linear predictor. The regression coefficients are assumed to have a nonparametric distribution with a Dirichlet process prior. The semiparametric Bayesian approach gives added flexibility to the model and is usefu...

  13. Mapping, Bayesian Geostatistical Analysis and Spatial Prediction of Lymphatic Filariasis Prevalence in Africa

    Science.gov (United States)

    Slater, Hannah; Michael, Edwin

    2013-01-01

    There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account

  14. A new approach for Bayesian model averaging

    Institute of Scientific and Technical Information of China (English)

    TIAN XiangJun; XIE ZhengHui; WANG AiHui; YANG XiaoChun

    2012-01-01

    Bayesian model averaging (BMA) is a recently proposed statistical method for calibrating forecast ensembles from numerical weather models.However,successful implementation of BMA requires accurate estimates of the weights and variances of the individual competing models in the ensemble.Two methods,namely the Expectation-Maximization (EM) and the Markov Chain Monte Carlo (MCMC) algorithms,are widely used for BMA model training.Both methods have their own respective strengths and weaknesses.In this paper,we first modify the BMA log-likelihood function with the aim of removing the additional limitation that requires that the BMA weights add to one,and then use a limited memory quasi-Newtonian algorithm for solving the nonlinear optimization problem,thereby formulating a new approach for BMA (referred to as BMA-BFGS).Several groups of multi-model soil moisture simulation experiments from three land surface models show that the performance of BMA-BFGS is similar to the MCMC method in terms of simulation accuracy,and that both are superior to the EM algorithm.On the other hand,the computational cost of the BMA-BFGS algorithm is substantially less than for MCMC and is almost equivalent to that for EM.

  15. Assessing global vegetation activity using spatio-temporal Bayesian modelling

    Science.gov (United States)

    Mulder, Vera L.; van Eck, Christel M.; Friedlingstein, Pierre; Regnier, Pierre A. G.

    2016-04-01

    This work demonstrates the potential of modelling vegetation activity using a hierarchical Bayesian spatio-temporal model. This approach allows modelling changes in vegetation and climate simultaneous in space and time. Changes of vegetation activity such as phenology are modelled as a dynamic process depending on climate variability in both space and time. Additionally, differences in observed vegetation status can be contributed to other abiotic ecosystem properties, e.g. soil and terrain properties. Although these properties do not change in time, they do change in space and may provide valuable information in addition to the climate dynamics. The spatio-temporal Bayesian models were calibrated at a regional scale because the local trends in space and time can be better captured by the model. The regional subsets were defined according to the SREX segmentation, as defined by the IPCC. Each region is considered being relatively homogeneous in terms of large-scale climate and biomes, still capturing small-scale (grid-cell level) variability. Modelling within these regions is hence expected to be less uncertain due to the absence of these large-scale patterns, compared to a global approach. This overall modelling approach allows the comparison of model behavior for the different regions and may provide insights on the main dynamic processes driving the interaction between vegetation and climate within different regions. The data employed in this study encompasses the global datasets for soil properties (SoilGrids), terrain properties (Global Relief Model based on SRTM DEM and ETOPO), monthly time series of satellite-derived vegetation indices (GIMMS NDVI3g) and climate variables (Princeton Meteorological Forcing Dataset). The findings proved the potential of a spatio-temporal Bayesian modelling approach for assessing vegetation dynamics, at a regional scale. The observed interrelationships of the employed data and the different spatial and temporal trends support

  16. Bayesian Model Selection for LISA Pathfinder

    CERN Document Server

    Karnesis, Nikolaos; Sopuerta, Carlos F; Gibert, Ferran; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Ferraioli, Luigi; Hewitson, Martin; Hueller, Mauro; Korsakova, Natalia; Plagnol, Eric; Vitale, and Stefano

    2013-01-01

    The main goal of the LISA Pathfinder (LPF) mission is to fully characterize the acceleration noise models and to test key technologies for future space-based gravitational-wave observatories similar to the LISA/eLISA concept. The Data Analysis (DA) team has developed complex three-dimensional models of the LISA Technology Package (LTP) experiment on-board LPF. These models are used for simulations, but more importantly, they will be used for parameter estimation purposes during flight operations. One of the tasks of the DA team is to identify the physical effects that contribute significantly to the properties of the instrument noise. A way of approaching to this problem is to recover the essential parameters of the LTP which describe the data. Thus, we want to define the simplest model that efficiently explains the observations. To do so, adopting a Bayesian framework, one has to estimate the so-called Bayes Factor between two competing models. In our analysis, we use three main different methods to estimate...

  17. Bayesian Model Averaging in the Instrumental Variable Regression Model

    OpenAIRE

    Gary Koop; Robert Leon Gonzalez; Rodney Strachan

    2011-01-01

    This paper considers the instrumental variable regression model when there is uncertainly about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainly can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very fl...

  18. EVENT MODEL: A ROBUST BAYESIAN TOOL FOR CHRONOLOGICAL MODELING

    OpenAIRE

    Lanos, Philippe; Philippe, Anne

    2015-01-01

    We propose a new modeling approach for combining dates through the Event model by using hierarchical Bayesian statistics. The Event model aims to estimate the date of a context (unit of stratification) from individual dates assumed to be contemporaneous and which are affected by errors of different types: laboratory and calibration curve errors and also irreducible errors related to contaminations, taphonomic disturbances, etc, hence the possible presence of outliers. The Event model has a hi...

  19. Stochastic model updating utilizing Bayesian approach and Gaussian process model

    Science.gov (United States)

    Wan, Hua-Ping; Ren, Wei-Xin

    2016-03-01

    Stochastic model updating (SMU) has been increasingly applied in quantifying structural parameter uncertainty from responses variability. SMU for parameter uncertainty quantification refers to the problem of inverse uncertainty quantification (IUQ), which is a nontrivial task. Inverse problem solved with optimization usually brings about the issues of gradient computation, ill-conditionedness, and non-uniqueness. Moreover, the uncertainty present in response makes the inverse problem more complicated. In this study, Bayesian approach is adopted in SMU for parameter uncertainty quantification. The prominent strength of Bayesian approach for IUQ problem is that it solves IUQ problem in a straightforward manner, which enables it to avoid the previous issues. However, when applied to engineering structures that are modeled with a high-resolution finite element model (FEM), Bayesian approach is still computationally expensive since the commonly used Markov chain Monte Carlo (MCMC) method for Bayesian inference requires a large number of model runs to guarantee the convergence. Herein we reduce computational cost in two aspects. On the one hand, the fast-running Gaussian process model (GPM) is utilized to approximate the time-consuming high-resolution FEM. On the other hand, the advanced MCMC method using delayed rejection adaptive Metropolis (DRAM) algorithm that incorporates local adaptive strategy with global adaptive strategy is employed for Bayesian inference. In addition, we propose the use of the powerful variance-based global sensitivity analysis (GSA) in parameter selection to exclude non-influential parameters from calibration parameters, which yields a reduced-order model and thus further alleviates the computational burden. A simulated aluminum plate and a real-world complex cable-stayed pedestrian bridge are presented to illustrate the proposed framework and verify its feasibility.

  20. Bayesian estimation of parameters in a regional hydrological model

    OpenAIRE

    Engeland, K.; Gottschalk, L.

    2002-01-01

    This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC) analysis. The Bayesian method requires formulation of a likelihood funct...

  1. Bayesian estimation of parameters in a regional hydrological model

    OpenAIRE

    Engeland, K.; Gottschalk, L.

    2002-01-01

    This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC) analysis. The Bayesian method requires formulation of ...

  2. Bayesian Analysis of Dynamic Multivariate Models with Multiple Structural Breaks

    OpenAIRE

    Sugita, Katsuhiro

    2006-01-01

    This paper considers a vector autoregressive model or a vector error correction model with multiple structural breaks in any subset of parameters, using a Bayesian approach with Markov chain Monte Carlo simulation technique. The number of structural breaks is determined as a sort of model selection by the posterior odds. For a cointegrated model, cointegrating rank is also allowed to change with breaks. Bayesian approach by Strachan (Journal of Business and Economic Statistics 21 (2003) 185) ...

  3. Bayesian Test of Significance for Conditional Independence: The Multinomial Model

    Science.gov (United States)

    de Morais Andrade, Pablo; Stern, Julio; de Bragança Pereira, Carlos

    2014-03-01

    Conditional independence tests (CI tests) have received special attention lately in Machine Learning and Computational Intelligence related literature as an important indicator of the relationship among the variables used by their models. In the field of Probabilistic Graphical Models (PGM)--which includes Bayesian Networks (BN) models--CI tests are especially important for the task of learning the PGM structure from data. In this paper, we propose the Full Bayesian Significance Test (FBST) for tests of conditional independence for discrete datasets. FBST is a powerful Bayesian test for precise hypothesis, as an alternative to frequentist's significance tests (characterized by the calculation of the \\emph{p-value}).

  4. Bayesian Nonparametrics in Topic Modeling: A Brief Tutorial

    OpenAIRE

    Spangher, Alexander

    2015-01-01

    Using nonparametric methods has been increasingly explored in Bayesian hierarchical modeling as a way to increase model flexibility. Although the field shows a lot of promise, inference in many models, including Hierachical Dirichlet Processes (HDP), remain prohibitively slow. One promising path forward is to exploit the submodularity inherent in Indian Buffet Process (IBP) to derive near-optimal solutions in polynomial time. In this work, I will present a brief tutorial on Bayesian nonparame...

  5. Two-Stage Bayesian Model Averaging in Endogenous Variable Models.

    Science.gov (United States)

    Lenkoski, Alex; Eicher, Theo S; Raftery, Adrian E

    2014-01-01

    Economic modeling in the presence of endogeneity is subject to model uncertainty at both the instrument and covariate level. We propose a Two-Stage Bayesian Model Averaging (2SBMA) methodology that extends the Two-Stage Least Squares (2SLS) estimator. By constructing a Two-Stage Unit Information Prior in the endogenous variable model, we are able to efficiently combine established methods for addressing model uncertainty in regression models with the classic technique of 2SLS. To assess the validity of instruments in the 2SBMA context, we develop Bayesian tests of the identification restriction that are based on model averaged posterior predictive p-values. A simulation study showed that 2SBMA has the ability to recover structure in both the instrument and covariate set, and substantially improves the sharpness of resulting coefficient estimates in comparison to 2SLS using the full specification in an automatic fashion. Due to the increased parsimony of the 2SBMA estimate, the Bayesian Sargan test had a power of 50 percent in detecting a violation of the exogeneity assumption, while the method based on 2SLS using the full specification had negligible power. We apply our approach to the problem of development accounting, and find support not only for institutions, but also for geography and integration as development determinants, once both model uncertainty and endogeneity have been jointly addressed. PMID:24223471

  6. Bayesian Modeling of ChIP-chip Data Through a High-Order Ising Model

    KAUST Repository

    Mo, Qianxing

    2010-01-29

    ChIP-chip experiments are procedures that combine chromatin immunoprecipitation (ChIP) and DNA microarray (chip) technology to study a variety of biological problems, including protein-DNA interaction, histone modification, and DNA methylation. The most important feature of ChIP-chip data is that the intensity measurements of probes are spatially correlated because the DNA fragments are hybridized to neighboring probes in the experiments. We propose a simple, but powerful Bayesian hierarchical approach to ChIP-chip data through an Ising model with high-order interactions. The proposed method naturally takes into account the intrinsic spatial structure of the data and can be used to analyze data from multiple platforms with different genomic resolutions. The model parameters are estimated using the Gibbs sampler. The proposed method is illustrated using two publicly available data sets from Affymetrix and Agilent platforms, and compared with three alternative Bayesian methods, namely, Bayesian hierarchical model, hierarchical gamma mixture model, and Tilemap hidden Markov model. The numerical results indicate that the proposed method performs as well as the other three methods for the data from Affymetrix tiling arrays, but significantly outperforms the other three methods for the data from Agilent promoter arrays. In addition, we find that the proposed method has better operating characteristics in terms of sensitivities and false discovery rates under various scenarios. © 2010, The International Biometric Society.

  7. Bayesian model reduction and empirical Bayes for group (DCM) studies.

    Science.gov (United States)

    Friston, Karl J; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E; van Wijk, Bernadette C M; Ziegler, Gabriel; Zeidman, Peter

    2016-03-01

    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level - e.g., dynamic causal models - and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. PMID:26569570

  8. A space-time multivariate Bayesian model to analyse road traffic accidents by severity

    OpenAIRE

    Boulieri, A; Liverani, S; Hoogh, K. de; Blangiardo, M.

    2016-01-01

    The paper investigates the dependences between levels of severity of road traffic accidents, accounting at the same time for spatial and temporal correlations. The study analyses road traffic accidents data at ward level in England over the period 2005–2013. We include in our model multivariate spatially structured and unstructured effects to capture the dependences between severities, within a Bayesian hierarchical formulation. We also include a temporal component to capture the time effects...

  9. Sampling Techniques in Bayesian Finite Element Model Updating

    CERN Document Server

    Boulkaibet, I; Mthembu, L; Friswell, M I; Adhikari, S

    2011-01-01

    Recent papers in the field of Finite Element Model (FEM) updating have highlighted the benefits of Bayesian techniques. The Bayesian approaches are designed to deal with the uncertainties associated with complex systems, which is the main problem in the development and updating of FEMs. This paper highlights the complexities and challenges of implementing any Bayesian method when the analysis involves a complicated structural dynamic model. In such systems an analytical Bayesian formulation might not be available in an analytic form; therefore this leads to the use of numerical methods, i.e. sampling methods. The main challenge then is to determine an efficient sampling of the model parameter space. In this paper, three sampling techniques, the Metropolis-Hastings (MH) algorithm, Slice Sampling and the Hybrid Monte Carlo (HMC) technique, are tested by updating a structural beam model. The efficiency and limitations of each technique is investigated when the FEM updating problem is implemented using the Bayesi...

  10. Efficient Nonparametric Bayesian Modelling with Sparse Gaussian Process Approximations

    OpenAIRE

    Seeger, Matthias; Lawrence, Neil; Herbrich, Ralf

    2006-01-01

    Sparse approximations to Bayesian inference for nonparametric Gaussian Process models scale linearly in the number of training points, allowing for the application of powerful kernel-based models to large datasets. We present a general framework based on the informative vector machine (IVM) (Lawrence et.al., 2002) and show how the complete Bayesian task of inference and learning of free hyperparameters can be performed in a practically efficient manner. Our framework allows for arbitrary like...

  11. Modelling biogeochemical cycles in forest ecosystems: a Bayesian approach

    OpenAIRE

    Bagnara, Maurizio

    2015-01-01

    Forest models are tools for explaining and predicting the dynamics of forest ecosystems. They simulate forest behavior by integrating information on the underlying processes in trees, soil and atmosphere. Bayesian calibration is the application of probability theory to parameter estimation. It is a method, applicable to all models, that quantifies output uncertainty and identifies key parameters and variables. This study aims at testing the Bayesian procedure for calibration to different t...

  12. Bayesian inverse modeling at the hydrological surface-subsurface interface

    Science.gov (United States)

    Cucchi, K.; Rubin, Y.

    2014-12-01

    In systems where surface and subsurface hydrological domains are highly connected, modeling surface and subsurface flow jointly is essential to accurately represent the physical processes and come up with reliable predictions of flows in river systems or stream-aquifer exchange. The flow quantification at the interface merging the two hydrosystem components is a function of both surface and subsurface spatially distributed parameters. In the present study, we apply inverse modeling techniques to a synthetic catchment with connected surface and subsurface hydrosystems. The model is physically-based and implemented with the Gridded Surface Subsurface Hydrologic Analysis software. On the basis of hydrograph measurement at the catchment outlet, we estimate parameters such as saturated hydraulic conductivity, overland and channel roughness coefficients. We compare maximum likelihood estimates (ML) with the parameter distributions obtained using the Bayesian statistical framework for spatially random fields provided by the Method of Anchored Distributions (MAD). While ML estimates maximize the probability of observing the data and capture the global trend of the target variables, MAD focuses on obtaining a probability distribution for the random unknown parameters and the anchors are designed to capture local features. We check the consistency between the two approaches and evaluate the additional information provided by MAD on parameter distributions. We also assess the contribution of adding new types of measurements such as water table depth or soil conductivity to the reduction of parameter uncertainty.

  13. A unified Bayesian hierarchical model for MRI tissue classification.

    Science.gov (United States)

    Feng, Dai; Liang, Dong; Tierney, Luke

    2014-04-15

    Various works have used magnetic resonance imaging (MRI) tissue classification extensively to study a number of neurological and psychiatric disorders. Various noise characteristics and other artifacts make this classification a challenging task. Instead of splitting the procedure into different steps, we extend a previous work to develop a unified Bayesian hierarchical model, which addresses both the partial volume effect and intensity non-uniformity, the two major acquisition artifacts, simultaneously. We adopted a normal mixture model with the means and variances depending on the tissue types of voxels to model the observed intensity values. We modeled the relationship among the components of the index vector of tissue types by a hidden Markov model, which captures the spatial similarity of voxels. Furthermore, we addressed the partial volume effect by construction of a higher resolution image in which each voxel is divided into subvoxels. Finally, We achieved the bias field correction by using a Gaussian Markov random field model with a band precision matrix designed in light of image filtering. Sparse matrix methods and parallel computations based on conditional independence are exploited to improve the speed of the Markov chain Monte Carlo simulation. The unified model provides more accurate tissue classification results for both simulated and real data sets. PMID:24738112

  14. Bayesball: A Bayesian hierarchical model for evaluating fielding in major league baseball

    OpenAIRE

    Jensen, Shane T.; Shirley, Kenneth E.; Wyner, Abraham J.

    2008-01-01

    The use of statistical modeling in baseball has received substantial attention recently in both the media and academic community. We focus on a relatively under-explored topic: the use of statistical models for the analysis of fielding based on high-resolution data consisting of on-field location of batted balls. We combine spatial modeling with a hierarchical Bayesian structure in order to evaluate the performance of individual fielders while sharing information between fielders at each posi...

  15. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Vedel Jensen, Eva B.

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample f...

  16. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Jensen, Eva B. Vedel

    2007-01-01

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample f...

  17. Bayesian Inference and Optimal Design in the Sparse Linear Model

    OpenAIRE

    Seeger, Matthias; Steinke, Florian; Tsuda, Koji

    2007-01-01

    The sparse linear model has seen many successful applications in Statistics, Machine Learning, and Computational Biology, such as identification of gene regulatory networks from micro-array expression data. Prior work has either approximated Bayesian inference by expensive Markov chain Monte Carlo, or replaced it by point estimation. We show how to obtain a good approximation to Bayesian analysis efficiently, using the Expectation Propagation method. We also address the problems of optimal de...

  18. Improved calibration of organic SST proxies via community-driven Bayesian, spatially-varying regression

    Science.gov (United States)

    Tierney, J. E.; Tingley, M.

    2013-12-01

    Improving the calibration of SST proxies is fundamental to providing accurate estimates of past changes in sea-surface temperatures. Existing calibrations assume spatially and temporally constant regression terms, but this may not adequately capture the influence of both known and unknown secondary environmental factors on proxy response. As an alternative, we propose a BAYesian, SPAtially-varying Regression model (BAYSPAR) for general application to marine organic geochemical SST proxies. The calibration model treats regression parameters as slowly-varying functions in space and allows for a full propagation of errors in both the proxy and the SST field. Initial application of the technique to the TEX86 proxy demonstrates that it yields better-behaved residuals than previous calibrations and therefore improves SST estimates in certain regions. Two different prediction models allow users to apply to the calibration to either Neogene or "deep-time" data, the latter of which uses an analog approach. Traditionally, calibrations for SST proxies are updated incrementally via individual publications over a period of many years, and in some cases the coretop collections that form these calibrations are left unarchived. To facilitate both up-to-date prediction and data archiving, BAYSPAR will be designed to reflect community-based improvements in knowledge and data in real time via a semi-autonomous updating process. Users may enter new coretop data into a portal on the web, and after a screening procedure, the data will be added to the calibration model, which will then be autonomously updated and made available to the users. In this way, calibration of SST proxies becomes a community-driven process.

  19. Modelling of JET diagnostics using Bayesian Graphical Models

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.

    2011-07-01

    The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This

  20. Bayesian model discrimination for glucose-insulin homeostasis

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Brooks, Stephen P.; Højbjerre, Malene

    the reformulation of existing deterministic models as stochastic state space models which properly accounts for both measurement and process variability. The analysis is further enhanced by Bayesian model discrimination techniques and model averaged parameter estimation which fully accounts for model as well......In this paper we analyse a set of experimental data on a number of healthy and diabetic patients and discuss a variety of models for describing the physiological processes involved in glucose absorption and insulin secretion within the human body. We adopt a Bayesian approach which facilitates...

  1. Using consensus bayesian network to model the reactive oxygen species regulatory pathway.

    Directory of Open Access Journals (Sweden)

    Liangdong Hu

    Full Text Available Bayesian network is one of the most successful graph models for representing the reactive oxygen species regulatory pathway. With the increasing number of microarray measurements, it is possible to construct the bayesian network from microarray data directly. Although large numbers of bayesian network learning algorithms have been developed, when applying them to learn bayesian networks from microarray data, the accuracies are low due to that the databases they used to learn bayesian networks contain too few microarray data. In this paper, we propose a consensus bayesian network which is constructed by combining bayesian networks from relevant literatures and bayesian networks learned from microarray data. It would have a higher accuracy than the bayesian networks learned from one database. In the experiment, we validated the bayesian network combination algorithm on several classic machine learning databases and used the consensus bayesian network to model the Escherichia coli's ROS pathway.

  2. Spatial Bayesian belief networks as a planning decision tool for mapping ecosystem services trade-offs on forested landscapes.

    Science.gov (United States)

    Gonzalez-Redin, Julen; Luque, Sandra; Poggio, Laura; Smith, Ron; Gimona, Alessandro

    2016-01-01

    An integrated methodology, based on linking Bayesian belief networks (BBN) with GIS, is proposed for combining available evidence to help forest managers evaluate implications and trade-offs between forest production and conservation measures to preserve biodiversity in forested habitats. A Bayesian belief network is a probabilistic graphical model that represents variables and their dependencies through specifying probabilistic relationships. In spatially explicit decision problems where it is difficult to choose appropriate combinations of interventions, the proposed integration of a BBN with GIS helped to facilitate shared understanding of the human-landscape relationships, while fostering collective management that can be incorporated into landscape planning processes. Trades-offs become more and more relevant in these landscape contexts where the participation of many and varied stakeholder groups is indispensable. With these challenges in mind, our integrated approach incorporates GIS-based data with expert knowledge to consider two different land use interests - biodiversity value for conservation and timber production potential - with the focus on a complex mountain landscape in the French Alps. The spatial models produced provided different alternatives of suitable sites that can be used by policy makers in order to support conservation priorities while addressing management options. The approach provided provide a common reasoning language among different experts from different backgrounds while helped to identify spatially explicit conflictive areas. PMID:26597639

  3. Hellinger Distance and Bayesian Non-Parametrics: Hierarchical Models for Robust and Efficient Bayesian Inference

    OpenAIRE

    Wu, Yuefeng; Hooker, Giles

    2013-01-01

    This paper introduces a hierarchical framework to incorporate Hellinger distance methods into Bayesian analysis. We propose to modify a prior over non-parametric densities with the exponential of twice the Hellinger distance between a candidate and a parametric density. By incorporating a prior over the parameters of the second density, we arrive at a hierarchical model in which a non-parametric model is placed between parameters and the data. The parameters of the family can then be estimate...

  4. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  5. Lack of Confidence in Approximate Bayesian Computation Model Choice

    OpenAIRE

    Robert, Christian P.; Cornuet, Jean-Marie; Marin, Jean-Michel; Pillai, Natesh S.

    2011-01-01

    Approximate Bayesian computation (ABC) have become an essential tool for the analysis of complex stochastic models. Grelaud et al. [(2009) Bayesian Anal 3:427–442] advocated the use of ABC for model choice in the specific case of Gibbs random fields, relying on an intermodel sufficiency property to show that the approximation was legitimate. We implemented ABC model choice in a wide range of phylogenetic models in the Do It Yourself-ABC (DIY-ABC) software [Cornuet et al. (2008) Bioinformatics...

  6. Cosmological parameter estimation and Bayesian model comparison using VSA data

    CERN Document Server

    Slosar, A; Cleary, K; Davies, R D; Davis, R J; Dickinson, C; Genova-Santos, R; Grainge, K; Gutíerrez, C M; Hafez, Y A; Hobson, M P; Jones, M E; Kneissl, R; Lancaster, K; Lasenby, A; Leahy, J P; Maisinger, K; Marshall, P J; Pooley, G G; Rebolo, R; Rubiño-Martín, J A; Rusholme, B A; Saunders, R D E; Savage, R; Scott, P F; Molina, P J S; Taylor, A C; Titterington, D; Waldram, E M; Watson, R A; Wilkinson, A; Slosar, Anze; Carreira, Pedro; Cleary, Kieran; Davies, Rod D.; Davis, Richard J.; Dickinson, Clive; Genova-Santos, Ricardo; Grainge, Keith; Gutierrez, Carlos M.; Hafez, Yaser A.; Hobson, Michael P.; Jones, Michael E.; Kneissl, Rudiger; Lancaster, Katy; Lasenby, Anthony; Maisinger, Klaus; Marshall, Phil J.; Pooley, Guy G.; Rebolo, Rafael; Rubino-Martin, Jose Alberto; Rusholme, Ben; Saunders, Richard D. E.; Savage, Richard; Scott, Paul F.; Molina, Pedro J. Sosa; Taylor, Angela C.; Titterington, David; Waldram, Elizabeth; Watson, Robert A.; Wilkinson, Althea

    2003-01-01

    We constrain the basic comological parameters using the first observations by the Very Small Array (VSA) in its extended configuration, together with existing cosmic microwave background data and other cosmological observations. We estimate cosmological parameters for four different models of increasing complexity. In each case, careful consideration is given to implied priors and the Bayesian evidence is calculated in order to perform model selection. We find that the data are most convincingly explained by a simple flat Lambda-CDM cosmology without tensor modes. In this case, combining just the VSA and COBE data sets yields the 68 per cent confidence intervals Omega_b h^2=0.034 (+0.007, -0.007), Omega_dm h^2 = 0.18 (+0.06, -0.04), h=0.72 (+0.15,-0.13), n_s=1.07 (+0.06,-0.06) and sigma_8=1.17 (+0.25, -0.20). The most general model considered includes spatial curvature, tensor modes, massive neutrinos and a parameterised equation of state for the dark energy. In this case, by combining all recent cosmological...

  7. On the Bayesian Nonparametric Generalization of IRT-Type Models

    Science.gov (United States)

    San Martin, Ernesto; Jara, Alejandro; Rolin, Jean-Marie; Mouchart, Michel

    2011-01-01

    We study the identification and consistency of Bayesian semiparametric IRT-type models, where the uncertainty on the abilities' distribution is modeled using a prior distribution on the space of probability measures. We show that for the semiparametric Rasch Poisson counts model, simple restrictions ensure the identification of a general…

  8. Bayesian inference model for fatigue life of laminated composites

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der; Berggreen, Christian

    2016-01-01

    A probabilistic model for estimating the fatigue life of laminated composite plates is developed. The model is based on lamina-level input data, making it possible to predict fatigue properties for a wide range of laminate configurations. Model parameters are estimated by Bayesian inference. The...

  9. Modelling LGD for unsecured retail loans using Bayesian methods

    OpenAIRE

    Katarzyna Bijak; Thomas, Lyn C

    2015-01-01

    Loss Given Default (LGD) is the loss borne by the bank when a customer defaults on a loan. LGD for unsecured retail loans is often found difficult to model. In the frequentist (non-Bayesian) two-step approach, two separate regression models are estimated independently, which can be considered potentially problematic when trying to combine them to make predictions about LGD. The result is a point estimate of LGD for each loan. Alternatively, LGD can be modelled using Bayesian methods. In the B...

  10. A Bayesian Matrix Factorization Model for Relational Data

    CERN Document Server

    Singh, Ajit P

    2012-01-01

    Relational learning can be used to augment one data source with other correlated sources of information, to improve predictive accuracy. We frame a large class of relational learning problems as matrix factorization problems, and propose a hierarchical Bayesian model. Training our Bayesian model using random-walk Metropolis-Hastings is impractically slow, and so we develop a block Metropolis- Hastings sampler which uses the gradient and Hessian of the likelihood to dynamically tune the proposal. We demonstrate that a predictive model of brain response to stimuli can be improved by augmenting it with side information about the stimuli.

  11. Bayesian inference of chemical kinetic models from proposed reactions

    KAUST Repository

    Galagali, Nikhil

    2015-02-01

    © 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.

  12. The Bayesian Modelling Of Inflation Rate In Romania

    Directory of Open Access Journals (Sweden)

    Mihaela Simionescu (Bratu

    2014-06-01

    Full Text Available Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estimation was presented, realizing two empirical studies for data taken from the Romanian economy. Thus, an autoregressive model of order 2 and a multiple regression model were built for the index of consumer prices. The Gibbs sampling algorithm was used for estimation in R software, computing the posterior means and the standard deviations. The parameters’ stability proved to be greater than in the case of estimations based on the methods of classical Econometrics.

  13. Bayesian Subset Modeling for High-Dimensional Generalized Linear Models

    KAUST Repository

    Liang, Faming

    2013-06-01

    This article presents a new prior setting for high-dimensional generalized linear models, which leads to a Bayesian subset regression (BSR) with the maximum a posteriori model approximately equivalent to the minimum extended Bayesian information criterion model. The consistency of the resulting posterior is established under mild conditions. Further, a variable screening procedure is proposed based on the marginal inclusion probability, which shares the same properties of sure screening and consistency with the existing sure independence screening (SIS) and iterative sure independence screening (ISIS) procedures. However, since the proposed procedure makes use of joint information from all predictors, it generally outperforms SIS and ISIS in real applications. This article also makes extensive comparisons of BSR with the popular penalized likelihood methods, including Lasso, elastic net, SIS, and ISIS. The numerical results indicate that BSR can generally outperform the penalized likelihood methods. The models selected by BSR tend to be sparser and, more importantly, of higher prediction ability. In addition, the performance of the penalized likelihood methods tends to deteriorate as the number of predictors increases, while this is not significant for BSR. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  14. Involving Stakeholders in Building Integrated Fisheries Models Using Bayesian Methods

    Science.gov (United States)

    Haapasaari, Päivi; Mäntyniemi, Samu; Kuikka, Sakari

    2013-06-01

    A participatory Bayesian approach was used to investigate how the views of stakeholders could be utilized to develop models to help understand the Central Baltic herring fishery. In task one, we applied the Bayesian belief network methodology to elicit the causal assumptions of six stakeholders on factors that influence natural mortality, growth, and egg survival of the herring stock in probabilistic terms. We also integrated the expressed views into a meta-model using the Bayesian model averaging (BMA) method. In task two, we used influence diagrams to study qualitatively how the stakeholders frame the management problem of the herring fishery and elucidate what kind of causalities the different views involve. The paper combines these two tasks to assess the suitability of the methodological choices to participatory modeling in terms of both a modeling tool and participation mode. The paper also assesses the potential of the study to contribute to the development of participatory modeling practices. It is concluded that the subjective perspective to knowledge, that is fundamental in Bayesian theory, suits participatory modeling better than a positivist paradigm that seeks the objective truth. The methodology provides a flexible tool that can be adapted to different kinds of needs and challenges of participatory modeling. The ability of the approach to deal with small data sets makes it cost-effective in participatory contexts. However, the BMA methodology used in modeling the biological uncertainties is so complex that it needs further development before it can be introduced to wider use in participatory contexts.

  15. Applications of Bayesian Model Selection to Cosmological Parameters

    CERN Document Server

    Trotta, R

    2005-01-01

    Bayesian evidence is a tool for model comparison which can be used to decide whether the introduction of a new parameter is warranted by data. I show that the usual sampling statistic rejection tests for a null hypothesis can be misleading, since they do not take into account the information content of the data. I review the Laplace approximation and the Savage-Dickey density ratio to compute Bayes factors, which avoid the need of carrying out a computationally demanding multi-dimensional integration. I present a new procedure to forecast the Bayes factor of a future observation by computing the Expected Posterior Odds (ExPO). As an illustration, I consider three key parameters for our understanding of the cosmological concordance model: the spectral tilt of scalar perturbations, the spatial curvature of the Universe and a CDM isocurvature component to the initial conditions which is totally (anti)correlated with the adiabatic mode. I find that current data are not informative enough to draw a conclusion on t...

  16. Bayesian modeling and prediction of solar particles flux

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Kalová, J.

    Praha: FJFI ČVUT v Praze, 2009 - (Štěpán, V.), s. 77-77 ISBN 978-80-01-04430-8. [XXXI. Dny radiační ochrany. Kouty nad Desnou, Hrubý Jeseník (CZ), 02.11.2009-06.11.2009] R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian model * solar particle * solar wind Subject RIV: IN - Informatics, Computer Science http://library.utia.cas.cz/separaty/2009/AS/dedecius-bayesian modeling and prediction of solar particle s flux.pdf

  17. Research & development and growth: A Bayesian model averaging analysis

    Czech Academy of Sciences Publication Activity Database

    Horváth, Roman

    2011-01-01

    Roč. 28, č. 6 (2011), s. 2669-2673. ISSN 0264-9993. [Society for Non-linear Dynamics and Econometrics Annual Conferencen. Washington DC, 16.03.2011-18.03.2011] R&D Projects: GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Keywords : Research and development * Growth * Bayesian model averaging Subject RIV: AH - Economics Impact factor: 0.701, year: 2011 http://library.utia.cas.cz/separaty/2011/E/horvath-research & development and growth a bayesian model averaging analysis.pdf

  18. Approximate Bayesian Recursive Estimation of Linear Model with Uniform Noise

    Czech Academy of Sciences Publication Activity Database

    Pavelková, Lenka; Kárný, Miroslav

    Brussels: IFAC, 2012, s. 1803-1807. ISBN 978-3-902823-06-9. [16th IFAC Symposium on System Identification The International Federation of Automatic Control. Brussels (BE), 11.07.2012-13.07.2012] R&D Projects: GA TA ČR TA01030123 Institutional support: RVO:67985556 Keywords : recursive parameter estimation * bounded noise * Bayesian learning * autoregressive models Subject RIV: BC - Control System s Theory http://library.utia.cas.cz/separaty/2012/AS/pavelkova-approximate bayesian recursive estimation of linear model with uniform noise.pdf

  19. Comparing Bayesian models for multisensory cue combination without mandatory integration

    OpenAIRE

    Beierholm, Ulrik R.; Shams, Ladan; Kording, Konrad P; Ma, Wei Ji

    2009-01-01

    Bayesian models of multisensory perception traditionally address the problem of estimating an underlying variable that is assumed to be the cause of the two sensory signals. The brain, however, has to solve a more general problem: it also has to establish which signals come from the same source and should be integrated, and which ones do not and should be segregated. In the last couple of years, a few models have been proposed to solve this problem in a Bayesian fashion. One of these ha...

  20. Bayesian model mixing for cold rolling mills: Test results

    Czech Academy of Sciences Publication Activity Database

    Ettler, P.; Puchr, I.; Dedecius, Kamil

    Slovensko: Slovak University of Technology, 2013, s. 359-364. ISBN 978-1-4799-0926-1. [19th International Conference on Process Control . Štrbské Pleso (SK), 18.06.2013-21.06.2013] R&D Projects: GA MŠk(CZ) 7D09008; GA MŠk 7D12004 Keywords : Bayesian statistics * model mixing * process control Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2013/AS/dedecius-bayesian model mixing for cold rolling mills test results.pdf

  1. Bayesian Model Comparison With the g-Prior

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan;

    2014-01-01

    Model comparison and selection is an important problem in many model-based signal processing applications. Often, very simple information criteria such as the Akaike information criterion or the Bayesian information criterion are used despite their shortcomings. Compared to these methods, Djuric’...

  2. Bayesian Estimation of the DINA Model with Gibbs Sampling

    Science.gov (United States)

    Culpepper, Steven Andrew

    2015-01-01

    A Bayesian model formulation of the deterministic inputs, noisy "and" gate (DINA) model is presented. Gibbs sampling is employed to simulate from the joint posterior distribution of item guessing and slipping parameters, subject attribute parameters, and latent class probabilities. The procedure extends concepts in Béguin and Glas,…

  3. Socio-ecological factors and hand, foot and mouth disease in dry climate regions: a Bayesian spatial approach in Gansu, China

    Science.gov (United States)

    Gou, Faxiang; Liu, Xinfeng; Ren, Xiaowei; Liu, Dongpeng; Liu, Haixia; Wei, Kongfu; Yang, Xiaoting; Cheng, Yao; Zheng, Yunhe; Jiang, Xiaojuan; Li, Juansheng; Meng, Lei; Hu, Wenbiao

    2016-06-01

    The influence of socio-ecological factors on hand, foot and mouth disease (HFMD) were explored in this study using Bayesian spatial modeling and spatial patterns identified in dry regions of Gansu, China. Notified HFMD cases and socio-ecological data were obtained from the China Information System for Disease Control and Prevention, Gansu Yearbook and Gansu Meteorological Bureau. A Bayesian spatial conditional autoregressive model was used to quantify the effects of socio-ecological factors on the HFMD and explore spatial patterns, with the consideration of its socio-ecological effects. Our non-spatial model suggests temperature (relative risk (RR) 1.15, 95 % CI 1.01-1.31), GDP per capita (RR 1.19, 95 % CI 1.01-1.39) and population density (RR 1.98, 95 % CI 1.19-3.17) to have a significant effect on HFMD transmission. However, after controlling for spatial random effects, only temperature (RR 1.25, 95 % CI 1.04-1.53) showed significant association with HFMD. The spatial model demonstrates temperature to play a major role in the transmission of HFMD in dry regions. Estimated residual variation after taking into account the socio-ecological variables indicated that high incidences of HFMD were mainly clustered in the northwest of Gansu. And, spatial structure showed a unique distribution after taking account of socio-ecological effects.

  4. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  5. Exploring the Influence of Neighborhood Characteristics on Burglary Risks: A Bayesian Random Effects Modeling Approach

    Directory of Open Access Journals (Sweden)

    Hongqiang Liu

    2016-06-01

    Full Text Available A Bayesian random effects modeling approach was used to examine the influence of neighborhood characteristics on burglary risks in Jianghan District, Wuhan, China. This random effects model is essentially spatial; a spatially structured random effects term and an unstructured random effects term are added to the traditional non-spatial Poisson regression model. Based on social disorganization and routine activity theories, five covariates extracted from the available data at the neighborhood level were used in the modeling. Three regression models were fitted and compared by the deviance information criterion to identify which model best fit our data. A comparison of the results from the three models indicates that the Bayesian random effects model is superior to the non-spatial models in fitting the data and estimating regression coefficients. Our results also show that neighborhoods with above average bar density and department store density have higher burglary risks. Neighborhood-specific burglary risks and posterior probabilities of neighborhoods having a burglary risk greater than 1.0 were mapped, indicating the neighborhoods that should warrant more attention and be prioritized for crime intervention and reduction. Implications and limitations of the study are discussed in our concluding section.

  6. Bayesian Uncertainty Quantification for Subsurface Inversion Using a Multiscale Hierarchical Model

    KAUST Repository

    Mondal, Anirban

    2014-07-03

    We consider a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a random field (spatial or temporal). The Bayesian approach contains a natural mechanism for regularization in the form of prior information, can incorporate information from heterogeneous sources and provide a quantitative assessment of uncertainty in the inverse solution. The Bayesian setting casts the inverse solution as a posterior probability distribution over the model parameters. The Karhunen-Loeve expansion is used for dimension reduction of the random field. Furthermore, we use a hierarchical Bayes model to inject multiscale data in the modeling framework. In this Bayesian framework, we show that this inverse problem is well-posed by proving that the posterior measure is Lipschitz continuous with respect to the data in total variation norm. Computational challenges in this construction arise from the need for repeated evaluations of the forward model (e.g., in the context of MCMC) and are compounded by high dimensionality of the posterior. We develop two-stage reversible jump MCMC that has the ability to screen the bad proposals in the first inexpensive stage. Numerical results are presented by analyzing simulated as well as real data from hydrocarbon reservoir. This article has supplementary material available online. © 2014 American Statistical Association and the American Society for Quality.

  7. Bayesian Joint Modelling for Object Localisation in Weakly Labelled Images.

    Science.gov (United States)

    Shi, Zhiyuan; Hospedales, Timothy M; Xiang, Tao

    2015-10-01

    We address the problem of localisation of objects as bounding boxes in images and videos with weak labels. This weakly supervised object localisation problem has been tackled in the past using discriminative models where each object class is localised independently from other classes. In this paper, a novel framework based on Bayesian joint topic modelling is proposed, which differs significantly from the existing ones in that: (1) All foreground object classes are modelled jointly in a single generative model that encodes multiple object co-existence so that "explaining away" inference can resolve ambiguity and lead to better learning and localisation. (2) Image backgrounds are shared across classes to better learn varying surroundings and "push out" objects of interest. (3) Our model can be learned with a mixture of weakly labelled and unlabelled data, allowing the large volume of unlabelled images on the Internet to be exploited for learning. Moreover, the Bayesian formulation enables the exploitation of various types of prior knowledge to compensate for the limited supervision offered by weakly labelled data, as well as Bayesian domain adaptation for transfer learning. Extensive experiments on the PASCAL VOC, ImageNet and YouTube-Object videos datasets demonstrate the effectiveness of our Bayesian joint model for weakly supervised object localisation. PMID:26340253

  8. Modeling error distributions of growth curve models through Bayesian methods.

    Science.gov (United States)

    Zhang, Zhiyong

    2016-06-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided. PMID:26019004

  9. Asymptotically minimax Bayesian predictive densities for multinomial models

    CERN Document Server

    Komaki, Fumiyasu

    2011-01-01

    One-step ahead prediction for the multinomial model is considered. The performance of a predictive density is evaluated by the average Kullback-Leibler divergence from the true density to the predictive density. Asymptotic approximations of risk functions of Bayesian predictive densities based on Dirichlet priors are obtained. It is shown that a Bayesian predictive density based on a specific Dirichlet prior is asymptotically minimax. The asymptotically minimax prior is different from known objective priors such as the Jeffreys prior or the uniform prior.

  10. Uncertainty Modeling Based on Bayesian Network in Ontology Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Yuhua; LIU Tao; SUN Xiaolin

    2006-01-01

    How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.

  11. Improving Bayesian Reasoning: The Effects of Phrasing, Visualization, and Spatial Ability.

    Science.gov (United States)

    Ottley, Alvitta; Peck, Evan M; Harrison, Lane T; Afergan, Daniel; Ziemkiewicz, Caroline; Taylor, Holly A; Han, Paul K J; Chang, Remco

    2016-01-01

    Decades of research have repeatedly shown that people perform poorly at estimating and understanding conditional probabilities that are inherent in Bayesian reasoning problems. Yet in the medical domain, both physicians and patients make daily, life-critical judgments based on conditional probability. Although there have been a number of attempts to develop more effective ways to facilitate Bayesian reasoning, reports of these findings tend to be inconsistent and sometimes even contradictory. For instance, the reported accuracies for individuals being able to correctly estimate conditional probability range from 6% to 62%. In this work, we show that problem representation can significantly affect accuracies. By controlling the amount of information presented to the user, we demonstrate how text and visualization designs can increase overall accuracies to as high as 77%. Additionally, we found that for users with high spatial ability, our designs can further improve their accuracies to as high as 100%. By and large, our findings provide explanations for the inconsistent reports on accuracy in Bayesian reasoning tasks and show a significant improvement over existing methods. We believe that these findings can have immediate impact on risk communication in health-related fields. PMID:26390491

  12. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    2000-01-01

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...

  13. Bayesian Modelling of fMRI Time Series

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward

    We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...

  14. Bayesian nonparametric estimation of hazard rate in monotone Aalen model

    Czech Academy of Sciences Publication Activity Database

    Timková, Jana

    2014-01-01

    Roč. 50, č. 6 (2014), s. 849-868. ISSN 0023-5954 Institutional support: RVO:67985556 Keywords : Aalen model * Bayesian estimation * MCMC Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.541, year: 2014 http://library.utia.cas.cz/separaty/2014/SI/timkova-0438210.pdf

  15. Research on Bayesian Network Based User's Interest Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Weifeng; XU Baowen; CUI Zifeng; XU Lei

    2007-01-01

    It has very realistic significance for improving the quality of users' accessing information to filter and selectively retrieve the large number of information on the Internet. On the basis of analyzing the existing users' interest models and some basic questions of users' interest (representation, derivation and identification of users' interest), a Bayesian network based users' interest model is given. In this model, the users' interest reduction algorithm based on Markov Blanket model is used to reduce the interest noise, and then users' interested and not interested documents are used to train the Bayesian network. Compared to the simple model, this model has the following advantages like small space requirements, simple reasoning method and high recognition rate. The experiment result shows this model can more appropriately reflect the user's interest, and has higher performance and good usability.

  16. Exploring Neighborhood Influences on Small-Area Variations in Intimate Partner Violence Risk: A Bayesian Random-Effects Modeling Approach

    Directory of Open Access Journals (Sweden)

    Enrique Gracia

    2014-01-01

    Full Text Available This paper uses spatial data of cases of intimate partner violence against women (IPVAW to examine neighborhood-level influences on small-area variations in IPVAW risk in a police district of the city of Valencia (Spain. To analyze area variations in IPVAW risk and its association with neighborhood-level explanatory variables we use a Bayesian spatial random-effects modeling approach, as well as disease mapping methods to represent risk probabilities in each area. Analyses show that IPVAW cases are more likely in areas of high immigrant concentration, high public disorder and crime, and high physical disorder. Results also show a spatial component indicating remaining variability attributable to spatially structured random effects. Bayesian spatial modeling offers a new perspective to identify IPVAW high and low risk areas, and provides a new avenue for the design of better-informed prevention and intervention strategies.

  17. Bayesian estimation of parameters in a regional hydrological model

    Directory of Open Access Journals (Sweden)

    K. Engeland

    2002-01-01

    Full Text Available This study evaluates the applicability of the distributed, process-oriented Ecomag model for prediction of daily streamflow in ungauged basins. The Ecomag model is applied as a regional model to nine catchments in the NOPEX area, using Bayesian statistics to estimate the posterior distribution of the model parameters conditioned on the observed streamflow. The distribution is calculated by Markov Chain Monte Carlo (MCMC analysis. The Bayesian method requires formulation of a likelihood function for the parameters and three alternative formulations are used. The first is a subjectively chosen objective function that describes the goodness of fit between the simulated and observed streamflow, as defined in the GLUE framework. The second and third formulations are more statistically correct likelihood models that describe the simulation errors. The full statistical likelihood model describes the simulation errors as an AR(1 process, whereas the simple model excludes the auto-regressive part. The statistical parameters depend on the catchments and the hydrological processes and the statistical and the hydrological parameters are estimated simultaneously. The results show that the simple likelihood model gives the most robust parameter estimates. The simulation error may be explained to a large extent by the catchment characteristics and climatic conditions, so it is possible to transfer knowledge about them to ungauged catchments. The statistical models for the simulation errors indicate that structural errors in the model are more important than parameter uncertainties. Keywords: regional hydrological model, model uncertainty, Bayesian analysis, Markov Chain Monte Carlo analysis

  18. A Bayesian Markov geostatistical model for estimation of hydrogeological properties

    International Nuclear Information System (INIS)

    A geostatistical methodology based on Markov-chain analysis and Bayesian statistics was developed for probability estimations of hydrogeological and geological properties in the siting process of a nuclear waste repository. The probability estimates have practical use in decision-making on issues such as siting, investigation programs, and construction design. The methodology is nonparametric which makes it possible to handle information that does not exhibit standard statistical distributions, as is often the case for classified information. Data do not need to meet the requirements on additivity and normality as with the geostatistical methods based on regionalized variable theory, e.g., kriging. The methodology also has a formal way for incorporating professional judgments through the use of Bayesian statistics, which allows for updating of prior estimates to posterior probabilities each time new information becomes available. A Bayesian Markov Geostatistical Model (BayMar) software was developed for implementation of the methodology in two and three dimensions. This paper gives (1) a theoretical description of the Bayesian Markov Geostatistical Model; (2) a short description of the BayMar software; and (3) an example of application of the model for estimating the suitability for repository establishment with respect to the three parameters of lithology, hydraulic conductivity, and rock quality designation index (RQD) at 400--500 meters below ground surface in an area around the Aespoe Hard Rock Laboratory in southeastern Sweden

  19. Bayesian and maximin optimal designs for heteroscedastic regression models

    OpenAIRE

    Dette, Holger; Haines, Linda M.; Imhof, Lorens A.

    2003-01-01

    The problem of constructing standardized maximin D-optimal designs for weighted polynomial regression models is addressed. In particular it is shown that, by following the broad approach to the construction of maximin designs introduced recently by Dette, Haines and Imhof (2003), such designs can be obtained as weak limits of the corresponding Bayesian Φq-optimal designs. The approach is illustrated for two specific weighted polynomial models and also for a particular growth model.

  20. Bayesian modeling growth curves for quail assuming skewness in errors

    Directory of Open Access Journals (Sweden)

    Robson Marcelo Rossi

    2014-06-01

    Full Text Available Bayesian modeling growth curves for quail assuming skewness in errors - To assume normal distributions in the data analysis is common in different areas of the knowledge. However we can make use of the other distributions that are capable to model the skewness parameter in the situations that is needed to model data with tails heavier than the normal. This article intend to present alternatives to the assumption of the normality in the errors, adding asymmetric distributions. A Bayesian approach is proposed to fit nonlinear models when the errors are not normal, thus, the distributions t, skew-normal and skew-t are adopted. The methodology is intended to apply to different growth curves to the quail body weights. It was found that the Gompertz model assuming skew-normal errors and skew-t errors, respectively for male and female, were the best fitted to the data.

  1. APPLICATION OF BAYESIAN MONTE CARLO ANALYSIS TO A LAGRANGIAN PHOTOCHEMICAL AIR QUALITY MODEL. (R824792)

    Science.gov (United States)

    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  2. The Relevance Voxel Machine (RVoxM): A Self-Tuning Bayesian Model for Informative Image-Based Prediction

    DEFF Research Database (Denmark)

    Sabuncu, Mert R.; Van Leemput, Koen

    2012-01-01

    This paper presents the relevance voxel machine (RVoxM), a dedicated Bayesian model for making predictions based on medical imaging data. In contrast to the generic machine learning algorithms that have often been used for this purpose, the method is designed to utilize a small number of spatially...

  3. A Bayesian nonlinear mixed-effects disease progression model

    Science.gov (United States)

    Kim, Seongho; Jang, Hyejeong; Wu, Dongfeng; Abrams, Judith

    2016-01-01

    A nonlinear mixed-effects approach is developed for disease progression models that incorporate variation in age in a Bayesian framework. We further generalize the probability model for sensitivity to depend on age at diagnosis, time spent in the preclinical state and sojourn time. The developed models are then applied to the Johns Hopkins Lung Project data and the Health Insurance Plan for Greater New York data using Bayesian Markov chain Monte Carlo and are compared with the estimation method that does not consider random-effects from age. Using the developed models, we obtain not only age-specific individual-level distributions, but also population-level distributions of sensitivity, sojourn time and transition probability. PMID:26798562

  4. Non-stationarity in GARCH models: A Bayesian analysis

    OpenAIRE

    Kleibergen, Frank; Dijk, Herman

    1993-01-01

    textabstractFirst, the non-stationarity properties of the conditional variances in the GARCH(1,1) model are analysed using the concept of infinite persistence of shocks. Given a time sequence of probabilities for increasing/decreasing conditional variances, a theoretical formula for quasi-strict non-stationarity is defined. The resulting conditions for the GARCH(1,1) model are shown to differ from the weak stationarity conditions mainly used in the literature. Bayesian statistical analysis us...

  5. A New Bayesian Unit Root Test in Stochastic Volatility Models

    OpenAIRE

    Yong Li; Jun Yu

    2010-01-01

    A new posterior odds analysis is proposed to test for a unit root in volatility dynamics in the context of stochastic volatility models. This analysis extends the Bayesian unit root test of So and Li (1999, Journal of Business Economic Statistics) in two important ways. First, a numerically more stable algorithm is introduced to compute the Bayes factor, taking into account the special structure of the competing models. Owing to its numerical stability, the algorithm overcomes the problem of ...

  6. Bayesian Modelling in Machine Learning: A Tutorial Review

    OpenAIRE

    Seeger, Matthias

    2006-01-01

    Many facets of Bayesian Modelling are firmly established in Machine Learning and give rise to state-of-the-art solutions to application problems. The sheer number of techniques, ideas and models which have been proposed, and the terminology, can be bewildering. With this tutorial review, we aim to give a wide high-level overview over this important field, concentrating on central ideas and methods, and on their interconnections. The reader will gain a basic understanding of the topics and the...

  7. Performance and prediction: Bayesian modelling of fallible choice in chess

    OpenAIRE

    Haworth, Guy McCrossan; Regan, Ken; Di Fatta, Giuseppe

    2010-01-01

    Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration ...

  8. Bayesian modeling and prediction of solar particles flux

    International Nuclear Information System (INIS)

    An autoregression model was developed based on the Bayesian approach. Considering the solar wind non-homogeneity, the idea was applied of combining the pure autoregressive properties of the model with expert knowledge based on a similar behaviour of the various phenomena related to the flux properties. Examples of such situations include the hardening of the X-ray spectrum, which is often followed by coronal mass ejection and a significant increase in the particles flux intensity

  9. Bayesian modeling and prediction of solar particles flux

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Kalová, J.

    18/56/, 7/8 (2010), s. 228-230. ISSN 1210-7085 R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : mathematical models * solar activity * solar flares * solar flux * solar particles Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2010/AS/dedecius-bayesian modeling and prediction of solar particles flux.pdf

  10. Hierarchical Bayesian Modeling of Hitting Performance in Baseball

    OpenAIRE

    Jensen, Shane T.; McShane, Blake; Wyner, Abraham J.

    2009-01-01

    We have developed a sophisticated statistical model for predicting the hitting performance of Major League baseball players. The Bayesian paradigm provides a principled method for balancing past performance with crucial covariates, such as player age and position. We share information across time and across players by using mixture distributions to control shrinkage for improved accuracy. We compare the performance of our model to current sabermetric methods on a held-out seaso...

  11. Bayesian estimation of a DSGE model with inventories

    OpenAIRE

    Foerster, Marcel

    2011-01-01

    This paper introduces inventories in an otherwise standard Dynamic Stochastic General Equilibrium Model (DSGE) of the business cycle. Firms accumulate inventories to facilitate sales, but face a cost of doing so in terms of costly storage of intermediate goods. The paper's main contribution is to present a DSGE model with inventories that is estimated using Bayesian methods. Based on US data we show that accounting for inventory dynamics has a significant impact on parameter estimates and imp...

  12. Markov Model of Wind Power Time Series UsingBayesian Inference of Transition Matrix

    OpenAIRE

    Chen, Peiyuan; Berthelsen, Kasper Klitgaard; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper proposes to use Bayesian inference of transition matrix when developing a discrete Markov model of a wind speed/power time series and 95% credible interval for the model verification. The Dirichlet distribution is used as a conjugate prior for the transition matrix. Three discrete Markov models are compared, i.e. the basic Markov model, the Bayesian Markov model and the birth-and-death Markov model. The proposed Bayesian Markov model shows the best accuracy in modeling the autocorr...

  13. Prediction and assimilation of surf-zone processes using a Bayesian network: Part I: Forward models

    Science.gov (United States)

    Plant, Nathaniel G.; Holland, K. Todd

    2011-01-01

    Prediction of coastal processes, including waves, currents, and sediment transport, can be obtained from a variety of detailed geophysical-process models with many simulations showing significant skill. This capability supports a wide range of research and applied efforts that can benefit from accurate numerical predictions. However, the predictions are only as accurate as the data used to drive the models and, given the large temporal and spatial variability of the surf zone, inaccuracies in data are unavoidable such that useful predictions require corresponding estimates of uncertainty. We demonstrate how a Bayesian-network model can be used to provide accurate predictions of wave-height evolution in the surf zone given very sparse and/or inaccurate boundary-condition data. The approach is based on a formal treatment of a data-assimilation problem that takes advantage of significant reduction of the dimensionality of the model system. We demonstrate that predictions of a detailed geophysical model of the wave evolution are reproduced accurately using a Bayesian approach. In this surf-zone application, forward prediction skill was 83%, and uncertainties in the model inputs were accurately transferred to uncertainty in output variables. We also demonstrate that if modeling uncertainties were not conveyed to the Bayesian network (i.e., perfect data or model were assumed), then overly optimistic prediction uncertainties were computed. More consistent predictions and uncertainties were obtained by including model-parameter errors as a source of input uncertainty. Improved predictions (skill of 90%) were achieved because the Bayesian network simultaneously estimated optimal parameters while predicting wave heights.

  14. Spatial Stochastic Point Models for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Syversveen, Anne Randi

    1997-12-31

    The main part of this thesis discusses stochastic modelling of geology in petroleum reservoirs. A marked point model is defined for objects against a background in a two-dimensional vertical cross section of the reservoir. The model handles conditioning on observations from more than one well for each object and contains interaction between objects, and the objects have the correct length distribution when penetrated by wells. The model is developed in a Bayesian setting. The model and the simulation algorithm are demonstrated by means of an example with simulated data. The thesis also deals with object recognition in image analysis, in a Bayesian framework, and with a special type of spatial Cox processes called log-Gaussian Cox processes. In these processes, the logarithm of the intensity function is a Gaussian process. The class of log-Gaussian Cox processes provides flexible models for clustering. The distribution of such a process is completely characterized by the intensity and the pair correlation function of the Cox process. 170 refs., 37 figs., 5 tabs.

  15. Bayesian hierarchical modelling of weak lensing - the golden goal

    CERN Document Server

    Heavens, Alan; Jaffe, Andrew; Hoffmann, Till; Kiessling, Alina; Wandelt, Benjamin

    2016-01-01

    To accomplish correct Bayesian inference from weak lensing shear data requires a complete statistical description of the data. The natural framework to do this is a Bayesian Hierarchical Model, which divides the chain of reasoning into component steps. Starting with a catalogue of shear estimates in tomographic bins, we build a model that allows us to sample simultaneously from the the underlying tomographic shear fields and the relevant power spectra (E-mode, B-mode, and E-B, for auto- and cross-power spectra). The procedure deals easily with masked data and intrinsic alignments. Using Gibbs sampling and messenger fields, we show with simulated data that the large (over 67000-)dimensional parameter space can be efficiently sampled and the full joint posterior probability density function for the parameters can feasibly be obtained. The method correctly recovers the underlying shear fields and all of the power spectra, including at levels well below the shot noise.

  16. A localization model to localize multiple sources using Bayesian inference

    Science.gov (United States)

    Dunham, Joshua Rolv

    Accurate localization of a sound source in a room setting is important in both psychoacoustics and architectural acoustics. Binaural models have been proposed to explain how the brain processes and utilizes the interaural time differences (ITDs) and interaural level differences (ILDs) of sound waves arriving at the ears of a listener in determining source location. Recent work shows that applying Bayesian methods to this problem is proving fruitful. In this thesis, pink noise samples are convolved with head-related transfer functions (HRTFs) and compared to combinations of one and two anechoic speech signals convolved with different HRTFs or binaural room impulse responses (BRIRs) to simulate room positions. Through exhaustive calculation of Bayesian posterior probabilities and using a maximal likelihood approach, model selection will determine the number of sources present, and parameter estimation will result in azimuthal direction of the source(s).

  17. Bayesian Inference and Forecasting in the Stationary Bilinear Model

    OpenAIRE

    Roberto Leon-Gonzalez; Fuyu Yang

    2014-01-01

    A stationary bilinear (SB) model can be used to describe processes with a time-varying degree of persistence that depends on past shocks. An example of such a process is inflation. This study develops methods for Bayesian inference, model comparison, and forecasting in the SB model. Using monthly U.K. inflation data, we find that the SB model outperforms the random walk and first order autoregressive AR(1) models in terms of root mean squared forecast errors for both the one-step-ahead and th...

  18. Bayesian Age-Period-Cohort Modeling and Prediction - BAMP

    Directory of Open Access Journals (Sweden)

    Volker J. Schmid

    2007-10-01

    Full Text Available The software package BAMP provides a method of analyzing incidence or mortality data on the Lexis diagram, using a Bayesian version of an age-period-cohort model. A hierarchical model is assumed with a binomial model in the first-stage. As smoothing priors for the age, period and cohort parameters random walks of first and second order, with and without an additional unstructured component are available. Unstructured heterogeneity can also be included in the model. In order to evaluate the model fit, posterior deviance, DIC and predictive deviances are computed. By projecting the random walk prior into the future, future death rates can be predicted.

  19. Introduction to Hierarchical Bayesian Modeling for Ecological Data

    CERN Document Server

    Parent, Eric

    2012-01-01

    Making statistical modeling and inference more accessible to ecologists and related scientists, Introduction to Hierarchical Bayesian Modeling for Ecological Data gives readers a flexible and effective framework to learn about complex ecological processes from various sources of data. It also helps readers get started on building their own statistical models. The text begins with simple models that progressively become more complex and realistic through explanatory covariates and intermediate hidden states variables. When fitting the models to data, the authors gradually present the concepts a

  20. Bayesian calibration of simulation models for supporting management of the elimination of the macroparasitic disease, Lymphatic Filariasis

    OpenAIRE

    Singh, Brajendra K.; Michael, Edwin

    2015-01-01

    Background Mathematical models of parasite transmission can help integrate a large body of information into a consistent framework, which can then be used for gaining mechanistic insights and making predictions. However, uncertainty, spatial variability and complexity, can hamper the use of such models for decision making in parasite management programs. Methods We have adapted a Bayesian melding framework for calibrating simulation models to address the need for robust modelling tools that c...

  1. Bayesian analysis of recursive SVAR models with overidentifying restrictions

    OpenAIRE

    Kociecki, Andrzej; Rubaszek, Michał; Ca' Zorzi, Michele

    2012-01-01

    The paper provides a novel Bayesian methodological framework to estimate structural VAR (SVAR) models with recursive identification schemes that allows for the inclusion of over-identifying restrictions. The proposed framework enables the researcher to (i) elicit the prior on the non-zero contemporaneous relations between economic variables and to (ii) derive an analytical expression for the posterior distribution and marginal data density. We illustrate our methodological framework by estima...

  2. Differential gene co-expression networks via Bayesian biclustering models

    OpenAIRE

    Gao, Chuan; Zhao, Shiwen; McDowell, Ian C.; Brown, Christopher D.; Barbara E Engelhardt

    2014-01-01

    Identifying latent structure in large data matrices is essential for exploring biological processes. Here, we consider recovering gene co-expression networks from gene expression data, where each network encodes relationships between genes that are locally co-regulated by shared biological mechanisms. To do this, we develop a Bayesian statistical model for biclustering to infer subsets of co-regulated genes whose covariation may be observed in only a subset of the samples. Our biclustering me...

  3. Bayesian parsimonious covariance estimation for hierarchical linear mixed models

    OpenAIRE

    Frühwirth-Schnatter, Sylvia; Tüchler, Regina

    2004-01-01

    We considered a non-centered parameterization of the standard random-effects model, which is based on the Cholesky decomposition of the variance-covariance matrix. The regression type structure of the non-centered parameterization allows to choose a simple, conditionally conjugate normal prior on the Cholesky factor. Based on the non-centered parameterization, we search for a parsimonious variance-covariance matrix by identifying the non-zero elements of the Cholesky factors using Bayesian va...

  4. Diffusion Estimation Of State-Space Models: Bayesian Formulation

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil

    Reims: IEEE, 2014. ISBN 978-1-4799-3693-9. [The 24th IEEE International Workshop on Machine Learning for Signal Processing (MLSP2014). Reims (FR), 21.09.2014-24.09.2014] R&D Projects: GA ČR(CZ) GP14-06678P Keywords : distributed estimation * state-space models * Bayesian estimation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2014/AS/dedecius-0431804.pdf

  5. Bayesian Methods for Neural Networks and Related Models

    OpenAIRE

    Titterington, D.M.

    2004-01-01

    Models such as feed-forward neural networks and certain other structures investigated in the computer science literature are not amenable to closed-form Bayesian analysis. The paper reviews the various approaches taken to overcome this difficulty, involving the use of Gaussian approximations, Markov chain Monte Carlo simulation routines and a class of non-Gaussian but “deterministic” approximations called variational approximations.

  6. Bayesian network models in brain functional connectivity analysis

    OpenAIRE

    Ide, Jaime S.; Zhang, Sheng; Chiang-shan R. Li

    2013-01-01

    Much effort has been made to better understand the complex integration of distinct parts of the human brain using functional magnetic resonance imaging (fMRI). Altered functional connectivity between brain regions is associated with many neurological and mental illnesses, such as Alzheimer and Parkinson diseases, addiction, and depression. In computational science, Bayesian networks (BN) have been used in a broad range of studies to model complex data set in the presence of uncertainty and wh...

  7. Bayesian Models of Learning and Reasoning with Relations

    OpenAIRE

    Chen, Dawn

    2014-01-01

    How do humans acquire relational concepts such as larger, which are essential for analogical inference and other forms of high-level reasoning? Are they necessarily innate, or can they be learned from non-relational inputs? Using comparative relations as a model domain, we show that structured relations can be learned from unstructured inputs of realistic complexity, applying bottom-up Bayesian learning mechanisms that make minimal assumptions about innate representations. First, we introduce...

  8. AIC, BIC, Bayesian evidence against the interacting dark energy model

    Energy Technology Data Exchange (ETDEWEB)

    Szydlowski, Marek [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland); Krawiec, Adam [Jagiellonian University, Institute of Economics, Finance and Management, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland); Kurek, Aleksandra [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Kamionka, Michal [University of Wroclaw, Astronomical Institute, Wroclaw (Poland)

    2015-01-01

    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative - the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock- Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam's razor we are inclined to reject this model. (orig.)

  9. AIC, BIC, Bayesian evidence against the interacting dark energy model

    Energy Technology Data Exchange (ETDEWEB)

    Szydłowski, Marek, E-mail: marek.szydlowski@uj.edu.pl [Astronomical Observatory, Jagiellonian University, Orla 171, 30-244, Kraków (Poland); Mark Kac Complex Systems Research Centre, Jagiellonian University, Reymonta 4, 30-059, Kraków (Poland); Krawiec, Adam, E-mail: adam.krawiec@uj.edu.pl [Institute of Economics, Finance and Management, Jagiellonian University, Łojasiewicza 4, 30-348, Kraków (Poland); Mark Kac Complex Systems Research Centre, Jagiellonian University, Reymonta 4, 30-059, Kraków (Poland); Kurek, Aleksandra, E-mail: alex@oa.uj.edu.pl [Astronomical Observatory, Jagiellonian University, Orla 171, 30-244, Kraków (Poland); Kamionka, Michał, E-mail: kamionka@astro.uni.wroc.pl [Astronomical Institute, University of Wrocław, ul. Kopernika 11, 51-622, Wrocław (Poland)

    2015-01-14

    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative—the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam’s principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock–Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam’s razor we are inclined to reject this model.

  10. AIC, BIC, Bayesian evidence against the interacting dark energy model

    International Nuclear Information System (INIS)

    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting ΛCDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative—the ΛCDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam’s principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), h(z), baryon acoustic oscillation, the Alcock–Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting ΛCDM model when compared to the ΛCDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the ΛCDM model. Given the weak or almost non-existing support for the interacting ΛCDM model and bearing in mind Occam’s razor we are inclined to reject this model

  11. Dissecting Magnetar Variability with Bayesian Hierarchical Models

    Science.gov (United States)

    Huppenkothen, Daniela; Brewer, Brendon J.; Hogg, David W.; Murray, Iain; Frean, Marcus; Elenbaas, Chris; Watts, Anna L.; Levin, Yuri; van der Horst, Alexander J.; Kouveliotou, Chryssa

    2015-09-01

    Neutron stars are a prime laboratory for testing physical processes under conditions of strong gravity, high density, and extreme magnetic fields. Among the zoo of neutron star phenomena, magnetars stand out for their bursting behavior, ranging from extremely bright, rare giant flares to numerous, less energetic recurrent bursts. The exact trigger and emission mechanisms for these bursts are not known; favored models involve either a crust fracture and subsequent energy release into the magnetosphere, or explosive reconnection of magnetic field lines. In the absence of a predictive model, understanding the physical processes responsible for magnetar burst variability is difficult. Here, we develop an empirical model that decomposes magnetar bursts into a superposition of small spike-like features with a simple functional form, where the number of model components is itself part of the inference problem. The cascades of spikes that we model might be formed by avalanches of reconnection, or crust rupture aftershocks. Using Markov Chain Monte Carlo sampling augmented with reversible jumps between models with different numbers of parameters, we characterize the posterior distributions of the model parameters and the number of components per burst. We relate these model parameters to physical quantities in the system, and show for the first time that the variability within a burst does not conform to predictions from ideas of self-organized criticality. We also examine how well the properties of the spikes fit the predictions of simplified cascade models for the different trigger mechanisms.

  12. Dissecting magnetar variability with Bayesian hierarchical models

    CERN Document Server

    Huppenkothen, D; Hogg, D W; Murray, I; Frean, M; Elenbaas, C; Watts, A L; Levin, Y; van der Horst, A J; Kouveliotou, C

    2015-01-01

    Neutron stars are a prime laboratory for testing physical processes under conditions of strong gravity, high density, and extreme magnetic fields. Among the zoo of neutron star phenomena, magnetars stand out for their bursting behaviour, ranging from extremely bright, rare giant flares to numerous, less energetic recurrent bursts. The exact trigger and emission mechanisms for these bursts are not known; favoured models involve either a crust fracture and subsequent energy release into the magnetosphere, or explosive reconnection of magnetic field lines. In the absence of a predictive model, understanding the physical processes responsible for magnetar burst variability is difficult. Here, we develop an empirical model that decomposes magnetar bursts into a superposition of small spike-like features with a simple functional form, where the number of model components is itself part of the inference problem. The cascades of spikes that we model might be formed by avalanches of reconnection, or crust rupture afte...

  13. Spatial Double Generalized Beta Regression Models: Extensions and Application to Study Quality of Education in Colombia

    Science.gov (United States)

    Cepeda-Cuervo, Edilberto; Núñez-Antón, Vicente

    2013-01-01

    In this article, a proposed Bayesian extension of the generalized beta spatial regression models is applied to the analysis of the quality of education in Colombia. We briefly revise the beta distribution and describe the joint modeling approach for the mean and dispersion parameters in the spatial regression models' setting. Finally, we…

  14. Dynamic model based on Bayesian method for energy security assessment

    International Nuclear Information System (INIS)

    Highlights: • Methodology for dynamic indicator model construction and forecasting of indicators. • Application of dynamic indicator model for energy system development scenarios. • Expert judgement involvement using Bayesian method. - Abstract: The methodology for the dynamic indicator model construction and forecasting of indicators for the assessment of energy security level is presented in this article. An indicator is a special index, which provides numerical values to important factors for the investigated area. In real life, models of different processes take into account various factors that are time-dependent and dependent on each other. Thus, it is advisable to construct a dynamic model in order to describe these dependences. The energy security indicators are used as factors in the dynamic model. Usually, the values of indicators are obtained from statistical data. The developed dynamic model enables to forecast indicators’ variation taking into account changes in system configuration. The energy system development is usually based on a new object construction. Since the parameters of changes of the new system are not exactly known, information about their influences on indicators could not be involved in the model by deterministic methods. Thus, dynamic indicators’ model based on historical data is adjusted by probabilistic model with the influence of new factors on indicators using the Bayesian method

  15. A Bayesian Network View on Nested Effects Models

    Directory of Open Access Journals (Sweden)

    Fröhlich Holger

    2009-01-01

    Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the /Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.

  16. Probe Error Modeling Research Based on Bayesian Network

    Institute of Scientific and Technical Information of China (English)

    Wu Huaiqiang; Xing Zilong; Zhang Jian; Yan Yan

    2015-01-01

    Probe calibration is carried out under specific conditions; most of the error caused by the change of speed parameter has not been corrected. In order to reduce the measuring error influence on measurement accuracy, this article analyzes the relationship between speed parameter and probe error, and use Bayesian network to establish the model of probe error. Model takes account of prior knowledge and sample data, with the updating of data, which can reflect the change of the errors of the probe and constantly revised modeling results.

  17. An Efficient Bayesian Nearest Neighbor Search Using Marginal Object Weight Ranking Scheme in Spatial Databases

    Directory of Open Access Journals (Sweden)

    K. Duraiswamy

    2012-01-01

    Full Text Available Problem statement: A database that is optimized to store and query data that is related to objects in space, including points, lines and polygons is called spatial database. Identifying nearest neighbor object search is a vital part of spatial database. Many nearest neighbor search techniques such as Authenticated Multi-step NN (AMNN, Superseding Nearest Neighbor (SNN search, Bayesian Nearest Neighbor (BNN and so on are available. But they had some difficulties while performing NN in uncertain spatial database. AMNN does not process the queries from distributed server and it accesses the queries only from single server. In SNN, the high dimensional data structure could not be used in NN search and it accesses only low dimensional data for NN search. Approach: The previous works described the process of NN using SNN with marginal object weight ranking. The downside over the previous work is that the performance is poor when compared to another work which performed NN using BNN. To improve the NN search in spatial databases using BNN, we are going to present a new technique as BNN search using marginal object weight ranking. Based on events occurring in the nearest object, BNN starts its search using MOW. The MOW is done by computing the weight of each NN objects and rank each object based on its frequency and distance of NN object for an efficient NN search in spatial databases. Results: Marginal Object Weight (MOW is introduced to all nearest neighbor object identified using BNN for any relevant query point. It processes the queries from distributed server using MOW. Conclusion: The proposed BNN using MOW framework is experimented with real data sets to show the performance improvement with the previous MOW using SNN in terms of execution time, memory consumption and query result accuracy.

  18. Bayesian inference and model comparison for metallic fatigue data

    KAUST Repository

    Babuška, Ivo

    2016-02-23

    In this work, we present a statistical treatment of stress-life (S-N) data drawn from a collection of records of fatigue experiments that were performed on 75S-T6 aluminum alloys. Our main objective is to predict the fatigue life of materials by providing a systematic approach to model calibration, model selection and model ranking with reference to S-N data. To this purpose, we consider fatigue-limit models and random fatigue-limit models that are specially designed to allow the treatment of the run-outs (right-censored data). We first fit the models to the data by maximum likelihood methods and estimate the quantiles of the life distribution of the alloy specimen. To assess the robustness of the estimation of the quantile functions, we obtain bootstrap confidence bands by stratified resampling with respect to the cycle ratio. We then compare and rank the models by classical measures of fit based on information criteria. We also consider a Bayesian approach that provides, under the prior distribution of the model parameters selected by the user, their simulation-based posterior distributions. We implement and apply Bayesian model comparison methods, such as Bayes factor ranking and predictive information criteria based on cross-validation techniques under various a priori scenarios.

  19. Bayesian inference and model comparison for metallic fatigue data

    Science.gov (United States)

    Babuška, Ivo; Sawlan, Zaid; Scavino, Marco; Szabó, Barna; Tempone, Raúl

    2016-06-01

    In this work, we present a statistical treatment of stress-life (S-N) data drawn from a collection of records of fatigue experiments that were performed on 75S-T6 aluminum alloys. Our main objective is to predict the fatigue life of materials by providing a systematic approach to model calibration, model selection and model ranking with reference to S-N data. To this purpose, we consider fatigue-limit models and random fatigue-limit models that are specially designed to allow the treatment of the run-outs (right-censored data). We first fit the models to the data by maximum likelihood methods and estimate the quantiles of the life distribution of the alloy specimen. To assess the robustness of the estimation of the quantile functions, we obtain bootstrap confidence bands by stratified resampling with respect to the cycle ratio. We then compare and rank the models by classical measures of fit based on information criteria. We also consider a Bayesian approach that provides, under the prior distribution of the model parameters selected by the user, their simulation-based posterior distributions. We implement and apply Bayesian model comparison methods, such as Bayes factor ranking and predictive information criteria based on cross-validation techniques under various a priori scenarios.

  20. A Bayesian Model for Discovering Typological Implications

    CERN Document Server

    Daumé, Hal

    2009-01-01

    A standard form of analysis for linguistic typology is the universal implication. These implications state facts about the range of extant languages, such as ``if objects come after verbs, then adjectives come after nouns.'' Such implications are typically discovered by painstaking hand analysis over a small sample of languages. We propose a computational model for assisting at this process. Our model is able to discover both well-known implications as well as some novel implications that deserve further study. Moreover, through a careful application of hierarchical analysis, we are able to cope with the well-known sampling problem: languages are not independent.

  1. A Latent Variable Bayesian Approach to Spatial Clustering with Background Noise

    NARCIS (Netherlands)

    Kayabol, K.

    2011-01-01

    We propose a finite mixture model for clustering of the spatial data patterns. The model is based on the spatial distances between the data locations in such a way that both the distances of the points to the cluster centers and the distances of a given point to its neighbors within a defined window

  2. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  3. KNET: Integrating Hypermedia and Bayesian Modeling

    OpenAIRE

    Chavez, R. Martin; Cooper, Gregory F.

    2013-01-01

    KNET is a general-purpose shell for constructing expert systems based on belief networks and decision networks. Such networks serve as graphical representations for decision models, in which the knowledge engineer must define clearly the alternatives, states, preferences, and relationships that constitute a decision basis. KNET contains a knowledge-engineering core written in Object Pascal and an interface that tightly integrates HyperCard, a hypertext authoring tool for the Apple Macintosh c...

  4. Lack of confidence in approximate Bayesian computation model choice.

    Science.gov (United States)

    Robert, Christian P; Cornuet, Jean-Marie; Marin, Jean-Michel; Pillai, Natesh S

    2011-09-13

    Approximate Bayesian computation (ABC) have become an essential tool for the analysis of complex stochastic models. Grelaud et al. [(2009) Bayesian Anal 3:427-442] advocated the use of ABC for model choice in the specific case of Gibbs random fields, relying on an intermodel sufficiency property to show that the approximation was legitimate. We implemented ABC model choice in a wide range of phylogenetic models in the Do It Yourself-ABC (DIY-ABC) software [Cornuet et al. (2008) Bioinformatics 24:2713-2719]. We now present arguments as to why the theoretical arguments for ABC model choice are missing, because the algorithm involves an unknown loss of information induced by the use of insufficient summary statistics. The approximation error of the posterior probabilities of the models under comparison may thus be unrelated with the computational effort spent in running an ABC algorithm. We then conclude that additional empirical verifications of the performances of the ABC procedure as those available in DIY-ABC are necessary to conduct model choice. PMID:21876135

  5. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    Science.gov (United States)

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842

  6. A study of finite mixture model: Bayesian approach on financial time series data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-07-01

    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  7. Macroeconomic Forecasts in Models with Bayesian Averaging of Classical Estimates

    Directory of Open Access Journals (Sweden)

    Piotr Białowolski

    2012-03-01

    Full Text Available The aim of this paper is to construct a forecasting model oriented on predicting basic macroeconomic variables, namely: the GDP growth rate, the unemployment rate, and the consumer price inflation. In order to select the set of the best regressors, Bayesian Averaging of Classical Estimators (BACE is employed. The models are atheoretical (i.e. they do not reflect causal relationships postulated by the macroeconomic theory and the role of regressors is played by business and consumer tendency survey-based indicators. Additionally, survey-based indicators are included with a lag that enables to forecast the variables of interest (GDP, unemployment, and inflation for the four forthcoming quarters without the need to make any additional assumptions concerning the values of predictor variables in the forecast period.  Bayesian Averaging of Classical Estimators is a method allowing for full and controlled overview of all econometric models which can be obtained out of a particular set of regressors. In this paper authors describe the method of generating a family of econometric models and the procedure for selection of a final forecasting model. Verification of the procedure is performed by means of out-of-sample forecasts of main economic variables for the quarters of 2011. The accuracy of the forecasts implies that there is still a need to search for new solutions in the atheoretical modelling.

  8. A Bayesian hierarchical approach to model seasonal algal variability along an upstream to downstream river gradient

    Science.gov (United States)

    Cha, YoonKyung; Soon Park, Seok; Won Lee, Hye; Stow, Craig A.

    2016-01-01

    Modeling to accurately predict river phytoplankton distribution and abundance is important in water quality and resource management. Nevertheless, the complex nature of eutrophication processes in highly connected river systems makes the task challenging. To model dynamics of river phytoplankton, represented by chlorophyll a (Chl a) concentration, we propose a Bayesian hierarchical model that explicitly accommodates seasonality and upstream-downstream spatial gradient in the structure. The utility of our model is demonstrated with an application to the Nakdong River (South Korea), which is a eutrophic, intensively regulated river, but functions as an irreplaceable water source for more than 13 million people. Chl a is modeled with two manageable factors, river flow, and total phosphorus (TP) concentration. Our model results highlight the importance of taking seasonal and spatial context into account when describing flow regimes and phosphorus delivery in rivers. A contrasting positive Chl a-flow relationship across stations versus negative Chl a-flow slopes that arose when Chl a was modeled on a station-month basis is an illustration of Simpson's paradox, which necessitates modeling Chl a-flow relationships decomposed into seasonal and spatial components. Similar Chl a-TP slopes among stations and months suggest that, with the flow effect removed, positive TP effects on Chl a are uniform regardless of the season and station in the river. Our model prediction successfully captured the shift in the spatial and monthly patterns of Chl a.

  9. Theoretical aspects of spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized theoretical aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter provides up-to-date coverage of particle association measures that underpin the theoretical properties of recently developed random set methods in space and time otherwise known as the class of probability hypothesis density framework (PHD filters). The second chapter gives an overview of recent advances in Monte Carlo methods for Bayesian filtering in high-dimensional spaces. In particular, the chapter explains how one may extend classical sequential Monte Carlo methods for filtering and static inference problems to high dimensions and big-data applications. The third chapter presents an overview of generalized families of processes that extend the class of Gaussian process models to heavy-tailed families known as alph...

  10. Evaluating stream health based environmental justice model performance at different spatial scales

    Science.gov (United States)

    Daneshvar, Fariborz; Nejadhashemi, A. Pouyan; Zhang, Zhen; Herman, Matthew R.; Shortridge, Ashton; Marquart-Pyatt, Sandra

    2016-07-01

    This study evaluated the effects of spatial resolution on environmental justice analysis concerning stream health. The Saginaw River Basin in Michigan was selected since it is an area of concern in the Great Lakes basin. Three Bayesian Conditional Autoregressive (CAR) models (ordinary regression, weighted regression and spatial) were developed for each stream health measure based on 17 socioeconomic and physiographical variables at three census levels. For all stream health measures, spatial models had better performance compared to the two non-spatial ones at the census tract and block group levels. Meanwhile no spatial dependency was found at the county level. Multilevel Bayesian CAR models were also developed to understand the spatial dependency at the three levels. Results showed that considering level interactions improved models' prediction. Residual plots also showed that models developed at the block group and census tract (in contrary to county level models) are able to capture spatial variations.

  11. Modeling operational risks of the nuclear industry with Bayesian networks

    International Nuclear Information System (INIS)

    Basically, planning a new industrial plant requires information on the industrial management, regulations, site selection, definition of initial and planned capacity, and on the estimation of the potential demand. However, this is far from enough to assure the success of an industrial enterprise. Unexpected and extremely damaging events may occur that deviates from the original plan. The so-called operational risks are not only in the system, equipment, process or human (technical or managerial) failures. They are also in intentional events such as frauds and sabotage, or extreme events like terrorist attacks or radiological accidents and even on public reaction to perceived environmental or future generation impacts. For the nuclear industry, it is a challenge to identify and to assess the operational risks and their various sources. Early identification of operational risks can help in preparing contingency plans, to delay the decision to invest or to approve a project that can, at an extreme, affect the public perception of the nuclear energy. A major problem in modeling operational risk losses is the lack of internal data that are essential, for example, to apply the loss distribution approach. As an alternative, methods that consider qualitative and subjective information can be applied, for example, fuzzy logic, neural networks, system dynamic or Bayesian networks. An advantage of applying Bayesian networks to model operational risk is the possibility to include expert opinions and variables of interest, to structure the model via causal dependencies among these variables, and to specify subjective prior and conditional probabilities distributions at each step or network node. This paper suggests a classification of operational risks in industry and discusses the benefits and obstacles of the Bayesian networks approach to model those risks. (author)

  12. An application of GIS and Bayesian network in studying spatial-causal relations between enterprises and environmental factors

    Science.gov (United States)

    Shen, Tiyan; Li, Xi; Li, Maiqing

    2009-10-01

    The paper intends to employ Geographic Information System (GIS) and Bayesian Network to discover the spatial causality between enterprises and environmental factors in Beijing Metropolis. The census data of Beijing was spatialized by means of GIS in the beginning, and then the training data was made using density mapping technique. Base on the training data, the structure of a Bayesian Network was learnt with the help of Maximum Weight Spanning Tree. Eight direct relations were discussed in the end, of which, the most exciting discovery, "Enterprise-Run Society", as the symbol of the former planned economy, was emphasized in the spatial relations between heavy industry and schools. Though the final result is not so creative in economic perspective, it is of significance in technique view due to all discoveries were drawn from data, therefore leading to the realization of the importance of GIS and data mining to economic geography research.

  13. Spatial Models for Virtual Networks

    Science.gov (United States)

    Janssen, Jeannette

    This paper discusses the use of spatial graph models for the analysis of networks that do not have a direct spatial reality, such as web graphs, on-line social networks, or citation graphs. In a spatial graph model, nodes are embedded in a metric space, and link formation depends on the relative position of nodes in the space. It is argued that spatial models form a good basis for link mining: assuming a spatial model, the link information can be used to infer the spatial position of the nodes, and this information can then be used for clustering and recognition of node similarity. This paper gives a survey of spatial graph models, and discusses their suitability for link mining.

  14. Bayesian generalized additive models for location, scale and shape for zero-inflated and overdispersed count data

    OpenAIRE

    Klein, Nadja; Kneib, Thomas; Lang, Stefan

    2013-01-01

    Frequent problems in applied research that prevent the application of the classical Poisson log-linear model for analyzing count data include overdispersion, an excess of zeros compared to the Poisson distribution, correlated responses, as well as complex predictor structures comprising nonlinear effects of continuous covariates, interactions or spatial effects. We propose a general class of Bayesian generalized additive models for zero-inflated and overdispersed count data within the framewo...

  15. Two-stage Bayesian models-application to ZEDB project

    Energy Technology Data Exchange (ETDEWEB)

    Bunea, C. [George Washington University, School of Applied Science, 1776 G Street, NW, Suite 108, Washington, DC 20052 (United States)]. E-mail: cornel@gwu.edu; Charitos, T. [Institute of Information and Computing Sciences, Padualaan 14, de Uithof, 3508 TB, Utrecht (Netherlands)]. E-mail: theodore@cs.uu.nl; Cooke, R.M. [Delft University of Technology, EWI Faculty, Mekelweg 4, 2628 CD, Delft (Netherlands)]. E-mail: r.m.cooke@ewi.tudelft.n1; Becker, G. [RISA, Krumme Str., Berlin 10627 (Germany)]. E-mail: guenter.becker@risa.de

    2005-12-01

    A well-known mathematical tool to analyze plant specific reliability data for nuclear power facilities is the two-stage Bayesian model. Such two-stage Bayesian models are standard practice nowadays, for example in the German ZEDB project or in the Swedish T-Book, although they may differ in their mathematical models and software implementation. In this paper, we review the mathematical model, its underlying assumptions and supporting arguments. Reasonable conditional assumptions are made to yield tractable and mathematically valid form for the failure rate at plant of interest, given failures and operational times at other plants in the population. The posterior probability of failure rate at plant of interest is sensitive to the choice of hyperprior parameters since the effect of hyperprior distribution will never be dominated by the effect of observation. The methods of Poern and Jeffrey for choosing distributions over hyperparameters are discussed. Furthermore, we will perform verification tasks associated with the theoretical model presented in this paper. The present software implementation produces good agreement with ZEDB results for various prior distributions. The difference between our results and those of ZEDB reflect differences that may arise from numerical implementation, as that would use different step size and truncation bounds.

  16. Two-stage Bayesian models-application to ZEDB project

    International Nuclear Information System (INIS)

    A well-known mathematical tool to analyze plant specific reliability data for nuclear power facilities is the two-stage Bayesian model. Such two-stage Bayesian models are standard practice nowadays, for example in the German ZEDB project or in the Swedish T-Book, although they may differ in their mathematical models and software implementation. In this paper, we review the mathematical model, its underlying assumptions and supporting arguments. Reasonable conditional assumptions are made to yield tractable and mathematically valid form for the failure rate at plant of interest, given failures and operational times at other plants in the population. The posterior probability of failure rate at plant of interest is sensitive to the choice of hyperprior parameters since the effect of hyperprior distribution will never be dominated by the effect of observation. The methods of Poern and Jeffrey for choosing distributions over hyperparameters are discussed. Furthermore, we will perform verification tasks associated with the theoretical model presented in this paper. The present software implementation produces good agreement with ZEDB results for various prior distributions. The difference between our results and those of ZEDB reflect differences that may arise from numerical implementation, as that would use different step size and truncation bounds

  17. Quantum-Like Bayesian Networks for Modeling Decision Making.

    Science.gov (United States)

    Moreira, Catarina; Wichert, Andreas

    2016-01-01

    In this work, we explore an alternative quantum structure to perform quantum probabilistic inferences to accommodate the paradoxical findings of the Sure Thing Principle. We propose a Quantum-Like Bayesian Network, which consists in replacing classical probabilities by quantum probability amplitudes. However, since this approach suffers from the problem of exponential growth of quantum parameters, we also propose a similarity heuristic that automatically fits quantum parameters through vector similarities. This makes the proposed model general and predictive in contrast to the current state of the art models, which cannot be generalized for more complex decision scenarios and that only provide an explanatory nature for the observed paradoxes. In the end, the model that we propose consists in a nonparametric method for estimating inference effects from a statistical point of view. It is a statistical model that is simpler than the previous quantum dynamic and quantum-like models proposed in the literature. We tested the proposed network with several empirical data from the literature, mainly from the Prisoner's Dilemma game and the Two Stage Gambling game. The results obtained show that the proposed quantum Bayesian Network is a general method that can accommodate violations of the laws of classical probability theory and make accurate predictions regarding human decision-making in these scenarios. PMID:26858669

  18. Development of a cyber security risk model using Bayesian networks

    International Nuclear Information System (INIS)

    Cyber security is an emerging safety issue in the nuclear industry, especially in the instrumentation and control (I and C) field. To address the cyber security issue systematically, a model that can be used for cyber security evaluation is required. In this work, a cyber security risk model based on a Bayesian network is suggested for evaluating cyber security for nuclear facilities in an integrated manner. The suggested model enables the evaluation of both the procedural and technical aspects of cyber security, which are related to compliance with regulatory guides and system architectures, respectively. The activity-quality analysis model was developed to evaluate how well people and/or organizations comply with the regulatory guidance associated with cyber security. The architecture analysis model was created to evaluate vulnerabilities and mitigation measures with respect to their effect on cyber security. The two models are integrated into a single model, which is called the cyber security risk model, so that cyber security can be evaluated from procedural and technical viewpoints at the same time. The model was applied to evaluate the cyber security risk of the reactor protection system (RPS) of a research reactor and to demonstrate its usefulness and feasibility. - Highlights: • We developed the cyber security risk model can be find the weak point of cyber security integrated two cyber analysis models by using Bayesian Network. • One is the activity-quality model signifies how people and/or organization comply with the cyber security regulatory guide. • Other is the architecture model represents the probability of cyber-attack on RPS architecture. • The cyber security risk model can provide evidence that is able to determine the key element for cyber security for RPS of a research reactor

  19. Marginal Bayesian nonparametric model for time to disease arrival of threatened amphibian populations.

    Science.gov (United States)

    Zhou, Haiming; Hanson, Timothy; Knapp, Roland

    2015-12-01

    The global emergence of Batrachochytrium dendrobatidis (Bd) has caused the extinction of hundreds of amphibian species worldwide. It has become increasingly important to be able to precisely predict time to Bd arrival in a population. The data analyzed herein present a unique challenge in terms of modeling because there is a strong spatial component to Bd arrival time and the traditional proportional hazards assumption is grossly violated. To address these concerns, we develop a novel marginal Bayesian nonparametric survival model for spatially correlated right-censored data. This class of models assumes that the logarithm of survival times marginally follow a mixture of normal densities with a linear-dependent Dirichlet process prior as the random mixing measure, and their joint distribution is induced by a Gaussian copula model with a spatial correlation structure. To invert high-dimensional spatial correlation matrices, we adopt a full-scale approximation that can capture both large- and small-scale spatial dependence. An efficient Markov chain Monte Carlo algorithm with delayed rejection is proposed for posterior computation, and an R package spBayesSurv is provided to fit the model. This approach is first evaluated through simulations, then applied to threatened frog populations in Sequoia-Kings Canyon National Park. PMID:26148536

  20. Spatial extended hazard model with application to prostate cancer survival.

    Science.gov (United States)

    Li, Li; Hanson, Timothy; Zhang, Jiajia

    2015-06-01

    This article develops a Bayesian semiparametric approach to the extended hazard model, with generalization to high-dimensional spatially grouped data. County-level spatial correlation is accommodated marginally through the normal transformation model of Li and Lin (2006, Journal of the American Statistical Association 101, 591-603), using a correlation structure implied by an intrinsic conditionally autoregressive prior. Efficient Markov chain Monte Carlo algorithms are developed, especially applicable to fitting very large, highly censored areal survival data sets. Per-variable tests for proportional hazards, accelerated failure time, and accelerated hazards are efficiently carried out with and without spatial correlation through Bayes factors. The resulting reduced, interpretable spatial models can fit significantly better than a standard additive Cox model with spatial frailties. PMID:25521422

  1. Immigrant maternal depression and social networks. A multilevel Bayesian spatial logistic regression in South Western Sydney, Australia.

    Science.gov (United States)

    Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A; Phung, Hai N; Barnett, Bryanne E W

    2013-09-01

    The purpose is to explore the multilevel spatial distribution of depressive symptoms among migrant mothers in South Western Sydney and to identify any group level associations that could inform subsequent theory building and local public health interventions. Migrant mothers (n=7256) delivering in 2002 and 2003 were assessed at 2-3 weeks after delivery for risk factors for depressive symptoms. The binary outcome variables were Edinburgh Postnatal Depression Scale scores (EPDS) of >9 and >12. Individual level variables included were: financial income, self-reported maternal health, social support network, emotional support, practical support, baby trouble sleeping, baby demanding and baby not content. The group level variable reported here is aggregated social support networks. We used Bayesian hierarchical multilevel spatial modelling with conditional autoregression. Migrant mothers were at higher risk of having depressive symptoms if they lived in a community with predominantly Australian-born mothers and strong social capital as measured by aggregated social networks. These findings suggest that migrant mothers are socially isolated and current home visiting services should be strengthened for migrant mothers living in communities where they may have poor social networks. PMID:23973180

  2. BAYESIAN ESTIMATION IN SHARED COMPOUND POISSON FRAILTY MODELS

    Directory of Open Access Journals (Sweden)

    David D. Hanagal

    2015-06-01

    Full Text Available In this paper, we study the compound Poisson distribution as the shared frailty distribution and two different baseline distributions namely Pareto and linear failure rate distributions for modeling survival data. We are using the Markov Chain Monte Carlo (MCMC technique to estimate parameters of the proposed models by introducing the Bayesian estimation procedure. In the present study, a simulation is done to compare the true values of parameters with the estimated values. We try to fit the proposed models to a real life bivariate survival data set of McGrilchrist and Aisbett (1991 related to kidney infection. Also, we present a comparison study for the same data by using model selection criterion, and suggest a better frailty model out of two proposed frailty models.

  3. Experimental validation of a Bayesian model of visual acuity.

    LENUS (Irish Health Repository)

    Dalimier, Eugénie

    2009-01-01

    Based on standard procedures used in optometry clinics, we compare measurements of visual acuity for 10 subjects (11 eyes tested) in the presence of natural ocular aberrations and different degrees of induced defocus, with the predictions given by a Bayesian model customized with aberrometric data of the eye. The absolute predictions of the model, without any adjustment, show good agreement with the experimental data, in terms of correlation and absolute error. The efficiency of the model is discussed in comparison with image quality metrics and other customized visual process models. An analysis of the importance and customization of each stage of the model is also given; it stresses the potential high predictive power from precise modeling of ocular and neural transfer functions.

  4. Non-parametric Bayesian modeling of cervical mucus symptom

    OpenAIRE

    Bin, Riccardo De; Scarpa, Bruno

    2014-01-01

    The analysis of the cervical mucus symptom is useful to identify the period of maximum fertility of a woman. In this paper we analyze the daily evolution of the cervical mucus symptom during the menstrual cycle, based on the data collected in two retrospective studies, in which the mucus symptom is treated as an ordinal variable. To produce our statistical model, we follow a non-parametric Bayesian approach. In particular, we use the idea of non-parametric mixtures of rounded continuous kerne...

  5. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  6. Bayesian calibration of power plant models for accurate performance prediction

    International Nuclear Information System (INIS)

    Highlights: • Bayesian calibration is applied to power plant performance prediction. • Measurements from a plant in operation are used for model calibration. • A gas turbine performance model and steam cycle model are calibrated. • An integrated plant model is derived. • Part load efficiency is accurately predicted as a function of ambient conditions. - Abstract: Gas turbine combined cycles are expected to play an increasingly important role in the balancing of supply and demand in future energy markets. Thermodynamic modeling of these energy systems is frequently applied to assist in decision making processes related to the management of plant operation and maintenance. In most cases, model inputs, parameters and outputs are treated as deterministic quantities and plant operators make decisions with limited or no regard of uncertainties. As the steady integration of wind and solar energy into the energy market induces extra uncertainties, part load operation and reliability are becoming increasingly important. In the current study, methods are proposed to not only quantify various types of uncertainties in measurements and plant model parameters using measured data, but to also assess their effect on various aspects of performance prediction. The authors aim to account for model parameter and measurement uncertainty, and for systematic discrepancy of models with respect to reality. For this purpose, the Bayesian calibration framework of Kennedy and O’Hagan is used, which is especially suitable for high-dimensional industrial problems. The article derives a calibrated model of the plant efficiency as a function of ambient conditions and operational parameters, which is also accurate in part load. The article shows that complete statistical modeling of power plants not only enhances process models, but can also increases confidence in operational decisions

  7. One-Stage and Bayesian Two-Stage Optimal Designs for Mixture Models

    OpenAIRE

    Lin, Hefang

    1999-01-01

    In this research, Bayesian two-stage D-D optimal designs for mixture experiments with or without process variables under model uncertainty are developed. A Bayesian optimality criterion is used in the first stage to minimize the determinant of the posterior variances of the parameters. The second stage design is then generated according to an optimality procedure that collaborates with the improved model from first stage data. Our results show that the Bayesian two-stage D-D optimal design...

  8. Uncovering Transcriptional Regulatory Networks by Sparse Bayesian Factor Model

    Directory of Open Access Journals (Sweden)

    Qi Yuan(Alan

    2010-01-01

    Full Text Available Abstract The problem of uncovering transcriptional regulation by transcription factors (TFs based on microarray data is considered. A novel Bayesian sparse correlated rectified factor model (BSCRFM is proposed that models the unknown TF protein level activity, the correlated regulations between TFs, and the sparse nature of TF-regulated genes. The model admits prior knowledge from existing database regarding TF-regulated target genes based on a sparse prior and through a developed Gibbs sampling algorithm, a context-specific transcriptional regulatory network specific to the experimental condition of the microarray data can be obtained. The proposed model and the Gibbs sampling algorithm were evaluated on the simulated systems, and results demonstrated the validity and effectiveness of the proposed approach. The proposed model was then applied to the breast cancer microarray data of patients with Estrogen Receptor positive ( status and Estrogen Receptor negative ( status, respectively.

  9. Efficient multilevel brain tumor segmentation with integrated bayesian model classification.

    Science.gov (United States)

    Corso, J J; Sharon, E; Dube, S; El-Saden, S; Sinha, U; Yuille, A

    2008-05-01

    We present a new method for automatic segmentation of heterogeneous image data that takes a step toward bridging the gap between bottom-up affinity-based segmentation methods and top-down generative model based approaches. The main contribution of the paper is a Bayesian formulation for incorporating soft model assignments into the calculation of affinities, which are conventionally model free. We integrate the resulting model-aware affinities into the multilevel segmentation by weighted aggregation algorithm, and apply the technique to the task of detecting and segmenting brain tumor and edema in multichannel magnetic resonance (MR) volumes. The computationally efficient method runs orders of magnitude faster than current state-of-the-art techniques giving comparable or improved results. Our quantitative results indicate the benefit of incorporating model-aware affinities into the segmentation process for the difficult case of glioblastoma multiforme brain tumor. PMID:18450536

  10. Building dynamic spatial environmental models

    NARCIS (Netherlands)

    Karssenberg, D.J.

    2003-01-01

    An environmental model is a representation or imitation of complex natural phenomena that can be discerned by human cognitive processes. This thesis deals with the type of environmental models referred to as dynamic spatial environmental models. The word ‘spatial’ refers to the geographic domain whi

  11. Emulation: A fast stochastic Bayesian method to eliminate model space

    Science.gov (United States)

    Roberts, Alan; Hobbs, Richard; Goldstein, Michael

    2010-05-01

    Joint inversion of large 3D datasets has been the goal of geophysicists ever since the datasets first started to be produced. There are two broad approaches to this kind of problem, traditional deterministic inversion schemes and more recently developed Bayesian search methods, such as MCMC (Markov Chain Monte Carlo). However, using both these kinds of schemes has proved prohibitively expensive, both in computing power and time cost, due to the normally very large model space which needs to be searched using forward model simulators which take considerable time to run. At the heart of strategies aimed at accomplishing this kind of inversion is the question of how to reliably and practicably reduce the size of the model space in which the inversion is to be carried out. Here we present a practical Bayesian method, known as emulation, which can address this issue. Emulation is a Bayesian technique used with considerable success in a number of technical fields, such as in astronomy, where the evolution of the universe has been modelled using this technique, and in the petroleum industry where history matching is carried out of hydrocarbon reservoirs. The method of emulation involves building a fast-to-compute uncertainty-calibrated approximation to a forward model simulator. We do this by modelling the output data from a number of forward simulator runs by a computationally cheap function, and then fitting the coefficients defining this function to the model parameters. By calibrating the error of the emulator output with respect to the full simulator output, we can use this to screen out large areas of model space which contain only implausible models. For example, starting with what may be considered a geologically reasonable prior model space of 10000 models, using the emulator we can quickly show that only models which lie within 10% of that model space actually produce output data which is plausibly similar in character to an observed dataset. We can thus much

  12. Finding the right balance between groundwater model complexity and experimental effort via Bayesian model selection

    Science.gov (United States)

    Schöniger, Anneli; Illman, Walter A.; Wöhling, Thomas; Nowak, Wolfgang

    2015-12-01

    Groundwater modelers face the challenge of how to assign representative parameter values to the studied aquifer. Several approaches are available to parameterize spatial heterogeneity in aquifer parameters. They differ in their conceptualization and complexity, ranging from homogeneous models to heterogeneous random fields. While it is common practice to invest more effort into data collection for models with a finer resolution of heterogeneities, there is a lack of advice which amount of data is required to justify a certain level of model complexity. In this study, we propose to use concepts related to Bayesian model selection to identify this balance. We demonstrate our approach on the characterization of a heterogeneous aquifer via hydraulic tomography in a sandbox experiment (Illman et al., 2010). We consider four increasingly complex parameterizations of hydraulic conductivity: (1) Effective homogeneous medium, (2) geology-based zonation, (3) interpolation by pilot points, and (4) geostatistical random fields. First, we investigate the shift in justified complexity with increasing amount of available data by constructing a model confusion matrix. This matrix indicates the maximum level of complexity that can be justified given a specific experimental setup. Second, we determine which parameterization is most adequate given the observed drawdown data. Third, we test how the different parameterizations perform in a validation setup. The results of our test case indicate that aquifer characterization via hydraulic tomography does not necessarily require (or justify) a geostatistical description. Instead, a zonation-based model might be a more robust choice, but only if the zonation is geologically adequate.

  13. Bayesian Dose-Response Modeling in Sparse Data

    Science.gov (United States)

    Kim, Steven B.

    This book discusses Bayesian dose-response modeling in small samples applied to two different settings. The first setting is early phase clinical trials, and the second setting is toxicology studies in cancer risk assessment. In early phase clinical trials, experimental units are humans who are actual patients. Prior to a clinical trial, opinions from multiple subject area experts are generally more informative than the opinion of a single expert, but we may face a dilemma when they have disagreeing prior opinions. In this regard, we consider compromising the disagreement and compare two different approaches for making a decision. In addition to combining multiple opinions, we also address balancing two levels of ethics in early phase clinical trials. The first level is individual-level ethics which reflects the perspective of trial participants. The second level is population-level ethics which reflects the perspective of future patients. We extensively compare two existing statistical methods which focus on each perspective and propose a new method which balances the two conflicting perspectives. In toxicology studies, experimental units are living animals. Here we focus on a potential non-monotonic dose-response relationship which is known as hormesis. Briefly, hormesis is a phenomenon which can be characterized by a beneficial effect at low doses and a harmful effect at high doses. In cancer risk assessments, the estimation of a parameter, which is known as a benchmark dose, can be highly sensitive to a class of assumptions, monotonicity or hormesis. In this regard, we propose a robust approach which considers both monotonicity and hormesis as a possibility. In addition, We discuss statistical hypothesis testing for hormesis and consider various experimental designs for detecting hormesis based on Bayesian decision theory. Past experiments have not been optimally designed for testing for hormesis, and some Bayesian optimal designs may not be optimal under a

  14. Perceptual decision making: Drift-diffusion model is equivalent to a Bayesian model

    Directory of Open Access Journals (Sweden)

    Sebastian Bitzer

    2014-02-01

    Full Text Available Behavioural data obtained with perceptual decision making experiments are typically analysed with the drift-diffusion model. This parsimonious model accumulates noisy pieces of evidence towards a decision bound to explain the accuracy and reaction times of subjects. Recently, Bayesian models have been proposed to explain how the brain extracts information from noisy input as typically presented in perceptual decision making tasks. It has long been known that the drift-diffusion model is tightly linked with such functional Bayesian models but the precise relationship of the two mechanisms was never made explicit. Using a Bayesian model, we derived the equations which relate parameter values between these models. In practice we show that this equivalence is useful when fitting multi-subject data. We further show that the Bayesian model suggests different decision variables which all predict equal responses and discuss how these may be discriminated based on neural correlates of accumulated evidence. In addition, we discuss extensions to the Bayesian model which would be difficult to derive for the drift-diffusion model. We suggest that these and other extensions may be highly useful for deriving new experiments which test novel hypotheses.

  15. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  16. Bayesian predictive modeling for genomic based personalized treatment selection.

    Science.gov (United States)

    Ma, Junsheng; Stingo, Francesco C; Hobbs, Brian P

    2016-06-01

    Efforts to personalize medicine in oncology have been limited by reductive characterizations of the intrinsically complex underlying biological phenomena. Future advances in personalized medicine will rely on molecular signatures that derive from synthesis of multifarious interdependent molecular quantities requiring robust quantitative methods. However, highly parameterized statistical models when applied in these settings often require a prohibitively large database and are sensitive to proper characterizations of the treatment-by-covariate interactions, which in practice are difficult to specify and may be limited by generalized linear models. In this article, we present a Bayesian predictive framework that enables the integration of a high-dimensional set of genomic features with clinical responses and treatment histories of historical patients, providing a probabilistic basis for using the clinical and molecular information to personalize therapy for future patients. Our work represents one of the first attempts to define personalized treatment assignment rules based on large-scale genomic data. We use actual gene expression data acquired from The Cancer Genome Atlas in the settings of leukemia and glioma to explore the statistical properties of our proposed Bayesian approach for personalizing treatment selection. The method is shown to yield considerable improvements in predictive accuracy when compared to penalized regression approaches. PMID:26575856

  17. Bayesian networks and agent-based modeling approach for urban land-use and population density change: a BNAS model

    Science.gov (United States)

    Kocabas, Verda; Dragicevic, Suzana

    2013-10-01

    Land-use change models grounded in complexity theory such as agent-based models (ABMs) are increasingly being used to examine evolving urban systems. The objective of this study is to develop a spatial model that simulates land-use change under the influence of human land-use choice behavior. This is achieved by integrating the key physical and social drivers of land-use change using Bayesian networks (BNs) coupled with agent-based modeling. The BNAS model, integrated Bayesian network-based agent system, presented in this study uses geographic information systems, ABMs, BNs, and influence diagram principles to model population change on an irregular spatial structure. The model is parameterized with historical data and then used to simulate 20 years of future population and land-use change for the City of Surrey, British Columbia, Canada. The simulation results identify feasible new urban areas for development around the main transportation corridors. The obtained new development areas and the projected population trajectories with the“what-if” scenario capabilities can provide insights into urban planners for better and more informed land-use policy or decision-making processes.

  18. Development of a Bayesian Belief Network Runway Incursion Model

    Science.gov (United States)

    Green, Lawrence L.

    2014-01-01

    In a previous paper, a statistical analysis of runway incursion (RI) events was conducted to ascertain their relevance to the top ten Technical Challenges (TC) of the National Aeronautics and Space Administration (NASA) Aviation Safety Program (AvSP). The study revealed connections to perhaps several of the AvSP top ten TC. That data also identified several primary causes and contributing factors for RI events that served as the basis for developing a system-level Bayesian Belief Network (BBN) model for RI events. The system-level BBN model will allow NASA to generically model the causes of RI events and to assess the effectiveness of technology products being developed under NASA funding. These products are intended to reduce the frequency of RI events in particular, and to improve runway safety in general. The development, structure and assessment of that BBN for RI events by a Subject Matter Expert panel are documented in this paper.

  19. Bayesian reduced-order models for multiscale dynamical systems

    CERN Document Server

    Koutsourelakis, P S

    2010-01-01

    While existing mathematical descriptions can accurately account for phenomena at microscopic scales (e.g. molecular dynamics), these are often high-dimensional, stochastic and their applicability over macroscopic time scales of physical interest is computationally infeasible or impractical. In complex systems, with limited physical insight on the coherent behavior of their constituents, the only available information is data obtained from simulations of the trajectories of huge numbers of degrees of freedom over microscopic time scales. This paper discusses a Bayesian approach to deriving probabilistic coarse-grained models that simultaneously address the problems of identifying appropriate reduced coordinates and the effective dynamics in this lower-dimensional representation. At the core of the models proposed lie simple, low-dimensional dynamical systems which serve as the building blocks of the global model. These approximate the latent, generating sources and parameterize the reduced-order dynamics. We d...

  20. Extended Bayesian Information Criteria for Gaussian Graphical Models

    CERN Document Server

    Foygel, Rina

    2010-01-01

    Gaussian graphical models with sparsity in the inverse covariance matrix are of significant interest in many modern applications. For the problem of recovering the graphical structure, information criteria provide useful optimization objectives for algorithms searching through sets of graphs or for selection of tuning parameters of other methods such as the graphical lasso, which is a likelihood penalization technique. In this paper we establish the consistency of an extended Bayesian information criterion for Gaussian graphical models in a scenario where both the number of variables p and the sample size n grow. Compared to earlier work on the regression case, our treatment allows for growth in the number of non-zero parameters in the true model, which is necessary in order to cover connected graphs. We demonstrate the performance of this criterion on simulated data when used in conjunction with the graphical lasso, and verify that the criterion indeed performs better than either cross-validation or the ordi...

  1. A Bayesian approach to the modelling of alpha Cen A

    CERN Document Server

    Bazot, M; Christensen-Dalsgaard, J

    2012-01-01

    Determining the physical characteristics of a star is an inverse problem consisting in estimating the parameters of models for the stellar structure and evolution, knowing certain observable quantities. We use a Bayesian approach to solve this problem for alpha Cen A, which allows us to incorporate prior information on the parameters to be estimated, in order to better constrain the problem. Our strategy is based on the use of a Markov Chain Monte Carlo (MCMC) algorithm to estimate the posterior probability densities of the stellar parameters: mass, age, initial chemical composition,... We use the stellar evolutionary code ASTEC to model the star. To constrain this model both seismic and non-seismic observations were considered. Several different strategies were tested to fit these values, either using two or five free parameters in ASTEC. We are thus able to show evidence that MCMC methods become efficient with respect to more classical grid-based strategies when the number of parameters increases. The resul...

  2. Advances in Bayesian Model Based Clustering Using Particle Learning

    Energy Technology Data Exchange (ETDEWEB)

    Merl, D M

    2009-11-19

    Recent work by Carvalho, Johannes, Lopes and Polson and Carvalho, Lopes, Polson and Taddy introduced a sequential Monte Carlo (SMC) alternative to traditional iterative Monte Carlo strategies (e.g. MCMC and EM) for Bayesian inference for a large class of dynamic models. The basis of SMC techniques involves representing the underlying inference problem as one of state space estimation, thus giving way to inference via particle filtering. The key insight of Carvalho et al was to construct the sequence of filtering distributions so as to make use of the posterior predictive distribution of the observable, a distribution usually only accessible in certain Bayesian settings. Access to this distribution allows a reversal of the usual propagate and resample steps characteristic of many SMC methods, thereby alleviating to a large extent many problems associated with particle degeneration. Furthermore, Carvalho et al point out that for many conjugate models the posterior distribution of the static variables can be parametrized in terms of [recursively defined] sufficient statistics of the previously observed data. For models where such sufficient statistics exist, particle learning as it is being called, is especially well suited for the analysis of streaming data do to the relative invariance of its algorithmic complexity with the number of data observations. Through a particle learning approach, a statistical model can be fit to data as the data is arriving, allowing at any instant during the observation process direct quantification of uncertainty surrounding underlying model parameters. Here we describe the use of a particle learning approach for fitting a standard Bayesian semiparametric mixture model as described in Carvalho, Lopes, Polson and Taddy. In Section 2 we briefly review the previously presented particle learning algorithm for the case of a Dirichlet process mixture of multivariate normals. In Section 3 we describe several novel extensions to the original

  3. Uncertainty Quantification of a Distributed Hydrology Model Using Bayesian Framework Across a Heterogeneous Watershed

    Science.gov (United States)

    Samadi, S.; Atlani, A.; Meadows, M.; Barros, A. P.

    2014-12-01

    Coastal Plain watersheds have high levels of spatial heterogeneity due to rainfall variability in space and time, dominate shallow water table, nonlinearity of river hydraulics properties, wide floodplains and dense vegetation. Bayesian algorithms can be used to capture key hydrological dynamics across the heterogeneous watershed system. This study examined two Bayesian frameworks to quantify parameter uncertainty during 2003-2005 period in the distributed hydrologic model (i.e. Soil and Water Assessment Tool (SWAT)) using Sequential Uncertainty Fitting (SUFI-2) and Differential Evolution Adaptive Metropolis (DREAM) algorithms in the Black River watershed, in the southeastern (SE) United States. Both algorithms were calibrated using 19 absolute parameter ranges of the SWAT model and streamflow predictive uncertainty showed well agreement to physical variation and system dynamics across the dry to moderately wet calibration period. In this study, the calibrated p-factor computed using SUFI-2 and DREAM algorithms respectively bracketed 63% and 69% of the predictive uncertainty, underlying the importance of parameter uncertainty in the watershed under study. In addition, SWAT parameters exhibited significant seasonal variation in dry and wet hydrological conditions and both Bayesian algorithms demonstrated that groundwater parameters, and soil and land use properties contributed more uncertainty to model. To reduce the calibration load, a hypothesis was further tested about whether convergence of DREAM algorithm can be achieved quicker by incorporating the best parameter ranges of the SUFI-2 model. The results revealed that the convergence criterion of R SUFI-2 parameter ranges while absolute parameter ranges met the convergence criterion after approximately 77,000 model simulations. In principle the methodology proposed here led to some improvement in parameter indentification, diminished the dimensionality of the parameter space and reduced burn-in period

  4. Using Bayesian model averaging to estimate terrestrial evapotranspiration in China

    Science.gov (United States)

    Chen, Yang; Yuan, Wenping; Xia, Jiangzhou; Fisher, Joshua B.; Dong, Wenjie; Zhang, Xiaotong; Liang, Shunlin; Ye, Aizhong; Cai, Wenwen; Feng, Jinming

    2015-09-01

    Evapotranspiration (ET) is critical to terrestrial ecosystems as it links the water, carbon, and surface energy exchanges. Numerous ET models were developed for the ET estimations, but there are large model uncertainties. In this study, a Bayesian Model Averaging (BMA) method was used to merge eight satellite-based models, including five empirical and three process-based models, for improving the accuracy of ET estimates. At twenty-three eddy covariance flux towers, we examined the model performance on all possible combinations of eight models and found that an ensemble with four models (BMA_Best) showed the best model performance. The BMA_Best method can outperform the best of eight models, and the Kling-Gupta efficiency (KGE) value increased by 4% compared with the model with the highest KGE, and decreased RMSE by 4%. Although the correlation coefficient of BMA_Best is less than the best single model, the bias of BMA_Best is the smallest compared with the eight models. Moreover, based on the water balance principle over the river basin scale, the validation indicated the BMA_Best estimates can explain 86% variations. In general, the results showed BMA estimates will be very useful for future studies to characterize the regional water availability over long-time series.

  5. Semi-parametric Bayesian Partially Identified Models based on Support Function

    OpenAIRE

    Liao, Yuan; De Simoni, Anna

    2012-01-01

    We provide a comprehensive semi-parametric study of Bayesian partially identified econometric models. While the existing literature on Bayesian partial identification has mostly focused on the structural parameter, our primary focus is on Bayesian credible sets (BCS's) of the unknown identified set and the posterior distribution of its support function. We construct a (two-sided) BCS based on the support function of the identified set. We prove the Bernstein-von Mises theorem for the posterio...

  6. A Bayesian analysis of two probability models describing thunderstorm activity at Cape Kennedy, Florida

    Science.gov (United States)

    Williford, W. O.; Hsieh, P.; Carter, M. C.

    1974-01-01

    A Bayesian analysis of the two discrete probability models, the negative binomial and the modified negative binomial distributions, which have been used to describe thunderstorm activity at Cape Kennedy, Florida, is presented. The Bayesian approach with beta prior distributions is compared to the classical approach which uses a moment method of estimation or a maximum-likelihood method. The accuracy and simplicity of the Bayesian method is demonstrated.

  7. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...

  8. Local models for spatial analysis

    CERN Document Server

    Lloyd, Christopher D

    2006-01-01

    In both the physical and social sciences, there are now available large spatial data sets with detailed local information. Global models for analyzing these data are not suitable for investigating local variations; consequently, local models are the subject of much recent research. Collecting a variety of models into a single reference, Local Models for Spatial Analysis explains in detail a variety of approaches for analyzing univariate and multivariate spatial data. Different models make use of data in unique ways, and this book offers perspectives on various definitions of what constitutes

  9. Modelling of population dynamics of red king crab using Bayesian approach

    Directory of Open Access Journals (Sweden)

    Bakanev Sergey ...

    2012-10-01

    Modeling population dynamics based on the Bayesian approach enables to successfully resolve the above issues. The integration of the data from various studies into a unified model based on Bayesian parameter estimation method provides a much more detailed description of the processes occurring in the population.

  10. Dynamic Bayesian Network Modeling of Game Based Diagnostic Assessments. CRESST Report 837

    Science.gov (United States)

    Levy, Roy

    2014-01-01

    Digital games offer an appealing environment for assessing student proficiencies, including skills and misconceptions in a diagnostic setting. This paper proposes a dynamic Bayesian network modeling approach for observations of student performance from an educational video game. A Bayesian approach to model construction, calibration, and use in…

  11. Bayesian Nonstationary Gaussian Process Models via Treed Process Convolutions

    OpenAIRE

    Liang, Waley Wei Jie

    2012-01-01

    Spatial modeling with stationary Gaussian processes (GPs) has been widely used, but the assumption that the correlation structure is independent of spatial location is invalid in many applications. Various nonstationary GP models have been developed to solve this problem, however, many of them become impractical when the sample size is large. To tackle this problem, a more computationally efficient GP model is developed by convolving a smoothing kernel with a latent process. Nonstationarit...

  12. Inversion of hierarchical Bayesian models using Gaussian processes.

    Science.gov (United States)

    Lomakina, Ekaterina I; Paliwal, Saee; Diaconescu, Andreea O; Brodersen, Kay H; Aponte, Eduardo A; Buhmann, Joachim M; Stephan, Klaas E

    2015-09-01

    Over the past decade, computational approaches to neuroimaging have increasingly made use of hierarchical Bayesian models (HBMs), either for inferring on physiological mechanisms underlying fMRI data (e.g., dynamic causal modelling, DCM) or for deriving computational trajectories (from behavioural data) which serve as regressors in general linear models. However, an unresolved problem is that standard methods for inverting the hierarchical Bayesian model are either very slow, e.g. Markov Chain Monte Carlo Methods (MCMC), or are vulnerable to local minima in non-convex optimisation problems, such as variational Bayes (VB). This article considers Gaussian process optimisation (GPO) as an alternative approach for global optimisation of sufficiently smooth and efficiently evaluable objective functions. GPO avoids being trapped in local extrema and can be computationally much more efficient than MCMC. Here, we examine the benefits of GPO for inverting HBMs commonly used in neuroimaging, including DCM for fMRI and the Hierarchical Gaussian Filter (HGF). Importantly, to achieve computational efficiency despite high-dimensional optimisation problems, we introduce a novel combination of GPO and local gradient-based search methods. The utility of this GPO implementation for DCM and HGF is evaluated against MCMC and VB, using both synthetic data from simulations and empirical data. Our results demonstrate that GPO provides parameter estimates with equivalent or better accuracy than the other techniques, but at a fraction of the computational cost required for MCMC. We anticipate that GPO will prove useful for robust and efficient inversion of high-dimensional and nonlinear models of neuroimaging data. PMID:26048619

  13. Building dynamic spatial environmental models

    OpenAIRE

    Karssenberg, D.J.

    2003-01-01

    An environmental model is a representation or imitation of complex natural phenomena that can be discerned by human cognitive processes. This thesis deals with the type of environmental models referred to as dynamic spatial environmental models. The word ‘spatial’ refers to the geographic domain which they represent, which is the two- or three-dimensional space, while ‘dynamic’ refers to models simulating changes through time using rules of cause and effect, represented in mathematical equati...

  14. Bayesian Degree-Corrected Stochastic Block Models for Community Detection

    CERN Document Server

    Peng, Lijun

    2013-01-01

    Community detection in networks has drawn much attention in diverse fields, especially social sciences. Given its significance, there has been a large body of literature among which many are not statistically based. In this paper, we propose a novel stochastic blockmodel based on a logistic regression setup with node correction terms to better address this problem. We follow a Bayesian approach that explicitly captures the community behavior via prior specification. We then adopt a data augmentation strategy with latent Polya-Gamma variables to obtain posterior samples. We conduct inference based on a canonically mapped centroid estimator that formally addresses label non-identifiability. We demonstrate the novel proposed model and estimation on real-world as well as simulated benchmark networks and show that the proposed model and estimator are more flexible, representative, and yield smaller error rates when compared to the MAP estimator from classical degree-corrected stochastic blockmodels.

  15. GPU Computing in Bayesian Inference of Realized Stochastic Volatility Model

    International Nuclear Information System (INIS)

    The realized stochastic volatility (RSV) model that utilizes the realized volatility as additional information has been proposed to infer volatility of financial time series. We consider the Bayesian inference of the RSV model by the Hybrid Monte Carlo (HMC) algorithm. The HMC algorithm can be parallelized and thus performed on the GPU for speedup. The GPU code is developed with CUDA Fortran. We compare the computational time in performing the HMC algorithm on GPU (GTX 760) and CPU (Intel i7-4770 3.4GHz) and find that the GPU can be up to 17 times faster than the CPU. We also code the program with OpenACC and find that appropriate coding can achieve the similar speedup with CUDA Fortran

  16. a Simplified Bayesian Network Model Applied in Crop or Animal Disease Diagnosis

    Science.gov (United States)

    Yu, Helong; Chen, Guifen; Liu, Dayou

    Bayesian network is a powerful tool to represent and deal with uncertain knowledge. There exists much uncertainty in crop or animal disease. The construction of Bayesian network need much data and knowledge. But when data is scarce, some methods should be adopted to construct an effective Bayesian network. This paper introduces a disease diagnosis model based on Bayesian network, which is two-layered and obeys noisy-or assumption. Based on the two-layered structure, the relationship between nodes is obtained by domain knowledge. Based on the noisy-model, the conditional probability table is elicited by three methods, which are parameter learning, domain expert and the existing certainty factor model. In order to implement this model, a Bayesian network tool is developed. Finally, an example about cow disease diagnosis was implemented, which proved that the model discussed in this paper is an effective tool for some simple disease diagnosis in crop or animal field.

  17. Determinants of Low Birth Weight in Malawi: Bayesian Geo-Additive Modelling.

    Directory of Open Access Journals (Sweden)

    Alfred Ngwira

    Full Text Available Studies on factors of low birth weight in Malawi have neglected the flexible approach of using smooth functions for some covariates in models. Such flexible approach reveals detailed relationship of covariates with the response. The study aimed at investigating risk factors of low birth weight in Malawi by assuming a flexible approach for continuous covariates and geographical random effect. A Bayesian geo-additive model for birth weight in kilograms and size of the child at birth (less than average or average and higher with district as a spatial effect using the 2010 Malawi demographic and health survey data was adopted. A Gaussian model for birth weight in kilograms and a binary logistic model for the binary outcome (size of child at birth were fitted. Continuous covariates were modelled by the penalized (p splines and spatial effects were smoothed by the two dimensional p-spline. The study found that child birth order, mother weight and height are significant predictors of birth weight. Secondary education for mother, birth order categories 2-3 and 4-5, wealth index of richer family and mother height were significant predictors of child size at birth. The area associated with low birth weight was Chitipa and areas with increased risk to less than average size at birth were Chitipa and Mchinji. The study found support for the flexible modelling of some covariates that clearly have nonlinear influences. Nevertheless there is no strong support for inclusion of geographical spatial analysis. The spatial patterns though point to the influence of omitted variables with some spatial structure or possibly epidemiological processes that account for this spatial structure and the maps generated could be used for targeting development efforts at a glance.

  18. Modelling of Traffic Flow with Bayesian Autoregressive Model with Variable Partial Forgetting

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Nagy, Ivan; Hofman, Radek

    Praha : ČVUT v Praze, 2011, s. 1-11. [CTU Workshop 2011. Praha (CZ), 01.02.2011-01.02.2011] Grant ostatní: ČVUT v Praze(CZ) SGS 10/099/OHK3/1T/16 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian modelling * traffic modelling Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2011/AS/dedecius-modelling of traffic flow with bayesian autoregressive model with variable partial forgetting.pdf

  19. Comparison of Bayesian and frequentist approaches in modelling risk of preterm birth near the Sydney Tar Ponds, Nova Scotia, Canada

    Directory of Open Access Journals (Sweden)

    Canty Angelo

    2007-09-01

    Full Text Available Abstract Background This study compares the Bayesian and frequentist (non-Bayesian approaches in the modelling of the association between the risk of preterm birth and maternal proximity to hazardous waste and pollution from the Sydney Tar Pond site in Nova Scotia, Canada. Methods The data includes 1604 observed cases of preterm birth out of a total population of 17559 at risk of preterm birth from 144 enumeration districts in the Cape Breton Regional Municipality. Other covariates include the distance from the Tar Pond; the rate of unemployment to population; the proportion of persons who are separated, divorced or widowed; the proportion of persons who have no high school diploma; the proportion of persons living alone; the proportion of single parent families and average income. Bayesian hierarchical Poisson regression, quasi-likelihood Poisson regression and weighted linear regression models were fitted to the data. Results The results of the analyses were compared together with their limitations. Conclusion The results of the weighted linear regression and the quasi-likelihood Poisson regression agrees with the result from the Bayesian hierarchical modelling which incorporates the spatial effects.

  20. Bayesian network models for error detection in radiotherapy plans

    Science.gov (United States)

    Kalet, Alan M.; Gennari, John H.; Ford, Eric C.; Phillips, Mark H.

    2015-04-01

    The purpose of this study is to design and develop a probabilistic network for detecting errors in radiotherapy plans for use at the time of initial plan verification. Our group has initiated a multi-pronged approach to reduce these errors. We report on our development of Bayesian models of radiotherapy plans. Bayesian networks consist of joint probability distributions that define the probability of one event, given some set of other known information. Using the networks, we find the probability of obtaining certain radiotherapy parameters, given a set of initial clinical information. A low probability in a propagated network then corresponds to potential errors to be flagged for investigation. To build our networks we first interviewed medical physicists and other domain experts to identify the relevant radiotherapy concepts and their associated interdependencies and to construct a network topology. Next, to populate the network’s conditional probability tables, we used the Hugin Expert software to learn parameter distributions from a subset of de-identified data derived from a radiation oncology based clinical information database system. These data represent 4990 unique prescription cases over a 5 year period. Under test case scenarios with approximately 1.5% introduced error rates, network performance produced areas under the ROC curve of 0.88, 0.98, and 0.89 for the lung, brain and female breast cancer error detection networks, respectively. Comparison of the brain network to human experts performance (AUC of 0.90 ± 0.01) shows the Bayes network model performs better than domain experts under the same test conditions. Our results demonstrate the feasibility and effectiveness of comprehensive probabilistic models as part of decision support systems for improved detection of errors in initial radiotherapy plan verification procedures.

  1. Assessing and accounting for the effects of model error in Bayesian solutions to hydrogeophysical inverse problems

    Science.gov (United States)

    Koepke, C.; Irving, J.; Roubinet, D.

    2014-12-01

    Geophysical methods have gained much interest in hydrology over the past two decades because of their ability to provide estimates of the spatial distribution of subsurface properties at a scale that is often relevant to key hydrological processes. Because of an increased desire to quantify uncertainty in hydrological predictions, many hydrogeophysical inverse problems have recently been posed within a Bayesian framework, such that estimates of hydrological properties and their corresponding uncertainties can be obtained. With the Bayesian approach, it is often necessary to make significant approximations to the associated hydrological and geophysical forward models such that stochastic sampling from the posterior distribution, for example using Markov-chain-Monte-Carlo (MCMC) methods, is computationally feasible. These approximations lead to model structural errors, which, so far, have not been properly treated in hydrogeophysical inverse problems. Here, we study the inverse problem of estimating unsaturated hydraulic properties, namely the van Genuchten-Mualem (VGM) parameters, in a layered subsurface from time-lapse, zero-offset-profile (ZOP) ground penetrating radar (GPR) data, collected over the course of an infiltration experiment. In particular, we investigate the effects of assumptions made for computational tractability of the stochastic inversion on model prediction errors as a function of depth and time. These assumptions are that (i) infiltration is purely vertical and can be modeled by the 1D Richards equation, and (ii) the petrophysical relationship between water content and relative dielectric permittivity is known. Results indicate that model errors for this problem are far from Gaussian and independently identically distributed, which has been the common assumption in previous efforts in this domain. In order to develop a more appropriate likelihood formulation, we use (i) a stochastic description of the model error that is obtained through

  2. Bayesian modeling of ChIP-chip data using latent variables.

    KAUST Repository

    Wu, Mingqi

    2009-10-26

    BACKGROUND: The ChIP-chip technology has been used in a wide range of biomedical studies, such as identification of human transcription factor binding sites, investigation of DNA methylation, and investigation of histone modifications in animals and plants. Various methods have been proposed in the literature for analyzing the ChIP-chip data, such as the sliding window methods, the hidden Markov model-based methods, and Bayesian methods. Although, due to the integrated consideration of uncertainty of the models and model parameters, Bayesian methods can potentially work better than the other two classes of methods, the existing Bayesian methods do not perform satisfactorily. They usually require multiple replicates or some extra experimental information to parametrize the model, and long CPU time due to involving of MCMC simulations. RESULTS: In this paper, we propose a Bayesian latent model for the ChIP-chip data. The new model mainly differs from the existing Bayesian models, such as the joint deconvolution model, the hierarchical gamma mixture model, and the Bayesian hierarchical model, in two respects. Firstly, it works on the difference between the averaged treatment and control samples. This enables the use of a simple model for the data, which avoids the probe-specific effect and the sample (control/treatment) effect. As a consequence, this enables an efficient MCMC simulation of the posterior distribution of the model, and also makes the model more robust to the outliers. Secondly, it models the neighboring dependence of probes by introducing a latent indicator vector. A truncated Poisson prior distribution is assumed for the latent indicator variable, with the rationale being justified at length. CONCLUSION: The Bayesian latent method is successfully applied to real and ten simulated datasets, with comparisons with some of the existing Bayesian methods, hidden Markov model methods, and sliding window methods. The numerical results indicate that the

  3. Bayesian modeling of ChIP-chip data using latent variables

    Directory of Open Access Journals (Sweden)

    Tian Yanan

    2009-10-01

    Full Text Available Abstract Background The ChIP-chip technology has been used in a wide range of biomedical studies, such as identification of human transcription factor binding sites, investigation of DNA methylation, and investigation of histone modifications in animals and plants. Various methods have been proposed in the literature for analyzing the ChIP-chip data, such as the sliding window methods, the hidden Markov model-based methods, and Bayesian methods. Although, due to the integrated consideration of uncertainty of the models and model parameters, Bayesian methods can potentially work better than the other two classes of methods, the existing Bayesian methods do not perform satisfactorily. They usually require multiple replicates or some extra experimental information to parametrize the model, and long CPU time due to involving of MCMC simulations. Results In this paper, we propose a Bayesian latent model for the ChIP-chip data. The new model mainly differs from the existing Bayesian models, such as the joint deconvolution model, the hierarchical gamma mixture model, and the Bayesian hierarchical model, in two respects. Firstly, it works on the difference between the averaged treatment and control samples. This enables the use of a simple model for the data, which avoids the probe-specific effect and the sample (control/treatment effect. As a consequence, this enables an efficient MCMC simulation of the posterior distribution of the model, and also makes the model more robust to the outliers. Secondly, it models the neighboring dependence of probes by introducing a latent indicator vector. A truncated Poisson prior distribution is assumed for the latent indicator variable, with the rationale being justified at length. Conclusion The Bayesian latent method is successfully applied to real and ten simulated datasets, with comparisons with some of the existing Bayesian methods, hidden Markov model methods, and sliding window methods. The numerical results

  4. Bayesian modeling and significant features exploration in wavelet power spectra

    Directory of Open Access Journals (Sweden)

    D. V. Divine

    2007-01-01

    Full Text Available This study proposes and justifies a Bayesian approach to modeling wavelet coefficients and finding statistically significant features in wavelet power spectra. The approach utilizes ideas elaborated in scale-space smoothing methods and wavelet data analysis. We treat each scale of the discrete wavelet decomposition as a sequence of independent random variables and then apply Bayes' rule for constructing the posterior distribution of the smoothed wavelet coefficients. Samples drawn from the posterior are subsequently used for finding the estimate of the true wavelet spectrum at each scale. The method offers two different significance testing procedures for wavelet spectra. A traditional approach assesses the statistical significance against a red noise background. The second procedure tests for homoscedasticity of the wavelet power assessing whether the spectrum derivative significantly differs from zero at each particular point of the spectrum. Case studies with simulated data and climatic time-series prove the method to be a potentially useful tool in data analysis.

  5. Designing and testing inflationary models with Bayesian networks

    CERN Document Server

    Price, Layne C; Frazer, Jonathan; Easther, Richard

    2015-01-01

    Even simple inflationary scenarios have many free parameters. Beyond the variables appearing in the inflationary action, these include dynamical initial conditions, the number of fields, and couplings to other sectors. These quantities are often ignored but cosmological observables can depend on the unknown parameters. We use Bayesian networks to account for a large set of inflationary parameters, deriving generative models for the primordial spectra that are conditioned on a hierarchical set of prior probabilities describing the initial conditions, reheating physics, and other free parameters. We use $N_f$--quadratic inflation as an illustrative example, finding that the number of $e$-folds $N_*$ between horizon exit for the pivot scale and the end of inflation is typically the most important parameter, even when the number of fields, their masses and initial conditions are unknown, along with possible conditional dependencies between these parameters.

  6. Designing and testing inflationary models with Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Price, Layne C. [Carnegie Mellon Univ., Pittsburgh, PA (United States). Dept. of Physics; Auckland Univ. (New Zealand). Dept. of Physics; Peiris, Hiranya V. [Univ. College London (United Kingdom). Dept. of Physics and Astronomy; Frazer, Jonathan [DESY Hamburg (Germany). Theory Group; Univ. of the Basque Country, Bilbao (Spain). Dept. of Theoretical Physics; Basque Foundation for Science, Bilbao (Spain). IKERBASQUE; Easther, Richard [Auckland Univ. (New Zealand). Dept. of Physics

    2015-11-15

    Even simple inflationary scenarios have many free parameters. Beyond the variables appearing in the inflationary action, these include dynamical initial conditions, the number of fields, and couplings to other sectors. These quantities are often ignored but cosmological observables can depend on the unknown parameters. We use Bayesian networks to account for a large set of inflationary parameters, deriving generative models for the primordial spectra that are conditioned on a hierarchical set of prior probabilities describing the initial conditions, reheating physics, and other free parameters. We use N{sub f}-quadratic inflation as an illustrative example, finding that the number of e-folds N{sub *} between horizon exit for the pivot scale and the end of inflation is typically the most important parameter, even when the number of fields, their masses and initial conditions are unknown, along with possible conditional dependencies between these parameters.

  7. Thermodynamic Model of Spatial Memory

    Science.gov (United States)

    Kaufman, Miron; Allen, P.

    1998-03-01

    We develop and test a thermodynamic model of spatial memory. Our model is an application of statistical thermodynamics to cognitive science. It is related to applications of the statistical mechanics framework in parallel distributed processes research. Our macroscopic model allows us to evaluate an entropy associated with spatial memory tasks. We find that older adults exhibit higher levels of entropy than younger adults. Thurstone's Law of Categorical Judgment, according to which the discriminal processes along the psychological continuum produced by presentations of a single stimulus are normally distributed, is explained by using a Hooke spring model of spatial memory. We have also analyzed a nonlinear modification of the ideal spring model of spatial memory. This work is supported by NIH/NIA grant AG09282-06.

  8. Hunting down the best model of inflation with Bayesian evidence

    International Nuclear Information System (INIS)

    We present the first calculation of the Bayesian evidence for different prototypical single field inflationary scenarios, including representative classes of small field and large field models. This approach allows us to compare inflationary models in a well-defined statistical way and to determine the current 'best model of inflation'. The calculation is performed numerically by interfacing the inflationary code FieldInf with MultiNest. We find that small field models are currently preferred, while large field models having a self-interacting potential of power p>4 are strongly disfavored. The class of small field models as a whole has posterior odds of approximately 3 ratio 1 when compared with the large field class. The methodology and results presented in this article are an additional step toward the construction of a full numerical pipeline to constrain the physics of the early Universe with astrophysical observations. More accurate data (such as the Planck data) and the techniques introduced here should allow us to identify conclusively the best inflationary model.

  9. Probabilistic daily ILI syndromic surveillance with a spatio-temporal Bayesian hierarchical model.

    Directory of Open Access Journals (Sweden)

    Ta-Chien Chan

    Full Text Available BACKGROUND: For daily syndromic surveillance to be effective, an efficient and sensible algorithm would be expected to detect aberrations in influenza illness, and alert public health workers prior to any impending epidemic. This detection or alert surely contains uncertainty, and thus should be evaluated with a proper probabilistic measure. However, traditional monitoring mechanisms simply provide a binary alert, failing to adequately address this uncertainty. METHODS AND FINDINGS: Based on the Bayesian posterior probability of influenza-like illness (ILI visits, the intensity of outbreak can be directly assessed. The numbers of daily emergency room ILI visits at five community hospitals in Taipei City during 2006-2007 were collected and fitted with a Bayesian hierarchical model containing meteorological factors such as temperature and vapor pressure, spatial interaction with conditional autoregressive structure, weekend and holiday effects, seasonality factors, and previous ILI visits. The proposed algorithm recommends an alert for action if the posterior probability is larger than 70%. External data from January to February of 2008 were retained for validation. The decision rule detects successfully the peak in the validation period. When comparing the posterior probability evaluation with the modified Cusum method, results show that the proposed method is able to detect the signals 1-2 days prior to the rise of ILI visits. CONCLUSIONS: This Bayesian hierarchical model not only constitutes a dynamic surveillance system but also constructs a stochastic evaluation of the need to call for alert. The monitoring mechanism provides earlier detection as well as a complementary tool for current surveillance programs.

  10. Bayesian modeling of animal- and herd-level prevalences.

    Science.gov (United States)

    Branscum, A J; Gardner, I A; Johnson, W O

    2004-12-15

    We reviewed Bayesian approaches for animal-level and herd-level prevalence estimation based on cross-sectional sampling designs and demonstrated fitting of these models using the WinBUGS software. We considered estimation of infection prevalence based on use of a single diagnostic test applied to a single herd with binomial and hypergeometric sampling. We then considered multiple herds under binomial sampling with the primary goal of estimating the prevalence distribution and the proportion of infected herds. A new model is presented that can be used to estimate the herd-level prevalence in a region, including the posterior probability that all herds are non-infected. Using this model, inferences for the distribution of prevalences, mean prevalence in the region, and predicted prevalence of herds in the region (including the predicted probability of zero prevalence) are also available. In the models presented, both animal- and herd-level prevalences are modeled as mixture distributions to allow for zero infection prevalences. (If mixture models for the prevalences were not used, prevalence estimates might be artificially inflated, especially in herds and regions with low or zero prevalence.) Finally, we considered estimation of animal-level prevalence based on pooled samples. PMID:15579338

  11. A Bayesian semiparametric multilevel survival modelling of age at first birth in Nigeria

    Directory of Open Access Journals (Sweden)

    Ezra Gayawan

    2013-06-01

    Full Text Available BACKGROUND The age at which childbearing begins influences the total number of children a woman bears throughout her reproductive period, in the absence of any active fertility control. For countries in sub-Saharan Africa where contraceptive prevalence rate is still low, younger ages at first birth tend to increase the number of children a woman will have thereby hindering the process of fertility decline. Research has also shown that early childbearing can endanger the health of the mother and her offspring, which can in turn lead to high child and maternal mortality. OBJECTIVE In this paper, an attempt was made to explore possible trends, geographical variation and determinants of timing of first birth in Nigeria, using the 1999 - 2008 Nigeria Demographic and Health Survey data sets. METHODS A structured additive survival model for continuous time data, an approach that simultaneously estimates the nonlinear effect of metrical covariates, fixed effects, spatial effects and smoothing parameters within a Bayesian context in one step is employed for all estimations. All analyses were carried out using BayesX - a software package for Bayesian modelling techniques. RESULTS Results from this paper reveal that variation in age at first birth in Nigeria is determined more by individual household than by community, and that substantial geographical variations in timing of first birth also exist. COMMENTS These findings can guide policymakers in identifying states or districts that are associated with significant risk of early childbirth, which can in turn be used in designing effective strategies and in decision making.

  12. Modeling crash spatial heterogeneity: random parameter versus geographically weighting.

    Science.gov (United States)

    Xu, Pengpeng; Huang, Helai

    2015-02-01

    The widely adopted techniques for regional crash modeling include the negative binomial model (NB) and Bayesian negative binomial model with conditional autoregressive prior (CAR). The outputs from both models consist of a set of fixed global parameter estimates. However, the impacts of predicting variables on crash counts might not be stationary over space. This study intended to quantitatively investigate this spatial heterogeneity in regional safety modeling using two advanced approaches, i.e., random parameter negative binomial model (RPNB) and semi-parametric geographically weighted Poisson regression model (S-GWPR). Based on a 3-year data set from the county of Hillsborough, Florida, results revealed that (1) both RPNB and S-GWPR successfully capture the spatially varying relationship, but the two methods yield notably different sets of results; (2) the S-GWPR performs best with the highest value of Rd(2) as well as the lowest mean absolute deviance and Akaike information criterion measures. Whereas the RPNB is comparable to the CAR, in some cases, it provides less accurate predictions; (3) a moderately significant spatial correlation is found in the residuals of RPNB and NB, implying the inadequacy in accounting for the spatial correlation existed across adjacent zones. As crash data are typically collected with reference to location dimension, it is desirable to firstly make use of the geographical component to explore explicitly spatial aspects of the crash data (i.e., the spatial heterogeneity, or the spatially structured varying relationships), then is the unobserved heterogeneity by non-spatial or fuzzy techniques. The S-GWPR is proven to be more appropriate for regional crash modeling as the method outperforms the global models in capturing the spatial heterogeneity occurring in the relationship that is model, and compared with the non-spatial model, it is capable of accounting for the spatial correlation in crash data. PMID:25460087

  13. Bayesian calibration of the Community Land Model using surrogates

    Energy Technology Data Exchange (ETDEWEB)

    Ray, Jaideep [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hou, Zhangshuan [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Huang, Maoyi [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-02-01

    We present results from the Bayesian calibration of hydrological parameters of the Community Land Model (CLM), which is often used in climate simulations and Earth system models. A statistical inverse problem is formulated for three hydrological parameters, conditional on observations of latent heat surface fluxes over 48 months. Our calibration method uses polynomial and Gaussian process surrogates of the CLM, and solves the parameter estimation problem using a Markov chain Monte Carlo sampler. Posterior probability densities for the parameters are developed for two sites with different soil and vegetation covers. Our method also allows us to examine the structural error in CLM under two error models. We find that surrogate models can be created for CLM in most cases. The posterior distributions are more predictive than the default parameter values in CLM. Climatologically averaging the observations does not modify the parameters' distributions significantly. The structural error model reveals a correlation time-scale which can be used to identify the physical process that could be contributing to it. While the calibrated CLM has a higher predictive skill, the calibration is under-dispersive.

  14. Ensemble bayesian model averaging using markov chain Monte Carlo sampling

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Diks, Cees G H [NON LANL; Clark, Martyn P [NON LANL

    2008-01-01

    Bayesian model averaging (BMA) has recently been proposed as a statistical method to calibrate forecast ensembles from numerical weather models. Successful implementation of BMA however, requires accurate estimates of the weights and variances of the individual competing models in the ensemble. In their seminal paper (Raftery etal. Mon Weather Rev 133: 1155-1174, 2(05)) has recommended the Expectation-Maximization (EM) algorithm for BMA model training, even though global convergence of this algorithm cannot be guaranteed. In this paper, we compare the performance of the EM algorithm and the recently developed Differential Evolution Adaptive Metropolis (DREAM) Markov Chain Monte Carlo (MCMC) algorithm for estimating the BMA weights and variances. Simulation experiments using 48-hour ensemble data of surface temperature and multi-model stream-flow forecasts show that both methods produce similar results, and that their performance is unaffected by the length of the training data set. However, MCMC simulation with DREAM is capable of efficiently handling a wide variety of BMA predictive distributions, and provides useful information about the uncertainty associated with the estimated BMA weights and variances.

  15. A Bayesian model of category-specific emotional brain responses.

    Science.gov (United States)

    Wager, Tor D; Kang, Jian; Johnson, Timothy D; Nichols, Thomas E; Satpute, Ajay B; Barrett, Lisa Feldman

    2015-04-01

    Understanding emotion is critical for a science of healthy and disordered brain function, but the neurophysiological basis of emotional experience is still poorly understood. We analyzed human brain activity patterns from 148 studies of emotion categories (2159 total participants) using a novel hierarchical Bayesian model. The model allowed us to classify which of five categories--fear, anger, disgust, sadness, or happiness--is engaged by a study with 66% accuracy (43-86% across categories). Analyses of the activity patterns encoded in the model revealed that each emotion category is associated with unique, prototypical patterns of activity across multiple brain systems including the cortex, thalamus, amygdala, and other structures. The results indicate that emotion categories are not contained within any one region or system, but are represented as configurations across multiple brain networks. The model provides a precise summary of the prototypical patterns for each emotion category, and demonstrates that a sufficient characterization of emotion categories relies on (a) differential patterns of involvement in neocortical systems that differ between humans and other species, and (b) distinctive patterns of cortical-subcortical interactions. Thus, these findings are incompatible with several contemporary theories of emotion, including those that emphasize emotion-dedicated brain systems and those that propose emotion is localized primarily in subcortical activity. They are consistent with componential and constructionist views, which propose that emotions are differentiated by a combination of perceptual, mnemonic, prospective, and motivational elements. Such brain-based models of emotion provide a foundation for new translational and clinical approaches. PMID:25853490

  16. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    Science.gov (United States)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  17. Bayesian Belief Networks Approach for Modeling Irrigation Behavior

    Science.gov (United States)

    Andriyas, S.; McKee, M.

    2012-12-01

    Canal operators need information to manage water deliveries to irrigators. Short-term irrigation demand forecasts can potentially valuable information for a canal operator who must manage an on-demand system. Such forecasts could be generated by using information about the decision-making processes of irrigators. Bayesian models of irrigation behavior can provide insight into the likely criteria which farmers use to make irrigation decisions. This paper develops a Bayesian belief network (BBN) to learn irrigation decision-making behavior of farmers and utilizes the resulting model to make forecasts of future irrigation decisions based on factor interaction and posterior probabilities. Models for studying irrigation behavior have been rarely explored in the past. The model discussed here was built from a combination of data about biotic, climatic, and edaphic conditions under which observed irrigation decisions were made. The paper includes a case study using data collected from the Canal B region of the Sevier River, near Delta, Utah. Alfalfa, barley and corn are the main crops of the location. The model has been tested with a portion of the data to affirm the model predictive capabilities. Irrigation rules were deduced in the process of learning and verified in the testing phase. It was found that most of the farmers used consistent rules throughout all years and across different types of crops. Soil moisture stress, which indicates the level of water available to the plant in the soil profile, was found to be one of the most significant likely driving forces for irrigation. Irrigations appeared to be triggered by a farmer's perception of soil stress, or by a perception of combined factors such as information about a neighbor irrigating or an apparent preference to irrigate on a weekend. Soil stress resulted in irrigation probabilities of 94.4% for alfalfa. With additional factors like weekend and irrigating when a neighbor irrigates, alfalfa irrigation

  18. Confronting different models of community structure to species-abundance data: a Bayesian model comparison

    NARCIS (Netherlands)

    Etienne, R.S.; Olff, H.

    2005-01-01

    Species abundances are undoubtedly the most widely available macroecological data, but can we use them to distinguish among several models of community structure? Here we present a Bayesian analysis of species-abundance data that yields a full joint probability distribution of each model's parameter

  19. Confronting different models of community structure to species-abundance data : a Bayesian model comparison

    NARCIS (Netherlands)

    Etienne, RS; Olff, H

    2005-01-01

    Species abundances are undoubtedly the most widely available macroecological data, but can we use them to distinguish among several models of community structure? Here we present a Bayesian analysis of species-abundance data that yields a full joint probability distribution of each model's parameter

  20. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  1. Optimizing the Amount of Models Taken into Consideration During Model Selection in Bayesian Networks

    OpenAIRE

    Castelo, J.R.; Siebes, Arno

    1999-01-01

    Graphical model selection from data embodies several difficulties. Among them, it is specially challenging the size of the sample space of models on which one should carry out model selection, even considering only a modest amount of variables. This becomes more severe when one works on those graphical models where some variables may be responses to other. This is the case of Bayesian Networks that are modeled by acyclic digraphs. In this paper we try to reduce the amount of models taken into...

  2. Bayesian Analysis of Marginal Log-Linear Graphical Models for Three Way Contingency Tables

    OpenAIRE

    Ntzoufras, Ioannis; Tarantola, Claudia

    2008-01-01

    This paper deals with the Bayesian analysis of graphical models of marginal independence for three way contingency tables. We use a marginal log-linear parametrization, under which the model is defined through suitable zero-constraints on the interaction parameters calculated within marginal distributions. We undertake a comprehensive Bayesian analysis of these models, involving suitable choices of prior distributions, estimation, model determination, as well as the allied computational issue...

  3. Bayesian Analysis of Graphical Models of Marginal Independence for Three Way Contingency Tables

    OpenAIRE

    Tarantola, Claudia; Ntzoufras, Ioannis

    2012-01-01

    This paper deals with the Bayesian analysis of graphical models of marginal independence for three way contingency tables. Each marginal independence model corresponds to a particular factorization of the cell probabilities and a conjugate analysis based on Dirichlet prior can be performed. We illustrate a comprehensive Bayesian analysis of such models, involving suitable choices of prior parameters, estimation, model determination, as well as the allied computational issues. The posterior di...

  4. Beam-R1. A Bayesian European archeomagnetic model for the past 2000 years

    International Nuclear Information System (INIS)

    Complete text of publication follows. Ongoing efforts are currently being made to enrich the available archeomagnetic database and to represent the data in space and time by means of Spherical Harmonic models. However, it is well known that the sparse spatial and temporal data distribution at the worldwide scale hampers the modeling to high resolutions. The data density is comparatively higher in Europe than in any other part of the world for the past 2000 yr. This situation offers a good opportunity to model the archeomagnetic field at spatial scales higher than permitted by spherical harmonics. In principle, such a regional modelling should give us the possibility of better exploring the reliability of the rapid spatial and time variations of the past magnetic field. We complement the recent regional modeling advances made by Pavon-Carrasco et al. (2009) with a new approach based on a Revised Spherical Cap Harmonic Analysis (R-SCHA2D; Thebault, 2008) and a Bayesian inversion scheme in order to process the most up-to-date archeomagnetic database for Europe. This allows us to solve non-linear inverse problems and to avoid the use of strong a priori information. We obtain the marginal probabilities of each regional parameter with a Monte Carlo Markov Chain at all epochs between 50BC and 1850AD. This helps us assessing the reliability of some time variations. Our method provides the maximum likelihood of the magnetic field components, their probability distribution, but also misfit histograms for all individual data. This exhaustive statistical information allow us to conclude that directional data are self-consistent in Europe but that important issues are faced when considering intensity data. By a comparison with other regional and global models, we conclude that our new Bayesian model represents the data to higher spatial resolutions than what is obtained by other techniques. However, given the data error and uncertainties, this does not necessarily lead to

  5. A Bayesian spatio-temporal geostatistical model with an auxiliary lattice for large datasets

    KAUST Repository

    Xu, Ganggang

    2015-01-01

    When spatio-temporal datasets are large, the computational burden can lead to failures in the implementation of traditional geostatistical tools. In this paper, we propose a computationally efficient Bayesian hierarchical spatio-temporal model in which the spatial dependence is approximated by a Gaussian Markov random field (GMRF) while the temporal correlation is described using a vector autoregressive model. By introducing an auxiliary lattice on the spatial region of interest, the proposed method is not only able to handle irregularly spaced observations in the spatial domain, but it is also able to bypass the missing data problem in a spatio-temporal process. Because the computational complexity of the proposed Markov chain Monte Carlo algorithm is of the order O(n) with n the total number of observations in space and time, our method can be used to handle very large spatio-temporal datasets with reasonable CPU times. The performance of the proposed model is illustrated using simulation studies and a dataset of precipitation data from the coterminous United States.

  6. A Bayesian analysis of sensible heat flux estimation: Quantifying uncertainty in meteorological forcing to improve model prediction

    KAUST Repository

    Ershadi, Ali

    2013-05-01

    The influence of uncertainty in land surface temperature, air temperature, and wind speed on the estimation of sensible heat flux is analyzed using a Bayesian inference technique applied to the Surface Energy Balance System (SEBS) model. The Bayesian approach allows for an explicit quantification of the uncertainties in input variables: a source of error generally ignored in surface heat flux estimation. An application using field measurements from the Soil Moisture Experiment 2002 is presented. The spatial variability of selected input meteorological variables in a multitower site is used to formulate the prior estimates for the sampling uncertainties, and the likelihood function is formulated assuming Gaussian errors in the SEBS model. Land surface temperature, air temperature, and wind speed were estimated by sampling their posterior distribution using a Markov chain Monte Carlo algorithm. Results verify that Bayesian-inferred air temperature and wind speed were generally consistent with those observed at the towers, suggesting that local observations of these variables were spatially representative. Uncertainties in the land surface temperature appear to have the strongest effect on the estimated sensible heat flux, with Bayesian-inferred values differing by up to ±5°C from the observed data. These differences suggest that the footprint of the in situ measured land surface temperature is not representative of the larger-scale variability. As such, these measurements should be used with caution in the calculation of surface heat fluxes and highlight the importance of capturing the spatial variability in the land surface temperature: particularly, for remote sensing retrieval algorithms that use this variable for flux estimation.

  7. An evaluation of the Bayesian approach to fitting the N-mixture model for use with pseudo-replicated count data

    Science.gov (United States)

    Toribo, S.G.; Gray, B.R.; Liang, S.

    2011-01-01

    The N-mixture model proposed by Royle in 2004 may be used to approximate the abundance and detection probability of animal species in a given region. In 2006, Royle and Dorazio discussed the advantages of using a Bayesian approach in modelling animal abundance and occurrence using a hierarchical N-mixture model. N-mixture models assume replication on sampling sites, an assumption that may be violated when the site is not closed to changes in abundance during the survey period or when nominal replicates are defined spatially. In this paper, we studied the robustness of a Bayesian approach to fitting the N-mixture model for pseudo-replicated count data. Our simulation results showed that the Bayesian estimates for abundance and detection probability are slightly biased when the actual detection probability is small and are sensitive to the presence of extra variability within local sites.

  8. Bayesian network model of crowd emotion and negative behavior

    Science.gov (United States)

    Ramli, Nurulhuda; Ghani, Noraida Abdul; Hatta, Zulkarnain Ahmad; Hashim, Intan Hashimah Mohd; Sulong, Jasni; Mahudin, Nor Diana Mohd; Rahman, Shukran Abd; Saad, Zarina Mat

    2014-12-01

    The effects of overcrowding have become a major concern for event organizers. One aspect of this concern has been the idea that overcrowding can enhance the occurrence of serious incidents during events. As one of the largest Muslim religious gathering attended by pilgrims from all over the world, Hajj has become extremely overcrowded with many incidents being reported. The purpose of this study is to analyze the nature of human emotion and negative behavior resulting from overcrowding during Hajj events from data gathered in Malaysian Hajj Experience Survey in 2013. The sample comprised of 147 Malaysian pilgrims (70 males and 77 females). Utilizing a probabilistic model called Bayesian network, this paper models the dependence structure between different emotions and negative behaviors of pilgrims in the crowd. The model included the following variables of emotion: negative, negative comfortable, positive, positive comfortable and positive spiritual and variables of negative behaviors; aggressive and hazardous acts. The study demonstrated that emotions of negative, negative comfortable, positive spiritual and positive emotion have a direct influence on aggressive behavior whereas emotion of negative comfortable, positive spiritual and positive have a direct influence on hazardous acts behavior. The sensitivity analysis showed that a low level of negative and negative comfortable emotions leads to a lower level of aggressive and hazardous behavior. Findings of the study can be further improved to identify the exact cause and risk factors of crowd-related incidents in preventing crowd disasters during the mass gathering events.

  9. A Bayesian Semiparametric Model for Radiation Dose-Response Estimation.

    Science.gov (United States)

    Furukawa, Kyoji; Misumi, Munechika; Cologne, John B; Cullings, Harry M

    2016-06-01

    In evaluating the risk of exposure to health hazards, characterizing the dose-response relationship and estimating acceptable exposure levels are the primary goals. In analyses of health risks associated with exposure to ionizing radiation, while there is a clear agreement that moderate to high radiation doses cause harmful effects in humans, little has been known about the possible biological effects at low doses, for example, below 0.1 Gy, which is the dose range relevant to most radiation exposures of concern today. A conventional approach to radiation dose-response estimation based on simple parametric forms, such as the linear nonthreshold model, can be misleading in evaluating the risk and, in particular, its uncertainty at low doses. As an alternative approach, we consider a Bayesian semiparametric model that has a connected piece-wise-linear dose-response function with prior distributions having an autoregressive structure among the random slope coefficients defined over closely spaced dose categories. With a simulation study and application to analysis of cancer incidence data among Japanese atomic bomb survivors, we show that this approach can produce smooth and flexible dose-response estimation while reasonably handling the risk uncertainty at low doses and elsewhere. With relatively few assumptions and modeling options to be made by the analyst, the method can be particularly useful in assessing risks associated with low-dose radiation exposures. PMID:26581473

  10. Bayesian parametrization of coarse-grain dissipative dynamics models

    Science.gov (United States)

    Dequidt, Alain; Solano Canchaya, Jose G.

    2015-08-01

    We introduce a new bottom-up method for the optimization of dissipative coarse-grain models. The method is based on Bayesian optimization of the likelihood to reproduce a coarse-grained reference trajectory obtained from analysis of a higher resolution molecular dynamics trajectory. This new method is related to force matching techniques, but using the total force on each grain averaged on a coarse time step instead of instantaneous forces. It has the advantage of not being limited to pairwise short-range interactions in the coarse-grain model and also yields an estimation of the friction parameter controlling the dynamics. The theory supporting the method is exposed in a practical perspective, with an analytical solution for the optimal set of parameters. The method was first validated by using it on a system with a known optimum. The new method was then tested on a simple system: n-pentane. The local molecular structure of the optimized model is in excellent agreement with the reference system. An extension of the method allows to get also an excellent agreement for the equilibrium density. As for the dynamic properties, they are also very satisfactory, but more sensitive to the choice of the coarse-grain representation. The quality of the final force field depends on the definition of the coarse grain degrees of freedom and interactions. We consider this method as a serious alternative to other methods like iterative Boltzmann inversion, force matching, and Green-Kubo formulae.

  11. Bayesian Diagnostic Network: A Powerful Model for Representation and Reasoning of Engineering Diagnostic Knowledge

    Institute of Scientific and Technical Information of China (English)

    HU Zhao-yong

    2005-01-01

    Engineering diagnosis is essential to the operation of industrial equipment. The key to successful diagnosis is correct knowledge representation and reasoning. The Bayesian network is a powerful tool for it. This paper utilizes the Bayesian network to represent and reason diagnostic knowledge, named Bayesian diagnostic network. It provides a three-layer topologic structure based on operating conditions, possible faults and corresponding symptoms. The paper also discusses an approximate stochastic sampling algorithm. Then a practical Bayesian network for gas turbine diagnosis is constructed on a platform developed under a Visual C++ environment. It shows that the Bayesian network is a powerful model for representation and reasoning of diagnostic knowledge. The three-layer structure and the approximate algorithm are effective also.

  12. Bayesian Influence Measures for Joint Models for Longitudinal and Survival Data

    OpenAIRE

    Zhu, Hongtu; Ibrahim, Joseph G.; Chi, Yueh-Yun; Tang, Niansheng

    2012-01-01

    This article develops a variety of influence measures for carrying out perturbation (or sensitivity) analysis to joint models of longitudinal and survival data (JMLS) in Bayesian analysis. A perturbation model is introduced to characterize individual and global perturbations to the three components of a Bayesian model, including the data points, the prior distribution, and the sampling distribution. Local influence measures are proposed to quantify the degree of these perturbations to the JML...

  13. Bayesian inference for partially identified models exploring the limits of limited data

    CERN Document Server

    Gustafson, Paul

    2015-01-01

    Introduction Identification What Is against Us? What Is for Us? Some Simple Examples of Partially Identified ModelsThe Road Ahead The Structure of Inference in Partially Identified Models Bayesian Inference The Structure of Posterior Distributions in PIMs Computational Strategies Strength of Bayesian Updating, Revisited Posterior MomentsCredible Intervals Evaluating the Worth of Inference Partial Identification versus Model Misspecification The Siren Call of Identification Comp

  14. Macroscopic Models of Clique Tree Growth for Bayesian Networks

    Data.gov (United States)

    National Aeronautics and Space Administration — In clique tree clustering, inference consists of propagation in a clique tree compiled from a Bayesian network. In this paper, we develop an analytical approach to...

  15. Nitrate source apportionment in a subtropical watershed using Bayesian model

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Liping; Han, Jiangpei; Xue, Jianlong; Zeng, Lingzao [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Shi, Jiachun, E-mail: jcshi@zju.edu.cn [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Wu, Laosheng, E-mail: laowu@zju.edu.cn [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Jiang, Yonghai [State Key Laboratory of Environmental Criteria and Risk Assessment, Chinese Research Academy of Environmental Sciences, Beijing, 100012 (China)

    2013-10-01

    Nitrate (NO{sub 3}{sup −}) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO{sub 3}{sup −} concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L{sup −1}) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L{sup −1}). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L{sup −1} NO{sub 3}{sup −}. Four sources of NO{sub 3}{sup −} (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl{sup −}, NO{sub 3}{sup −}, HCO{sub 3}{sup −}, SO{sub 4}{sup 2−}, Ca{sup 2+}, K{sup +}, Mg{sup 2+}, Na{sup +}, dissolved oxygen (DO)] and dual isotope approach (δ{sup 15}N–NO{sub 3}{sup −} and δ{sup 18}O–NO{sub 3}{sup −}). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO{sub 3}{sup −} to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO{sub 3}{sup −}, better

  16. Nitrate source apportionment in a subtropical watershed using Bayesian model

    International Nuclear Information System (INIS)

    Nitrate (NO3−) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO3− concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L−1) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L−1). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L−1 NO3−. Four sources of NO3− (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl−, NO3−, HCO3−, SO42−, Ca2+, K+, Mg2+, Na+, dissolved oxygen (DO)] and dual isotope approach (δ15N–NO3− and δ18O–NO3−). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO3− to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO3−, better agricultural management practices and sewage disposal programs can be implemented to sustain water quality in subtropical watersheds. - Highlights: • Nitrate concentration in water displayed

  17. Bayesian auxiliary variable models for binary and multinomial regression

    OpenAIRE

    Holmes, C C; HELD, L.

    2006-01-01

    In this paper we discuss auxiliary variable approaches to Bayesian binary and multinomial regression. These approaches are ideally suited to automated Markov chain Monte Carlo simulation. In the first part we describe a simple technique using joint updating that improves the performance of the conventional probit regression algorithm. In the second part we discuss auxiliary variable methods for inference in Bayesian logistic regression, including covariate set uncertainty. Fina...

  18. A flexible bayesian model for testing for transmission ratio distortion.

    Science.gov (United States)

    Casellas, Joaquim; Manunza, Arianna; Mercader, Anna; Quintanilla, Raquel; Amills, Marcel

    2014-12-01

    Current statistical approaches to investigate the nature and magnitude of transmission ratio distortion (TRD) are scarce and restricted to the most common experimental designs such as F2 populations and backcrosses. In this article, we describe a new Bayesian approach to check TRD within a given biallelic genetic marker in a diploid species, providing a highly flexible framework that can accommodate any kind of population structure. This model relies on the genotype of each offspring and thus integrates all available information from either the parents' genotypes or population-specific allele frequencies and yields TRD estimates that can be corroborated by the calculation of a Bayes factor (BF). This approach has been evaluated on simulated data sets with appealing statistical performance. As a proof of concept, we have also tested TRD in a porcine population with five half-sib families and 352 offspring. All boars and piglets were genotyped with the Porcine SNP60 BeadChip, whereas genotypes from the sows were not available. The SNP-by-SNP screening of the pig genome revealed 84 SNPs with decisive evidences of TRD (BF > 100) after accounting for multiple testing. Many of these regions contained genes related to biological processes (e.g., nucleosome assembly and co-organization, DNA conformation and packaging, and DNA complex assembly) that are critically associated with embryonic viability. The implementation of this method, which overcomes many of the limitations of previous approaches, should contribute to fostering research on TRD in both model and nonmodel organisms. PMID:25271302

  19. A Bayesian model of context-sensitive value attribution

    Science.gov (United States)

    Rigoli, Francesco; Friston, Karl J; Martinelli, Cristina; Selaković, Mirjana; Shergill, Sukhwinder S; Dolan, Raymond J

    2016-01-01

    Substantial evidence indicates that incentive value depends on an anticipation of rewards within a given context. However, the computations underlying this context sensitivity remain unknown. To address this question, we introduce a normative (Bayesian) account of how rewards map to incentive values. This assumes that the brain inverts a model of how rewards are generated. Key features of our account include (i) an influence of prior beliefs about the context in which rewards are delivered (weighted by their reliability in a Bayes-optimal fashion), (ii) the notion that incentive values correspond to precision-weighted prediction errors, (iii) and contextual information unfolding at different hierarchical levels. This formulation implies that incentive value is intrinsically context-dependent. We provide empirical support for this model by showing that incentive value is influenced by context variability and by hierarchically nested contexts. The perspective we introduce generates new empirical predictions that might help explaining psychopathologies, such as addiction. DOI: http://dx.doi.org/10.7554/eLife.16127.001 PMID:27328323

  20. Providing a Connection between a Bayesian Inverse Modeling Tool and a Coupled Hydrogeological Processes Modeling Software

    Science.gov (United States)

    Frystacky, H.; Osorio-Murillo, C. A.; Over, M. W.; Kalbacher, T.; Gunnell, D.; Kolditz, O.; Ames, D.; Rubin, Y.

    2013-12-01

    The Method of Anchored Distributions (MAD) is a Bayesian technique for characterizing the uncertainty in geostatistical model parameters. Open-source software has been developed in a modular framework such that this technique can be applied to any forward model software via a driver. This presentation is about the driver that has been developed for OpenGeoSys (OGS), open-source software that can simulate many hydrogeological processes, including couple processes. MAD allows the use of multiple data types for conditioning the spatially random fields and assessing model parameter likelihood. For example, if simulating flow and mass transport, the inversion target variable could be hydraulic conductivity and the inversion data types could be head, concentration, or both. The driver detects from the OGS files which processes and variables are being used in a given project and allows MAD to prompt the user to choose those that are to be modeled or to be treated deterministically. In this way, any combination of processes allowed by OGS can have MAD applied. As for the software, there are two versions, each with its own OGS driver. A Windows desktop version is available as a graphical user interface and is ideal for the learning and teaching environment. High-throughput computing can even be achieved with this version via HTCondor if large projects want to be pursued in a computer lab. In addition to this desktop application, a Linux version is available equipped with MPI such that it can be run in parallel on a computer cluster. All releases can be downloaded from the MAD Codeplex site given below.

  1. Errata: A survey of Bayesian predictive methods for model assessment, selection and comparison

    Directory of Open Access Journals (Sweden)

    Aki Vehtari

    2014-03-01

    Full Text Available Errata for “A survey of Bayesian predictive methods for model assessment, selection and comparison” by A. Vehtari and J. Ojanen, Statistics Surveys, 6 (2012, 142–228. doi:10.1214/12-SS102.

  2. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  3. The Lumiere Project: Bayesian User Modeling for Inferring the Goals and Needs of Software Users

    OpenAIRE

    Horvitz, Eric J.; Breese, John S.; Heckerman, David; Hovel, David; Rommelse, Koos

    2013-01-01

    The Lumiere Project centers on harnessing probability and utility to provide assistance to computer software users. We review work on Bayesian user models that can be employed to infer a users needs by considering a user's background, actions, and queries. Several problems were tackled in Lumiere research, including (1) the construction of Bayesian models for reasoning about the time-varying goals of computer users from their observed actions and queries, (2) gaining access to a stream of eve...

  4. Bayesian modelling of clusters of galaxies from multi-frequency pointed Sunyaev--Zel'dovich observations

    OpenAIRE

    Feroz, F.; Hobson, M. P.; Zwart, J T L; Saunders, R. D. E.; Grainge, K. J. B.

    2008-01-01

    We present a Bayesian approach to modelling galaxy clusters using multi-frequency pointed observations from telescopes that exploit the Sunyaev--Zel'dovich effect. We use the recently developed MultiNest technique (Feroz, Hobson & Bridges, 2008) to explore the high-dimensional parameter spaces and also to calculate the Bayesian evidence. This permits robust parameter estimation as well as model comparison. Tests on simulated Arcminute Microkelvin Imager observations of a cluster, in the prese...

  5. Food Reconstruction Using Isotopic Transferred Signals (FRUITS): A Bayesian Model for Diet Reconstruction

    OpenAIRE

    Fernandes, Ricardo; Millard, Andrew R.; Brabec, Marek; Nadeau, Marie-Josée; Grootes, Pieter

    2014-01-01

    Human and animal diet reconstruction studies that rely on tissue chemical signatures aim at providing estimates on the relative intake of potential food groups. However, several sources of uncertainty need to be considered when handling data. Bayesian mixing models provide a natural platform to handle diverse sources of uncertainty while allowing the user to contribute with prior expert information. The Bayesian mixing model FRUITS (Food Reconstruction Using Isotopic Transferred Signals) was ...

  6. Operational risk modelling and organizational learning in structured finance operations: a Bayesian network approach

    OpenAIRE

    Andrew Sanford; Imad Moosa

    2015-01-01

    This paper describes the development of a tool, based on a Bayesian network model, that provides posteriori predictions of operational risk events, aggregate operational loss distributions, and Operational Value-at-Risk, for a structured finance operations unit located within one of Australia's major banks. The Bayesian network, based on a previously developed causal framework, has been designed to model the smaller and more frequent, attritional operational loss events. Given the limited ava...

  7. Bayesian meta-analysis models for microarray data: a comparative study

    OpenAIRE

    Song Joon J; Conlon Erin M; Liu Anna

    2007-01-01

    Abstract Background With the growing abundance of microarray data, statistical methods are increasingly needed to integrate results across studies. Two common approaches for meta-analysis of microarrays include either combining gene expression measures across studies or combining summaries such as p-values, probabilities or ranks. Here, we compare two Bayesian meta-analysis models that are analogous to these methods. Results Two Bayesian meta-analysis models for microarray data have recently ...

  8. A Bayesian hierarchical model for reconstructing relative sea level: from raw data to rates of change

    OpenAIRE

    Cahill, N.; Kemp, A. C.; Horton, B. P.; Parnell, A.C.

    2015-01-01

    We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical (δ13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation,...

  9. A Bayesian hierarchical model for reconstructing relative sea level: from raw data to rates of change

    OpenAIRE

    Cahill, Niamh; Kemp, Andrew C.; Horton, Benjamin P.; Andrew C Parnell

    2016-01-01

    We present a Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical (δ13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) a new Bayesian transfer (B-TF) function for the calibration of biological indicators into tidal elevation, which is fl...

  10. Modeling Spatial and Temporal Dependencies between Earthquakes

    OpenAIRE

    2000-01-01

    Two new different stochastic models for earthquake occurrence are discussed. Both models are focusing on the spatio-temporal interactions between earthquakes. The parameters of the models are estimated from a Bayesian updating of priors, using empirical data to derive posterior distributions. The first model is a marked point process model in which each earthquake is represented by its magnitude and coordinates in space and time. This model incorporates the occurrence of aftershocks as well a...

  11. Bayesian network as a modelling tool for risk management in agriculture

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Madsen, Anders L.; Lund, Mogens

    this paper we use Bayesian networks as an integrated modelling approach for representing uncertainty and analysing risk management in agriculture. It is shown how historical farm account data may be efficiently used to estimate conditional probabilities, which are the core elements in Bayesian network...... models. We further show how the Bayesian network model RiBay is used for stochastic simulation of farm income, and we demonstrate how RiBay can be used to simulate risk management at the farm level. It is concluded that the key strength of a Bayesian network is the transparency of assumptions, and that......The importance of risk management increases as farmers become more exposed to risk. But risk management is a difficult topic because income risk is the result of the complex interaction of multiple risk factors combined with the effect of an increasing array of possible risk management tools. In...

  12. Multimethod, multistate Bayesian hierarchical modeling approach for use in regional monitoring of wolves.

    Science.gov (United States)

    Jiménez, José; García, Emilio J; Llaneza, Luis; Palacios, Vicente; González, Luis Mariano; García-Domínguez, Francisco; Múñoz-Igualada, Jaime; López-Bao, José Vicente

    2016-08-01

    In many cases, the first step in large-carnivore management is to obtain objective, reliable, and cost-effective estimates of population parameters through procedures that are reproducible over time. However, monitoring predators over large areas is difficult, and the data have a high level of uncertainty. We devised a practical multimethod and multistate modeling approach based on Bayesian hierarchical-site-occupancy models that combined multiple survey methods to estimate different population states for use in monitoring large predators at a regional scale. We used wolves (Canis lupus) as our model species and generated reliable estimates of the number of sites with wolf reproduction (presence of pups). We used 2 wolf data sets from Spain (Western Galicia in 2013 and Asturias in 2004) to test the approach. Based on howling surveys, the naïve estimation (i.e., estimate based only on observations) of the number of sites with reproduction was 9 and 25 sites in Western Galicia and Asturias, respectively. Our model showed 33.4 (SD 9.6) and 34.4 (3.9) sites with wolf reproduction, respectively. The number of occupied sites with wolf reproduction was 0.67 (SD 0.19) and 0.76 (0.11), respectively. This approach can be used to design more cost-effective monitoring programs (i.e., to define the sampling effort needed per site). Our approach should inspire well-coordinated surveys across multiple administrative borders and populations and lead to improved decision making for management of large carnivores on a landscape level. The use of this Bayesian framework provides a simple way to visualize the degree of uncertainty around population-parameter estimates and thus provides managers and stakeholders an intuitive approach to interpreting monitoring results. Our approach can be widely applied to large spatial scales in wildlife monitoring where detection probabilities differ between population states and where several methods are being used to estimate different population

  13. ESTIMATE OF THE HYPSOMETRIC RELATIONSHIP WITH NONLINEAR MODELS FITTED BY EMPIRICAL BAYESIAN METHODS

    Directory of Open Access Journals (Sweden)

    Monica Fabiana Bento Moreira

    2015-09-01

    Full Text Available In this paper we propose a Bayesian approach to solve the inference problem with restriction on parameters, regarding to nonlinear models used to represent the hypsometric relationship in clones of Eucalyptus sp. The Bayesian estimates are calculated using Monte Carlo Markov Chain (MCMC method. The proposed method was applied to different groups of actual data from which two were selected to show the results. These results were compared to the results achieved by the minimum square method, highlighting the superiority of the Bayesian approach, since this approach always generate the biologically consistent results for hipsometric relationship.

  14. Improving in situ data acquisition using training images and a Bayesian mixture model

    Science.gov (United States)

    Abdollahifard, Mohammad Javad; Mariethoz, Gregoire; Pourfard, Mohammadreza

    2016-06-01

    Estimating the spatial distribution of physical processes using a minimum number of samples is of vital importance in earth science applications where sampling is costly. In recent years, training image-based methods have received a lot of attention for interpolation and simulation. However, training images have never been employed to optimize spatial sampling process. In this paper, a sequential compressive sampling method is presented which decides the location of new samples based on a training image. First, a Bayesian mixture model is developed based on the training patterns. Then, using this model, unknown values are estimated based on a limited number of random samples. Since the model is probabilistic, it allows estimating local uncertainty conditionally to the available samples. Based on this, new samples are sequentially extracted from the locations with maximum uncertainty. Experiments show that compared to a random sampling strategy, the proposed supervised sampling method significantly reduces the number of samples needed to achieve the same level of accuracy, even when the training image is not optimally chosen. The method has the potential to reduce the number of observations necessary for the characterization of environmental processes.

  15. A hierarchical model for spatial capture-recapture data

    Science.gov (United States)

    Royle, J. Andrew; Young, K.V.

    2008-01-01

    Estimating density is a fundamental objective of many animal population studies. Application of methods for estimating population size from ostensibly closed populations is widespread, but ineffective for estimating absolute density because most populations are subject to short-term movements or so-called temporary emigration. This phenomenon invalidates the resulting estimates because the effective sample area is unknown. A number of methods involving the adjustment of estimates based on heuristic considerations are in widespread use. In this paper, a hierarchical model of spatially indexed capture recapture data is proposed for sampling based on area searches of spatial sample units subject to uniform sampling intensity. The hierarchical model contains explicit models for the distribution of individuals and their movements, in addition to an observation model that is conditional on the location of individuals during sampling. Bayesian analysis of the hierarchical model is achieved by the use of data augmentation, which allows for a straightforward implementation in the freely available software WinBUGS. We present results of a simulation study that was carried out to evaluate the operating characteristics of the Bayesian estimator under variable densities and movement patterns of individuals. An application of the model is presented for survey data on the flat-tailed horned lizard (Phrynosoma mcallii) in Arizona, USA.

  16. Predicting water main failures using Bayesian model averaging and survival modelling approach

    International Nuclear Information System (INIS)

    To develop an effective preventive or proactive repair and replacement action plan, water utilities often rely on water main failure prediction models. However, in predicting the failure of water mains, uncertainty is inherent regardless of the quality and quantity of data used in the model. To improve the understanding of water main failure, a Bayesian framework is developed for predicting the failure of water mains considering uncertainties. In this study, Bayesian model averaging method (BMA) is presented to identify the influential pipe-dependent and time-dependent covariates considering model uncertainties whereas Bayesian Weibull Proportional Hazard Model (BWPHM) is applied to develop the survival curves and to predict the failure rates of water mains. To accredit the proposed framework, it is implemented to predict the failure of cast iron (CI) and ductile iron (DI) pipes of the water distribution network of the City of Calgary, Alberta, Canada. Results indicate that the predicted 95% uncertainty bounds of the proposed BWPHMs capture effectively the observed breaks for both CI and DI water mains. Moreover, the performance of the proposed BWPHMs are better compare to the Cox-Proportional Hazard Model (Cox-PHM) for considering Weibull distribution for the baseline hazard function and model uncertainties. - Highlights: • Prioritize rehabilitation and replacements (R/R) strategies of water mains. • Consider the uncertainties for the failure prediction. • Improve the prediction capability of the water mains failure models. • Identify the influential and appropriate covariates for different models. • Determine the effects of the covariates on failure

  17. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  18. Bayesian Model Averaging of Artificial Intelligence Models for Hydraulic Conductivity Estimation

    Science.gov (United States)

    Nadiri, A.; Chitsazan, N.; Tsai, F. T.; Asghari Moghaddam, A.

    2012-12-01

    This research presents a Bayesian artificial intelligence model averaging (BAIMA) method that incorporates multiple artificial intelligence (AI) models to estimate hydraulic conductivity and evaluate estimation uncertainties. Uncertainty in the AI model outputs stems from error in model input as well as non-uniqueness in selecting different AI methods. Using one single AI model tends to bias the estimation and underestimate uncertainty. BAIMA employs Bayesian model averaging (BMA) technique to address the issue of using one single AI model for estimation. BAIMA estimates hydraulic conductivity by averaging the outputs of AI models according to their model weights. In this study, the model weights were determined using the Bayesian information criterion (BIC) that follows the parsimony principle. BAIMA calculates the within-model variances to account for uncertainty propagation from input data to AI model output. Between-model variances are evaluated to account for uncertainty due to model non-uniqueness. We employed Takagi-Sugeno fuzzy logic (TS-FL), artificial neural network (ANN) and neurofuzzy (NF) to estimate hydraulic conductivity for the Tasuj plain aquifer, Iran. BAIMA combined three AI models and produced better fitting than individual models. While NF was expected to be the best AI model owing to its utilization of both TS-FL and ANN models, the NF model is nearly discarded by the parsimony principle. The TS-FL model and the ANN model showed equal importance although their hydraulic conductivity estimates were quite different. This resulted in significant between-model variances that are normally ignored by using one AI model.

  19. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  20. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  1. Bayesian Network Based Fault Prognosis via Bond Graph Modeling of High-Speed Railway Traction Device

    Directory of Open Access Journals (Sweden)

    Yunkai Wu

    2015-01-01

    component-level faults accurately for a high-speed railway traction system, a fault prognosis approach via Bayesian network and bond graph modeling techniques is proposed. The inherent structure of a railway traction system is represented by bond graph model, based on which a multilayer Bayesian network is developed for fault propagation analysis and fault prediction. For complete and incomplete data sets, two different parameter learning algorithms such as Bayesian estimation and expectation maximization (EM algorithm are adopted to determine the conditional probability table of the Bayesian network. The proposed prognosis approach using Pearl’s polytree propagation algorithm for joint probability reasoning can predict the failure probabilities of leaf nodes based on the current status of root nodes. Verification results in a high-speed railway traction simulation system can demonstrate the effectiveness of the proposed approach.

  2. Bayesian log-periodic model for financial crashes

    DEFF Research Database (Denmark)

    Rodríguez-Caballero, Carlos Vladimir; Knapik, Oskar

    2014-01-01

    This paper introduces a Bayesian approach in econophysics literature about financial bubbles in order to estimate the most probable time for a financial crash to occur. To this end, we propose using noninformative prior distributions to obtain posterior distributions. Since these distributions...... part of the study, we analyze a well-known example of financial bubble – the S&P 500 1987 crash – to show the usefulness of the three methods under consideration and crashes of Merval-94, Bovespa-97, IPCMX-94, Hang Seng-97 using the simplest method. The novelty of this research is that the Bayesian...

  3. Applications of Bayesian approach in modelling risk of malaria-related hospital mortality

    Directory of Open Access Journals (Sweden)

    Simbeye Jupiter S

    2008-02-01

    Full Text Available Abstract Background Malaria is a major public health problem in Malawi, however, quantifying its burden in a population is a challenge. Routine hospital data provide a proxy for measuring the incidence of severe malaria and for crudely estimating morbidity rates. Using such data, this paper proposes a method to describe trends, patterns and factors associated with in-hospital mortality attributed to the disease. Methods We develop semiparametric regression models which allow joint analysis of nonlinear effects of calendar time and continuous covariates, spatially structured variation, unstructured heterogeneity, and other fixed covariates. Modelling and inference use the fully Bayesian approach via Markov Chain Monte Carlo (MCMC simulation techniques. The methodology is applied to analyse data arising from paediatric wards in Zomba district, Malawi, between 2002 and 2003. Results and Conclusion We observe that the risk of dying in hospital is lower in the dry season, and for children who travel a distance of less than 5 kms to the hospital, but increases for those who are referred to the hospital. The results also indicate significant differences in both structured and unstructured spatial effects, and the health facility effects reveal considerable differences by type of facility or practice. More importantly, our approach shows non-linearities in the effect of metrical covariates on the probability of dying in hospital. The study emphasizes that the methodological framework used provides a useful tool for analysing the data at hand and of similar structure.

  4. Bayesian inference in partially identified models: Is the shape of the posterior distribution useful?

    OpenAIRE

    Gustafson, Paul

    2014-01-01

    Partially identified models are characterized by the distribution of observables being compatible with a set of values for the target parameter, rather than a single value. This set is often referred to as an identification region. From a non-Bayesian point of view, the identification region is the object revealed to the investigator in the limit of increasing sample size. Conversely, a Bayesian analysis provides the identification region plus the limiting posterior distribution over this reg...

  5. Bayesian inference of BWR model parameters by Markov chain Monte Carlo

    International Nuclear Information System (INIS)

    In this paper, the Markov chain Monte Carlo approach to Bayesian inference is applied for estimating the parameters of a reduced-order model of the dynamics of a boiling water reactor system. A Bayesian updating strategy is devised to progressively refine the estimates, as newly measured data become available. Finally, the technique is used for detecting parameter changes during the system lifetime, e.g. due to component degradation

  6. Model Data Fusion: developing Bayesian inversion to constrain equilibrium and mode structure

    OpenAIRE

    Hole, M. J.; von Nessi, G.; Bertram, J; J. Svensson; Appel, L. C.; Blackwell, B. D.; Dewar, R L; Howard, J

    2010-01-01

    Recently, a new probabilistic "data fusion" framework based on Bayesian principles has been developed on JET and W7-AS. The Bayesian analysis framework folds in uncertainties and inter-dependencies in the diagnostic data and signal forward-models, together with prior knowledge of the state of the plasma, to yield predictions of internal magnetic structure. A feature of the framework, known as MINERVA (J. Svensson, A. Werner, Plasma Physics and Controlled Fusion 50, 085022, 2008), is the infer...

  7. Bayesian forecasting and scalable multivariate volatility analysis using simultaneous graphical dynamic models

    OpenAIRE

    Gruber, Lutz F.; West, Mike

    2016-01-01

    The recently introduced class of simultaneous graphical dynamic linear models (SGDLMs) defines an ability to scale on-line Bayesian analysis and forecasting to higher-dimensional time series. This paper advances the methodology of SGDLMs, developing and embedding a novel, adaptive method of simultaneous predictor selection in forward filtering for on-line learning and forecasting. The advances include developments in Bayesian computation for scalability, and a case study in exploring the resu...

  8. Modeling Spatial Sustainability: Spatial Welfare Economics versus Ecological Footprint

    OpenAIRE

    Grazi, Fabio; Van Den Bergh, Jeroen C.J.M.; Rietveld, Piet

    2006-01-01

    A spatial welfare framework for the analysis of the spatial dimensions of sustainability is developed. It incorporates agglomeration effects, interregional trade, negative environmental externalities and various land use categories. The model is used to compare rankings of spatial configurations according to evaluations based on social welfare and ecological footprint indicators. Five spatial configurations are considered for this purpose. The exercise is operationalized with the help of a tw...

  9. Spatial extremes of wildfire sizes: Bayesian hieralquical models for extremes

    OpenAIRE

    Mendes, Jorge M; Bermudez, Patricia Cortés de Zea; J. M. C. Pereira; Turkman, K.F.; Vasconcelos, M.J.P.

    2010-01-01

    In Portugal, due to the combination of climatological and ecological factors, large wildfires are a constant threat and due to their economic impact, a big policy issue. In order to organize efficient fire fighting capacity and resource management, correct quantification of the risk of large wildfires are needed. In this paper, we quantify the regional risk of large wildfire sizes, by fitting a Generalized Pareto distribution to excesses over a suitably chosen high threshold. S...

  10. BAYESIAN FORECASTS COMBINATION TO IMPROVE THE ROMANIAN INFLATION PREDICTIONS BASED ON ECONOMETRIC MODELS

    Directory of Open Access Journals (Sweden)

    Mihaela Simionescu

    2014-12-01

    Full Text Available There are many types of econometric models used in predicting the inflation rate, but in this study we used a Bayesian shrinkage combination approach. This methodology is used in order to improve the predictions accuracy by including information that is not captured by the econometric models. Therefore, experts’ forecasts are utilized as prior information, for Romania these predictions being provided by Institute for Economic Forecasting (Dobrescu macromodel, National Commission for Prognosis and European Commission. The empirical results for Romanian inflation show the superiority of a fixed effects model compared to other types of econometric models like VAR, Bayesian VAR, simultaneous equations model, dynamic model, log-linear model. The Bayesian combinations that used experts’ predictions as priors, when the shrinkage parameter tends to infinite, improved the accuracy of all forecasts based on individual models, outperforming also zero and equal weights predictions and naïve forecasts.

  11. Bayesian estimation of regularization parameters for deformable surface models

    Energy Technology Data Exchange (ETDEWEB)

    Cunningham, G.S.; Lehovich, A.; Hanson, K.M.

    1999-02-20

    In this article the authors build on their past attempts to reconstruct a 3D, time-varying bolus of radiotracer from first-pass data obtained by the dynamic SPECT imager, FASTSPECT, built by the University of Arizona. The object imaged is a CardioWest total artificial heart. The bolus is entirely contained in one ventricle and its associated inlet and outlet tubes. The model for the radiotracer distribution at a given time is a closed surface parameterized by 482 vertices that are connected to make 960 triangles, with nonuniform intensity variations of radiotracer allowed inside the surface on a voxel-to-voxel basis. The total curvature of the surface is minimized through the use of a weighted prior in the Bayesian framework, as is the weighted norm of the gradient of the voxellated grid. MAP estimates for the vertices, interior intensity voxels and background count level are produced. The strength of the priors, or hyperparameters, are determined by maximizing the probability of the data given the hyperparameters, called the evidence. The evidence is calculated by first assuming that the posterior is approximately normal in the values of the vertices and voxels, and then by evaluating the integral of the multi-dimensional normal distribution. This integral (which requires evaluating the determinant of a covariance matrix) is computed by applying a recent algorithm from Bai et. al. that calculates the needed determinant efficiently. They demonstrate that the radiotracer is highly inhomogeneous in early time frames, as suspected in earlier reconstruction attempts that assumed a uniform intensity of radiotracer within the closed surface, and that the optimal choice of hyperparameters is substantially different for different time frames.

  12. Bayesian estimation of regularization parameters for deformable surface models

    International Nuclear Information System (INIS)

    In this article the authors build on their past attempts to reconstruct a 3D, time-varying bolus of radiotracer from first-pass data obtained by the dynamic SPECT imager, FASTSPECT, built by the University of Arizona. The object imaged is a CardioWest total artificial heart. The bolus is entirely contained in one ventricle and its associated inlet and outlet tubes. The model for the radiotracer distribution at a given time is a closed surface parameterized by 482 vertices that are connected to make 960 triangles, with nonuniform intensity variations of radiotracer allowed inside the surface on a voxel-to-voxel basis. The total curvature of the surface is minimized through the use of a weighted prior in the Bayesian framework, as is the weighted norm of the gradient of the voxellated grid. MAP estimates for the vertices, interior intensity voxels and background count level are produced. The strength of the priors, or hyperparameters, are determined by maximizing the probability of the data given the hyperparameters, called the evidence. The evidence is calculated by first assuming that the posterior is approximately normal in the values of the vertices and voxels, and then by evaluating the integral of the multi-dimensional normal distribution. This integral (which requires evaluating the determinant of a covariance matrix) is computed by applying a recent algorithm from Bai et. al. that calculates the needed determinant efficiently. They demonstrate that the radiotracer is highly inhomogeneous in early time frames, as suspected in earlier reconstruction attempts that assumed a uniform intensity of radiotracer within the closed surface, and that the optimal choice of hyperparameters is substantially different for different time frames

  13. Use of SAMC for Bayesian analysis of statistical models with intractable normalizing constants

    KAUST Repository

    Jin, Ick Hoon

    2014-03-01

    Statistical inference for the models with intractable normalizing constants has attracted much attention. During the past two decades, various approximation- or simulation-based methods have been proposed for the problem, such as the Monte Carlo maximum likelihood method and the auxiliary variable Markov chain Monte Carlo methods. The Bayesian stochastic approximation Monte Carlo algorithm specifically addresses this problem: It works by sampling from a sequence of approximate distributions with their average converging to the target posterior distribution, where the approximate distributions can be achieved using the stochastic approximation Monte Carlo algorithm. A strong law of large numbers is established for the Bayesian stochastic approximation Monte Carlo estimator under mild conditions. Compared to the Monte Carlo maximum likelihood method, the Bayesian stochastic approximation Monte Carlo algorithm is more robust to the initial guess of model parameters. Compared to the auxiliary variable MCMC methods, the Bayesian stochastic approximation Monte Carlo algorithm avoids the requirement for perfect samples, and thus can be applied to many models for which perfect sampling is not available or very expensive. The Bayesian stochastic approximation Monte Carlo algorithm also provides a general framework for approximate Bayesian analysis. © 2012 Elsevier B.V. All rights reserved.

  14. A general Bayesian framework for calibrating and evaluating stochastic models of annual multi-site hydrological data

    Science.gov (United States)

    Frost, Andrew J.; Thyer, Mark A.; Srikanthan, R.; Kuczera, George

    2007-07-01

    SummaryMulti-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box-Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney's main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box-Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.

  15. A Bayesian fusion model for space-time reconstruction of finely resolved velocities in turbulent flows from low resolution measurements

    CERN Document Server

    Van Nguyen, Linh; Chainais, Pierre

    2015-01-01

    The study of turbulent flows calls for measurements with high resolution both in space and in time. We propose a new approach to reconstruct High-Temporal-High-Spatial resolution velocity fields by combining two sources of information that are well-resolved either in space or in time, the Low-Temporal-High-Spatial (LTHS) and the High-Temporal-Low-Spatial (HTLS) resolution measurements. In the framework of co-conception between sensing and data post-processing, this work extensively investigates a Bayesian reconstruction approach using a simulated database. A Bayesian fusion model is developed to solve the inverse problem of data reconstruction. The model uses a Maximum A Posteriori estimate, which yields the most probable field knowing the measurements. The DNS of a wall-bounded turbulent flow at moderate Reynolds number is used to validate and assess the performances of the present approach. Low resolution measurements are subsampled in time and space from the fully resolved data. Reconstructed velocities ar...

  16. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  17. Bayesian parameter uncertainty modeling in a macroscale hydrologic model and its impact on Indian river basin hydrology under climate change

    Science.gov (United States)

    Raje, D.; Krishnan, R.

    2012-08-01

    Macroscale hydrologic models (MHMs) were developed to study changes in land surface hydrology due to changing climate over large domains, such as continents or large river basins. However, there are many sources of uncertainty introduced in MHM hydrological simulation, such as model structure error, ineffective model parameters, and low-accuracy model input or validation data. It is hence important to model the uncertainty arising in projection results from an MHM. The objective of this study is to present a Bayesian statistical inference framework for parameter uncertainty modeling of a macroscale hydrologic model. The Bayesian approach implemented using Markov Chain Monte Carlo (MCMC) methods is used in this study to model uncertainty arising from calibration parameters of the Variable Infiltration Capacity (VIC) MHM. The study examines large-scale hydrologic impacts for Indian river basins and changes in discharges for three major river basins with distinct climatic and geographic characteristics, under climate change. Observed/reanalysis meteorological variables such as precipitation, temperature and wind speed are used to drive the VIC macroscale hydrologic model. An objective function describing the fit between observed and simulated discharges at four stations is used to compute the likelihood of the parameters. An MCMC approach using the Metropolis-Hastings algorithm is used to update probability distributions of the parameters. For future hydrologic simulations, bias-corrected GCM projections of climatic variables are used. The posterior distributions of VIC parameters are used for projection of 5th and 95th percentile discharge statistics at four stations, namely, Farakka, Jamtara, Garudeshwar, and Vijayawada for an ensemble of three GCMs and three scenarios, for two time slices. Spatial differences in uncertainty projections of runoff and evapotranspiration for years 2056-2065 for the a1b scenario at the 5th and 95th percentile levels are also projected

  18. A population-based Bayesian approach to the minimal model of glucose and insulin homeostasis

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Højbjerre, Malene

    2005-01-01

    for a whole population. Traditionally it has been analysed in a deterministic set-up with only error terms on the measurements. In this work we adopt a Bayesian graphical model to describe the coupled minimal model that accounts for both measurement and process variability, and the model is extended...... to a population-based model. The estimation of the parameters are efficiently implemented in a Bayesian approach where posterior inference is made through the use of Markov chain Monte Carlo techniques. Hereby we obtain a powerful and flexible modelling framework for regularizing the ill-posed estimation problem...

  19. Spatial model of autocatalytic reactions

    OpenAIRE

    De Anna, Pietro; Di Patti, Francesca; Fanelli, Duccio; McKane, Alan J.; Dauxois, Thierry

    2010-01-01

    Biological cells with all of their surface structure and complex interior stripped away are essentially vesicles - membranes composed of lipid bilayers which form closed sacs. Vesicles are thought to be relevant as models of primitive protocells, and they could have provided the ideal environment for pre-biotic reactions to occur. In this paper, we investigate the stochastic dynamics of a set of autocatalytic reactions, within a spatially bounded domain, so as to mimic a primordial cell. The ...

  20. B2Z: R Package for Bayesian Two-Zone Models

    Directory of Open Access Journals (Sweden)

    João Vitor Dias Monteiro

    2011-08-01

    Full Text Available A primary issue in industrial hygiene is the estimation of a worker's exposure to chemical, physical and biological agents. Mathematical modeling is increasingly being used as a method for assessing occupational exposures. However, predicting exposure in real settings is constrained by lack of quantitative knowledge of exposure determinants. Recently, Zhang, Banerjee, Yang, Lungu, and Ramachandran (2009 proposed Bayesian hierarchical models for estimating parameters and exposure concentrations for the two-zone differential equation models and for predicting concentrations in a zone near and far away from the source of contamination.Bayesian estimation, however, can often require substantial amounts of user-defined code and tuning. In this paper, we introduce a statistical software package, B2Z, built upon the R statistical computing platform that implements a Bayesian model for estimating model parameters and exposure concentrations in two-zone models. We discuss the algorithms behind our package and illustrate its use with simulated and real data examples.

  1. EXONEST: Bayesian model selection applied to the detection and characterization of exoplanets via photometric variations

    Energy Technology Data Exchange (ETDEWEB)

    Placek, Ben; Knuth, Kevin H. [Physics Department, University at Albany (SUNY), Albany, NY 12222 (United States); Angerhausen, Daniel, E-mail: bplacek@albany.edu, E-mail: kknuth@albany.edu, E-mail: daniel.angerhausen@gmail.com [Department of Physics, Applied Physics, and Astronomy, Rensselear Polytechnic Institute, Troy, NY 12180 (United States)

    2014-11-10

    EXONEST is an algorithm dedicated to detecting and characterizing the photometric signatures of exoplanets, which include reflection and thermal emission, Doppler boosting, and ellipsoidal variations. Using Bayesian inference, we can test between competing models that describe the data as well as estimate model parameters. We demonstrate this approach by testing circular versus eccentric planetary orbital models, as well as testing for the presence or absence of four photometric effects. In addition to using Bayesian model selection, a unique aspect of EXONEST is the potential capability to distinguish between reflective and thermal contributions to the light curve. A case study is presented using Kepler data recorded from the transiting planet KOI-13b. By considering only the nontransiting portions of the light curve, we demonstrate that it is possible to estimate the photometrically relevant model parameters of KOI-13b. Furthermore, Bayesian model testing confirms that the orbit of KOI-13b has a detectable eccentricity.

  2. Parameterizing Bayesian network Representations of Social-Behavioral Models by Expert Elicitation

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Stephen J.; Dalton, Angela C.; Whitney, Paul D.; White, Amanda M.

    2010-05-23

    Bayesian networks provide a general framework with which to model many natural phenomena. The mathematical nature of Bayesian networks enables a plethora of model validation and calibration techniques: e.g parameter estimation, goodness of fit tests, and diagnostic checking of the model assumptions. However, they are not free of shortcomings. Parameter estimation from relevant extant data is a common approach to calibrating the model parameters. In practice it is not uncommon to find oneself lacking adequate data to reliably estimate all model parameters. In this paper we present the early development of a novel application of conjoint analysis as a method for eliciting and modeling expert opinions and using the results in a methodology for calibrating the parameters of a Bayesian network.

  3. Bayesian network modeling method based on case reasoning for emergency decision-making

    Directory of Open Access Journals (Sweden)

    XU Lei

    2013-06-01

    Full Text Available Bayesian network has the abilities of probability expression, uncertainty management and multi-information fusion.It can support emergency decision-making, which can improve the efficiency of decision-making.Emergency decision-making is highly time sensitive, which requires shortening the Bayesian Network modeling time as far as possible.Traditional Bayesian network modeling methods are clearly unable to meet that requirement.Thus, a Bayesian network modeling method based on case reasoning for emergency decision-making is proposed.The method can obtain optional cases through case matching by the functions of similarity degree and deviation degree.Then,new Bayesian network can be built through case adjustment by case merging and pruning.An example is presented to illustrate and test the proposed method.The result shows that the method does not have a huge search space or need sample data.The only requirement is the collection of expert knowledge and historical case models.Compared with traditional methods, the proposed method can reuse historical case models, which can reduce the modeling time and improve the efficiency.

  4. Bayesian data fusion for spatial prediction of categorical variables in environmental sciences

    Energy Technology Data Exchange (ETDEWEB)

    Gengler, Sarah, E-mail: sarahgengler@gmail.com; Bogaert, Patrick, E-mail: sarahgengler@gmail.com [Earth and Life Institute, Environmental Sciences. Université catholique de Louvain, Croix du Sud 2/L7.05.16, B-1348 Louvain-la-Neuve (Belgium)

    2014-12-05

    First developed to predict continuous variables, Bayesian Maximum Entropy (BME) has become a complete framework in the context of space-time prediction since it has been extended to predict categorical variables and mixed random fields. This method proposes solutions to combine several sources of data whatever the nature of the information. However, the various attempts that were made for adapting the BME methodology to categorical variables and mixed random fields faced some limitations, as a high computational burden. The main objective of this paper is to overcome this limitation by generalizing the Bayesian Data Fusion (BDF) theoretical framework to categorical variables, which is somehow a simplification of the BME method through the convenient conditional independence hypothesis. The BDF methodology for categorical variables is first described and then applied to a practical case study: the estimation of soil drainage classes using a soil map and point observations in the sandy area of Flanders around the city of Mechelen (Belgium). The BDF approach is compared to BME along with more classical approaches, as Indicator CoKringing (ICK) and logistic regression. Estimators are compared using various indicators, namely the Percentage of Correctly Classified locations (PCC) and the Average Highest Probability (AHP). Although BDF methodology for categorical variables is somehow a simplification of BME approach, both methods lead to similar results and have strong advantages compared to ICK and logistic regression.

  5. A Bayesian Surrogate Model for Rapid Time Series Analysis and Application to Exoplanet Observations

    CERN Document Server

    Ford, Eric B; Veras, Dimitri

    2011-01-01

    We present a Bayesian surrogate model for the analysis of periodic or quasi-periodic time series data. We describe a computationally efficient implementation that enables Bayesian model comparison. We apply this model to simulated and real exoplanet observations. We discuss the results and demonstrate some of the challenges for applying our surrogate model to realistic exoplanet data sets. In particular, we find that analyses of real world data should pay careful attention to the effects of uneven spacing of observations and the choice of prior for the "jitter" parameter.

  6. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    Science.gov (United States)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  7. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  8. Introduction to Models in Spatial Analysis

    OpenAIRE

    Sanders, Lena

    2007-01-01

    The book provides a broad overview of the different types of models used in advanced spatial analysis. The models concern spatial organization, location factors and spatial interaction patterns from both static and dynamic perspectives. This introductory chapter proposes a discussion on the different meanings which are given to models in the field of spatial analysis depending on the formalization framework (statistics, GIS, computational approach). Core concepts as spatial interaction and le...

  9. Multi-scale inference of interaction rules in animal groups using Bayesian model selection.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    Full Text Available Inference of interaction rules of animals moving in groups usually relies on an analysis of large scale system behaviour. Models are tuned through repeated simulation until they match the observed behaviour. More recent work has used the fine scale motions of animals to validate and fit the rules of interaction of animals in groups. Here, we use a Bayesian methodology to compare a variety of models to the collective motion of glass prawns (Paratya australiensis. We show that these exhibit a stereotypical 'phase transition', whereby an increase in density leads to the onset of collective motion in one direction. We fit models to this data, which range from: a mean-field model where all prawns interact globally; to a spatial Markovian model where prawns are self-propelled particles influenced only by the current positions and directions of their neighbours; up to non-Markovian models where prawns have 'memory' of previous interactions, integrating their experiences over time when deciding to change behaviour. We show that the mean-field model fits the large scale behaviour of the system, but does not capture the observed locality of interactions. Traditional self-propelled particle models fail to capture the fine scale dynamics of the system. The most sophisticated model, the non-Markovian model, provides a good match to the data at both the fine scale and in terms of reproducing global dynamics, while maintaining a biologically plausible perceptual range. We conclude that prawns' movements are influenced by not just the current direction of nearby conspecifics, but also those encountered in the recent past. Given the simplicity of prawns as a study system our research suggests that self-propelled particle models of collective motion should, if they are to be realistic at multiple biological scales, include memory of previous interactions and other non-Markovian effects.

  10. Multi-scale inference of interaction rules in animal groups using Bayesian model selection.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    2012-01-01

    Full Text Available Inference of interaction rules of animals moving in groups usually relies on an analysis of large scale system behaviour. Models are tuned through repeated simulation until they match the observed behaviour. More recent work has used the fine scale motions of animals to validate and fit the rules of interaction of animals in groups. Here, we use a Bayesian methodology to compare a variety of models to the collective motion of glass prawns (Paratya australiensis. We show that these exhibit a stereotypical 'phase transition', whereby an increase in density leads to the onset of collective motion in one direction. We fit models to this data, which range from: a mean-field model where all prawns interact globally; to a spatial Markovian model where prawns are self-propelled particles influenced only by the current positions and directions of their neighbours; up to non-Markovian models where prawns have 'memory' of previous interactions, integrating their experiences over time when deciding to change behaviour. We show that the mean-field model fits the large scale behaviour of the system, but does not capture fine scale rules of interaction, which are primarily mediated by physical contact. Conversely, the Markovian self-propelled particle model captures the fine scale rules of interaction but fails to reproduce global dynamics. The most sophisticated model, the non-Markovian model, provides a good match to the data at both the fine scale and in terms of reproducing global dynamics. We conclude that prawns' movements are influenced by not just the current direction of nearby conspecifics, but also those encountered in the recent past. Given the simplicity of prawns as a study system our research suggests that self-propelled particle models of collective motion should, if they are to be realistic at multiple biological scales, include memory of previous interactions and other non-Markovian effects.

  11. Bayesian data assimilation for stochastic multiscale models of transport in porous media.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M. (Massachusetts Institute of Technology, Cambridge, MA); van Bloemen Waanders, Bart Gustaaf (Sandia National Laboratories, Albuquerque NM); Parno, Matthew (Massachusetts Institute of Technology, Cambridge, MA); Ray, Jaideep; Lefantzi, Sophia; Salazar, Luke (Sandia National Laboratories, Albuquerque NM); McKenna, Sean Andrew (Sandia National Laboratories, Albuquerque NM); Klise, Katherine A. (Sandia National Laboratories, Albuquerque NM)

    2011-10-01

    We investigate Bayesian techniques that can be used to reconstruct field variables from partial observations. In particular, we target fields that exhibit spatial structures with a large spectrum of lengthscales. Contemporary methods typically describe the field on a grid and estimate structures which can be resolved by it. In contrast, we address the reconstruction of grid-resolved structures as well as estimation of statistical summaries of subgrid structures, which are smaller than the grid resolution. We perform this in two different ways (a) via a physical (phenomenological), parameterized subgrid model that summarizes the impact of the unresolved scales at the coarse level and (b) via multiscale finite elements, where specially designed prolongation and restriction operators establish the interscale link between the same problem defined on a coarse and fine mesh. The estimation problem is posed as a Bayesian inverse problem. Dimensionality reduction is performed by projecting the field to be inferred on a suitable orthogonal basis set, viz. the Karhunen-Loeve expansion of a multiGaussian. We first demonstrate our techniques on the reconstruction of a binary medium consisting of a matrix with embedded inclusions, which are too small to be grid-resolved. The reconstruction is performed using an adaptive Markov chain Monte Carlo method. We find that the posterior distributions of the inferred parameters are approximately Gaussian. We exploit this finding to reconstruct a permeability field with long, but narrow embedded fractures (which are too fine to be grid-resolved) using scalable ensemble Kalman filters; this also allows us to address larger grids. Ensemble Kalman filtering is then used to estimate the values of hydraulic conductivity and specific yield in a model of the High Plains Aquifer in Kansas. Strong conditioning of the spatial structure of the parameters and the non-linear aspects of the water table aquifer create difficulty for the ensemble Kalman

  12. Diffusion and prediction of Leishmaniasis in a large metropolitan area in Brazil with a Bayesian space-time model.

    Science.gov (United States)

    Assunção, R M; Reis, I A; Oliveira, C D

    2001-08-15

    We present results from an analysis of human visceral Leishmaniasis cases based on public health records of Belo Horizonte, Brazil, from 1994 to 1997. The main emphasis in this study is on the development of a spatial statistical model to map and project the rates of visceral Leishmaniasis in Belo Horizonte. The model allows for space-time interaction and it is based on a hierarchical Bayesian approach. We assume that the underlying rates evolve in time according to a polynomial trend specific to each small area in the region. The parameters of these polynomials receive a spatial distribution in the form of an autonormal distribution. While the raw rates are extremely noisy and inadequate to support decisions, the resulting smoothed rates estimates are considerably less affected by small area issues and provide very clear directions to implement public health actions. PMID:11468766

  13. Bayesian Analysis of Geostatistical Models With an Auxiliary Lattice

    KAUST Repository

    Park, Jincheol

    2012-04-01

    The Gaussian geostatistical model has been widely used for modeling spatial data. However, this model suffers from a severe difficulty in computation: it requires users to invert a large covariance matrix. This is infeasible when the number of observations is large. In this article, we propose an auxiliary lattice-based approach for tackling this difficulty. By introducing an auxiliary lattice to the space of observations and defining a Gaussian Markov random field on the auxiliary lattice, our model completely avoids the requirement of matrix inversion. It is remarkable that the computational complexity of our method is only O(n), where n is the number of observations. Hence, our method can be applied to very large datasets with reasonable computational (CPU) times. The numerical results indicate that our model can approximate Gaussian random fields very well in terms of predictions, even for those with long correlation lengths. For real data examples, our model can generally outperform conventional Gaussian random field models in both prediction errors and CPU times. Supplemental materials for the article are available online. © 2012 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

  14. Bayesian modelling of the emission spectrum of the JET Li-BES system

    CERN Document Server

    Kwak, Sehyun; Brix, M; Ghim, Y -c; Contributors, JET

    2015-01-01

    A Bayesian model of the emission spectrum of the JET lithium beam has been developed to infer the intensity of the Li I (2p-2s) line radiation and associated uncertainties. The detected spectrum for each channel of the lithium beam emission spectroscopy (Li-BES) system is here modelled by a single Li line modified by an instrumental function, Bremsstrahlung background, instrumental offset, and interference filter curve. Both the instrumental function and the interference filter curve are modelled with non-parametric Gaussian processes. All free parameters of the model, the intensities of the Li line, Bremsstrahlung background, and instrumental offset, are inferred using Bayesian probability theory with a Gaussian likelihood for photon statistics and electronic background noise. The prior distributions of the free parameters are chosen as Gaussians. Given these assumptions, the intensity of the Li line and corresponding uncertainties are analytically available using a Bayesian linear inversion technique. The p...

  15. Regionalization of Parameters of the Continuous Rainfall-Runoff model Based on Bayesian Generalized Linear Model

    Science.gov (United States)

    Kim, Tae-Jeong; Kim, Ki-Young; Shin, Dong-Hoon; Kwon, Hyun-Han

    2015-04-01

    It has been widely acknowledged that the appropriate simulation of natural streamflow at ungauged sites is one of the fundamental challenges to hydrology community. In particular, the key to reliable runoff simulation in ungauged basins is a reliable rainfall-runoff model and a parameter estimation. In general, parameter estimation in rainfall-runoff models is a complex issue due to an insufficient hydrologic data. This study aims to regionalize the parameters of the continuous rainfall-runoff model in conjunction with Bayesian statistical techniques to facilitate uncertainty analysis. First, this study uses the Bayesian Markov Chain Monte Carlo scheme for the Sacramento rainfall-runoff model that has been widely used around the world. The Sacramento model is calibrated against daily runoff observation, and thirteen parameters of the model are optimized as well as posterior distributor distributions for each parameter are derived. Second, we applied Bayesian generalized linear regression model to set of the parameters with basin characteristics (e.g. area and slope), to obtain a functional relationship between pairs of variables. The proposed model was validated in two gauged watersheds in accordance with the efficiency criteria such as the Nash-Sutcliffe efficiency, coefficient of efficiency, index of agreement and coefficient of correlation. The future study will be further focused on uncertainty analysis to fully incorporate propagation of the uncertainty into the regionalization framework. KEYWORDS: Ungauge, Parameter, Sacramento, Generalized linear model, Regionalization Acknowledgement This research was supported by a Grant (13SCIPA01) from Smart Civil Infrastructure Research Program funded by the Ministry of Land, Infrastructure and Transport (MOLIT) of Korea government and the Korea Agency for Infrastructure Technology Advancement (KAIA).

  16. Linking Bayesian and Agent-Based Models to Simulate Complex Social-Ecological Systems in the Sonoran Desert

    Science.gov (United States)

    Pope, A.; Gimblett, R.

    2013-12-01

    Interdependencies of ecologic, hydrologic, and social systems challenge traditional approaches to natural resource management in semi-arid regions. As a complex social-ecological system, water demands in the Sonoran Desert from agricultural and urban users often conflicts with water needs for its ecologically-significant riparian corridors. To explore this system, we developed an agent-based model to simulate complex feedbacks between human decisions and environmental conditions. Cognitive mapping in conjunction with stakeholder participation produced a Bayesian model of conditional probabilities of local human decision-making processes resulting to changes in water demand. Probabilities created in the Bayesian model were incorporated into the agent-based model, so that each agent had a unique probability to make a positive decision based on its perceived environment at each point in time and space. By using a Bayesian approach, uncertainty in the human decision-making process could be incorporated. The spatially-explicit agent-based model simulated changes in depth-to-groundwater by well pumping based on an agent's water demand. Depth-to-groundwater was then used as an indicator of unique vegetation guilds within the riparian corridor. Each vegetation guild provides varying levels of ecosystem services, the changes of which, along with changes in depth-to-groundwater, feedback to influence agent behavior. Using this modeling approach allowed us to examine resilience of semi-arid riparian corridors and agent behavior under various scenarios. The insight provided by the model contributes to understanding how specific interventions may alter the complex social-ecological system in the future.

  17. Linking Bayesian and agent-based models to simulate complex social-ecological systems in semi-arid regions

    Directory of Open Access Journals (Sweden)

    Aloah J Pope

    2015-08-01

    Full Text Available Interdependencies of ecologic, hydrologic, and social systems challenge traditional approaches to natural resource management in semi-arid regions. As a complex social-ecological system, water demands in the Sonoran Desert from agricultural and urban users often conflicts with water needs for its ecologically-significant riparian corridors. To explore this system, we developed an agent-based model to simulate complex feedbacks between human decisions and environmental conditions in the Rio Sonora Watershed. Cognitive mapping in conjunction with stakeholder participation produced a Bayesian model of conditional probabilities of local human decision-making processes resulting to changes in water demand. Probabilities created in the Bayesian model were incorporated into the agent-based model, so that each agent had a unique probability to make a positive decision based on its perceived environment at each point in time and space. By using a Bayesian approach, uncertainty in the human decision-making process could be incorporated. The spatially-explicit agent-based model simulated changes in depth-to-groundwater by well pumping based on an agent’s water demand. Changes in depth-to-groundwater feedback to influence agent behavior, as well as determine unique vegetation classes within the riparian corridor. Each vegetation class then provides varying stakeholder-defined quality values of ecosystem services. Using this modeling approach allowed us to examine effects on both the ecological and social system of semi-arid riparian corridors under various scenarios. The insight provided by the model contributes to understanding how specific interventions may alter the complex social-ecological system in the future.

  18. Coexistence in stochastic spatial models

    CERN Document Server

    Durrett, Rick

    2009-01-01

    In this paper I will review twenty years of work on the question: When is there coexistence in stochastic spatial models? The answer, announced in Durrett and Levin [Theor. Pop. Biol. 46 (1994) 363--394], and that we explain in this paper is that this can be determined by examining the mean-field ODE. There are a number of rigorous results in support of this picture, but we will state nine challenging and important open problems, most of which date from the 1990's.

  19. Semiparametric Bayesian inference on skew-normal joint modeling of multivariate longitudinal and survival data.

    Science.gov (United States)

    Tang, An-Min; Tang, Nian-Sheng

    2015-02-28

    We propose a semiparametric multivariate skew-normal joint model for multivariate longitudinal and multivariate survival data. One main feature of the posited model is that we relax the commonly used normality assumption for random effects and within-subject error by using a centered Dirichlet process prior to specify the random effects distribution and using a multivariate skew-normal distribution to specify the within-subject error distribution and model trajectory functions of longitudinal responses semiparametrically. A Bayesian approach is proposed to simultaneously obtain Bayesian estimates of unknown parameters, random effects and nonparametric functions by combining the Gibbs sampler and the Metropolis-Hastings algorithm. Particularly, a Bayesian local influence approach is developed to assess the effect of minor perturbations to within-subject measurement error and random effects. Several simulation studies and an example are presented to illustrate the proposed methodologies. PMID:25404574

  20. Bayesian model selection applied to artificial neural networks used for water resources modeling

    Science.gov (United States)

    Kingston, Greer B.; Maier, Holger R.; Lambert, Martin F.

    2008-04-01

    Artificial neural networks (ANNs) have proven to be extremely valuable tools in the field of water resources engineering. However, one of the most difficult tasks in developing an ANN is determining the optimum level of complexity required to model a given problem, as there is no formal systematic model selection method. This paper presents a Bayesian model selection (BMS) method for ANNs that provides an objective approach for comparing models of varying complexity in order to select the most appropriate ANN structure. The approach uses Markov Chain Monte Carlo posterior simulations to estimate the evidence in favor of competing models and, in this study, three known methods for doing this are compared in terms of their suitability for being incorporated into the proposed BMS framework for ANNs. However, it is acknowledged that it can be particularly difficult to accurately estimate the evidence of ANN models. Therefore, the proposed BMS approach for ANNs incorporates a further check of the evidence results by inspecting the marginal posterior distributions of the hidden-to-output layer weights, which unambiguously indicate any redundancies in the hidden layer nodes. The fact that this check is available is one of the greatest advantages of the proposed approach over conventional model selection methods, which do not provide such a test and instead rely on the modeler's subjective choice of selection criterion. The advantages of a total Bayesian approach to ANN development, including training and model selection, are demonstrated on two synthetic and one real world water resources case study.

  1. Multi-site identification of a distributed hydrological nitrogen model using Bayesian uncertainty analysis

    Science.gov (United States)

    Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Meon, Günter; Rode, Michael

    2015-10-01

    For capturing spatial variations of runoff and nutrient fluxes attributed to catchment heterogeneity, multi-site hydrological water quality monitoring strategies are increasingly put into practice. This study aimed to investigate the impacts of spatially distributed streamflow and streamwater Inorganic Nitrogen (IN) concentration observations on the identification of a continuous time, spatially semi-distributed and process-based hydrological water quality model HYPE (HYdrological Predictions for the Environment). A Bayesian inference based approach DREAM(ZS) (DiffeRential Evolution Adaptive Metrololis algorithm) was combined with HYPE to implement model optimisation and uncertainty analysis on streamflow and streamwater IN concentration simulations at a nested meso scale catchment in central Germany. To this end, a 10-year period (1994-1999 for calibration and 1999-2004 for validation) was utilised. We compared the parameters' posterior distributions, modelling performance using the best estimated parameter set and 95% prediction confidence intervals at catchment outlet for the calibration period that were derived from single-site calibration (SSC) and multi-site calibration (MSC) modes. For SSC, streamflow and streamwater IN concentration observations at only the catchment outlet were used. While, for MSC, streamflow and streamwater IN concentration observations from both catchment outlet and two internal sites were considered. Results showed that the uncertainty intervals of hydrological water quality parameters' posterior distributions estimated from MSC, were narrower than those obtained from SSC. In addition, it was found that the MSC outperformed SSC on streamwater IN concentration simulations at internal sites for both calibration and validation periods, while the influence on streamflow modelling performance was small. This can be explained by the "nested" nature of the catchment and high correlation between discharge observations from different sites

  2. Bayesian Models for Multimodal Perception of 3D Structure and Motion

    OpenAIRE

    Ferreira, J.F.; Bessière, Pierre; Mekhnacha, Kamel; Lobo, J.; J. Dias; Laugier, Christian

    2008-01-01

    In this text we will formalise a novel solution, the Bayesian Volumetric Map (BVM), as a framework for a metric, short-term, egocentric spatial memory for multimodal perception of 3D structure and motion. This solution will enable the implementation of top-down mechanisms of attention guidance of perception towards areas of high entropy/uncertainty, so as to promote active exploration of the environment by the robotic perceptual system. In the process, we will to try address the inherent chal...

  3. Predictive risk mapping of schistosomiasis in Brazil using Bayesian geostatistical models.

    Science.gov (United States)

    Scholte, Ronaldo G C; Gosoniu, Laura; Malone, John B; Chammartin, Frédérique; Utzinger, Jürg; Vounatsou, Penelope

    2014-04-01

    Schistosomiasis is one of the most common parasitic diseases in tropical and subtropical areas, including Brazil. A national control programme was initiated in Brazil in the mid-1970s and proved successful in terms of morbidity control, as the number of cases with hepato-splenic involvement was reduced significantly. To consolidate control and move towards elimination, there is a need for reliable maps on the spatial distribution of schistosomiasis, so that interventions can target communities at highest risk. The purpose of this study was to map the distribution of Schistosoma mansoni in Brazil. We utilized readily available prevalence data from the national schistosomiasis control programme for the years 2005-2009, derived remotely sensed climatic and environmental data and obtained socioeconomic data from various sources. Data were collated into a geographical information system and Bayesian geostatistical models were developed. Model-based maps identified important risk factors related to the transmission of S. mansoni and confirmed that environmental variables are closely associated with indices of poverty. Our smoothed predictive risk map, including uncertainty, highlights priority areas for intervention, namely the northern parts of North and Southeast regions and the eastern part of Northeast region. Our predictive risk map provides a useful tool for to strengthen existing surveillance-response mechanisms. PMID:24361640

  4. Disentangling the formation of contrasting tree-line physiognomies combining model selection and Bayesian parameterization for simulation models.

    Science.gov (United States)

    Martínez, Isabel; Wiegand, Thorsten; Camarero, J Julio; Batllori, Enric; Gutiérrez, Emilia

    2011-05-01

    Alpine tree-line ecotones are characterized by marked changes at small spatial scales that may result in a variety of physiognomies. A set of alternative individual-based models was tested with data from four contrasting Pinus uncinata ecotones in the central Spanish Pyrenees to reveal the minimal subset of processes required for tree-line formation. A Bayesian approach combined with Markov chain Monte Carlo methods was employed to obtain the posterior distribution of model parameters, allowing the use of model selection procedures. The main features of real tree lines emerged only in models considering nonlinear responses in individual rates of growth or mortality with respect to the altitudinal gradient. Variation in tree-line physiognomy reflected mainly changes in the relative importance of these nonlinear responses, while other processes, such as dispersal limitation and facilitation, played a secondary role. Different nonlinear responses also determined the presence or absence of krummholz, in agreement with recent findings highlighting a different response of diffuse and abrupt or krummholz tree lines to climate change. The method presented here can be widely applied in individual-based simulation models and will turn model selection and evaluation in this type of models into a more transparent, effective, and efficient exercise. PMID:21508601

  5. Basic and Advanced Bayesian Structural Equation Modeling With Applications in the Medical and Behavioral Sciences

    CERN Document Server

    Lee, Sik-Yum

    2012-01-01

    This book provides clear instructions to researchers on how to apply Structural Equation Models (SEMs) for analyzing the inter relationships between observed and latent variables. Basic and Advanced Bayesian Structural Equation Modeling introduces basic and advanced SEMs for analyzing various kinds of complex data, such as ordered and unordered categorical data, multilevel data, mixture data, longitudinal data, highly non-normal data, as well as some of their combinations. In addition, Bayesian semiparametric SEMs to capture the true distribution of explanatory latent variables are introduce

  6. Featuring Multiple Local Optima to Assist the User in the Interpretation of Induced Bayesian Network Models

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Pena, Jose; Kocka, Tomas

    2004-01-01

    We propose a method to assist the user in the interpretation of the best Bayesian network model indu- ced from data. The method consists in extracting relevant features from the model (e.g. edges, directed paths and Markov blankets) and, then, assessing the con¯dence in them by studying multiple...

  7. Bayesian interpolation in a dynamic sinusoidal model with application to packet-loss concealment

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Cemgil, Ali Taylan;

    2010-01-01

    a Bayesian inference scheme for the missing observations, hidden states and model parameters of the dynamic model. The inference scheme is based on a Markov chain Monte Carlo method known as Gibbs sampler. We illustrate the performance of the inference scheme to the application of packet-loss concealment...

  8. Probabilistic Modelling of Fatigue Life of Composite Laminates Using Bayesian Inference

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der

    2014-01-01

    . Model parameters are estimated by Bayesian inference. The reference data used consists of constant-amplitude fatigue test results for a multi-directional laminate subjected to seven different load ratios. The paper describes the modelling techniques and the parameter estimation procedure, supported by...

  9. An Explanation of the Effectiveness of Latent Semantic Indexing by Means of a Bayesian Regression Model.

    Science.gov (United States)

    Story, Roger E.

    1996-01-01

    Discussion of the use of Latent Semantic Indexing to determine relevancy in information retrieval focuses on statistical regression and Bayesian methods. Topics include keyword searching; a multiple regression model; how the regression model can aid search methods; and limitations of this approach, including complexity, linearity, and…

  10. A Bayesian Multi-Level Factor Analytic Model of Consumer Price Sensitivities across Categories

    Science.gov (United States)

    Duvvuri, Sri Devi; Gruca, Thomas S.

    2010-01-01

    Identifying price sensitive consumers is an important problem in marketing. We develop a Bayesian multi-level factor analytic model of the covariation among household-level price sensitivities across product categories that are substitutes. Based on a multivariate probit model of category incidence, this framework also allows the researcher to…

  11. Markov Model of Wind Power Time Series UsingBayesian Inference of Transition Matrix

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Berthelsen, Kasper Klitgaard; Bak-Jensen, Birgitte;

    2009-01-01

    This paper proposes to use Bayesian inference of transition matrix when developing a discrete Markov model of a wind speed/power time series and 95% credible interval for the model verification. The Dirichlet distribution is used as a conjugate prior for the transition matrix. Three discrete Markov...

  12. The bivariate combined model for spatial data analysis.

    Science.gov (United States)

    Neyens, Thomas; Lawson, Andrew B; Kirby, Russell S; Faes, Christel

    2016-08-15

    To describe the spatial distribution of diseases, a number of methods have been proposed to model relative risks within areas. Most models use Bayesian hierarchical methods, in which one models both spatially structured and unstructured extra-Poisson variance present in the data. For modelling a single disease, the conditional autoregressive (CAR) convolution model has been very popular. More recently, a combined model was proposed that 'combines' ideas from the CAR convolution model and the well-known Poisson-gamma model. The combined model was shown to be a good alternative to the CAR convolution model when there was a large amount of uncorrelated extra-variance in the data. Less solutions exist for modelling two diseases simultaneously or modelling a disease in two sub-populations simultaneously. Furthermore, existing models are typically based on the CAR convolution model. In this paper, a bivariate version of the combined model is proposed in which the unstructured heterogeneity term is split up into terms that are shared and terms that are specific to the disease or subpopulation, while spatial dependency is introduced via a univariate or multivariate Markov random field. The proposed method is illustrated by analysis of disease data in Georgia (USA) and Limburg (Belgium) and in a simulation study. We conclude that the bivariate combined model constitutes an interesting model when two diseases are possibly correlated. As the choice of the preferred model differs between data sets, we suggest to use the new and existing modelling approaches together and to choose the best model via goodness-of-fit statistics. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26928309

  13. A Bayesian Framework for Active Artificial Perception

    OpenAIRE

    Ferreira, Joao; Lobo, Jorge; Bessiere, Pierre; Castelo-Branco, M; Dias, Jorge

    2012-01-01

    In this text, we present a Bayesian framework for active multimodal perception of 3D structure and motion. The design of this framework finds its inspiration in the role of the dorsal perceptual pathway of the human brain. Its composing models build upon a common egocentric spatial configuration that is naturally fitting for the integration of readings from multiple sensors using a Bayesian approach. In the process, we will contribute with efficient and robust probabilistic solutions for cycl...

  14. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  15. Generating Hourly Rainfall Model using Bayesian Time Series Model (A Case Study at Sentral Station, Bondowoso

    Directory of Open Access Journals (Sweden)

    Entin Hidayah

    2011-02-01

    Full Text Available Disaggregation of hourly rainfall data is very important to fulfil the input of continual rainfall-runoff model, when the availability of automatic rainfall records are limited. Continual rainfall-runoff modeling requires rainfall data in form of series of hourly. Such specification can be obtained by temporal disaggregation in single site. The paper attempts to generate single-site rainfall model based upon time series (AR1 model by adjusting and establishing dummy procedure. Estimated with Bayesian Markov Chain Monte Carlo (MCMC the objective variable is hourly rainfall depth. Performance of model has been evaluated by comparison of history data and model prediction. The result shows that the model has a good performance for dry interval periods. The performance of the model good represented by smaller number of MAE by 0.21 respectively.

  16. Improving Local and Regional Flood Quantile Estimates Using a Hierarchical Bayesian GEV Model

    Science.gov (United States)

    Ribeiro Lima, C. H.; Lall, U.; Devineni, N.; Troy, T.

    2013-12-01

    Flood risk management usually relies on local and regional flood frequency analysis, which tends to suffer from lack of data and parameter uncertainties. Here we estimate local and regional Generalized Extreme Value (GEV) distribution parameters in a hierarchical Bayesian framework, which helps reduce uncertainties by pooling more information in the estimation process and provides a simple topology to propagate model and parameter uncertainties to flood quantile estimates. As prior information for the Bayesian model, it is assumed for each site that the GEV location and scale parameters come from independent log-normal distributions, whose mean parameter follows the well known log-log scaling law with the drainage area. The shape parameter for each site is shrunk towards a common mean. Non-informative prior distributions are assumed for the hyperparameters and the MCMC method is used to sample from the posterior distributions. The model is tested using annual maximum series from 20 streamflow gauges located in an 83.000 km2 basin in southeastern Brazil. The results show a significant improvement of flood quantile estimates over the traditional GEV model, particularly for sites with few data. For return periods within the range of the data (around 50 years), the Bayesian credible intervals for the flood quantiles are narrower than the classical confidence limits based on the delta method. As the return period increases beyond the range of the data, the confidence limits from the delta method become unreliable and the Bayesian credible intervals provide a way to estimate satisfactory confidence bands for the flood quantiles considering the parameter uncertainties. In order to evaluate the applicability of the proposed hierarchical Bayesian model for flood frequency regional analysis, we estimate flood quantiles for three randomly chosen out-of-sample sites and compare with classical estimates using the index flood method. The posterior distributions of the scaling

  17. A Bayesian beta distribution model for estimating rainfall IDF curves in a changing climate

    Science.gov (United States)

    Lima, Carlos H. R.; Kwon, Hyun-Han; Kim, Jin-Young

    2016-09-01

    The estimation of intensity-duration-frequency (IDF) curves for rainfall data comprises a classical task in hydrology studies to support a variety of water resources projects, including urban drainage and the design of flood control structures. In a changing climate, however, traditional approaches based on historical records of rainfall and on the stationary assumption can be inadequate and lead to poor estimates of rainfall intensity quantiles. Climate change scenarios built on General Circulation Models offer a way to access and estimate future changes in spatial and temporal rainfall patterns at the daily scale at the utmost, which is not as fine temporal resolution as required (e.g. hours) to directly estimate IDF curves. In this paper we propose a novel methodology based on a four-parameter beta distribution to estimate IDF curves conditioned on the observed (or simulated) daily rainfall, which becomes the time-varying upper bound of the updated nonstationary beta distribution. The inference is conducted in a Bayesian framework that provides a better way to take into account the uncertainty in the model parameters when building the IDF curves. The proposed model is tested using rainfall data from four stations located in South Korea and projected climate change Representative Concentration Pathways (RCPs) scenarios 6 and 8.5 from the Met Office Hadley Centre HadGEM3-RA model. The results show that the developed model fits the historical data as good as the traditional Generalized Extreme Value (GEV) distribution but is able to produce future IDF curves that significantly differ from the historically based IDF curves. The proposed model predicts for the stations and RCPs scenarios analysed in this work an increase in the intensity of extreme rainfalls of short duration with long return periods.

  18. Introduction of a methodology for visualization and graphical interpretation of Bayesian classification models.

    Science.gov (United States)

    Balfer, Jenny; Bajorath, Jürgen

    2014-09-22

    Supervised machine learning models are widely used in chemoinformatics, especially for the prediction of new active compounds or targets of known actives. Bayesian classification methods are among the most popular machine learning approaches for the prediction of activity from chemical structure. Much work has focused on predicting structure-activity relationships (SARs) on the basis of experimental training data. By contrast, only a few efforts have thus far been made to rationalize the performance of Bayesian or other supervised machine learning models and better understand why they might succeed or fail. In this study, we introduce an intuitive approach for the visualization and graphical interpretation of naïve Bayesian classification models. Parameters derived during supervised learning are visualized and interactively analyzed to gain insights into model performance and identify features that determine predictions. The methodology is introduced in detail and applied to assess Bayesian modeling efforts and predictions on compound data sets of varying structural complexity. Different classification models and features determining their performance are characterized in detail. A prototypic implementation of the approach is provided. PMID:25137527

  19. Plackett-Luce regression: A new Bayesian model for polychotomous data

    OpenAIRE

    Archambeau, Cedric; Caron, Francois

    2012-01-01

    Multinomial logistic regression is one of the most popular models for modelling the effect of explanatory variables on a subject choice between a set of specified options. This model has found numerous applications in machine learning, psychology or economy. Bayesian inference in this model is non trivial and requires, either to resort to a MetropolisHastings algorithm, or rejection sampling within a Gibbs sampler. In this paper, we propose an alternative model to multinomial logistic regress...

  20. Scale dependence in the effects of leaf ecophysiological traits on photosynthesis: Bayesian parameterization of photosynthesis models.

    Science.gov (United States)

    Feng, Xiaohui; Dietze, Michael

    2013-12-01

    Relationships between leaf traits and carbon assimilation rates are commonly used to predict primary productivity at scales from the leaf to the globe. We addressed how the shape and magnitude of these relationships vary across temporal, spatial and taxonomic scales to improve estimates of carbon dynamics. Photosynthetic CO2 and light response curves, leaf nitrogen (N), chlorophyll (Chl) concentration and specific leaf area (SLA) of 25 grassland species were measured. In addition, C3 and C4 photosynthesis models were parameterized using a novel hierarchical Bayesian approach to quantify the effects of leaf traits on photosynthetic capacity and parameters at different scales. The effects of plant physiological traits on photosynthetic capacity and parameters varied among species, plant functional types and taxonomic scales. Relationships in the grassland biome were significantly different from the global average. Within-species variability in photosynthetic parameters through the growing season could be attributed to the seasonal changes of leaf traits, especially leaf N and Chl, but these responses followed qualitatively different relationships from the across-species relationship. The results suggest that one broad-scale relationship is not sufficient to characterize ecosystem condition and change at multiple scales. Applying trait relationships without articulating the scales may cause substantial carbon flux estimation errors. PMID:23952643

  1. Non-parametric Bayesian graph models reveal community structure in resting state fMRI

    DEFF Research Database (Denmark)

    Andersen, Kasper Winther; Madsen, Kristoffer H.; Siebner, Hartwig Roman;

    2014-01-01

    Modeling of resting state functional magnetic resonance imaging (rs-fMRI) data using network models is of increasing interest. It is often desirable to group nodes into clusters to interpret the communication patterns between nodes. In this study we consider three different nonparametric Bayesian...... models for node clustering in complex networks. In particular, we test their ability to predict unseen data and their ability to reproduce clustering across datasets. The three generative models considered are the Infinite Relational Model (IRM), Bayesian Community Detection (BCD), and the Infinite...... Diagonal Model (IDM). The models define probabilities of generating links within and between clusters and the difference between the models lies in the restrictions they impose upon the between-cluster link probabilities. IRM is the most flexible model with no restrictions on the probabilities of links...

  2. Bayesian approach for thermal diffusivity mapping from infrared images with spatially random heat pulse heating

    Energy Technology Data Exchange (ETDEWEB)

    Fudym, O [RAPSODEE UMR 2392 CNRS, Mines Albi, Campus Jarlard, Albi (France); Orlande, H R B [PEM/COPPE, Federal University of Rio de Janeiro (Brazil); Bamford, M; Batsale, J C [TREFLE-ENSAM, UMR 8508 CNRS, Talence (France)], E-mail: fudym@enstimac.fr

    2008-11-01

    This paper aims at the identification of spatially-dependent thermophysical properties and is focused on the mapping of thermal diffusivity. A one-dimensional heat conduction problem is used for the comparison of ordinary least-squares, maximum a posteriori and Markov Chain Monte Carlo methods. Simulated temperature measurements are used in the inverse analysis, which is based on a nodal strategy that results on a linear estimation problem. The nodal strategy relies on the availability of temperature measurements with fine spatial resolution and high frequency, typical of nowadays infrared cameras.

  3. Spatial model of autocatalytic reactions

    Science.gov (United States)

    de Anna, Pietro; di Patti, Francesca; Fanelli, Duccio; McKane, Alan J.; Dauxois, Thierry

    2010-05-01

    Biological cells with all of their surface structure and complex interior stripped away are essentially vesicles—membranes composed of lipid bilayers which form closed sacs. Vesicles are thought to be relevant as models of primitive protocells, and they could have provided the ideal environment for prebiotic reactions to occur. In this paper, we investigate the stochastic dynamics of a set of autocatalytic reactions, within a spatially bounded domain, so as to mimic a primordial cell. The discreteness of the constituents of the autocatalytic reactions gives rise to large sustained oscillations even when the number of constituents is quite large. These oscillations are spatiotemporal in nature, unlike those found in previous studies, which consisted only of temporal oscillations. We speculate that these oscillations may have a role in seeding membrane instabilities which lead to vesicle division. In this way synchronization could be achieved between protocell growth and the reproduction rate of the constituents (the protogenetic material) in simple protocells.

  4. Bayesian-MCMC-based parameter estimation of stealth aircraft RCS models

    Science.gov (United States)

    Xia, Wei; Dai, Xiao-Xia; Feng, Yuan

    2015-12-01

    When modeling a stealth aircraft with low RCS (Radar Cross Section), conventional parameter estimation methods may cause a deviation from the actual distribution, owing to the fact that the characteristic parameters are estimated via directly calculating the statistics of RCS. The Bayesian-Markov Chain Monte Carlo (Bayesian-MCMC) method is introduced herein to estimate the parameters so as to improve the fitting accuracies of fluctuation models. The parameter estimations of the lognormal and the Legendre polynomial models are reformulated in the Bayesian framework. The MCMC algorithm is then adopted to calculate the parameter estimates. Numerical results show that the distribution curves obtained by the proposed method exhibit improved consistence with the actual ones, compared with those fitted by the conventional method. The fitting accuracy could be improved by no less than 25% for both fluctuation models, which implies that the Bayesian-MCMC method might be a good candidate among the optimal parameter estimation methods for stealth aircraft RCS models. Project supported by the National Natural Science Foundation of China (Grant No. 61101173), the National Basic Research Program of China (Grant No. 613206), the National High Technology Research and Development Program of China (Grant No. 2012AA01A308), the State Scholarship Fund by the China Scholarship Council (CSC), and the Oversea Academic Training Funds, and University of Electronic Science and Technology of China (UESTC).

  5. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems. © 2013 Elsevier Inc.

  6. A spatial bivariate probit model for correlated binary data with application to adverse birth outcomes.

    Science.gov (United States)

    Neelon, Brian; Anthopolos, Rebecca; Miranda, Marie Lynn

    2014-04-01

    Motivated by a study examining geographic variation in birth outcomes, we develop a spatial bivariate probit model for the joint analysis of preterm birth and low birth weight. The model uses a hierarchical structure to incorporate individual and areal-level information, as well as spatially dependent random effects for each spatial unit. Because rates of preterm birth and low birth weight are likely to be correlated within geographic regions, we model the spatial random effects via a bivariate conditionally autoregressive prior, which induces regional dependence between the outcomes and provides spatial smoothing and sharing of information across neighboring areas. Under this general framework, one can obtain region-specific joint, conditional, and marginal inferences of interest. We adopt a Bayesian modeling approach and develop a practical Markov chain Monte Carlo computational algorithm that relies primarily on easily sampled Gibbs steps. We illustrate the model using data from the 2007-2008 North Carolina Detailed Birth Record. PMID:22599322

  7. A spatial model of bird abundance as adjusted for detection probability

    Science.gov (United States)

    Gorresen, P.M.; Mcmillan, G.P.; Camp, R.J.; Pratt, T.K.

    2009-01-01

    Modeling the spatial distribution of animals can be complicated by spatial and temporal effects (i.e. spatial autocorrelation and trends in abundance over time) and other factors such as imperfect detection probabilities and observation-related nuisance variables. Recent advances in modeling have demonstrated various approaches that handle most of these factors but which require a degree of sampling effort (e.g. replication) not available to many field studies. We present a two-step approach that addresses these challenges to spatially model species abundance. Habitat, spatial and temporal variables were handled with a Bayesian approach which facilitated modeling hierarchically structured data. Predicted abundance was subsequently adjusted to account for imperfect detection and the area effectively sampled for each species. We provide examples of our modeling approach for two endemic Hawaiian nectarivorous honeycreepers: 'i'iwi Vestiaria coccinea and 'apapane Himatione sanguinea. ?? 2009 Ecography.

  8. Adaptive surrogate modeling for response surface approximations with application to bayesian inference

    KAUST Repository

    Prudhomme, Serge

    2015-09-17

    Parameter estimation for complex models using Bayesian inference is usually a very costly process as it requires a large number of solves of the forward problem. We show here how the construction of adaptive surrogate models using a posteriori error estimates for quantities of interest can significantly reduce the computational cost in problems of statistical inference. As surrogate models provide only approximations of the true solutions of the forward problem, it is nevertheless necessary to control these errors in order to construct an accurate reduced model with respect to the observables utilized in the identification of the model parameters. Effectiveness of the proposed approach is demonstrated on a numerical example dealing with the Spalart–Allmaras model for the simulation of turbulent channel flows. In particular, we illustrate how Bayesian model selection using the adapted surrogate model in place of solving the coupled nonlinear equations leads to the same quality of results while requiring fewer nonlinear PDE solves.

  9. A Software Risk Analysis Model Using Bayesian Belief Network

    Institute of Scientific and Technical Information of China (English)

    Yong Hu; Juhua Chen; Mei Liu; Yang Yun; Junbiao Tang

    2006-01-01

    The uncertainty during the period of software project development often brings huge risks to contractors and clients. Ifwe can find an effective method to predict the cost and quality of software projects based on facts like the project character and two-side cooperating capability at the beginning of the project, we can reduce the risk.Bayesian Belief Network(BBN) is a good tool for analyzing uncertain consequences, but it is difficult to produce precise network structure and conditional probability table. In this paper, we built up network structure by Delphi method for conditional probability table learning, and learn update probability table and nodes' confidence levels continuously according to the application cases, which made the evaluation network have learning abilities, and evaluate the software development risk of organization more accurately. This paper also introduces EM algorithm, which will enhance the ability to produce hidden nodes caused by variant software projects.

  10. Using Bayesian Model Selection to Characterize Neonatal Eeg Recordings

    Science.gov (United States)

    Mitchell, Timothy J.

    2009-12-01

    The brains of premature infants must undergo significant maturation outside of the womb and are thus particularly susceptible to injury. Electroencephalographic (EEG) recordings are an important diagnostic tool in determining if a newborn's brain is functioning normally or if injury has occurred. However, interpreting the recordings is difficult and requires the skills of a trained electroencephelographer. Because these EEG specialists are rare, an automated interpretation of newborn EEG recordings would increase access to an important diagnostic tool for physicians. To automate this procedure, we employ Bayesian probability theory to compute the posterior probability for the EEG features of interest and use the results in a program designed to mimic EEG specialists. Specifically, we will be identifying waveforms of varying frequency and amplitude, as well as periods of flat recordings where brain activity is minimal.

  11. Bayesian state space models for dynamic genetic network construction across multiple tissues.

    Science.gov (United States)

    Liang, Yulan; Kelemen, Arpad

    2016-08-01

    Construction of gene-gene interaction networks and potential pathways is a challenging and important problem in genomic research for complex diseases while estimating the dynamic changes of the temporal correlations and non-stationarity are the keys in this process. In this paper, we develop dynamic state space models with hierarchical Bayesian settings to tackle this challenge for inferring the dynamic profiles and genetic networks associated with disease treatments. We treat both the stochastic transition matrix and the observation matrix time-variant and include temporal correlation structures in the covariance matrix estimations in the multivariate Bayesian state space models. The unevenly spaced short time courses with unseen time points are treated as hidden state variables. Hierarchical Bayesian approaches with various prior and hyper-prior models with Monte Carlo Markov Chain and Gibbs sampling algorithms are used to estimate the model parameters and the hidden state variables. We apply the proposed Hierarchical Bayesian state space models to multiple tissues (liver, skeletal muscle, and kidney) Affymetrix time course data sets following corticosteroid (CS) drug administration. Both simulation and real data analysis results show that the genomic changes over time and gene-gene interaction in response to CS treatment can be well captured by the proposed models. The proposed dynamic Hierarchical Bayesian state space modeling approaches could be expanded and applied to other large scale genomic data, such as next generation sequence (NGS) combined with real time and time varying electronic health record (EHR) for more comprehensive and robust systematic and network based analysis in order to transform big biomedical data into predictions and diagnostics for precision medicine and personalized healthcare with better decision making and patient outcomes. PMID:27343475

  12. Climate information based streamflow and rainfall forecasts for Huai River basin using hierarchical Bayesian modeling

    OpenAIRE

    Chen, X.; Hao, Z; N. Devineni; U. Lall

    2014-01-01

    A Hierarchal Bayesian model is presented for one season-ahead forecasts of summer rainfall and streamflow using exogenous climate variables for east central China. The model provides estimates of the posterior forecasted probability distribution for 12 rainfall and 2 streamflow stations considering parameter uncertainty, and cross-site correlation. The model has a multi-level structure with regression coefficients modeled from a common multi-variate normal distribution resul...

  13. Climate information based streamflow and rainfall forecasts for Huai River Basin using Hierarchical Bayesian Modeling

    OpenAIRE

    Chen, X.; Hao, Z; N. Devineni; U. Lall

    2013-01-01

    A Hierarchal Bayesian model for forecasting regional summer rainfall and streamflow season-ahead using exogenous climate variables for East Central China is presented. The model provides estimates of the posterior forecasted probability distribution for 12 rainfall and 2 streamflow stations considering parameter uncertainty, and cross-site correlation. The model has a multilevel structure with regression coefficients modeled from a common multivariate normal distribution results in partial-po...

  14. Bayesian inference of models and hyper-parameters for robust optic-flow estimation

    OpenAIRE

    Héas, Patrick; Herzet, Cédric; Memin, Etienne

    2012-01-01

    International audience Selecting optimal models and hyper-parameters is crucial for accurate optic-flow estimation. This paper provides a solution to the problem in a generic Bayesian framework. The method is based on a conditional model linking the image intensity function, the unknown velocity field, hyper-parameters and the prior and likelihood motion models. Inference is performed on each of the three-level of this so-defined hierarchical model by maximization of marginalized \\textit{a...

  15. Bayesian Forecasting of US Growth using Basic Time Varying Parameter Models and Expectations Data

    OpenAIRE

    Basturk, Nalan; Ceyhan, Pinar; Dijk, Herman

    2014-01-01

    markdownabstract__Abstract__ Time varying patterns in US growth are analyzed using various univariate model structures, starting from a naive model structure where all features change every period to a model where the slow variation in the conditional mean and changes in the conditional variance are specified together with their interaction, including survey data on expected growth in order to strengthen the information in the model. Use is made of a simulation based Bayesian inferential meth...

  16. Modeling and Analysis of Call Center Arrival Data: A Bayesian Approach

    OpenAIRE

    Refik Soyer; M. Murat Tarimcilar

    2008-01-01

    In this paper, we present a modulated Poisson process model to describe and analyze arrival data to a call center. The attractive feature of this model is that it takes into account both covariate and time effects on the call volume intensity, and in so doing, enables us to assess the effectiveness of different advertising strategies along with predicting the arrival patterns. A Bayesian analysis of the model is developed and an extension of the model is presented to describe potential hetero...

  17. A General and Flexible Approach to Estimating the Social Relations Model Using Bayesian Methods

    Science.gov (United States)

    Ludtke, Oliver; Robitzsch, Alexander; Kenny, David A.; Trautwein, Ulrich

    2013-01-01

    The social relations model (SRM) is a conceptual, methodological, and analytical approach that is widely used to examine dyadic behaviors and interpersonal perception within groups. This article introduces a general and flexible approach to estimating the parameters of the SRM that is based on Bayesian methods using Markov chain Monte Carlo…

  18. Non-parametric Bayesian models of response function in dynamic image sequences

    Czech Academy of Sciences Publication Activity Database

    Tichý, Ondřej; Šmídl, Václav

    -, - (2016). ISSN 1077-3142 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Response function * Blind source separation * Dynamic medical imaging * Probabilistic models * Bayesian methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.540, year: 2014 http://library.utia.cas.cz/separaty/2016/AS/tichy-0456983.pdf

  19. Food Reconstruction Using Isotopic Transferred Signals (FRUITS): A Bayesian Model for Diet Reconstruction

    Czech Academy of Sciences Publication Activity Database

    Fernandes, R.; Millard, A.R.; Brabec, Marek; Nadeau, M.J.; Grootes, P.

    2014-01-01

    Roč. 9, č. 2 (2014), Art. no. e87436. E-ISSN 1932-6203 Institutional support: RVO:67985807 Keywords : ancienit diet reconstruction * stable isotope measurements * mixture model * Bayesian estimation * Dirichlet prior Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.234, year: 2014

  20. A Bayesian model for predicting face recognition performance using image quality

    NARCIS (Netherlands)

    Dutta, Abhishek; Veldhuis, Raymond; Spreeuwers, Luuk

    2014-01-01

    Quality of a pair of facial images is a strong indicator of the uncertainty in decision about identity based on that image pair. In this paper, we describe a Bayesian approach to model the relation between image quality (like pose, illumination, noise, sharpness, etc) and corresponding face recognit

  1. Discriminative variable subsets in Bayesian classification with mixture models, with application in flow cytometry studies.

    Science.gov (United States)

    Lin, Lin; Chan, Cliburn; West, Mike

    2016-01-01

    We discuss the evaluation of subsets of variables for the discriminative evidence they provide in multivariate mixture modeling for classification. The novel development of Bayesian classification analysis presented is partly motivated by problems of design and selection of variables in biomolecular studies, particularly involving widely used assays of large-scale single-cell data generated using flow cytometry technology. For such studies and for mixture modeling generally, we define discriminative analysis that overlays fitted mixture models using a natural measure of concordance between mixture component densities, and define an effective and computationally feasible method for assessing and prioritizing subsets of variables according to their roles in discrimination of one or more mixture components. We relate the new discriminative information measures to Bayesian classification probabilities and error rates, and exemplify their use in Bayesian analysis of Dirichlet process mixture models fitted via Markov chain Monte Carlo methods as well as using a novel Bayesian expectation-maximization algorithm. We present a series of theoretical and simulated data examples to fix concepts and exhibit the utility of the approach, and compare with prior approaches. We demonstrate application in the context of automatic classification and discriminative variable selection in high-throughput systems biology using large flow cytometry datasets. PMID:26040910

  2. RevBayes: Bayesian Phylogenetic Inference Using Graphical Models and an Interactive Model-Specification Language.

    Science.gov (United States)

    Höhna, Sebastian; Landis, Michael J; Heath, Tracy A; Boussau, Bastien; Lartillot, Nicolas; Moore, Brian R; Huelsenbeck, John P; Ronquist, Fredrik

    2016-07-01

    Programs for Bayesian inference of phylogeny currently implement a unique and fixed suite of models. Consequently, users of these software packages are simultaneously forced to use a number of programs for a given study, while also lacking the freedom to explore models that have not been implemented by the developers of those programs. We developed a new open-source software package, RevBayes, to address these problems. RevBayes is entirely based on probabilistic graphical models, a powerful generic framework for specifying and analyzing statistical models. Phylogenetic-graphical models can be specified interactively in RevBayes, piece by piece, using a new succinct and intuitive language called Rev. Rev is similar to the R language and the BUGS model-specification language, and should be easy to learn for most users. The strength of RevBayes is the simplicity with which one can design, specify, and implement new and complex models. Fortunately, this tremendous flexibility does not come at the cost of slower computation; as we demonstrate, RevBayes outperforms competing software for several standard analyses. Compared with other programs, RevBayes has fewer black-box elements. Users need to explicitly specify each part of the model and analysis. Although this explicitness may initially be unfamiliar, we are convinced that this transparency will improve understanding of phylogenetic models in our field. Moreover, it will motivate the search for improvements to existing methods by brazenly exposing the model choices that we make to critical scrutiny. RevBayes is freely available at http://www.RevBayes.com [Bayesian inference; Graphical models; MCMC; statistical phylogenetics.]. PMID:27235697

  3. Using Bayesian statistics for modeling PTSD through Latent Growth Mixture Modeling: implementation and discussion

    Directory of Open Access Journals (Sweden)

    Sarah Depaoli

    2015-03-01

    Full Text Available Background: After traumatic events, such as disaster, war trauma, and injuries including burns (which is the focus here, the risk to develop posttraumatic stress disorder (PTSD is approximately 10% (Breslau & Davis, 1992. Latent Growth Mixture Modeling can be used to classify individuals into distinct groups exhibiting different patterns of PTSD (Galatzer-Levy, 2015. Currently, empirical evidence points to four distinct trajectories of PTSD patterns in those who have experienced burn trauma. These trajectories are labeled as: resilient, recovery, chronic, and delayed onset trajectories (e.g., Bonanno, 2004; Bonanno, Brewin, Kaniasty, & Greca, 2010; Maercker, Gäbler, O'Neil, Schützwohl, & Müller, 2013; Pietrzak et al., 2013. The delayed onset trajectory affects only a small group of individuals, that is, about 4–5% (O'Donnell, Elliott, Lau, & Creamer, 2007. In addition to its low frequency, the later onset of this trajectory may contribute to the fact that these individuals can be easily overlooked by professionals. In this special symposium on Estimating PTSD trajectories (Van de Schoot, 2015a, we illustrate how to properly identify this small group of individuals through the Bayesian estimation framework using previous knowledge through priors (see, e.g., Depaoli & Boyajian, 2014; Van de Schoot, Broere, Perryck, Zondervan-Zwijnenburg, & Van Loey, 2015. Method: We used latent growth mixture modeling (LGMM (Van de Schoot, 2015b to estimate PTSD trajectories across 4 years that followed a traumatic burn. We demonstrate and compare results from traditional (maximum likelihood and Bayesian estimation using priors (see, Depaoli, 2012, 2013. Further, we discuss where priors come from and how to define them in the estimation process. Results: We demonstrate that only the Bayesian approach results in the desired theory-driven solution of PTSD trajectories. Since the priors are chosen subjectively, we also present a sensitivity analysis of the

  4. Climate information based streamflow and rainfall forecasts for Huai River Basin using Hierarchical Bayesian Modeling

    Directory of Open Access Journals (Sweden)

    X. Chen

    2013-09-01

    Full Text Available A Hierarchal Bayesian model for forecasting regional summer rainfall and streamflow season-ahead using exogenous climate variables for East Central China is presented. The model provides estimates of the posterior forecasted probability distribution for 12 rainfall and 2 streamflow stations considering parameter uncertainty, and cross-site correlation. The model has a multilevel structure with regression coefficients modeled from a common multivariate normal distribution results in partial-pooling of information across multiple stations and better representation of parameter and posterior distribution uncertainty. Covariance structure of the residuals across stations is explicitly modeled. Model performance is tested under leave-10-out cross-validation. Frequentist and Bayesian performance metrics used include Receiver Operating Characteristic, Reduction of Error, Coefficient of Efficiency, Rank Probability Skill Scores, and coverage by posterior credible intervals. The ability of the model to reliably forecast regional summer rainfall and streamflow season-ahead offers potential for developing adaptive water risk management strategies.

  5. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit

    Science.gov (United States)

    Wong, Rowena Syn Yin; Ismail, Noor Azina

    2016-01-01

    Background and Objectives There are not many studies that attempt to model intensive care unit (ICU) risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU. Methods This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV) model. Bayesian Markov Chain Monte Carlo (MCMC) simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method. Results The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS) was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC) values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05) for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study. Conclusion Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of

  6. A Hierarchical Bayesian Model for Estimating Remediation-induced Biogeochemical Transformations Using Spectral Induced Polarization Data: Development and Application to the Contaminated DOE Rifle (CO) Site

    Science.gov (United States)

    Chen, J.; Hubbard, S. S.; Williams, K. H.; Tuglus, C.; Flores-Orozco, A.; Kemna, A.

    2010-12-01

    Although in-situ bioremediation is often considered as a key approach for subsurface environmental remediation, monitoring induced biogeochemical processes, needed to evaluate the efficacy of the treatments, is challenging over field relevant scales. In this study, we develop a hierarchical Bayesian model that builds on our previous framework for estimating biogeochemical transformations using geochemical and geophysical data obtained from laboratory column experiments. The new Bayesian model treats the induced biogeochemical transformations as both spatial and temporal (rather than just temporal) processes and combines time-lapse borehole ‘point’ geochemical measurements with inverted surface- or crosshole-based spectral induced polarization (SIP) data. This model consists of three levels of statistical sub-models: (1) data model (or likelihood function), which provides links between the biogeochemical end-products and geophysical attributes, (2) process model, which describes the spatial and temporal variability of biogeochemical properties in the disturbed subsurface systems, and (3) parameter model, which describes the prior distributions of various parameters and initial conditions. The joint posterior probability distribution is explored using Markov Chain Monte Carlo sampling methods to obtain the spatial and temporal distribution of the hidden parameters. We apply the developed Bayesian model to the datasets collected from the uranium-contaminated DOE Rifle site for estimating the spatial and temporal distribution of remediation-induced end products. The datasets consist of time-lapse wellbore aqueous geochemical parameters (including Fe(II), sulfate, sulfide, acetate, uranium, chloride, and bromide concentrations) and surface SIP data collected over 13 frequencies (ranging from 0.065Hz to 256Hz). We first perform statistical analysis on the multivariate data to identify possible patterns (or ‘diagnostic signatures’) of bioremediation, and then we

  7. Equifinality of formal (DREAM) and informal (GLUE) bayesian approaches in hydrologic modeling?

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Ter Braak, Cajo J F [NON LANL; Gupta, Hoshin V [NON LANL

    2008-01-01

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.

  8. Bayesian model selection validates a biokinetic model for zirconium processing in humans

    Directory of Open Access Journals (Sweden)

    Schmidl Daniel

    2012-08-01

    Full Text Available Abstract Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology.

  9. A spatio-temporal nonparametric Bayesian variable selection model of fMRI data for clustering correlated time courses.

    Science.gov (United States)

    Zhang, Linlin; Guindani, Michele; Versace, Francesco; Vannucci, Marina

    2014-07-15

    In this paper we present a novel wavelet-based Bayesian nonparametric regression model for the analysis of functional magnetic resonance imaging (fMRI) data. Our goal is to provide a joint analytical framework that allows to detect regions of the brain which exhibit neuronal activity in response to a stimulus and, simultaneously, infer the association, or clustering, of spatially remote voxels that exhibit fMRI time series with similar characteristics. We start by modeling the data with a hemodynamic response function (HRF) with a voxel-dependent shape parameter. We detect regions of the brain activated in response to a given stimulus by using mixture priors with a spike at zero on the coefficients of the regression model. We account for the complex spatial correlation structure of the brain by using a Markov random field (MRF) prior on the parameters guiding the selection of the activated voxels, therefore capturing correlation among nearby voxels. In order to infer association of the voxel time courses, we assume correlated errors, in particular long memory, and exploit the whitening properties of discrete wavelet transforms. Furthermore, we achieve clustering of the voxels by imposing a Dirichlet process (DP) prior on the parameters of the long memory process. For inference, we use Markov Chain Monte Carlo (MCMC) sampling techniques that combine Metropolis-Hastings schemes employed in Bayesian variable selection with sampling algorithms for nonparametric DP models. We explore the performance of the proposed model on simulated data, with both block- and event-related design, and on real fMRI data. PMID:24650600

  10. A Bayesian network approach to knowledge integration and representation of farm irrigation: 2. Model validation

    Science.gov (United States)

    Robertson, D. E.; Wang, Q. J.; Malano, H.; Etchells, T.

    2009-02-01

    For models to be useful, they need to adequately describe the systems they represent. The probabilistic nature of Bayesian network models has traditionally meant that model validation is difficult. In this paper we present a process to validate Inteca-Farm, a Bayesian network model of farm irrigation that we described in the first paper of this series. We assessed three aspects of the quality of model predictions, namely, bias, accuracy, and skill, for the two variables for which validation data are available directly or indirectly. We also examined model predictions for any systematic errors. The validation results show that the bias and accuracy of the two validated variables are within acceptable tolerances and that systematic errors are minimal. This suggests that Inteca-Farm is a plausible representation of farm irrigation system in the Shepparton Irrigation Region of northern Victoria, Australia.

  11. Bayesian spatio-temporal modeling of particulate matter concentrations in Peninsular Malaysia

    Science.gov (United States)

    Manga, Edna; Awang, Norhashidah

    2016-06-01

    This article presents an application of a Bayesian spatio-temporal Gaussian process (GP) model on particulate matter concentrations from Peninsular Malaysia. We analyze daily PM10 concentration levels from 35 monitoring sites in June and July 2011. The spatiotemporal model set in a Bayesian hierarchical framework allows for inclusion of informative covariates, meteorological variables and spatiotemporal interactions. Posterior density estimates of the model parameters are obtained by Markov chain Monte Carlo methods. Preliminary data analysis indicate information on PM10 levels at sites classified as industrial locations could explain part of the space time variations. We include the site-type indicator in our modeling efforts. Results of the parameter estimates for the fitted GP model show significant spatio-temporal structure and positive effect of the location-type explanatory variable. We also compute some validation criteria for the out of sample sites that show the adequacy of the model for predicting PM10 at unmonitored sites.

  12. Nonparametric Spatial Models for Extremes: Application to Extreme Temperature Data.

    Science.gov (United States)

    Fuentes, Montserrat; Henry, John; Reich, Brian

    2013-03-01

    Estimating the probability of extreme temperature events is difficult because of limited records across time and the need to extrapolate the distributions of these events, as opposed to just the mean, to locations where observations are not available. Another related issue is the need to characterize the uncertainty in the estimated probability of extreme events at different locations. Although the tools for statistical modeling of univariate extremes are well-developed, extending these tools to model spatial extreme data is an active area of research. In this paper, in order to make inference about spatial extreme events, we introduce a new nonparametric model for extremes. We present a Dirichlet-based copula model that is a flexible alternative to parametric copula models such as the normal and t-copula. The proposed modelling approach is fitted using a Bayesian framework that allow us to take into account different sources of uncertainty in the data and models. We apply our methods to annual maximum temperature values in the east-south-central United States. PMID:24058280

  13. Bayesian chronological modeling of SunWatch, a fort ancient village in Dayton, Ohio

    OpenAIRE

    Krus, A.M.; Cook, R.; Hamilton, W.D.

    2015-01-01

    Radiocarbon results from houses, pits, and burials at the SunWatch site, Dayton, Ohio, are presented within an interpretative Bayesian statistical framework. The primary model incorporates dates from archaeological features in an unordered phase and uses charcoal outlier modeling (Bronk Ramsey 2009b) to account for issues of wood charcoal 14C dates predating their context. The results of the primary model estimate occupation lasted for 1–245 yr (95% probability), starting in cal AD 1175–1385 ...

  14. A new model test in high energy physics in frequentist and Bayesian statistical formalisms

    OpenAIRE

    Kamenshchikov, Andrey

    2016-01-01

    A problem of a new physical model test given observed experimental data is a typical one for modern experiments of high energy physics (HEP). A solution of the problem may be provided with two alternative statistical formalisms, namely frequentist and Bayesian, which are widely spread in contemporary HEP searches. A characteristic experimental situation is modeled from general considerations and both the approaches are utilized in order to test a new model. The results are juxtaposed, what de...

  15. Integrated spatial sampling modeling of geospatial data

    Institute of Scientific and Technical Information of China (English)

    LI; Lianfa; WANG; Jinfeng

    2004-01-01

    Spatial sampling is a necessary and important method for extracting geospatial data and its methodology directly affects the geo-analysis results. Counter to the deficiency of separate models of spatial sampling, this article analyzes three crucial elements of spatial sampling (frame, correlation and decision diagram) and induces its general integrated model. The program of Spatial Sampling Integration (SSI) has been developed with Component Object Model (COM) to realize the general integrated model. In two practical applications, i.e. design of the monitoring network of natural disasters and sampling survey of the areas of non-cultivated land, SSI has produced accurate results at less cost, better realizing the cost-effective goal of sampling toward the geo-objects with spatial correlation. The two cases exemplify expanded application and convenient implementation of the general integrated model with inset components in an integrated environment, which can also be extended to other modeling of spatial analysis.

  16. Lessons Learned from a Past Series of Bayesian Model Averaging studies for Soil/Plant Models

    Science.gov (United States)

    Nowak, Wolfgang; Wöhling, Thomas; Schöniger, Anneli

    2015-04-01

    In this study we evaluate the lessons learned about modelling soil/plant systems from analyzing evapotranspiration data, soil moisture and leaf area index. The data were analyzed with advanced tools from the area of Bayesian Model Averaging, model ranking and Bayesian Model Selection. We have generated a large variety of model conceptualizations by sampling random parameter sets from the vegetation components of the CERES, SUCROS, GECROS, and SPASS models and a common model for soil water movement via Monte-Carlo simulations. We used data from a one vegetation period of winter wheat at a field site in Nellingen, Germany. The data set includes soil moisture, actual evapotranspiration (ETa) from an eddy covariance tower, and leaf-area index (LAI). The focus of data analysis was on how one can do model ranking and model selection. Further analysis steps included the predictive reliability of different soil/plant models calibrated on different subsets of the available data. Our main conclusion is that model selection between different competing soil-plant models remains a large challenge, because 1. different data types and their combinations favor different models, because competing models are more or less good in simulating the coupling processes between the various compartments and their states, 2. singular events (such as the evolution of LAI during plant senescence) can dominate an entire time series, and long time series can be represented well by the few data values where the models disagree most, 3. the different data types differ in their discriminating power for model selection, 4. the level of noise present in ETa and LAI data, and the level of systematic model bias through simplifications of the complex system (e.g., assuming a few internally homogeneous soil layers) substantially reduce the confidence in model ranking and model selection, 5. none of the models withstands a hypothesis test against the available data, 6. even the assumed level of measurement

  17. bspmma: An R Package for Bayesian Semiparametric Models for Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Deborah Burr

    2012-07-01

    Full Text Available We introduce an R package, bspmma, which implements a Dirichlet-based random effects model specific to meta-analysis. In meta-analysis, when combining effect estimates from several heterogeneous studies, it is common to use a random-effects model. The usual frequentist or Bayesian models specify a normal distribution for the true effects. However, in many situations, the effect distribution is not normal, e.g., it can have thick tails, be skewed, or be multi-modal. A Bayesian nonparametric model based on mixtures of Dirichlet process priors has been proposed in the literature, for the purpose of accommodating the non-normality. We review this model and then describe a competitor, a semiparametric version which has the feature that it allows for a well-defined centrality parameter convenient for determining whether the overall effect is significant. This second Bayesian model is based on a different version of the Dirichlet process prior, and we call it the "conditional Dirichlet model". The package contains functions to carry out analyses based on either the ordinary or the conditional Dirichlet model, functions for calculating certain Bayes factors that provide a check on the appropriateness of the conditional Dirichlet model, and functions that enable an empirical Bayes selection of the precision parameter of the Dirichlet process. We illustrate the use of the package on two examples, and give an interpretation of the results in these two different scenarios.

  18. Disaggregation of spatial rainfall fields for hydrological modelling

    Directory of Open Access Journals (Sweden)

    N. G. Mackay

    2001-01-01

    Full Text Available Meteorological models generate fields of precipitation and other climatological variables as spatial averages at the scale of the grid used for numerical solution. The grid-scale can be large, particularly for GCMs, and disaggregation is required, for example to generate appropriate spatial-temporal properties of rainfall for coupling with surface-boundary conditions or more general hydrological applications. A method is presented here which considers the generation of the wet areas and the simulation of rainfall intensities separately. For the first task, a nearest-neighbour Markov scheme, based upon a Bayesian technique used in image processing, is implemented so as to preserve the structural features of the observed rainfall. Essentially, the large-scale field and the previously disaggregated field are used as evidence in an iterative procedure which aims at selecting a realisation according to the joint posterior probability distribution. In the second task the morphological characteristics of the field of rainfall intensities are reproduced through a random sampling of intensities according to a beta distribution and their allocation to pixels chosen so that the higher intensities are more likely to be further from the dry areas. The components of the scheme are assessed for Arkansas-Red River basin radar rainfall (hourly averages by disaggregating from 40 km x 40 km to 8 km x 8 km. The wet/dry scheme provides a good reproduction both of the number of correctly classified pixels and the coverage, while the intensitiy scheme generates fields with an adequate variance within the grid-squares, so that this scheme provides the hydrologist with a useful tool for the downscaling of meteorological model outputs. Keywords: Rainfall, disaggregation, General Circulation Model, Bayesian analysis

  19. Continuous Spatial Process Models for Spatial Extreme Values

    KAUST Repository

    Sang, Huiyan

    2010-01-28

    We propose a hierarchical modeling approach for explaining a collection of point-referenced extreme values. In particular, annual maxima over space and time are assumed to follow generalized extreme value (GEV) distributions, with parameters μ, σ, and ξ specified in the latent stage to reflect underlying spatio-temporal structure. The novelty here is that we relax the conditionally independence assumption in the first stage of the hierarchial model, an assumption which has been adopted in previous work. This assumption implies that realizations of the the surface of spatial maxima will be everywhere discontinuous. For many phenomena including, e. g., temperature and precipitation, this behavior is inappropriate. Instead, we offer a spatial process model for extreme values that provides mean square continuous realizations, where the behavior of the surface is driven by the spatial dependence which is unexplained under the latent spatio-temporal specification for the GEV parameters. In this sense, the first stage smoothing is viewed as fine scale or short range smoothing while the larger scale smoothing will be captured in the second stage of the modeling. In addition, as would be desired, we are able to implement spatial interpolation for extreme values based on this model. A simulation study and a study on actual annual maximum rainfall for a region in South Africa are used to illustrate the performance of the model. © 2009 International Biometric Society.

  20. Spatial data modelling for 3D GIS

    CERN Document Server

    Abdul-Rahman, Alias

    2007-01-01

    This book covers fundamental aspects of spatial data modelling specifically on the aspect of three-dimensional (3D) modelling and structuring. Realisation of ""true"" 3D GIS spatial system needs a lot of effort, and the process is taking place in various research centres and universities in some countries. The development of spatial data modelling for 3D objects is the focus of this book.

  1. LEVERAGING SPATIAL MODEL TO IMPROVE INDOOR TRACKING

    OpenAIRE

    Liu, L; Xu, W.; Penard, W.; S. Zlatanova

    2015-01-01

    In this paper, we leverage spatial model to process indoor localization results and then improve the track consisting of measured locations. We elaborate different parts of spatial model such as geometry, topology and semantics, and then present how they contribute to the processing of indoor tracks. The initial results of our experiment reveal that spatial model can support us to overcome problems such as tracks intersecting with obstacles and unstable shifts between two location measurement...

  2. Training image analysis for model error assessment and dimension reduction in Bayesian-MCMC solutions to inverse problems

    Science.gov (United States)

    Koepke, C.; Irving, J.

    2015-12-01

    Bayesian solutions to inverse problems in near-surface geophysics and hydrology have gained increasing popularity as a means of estimating not only subsurface model parameters, but also their corresponding uncertainties that can be used in probabilistic forecasting and risk analysis. In particular, Markov-chain-Monte-Carlo (MCMC) methods have attracted much recent attention as a means of statistically sampling from the Bayesian posterior distribution. In this regard, two approaches are commonly used to improve the computational tractability of the Bayesian-MCMC approach: (i) Forward models involving a simplification of the underlying physics are employed, which offer a significant reduction in the time required to calculate data, but generally at the expense of model accuracy, and (ii) the model parameter space is represented using a limited set of spatially correlated basis functions as opposed to a more intuitive high-dimensional pixel-based parameterization. It has become well understood that model inaccuracies resulting from (i) can lead to posterior parameter distributions that are highly biased and overly confident. Further, when performing model reduction as described in (ii), it is not clear how the prior distribution for the basis weights should be defined because simple (e.g., Gaussian or uniform) priors that may be suitable for a pixel-based parameterization may result in a strong prior bias when used for the weights. To address the issue of model error resulting from known forward model approximations, we generate a set of error training realizations and analyze them with principal component analysis (PCA) in order to generate a sparse basis. The latter is used in the MCMC inversion to remove the main model-error component from the residuals. To improve issues related to prior bias when performing model reduction, we also use a training realization approach, but this time models are simulated from the prior distribution and analyzed using independent

  3. Bayesian classification and regression trees for predicting incidence of cryptosporidiosis.

    Directory of Open Access Journals (Sweden)

    Wenbiao Hu

    Full Text Available BACKGROUND: Classification and regression tree (CART models are tree-based exploratory data analysis methods which have been shown to be very useful in identifying and estimating complex hierarchical relationships in ecological and medical contexts. In this paper, a Bayesian CART model is described and applied to the problem of modelling the cryptosporidiosis infection in Queensland, Australia. METHODOLOGY/PRINCIPAL FINDINGS: We compared the results of a Bayesian CART model with those obtained using a Bayesian spatial conditional autoregressive (CAR model. Overall, the analyses indicated that the nature and magnitude of the effect estimates were similar for the two methods in this study, but the CART model more easily accommodated higher order interaction effects. CONCLUSIONS/SIGNIFICANCE: A Bayesian CART model for identification and estimation of the spatial distribution of disease risk is useful in monitoring and assessment of infectious diseases prevention and control.

  4. Maximum likelihood Bayesian model averaging and its predictive analysis for groundwater reactive transport models

    Science.gov (United States)

    Lu, Dan; Ye, Ming; Curtis, Gary P.

    2015-10-01

    While Bayesian model averaging (BMA) has been widely used in groundwater modeling, it is infrequently applied to groundwater reactive transport modeling because of multiple sources of uncertainty in the coupled hydrogeochemical processes and because of the long execution time of each model run. To resolve these problems, this study analyzed different levels of uncertainty in a hierarchical way, and used the maximum likelihood version of BMA, i.e., MLBMA, to improve the computational efficiency. This study demonstrates the applicability of MLBMA to groundwater reactive transport modeling in a synthetic case in which twenty-seven reactive transport models were designed to predict the reactive transport of hexavalent uranium (U(VI)) based on observations at a former uranium mill site near Naturita, CO. These reactive transport models contain three uncertain model components, i.e., parameterization of hydraulic conductivity, configuration of model boundary, and surface complexation reactions that simulate U(VI) adsorption. These uncertain model components were aggregated into the alternative models by integrating a hierarchical structure into MLBMA. The modeling results of the individual models and MLBMA were analyzed to investigate their predictive performance. The predictive logscore results show that MLBMA generally outperforms the best model, suggesting that using MLBMA is a sound strategy to achieve more robust model predictions relative to a single model. MLBMA works best when the alternative models are structurally distinct and have diverse model predictions. When correlation in model structure exists, two strategies were used to improve predictive performance by retaining structurally distinct models or assigning smaller prior model probabilities to correlated models. Since the synthetic models were designed using data from the Naturita site, the results of this study are expected to provide guidance for real-world modeling. Limitations of applying MLBMA to the

  5. An imprecise Dirichlet model for Bayesian analysis of failure data including right-censored observations

    International Nuclear Information System (INIS)

    This paper is intended to make researchers in reliability theory aware of a recently introduced Bayesian model with imprecise prior distributions for statistical inference on failure data, that can also be considered as a robust Bayesian model. The model consists of a multinomial distribution with Dirichlet priors, making the approach basically nonparametric. New results for the model are presented, related to right-censored observations, where estimation based on this model is closely related to the product-limit estimator, which is an important statistical method to deal with reliability or survival data including right-censored observations. As for the product-limit estimator, the model considered in this paper aims at not using any information other than that provided by observed data, but our model fits into the robust Bayesian context which has the advantage that all inferences can be based on probabilities or expectations, or bounds for probabilities or expectations. The model uses a finite partition of the time-axis, and as such it is also related to life-tables

  6. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    Science.gov (United States)

    Patri, Jean-François; Diard, Julien; Perrier, Pascal

    2015-12-01

    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way. PMID:26497359

  7. Monitoring county-level chlamydia incidence in Texas, 2004 – 2005: application of empirical Bayesian smoothing and Exploratory Spatial Data Analysis (ESDA methods

    Directory of Open Access Journals (Sweden)

    Owens Chantelle J

    2009-02-01

    Full Text Available Abstract Background Chlamydia continues to be the most prevalent disease in the United States. Effective spatial monitoring of chlamydia incidence is important for successful implementation of control and prevention programs. The objective of this study is to apply Bayesian smoothing and exploratory spatial data analysis (ESDA methods to monitor Texas county-level chlamydia incidence rates by examining spatiotemporal patterns. We used county-level data on chlamydia incidence (for all ages, gender and races from the National Electronic Telecommunications System for Surveillance (NETSS for 2004 and 2005. Results Bayesian-smoothed chlamydia incidence rates were spatially dependent both in levels and in relative changes. Erath county had significantly (p 300 cases per 100,000 residents than its contiguous neighbors (195 or less in both years. Gaines county experienced the highest relative increase in smoothed rates (173% – 139 to 379. The relative change in smoothed chlamydia rates in Newton county was significantly (p Conclusion Bayesian smoothing and ESDA methods can assist programs in using chlamydia surveillance data to identify outliers, as well as relevant changes in chlamydia incidence in specific geographic units. Secondly, it may also indirectly help in assessing existing differences and changes in chlamydia surveillance systems over time.

  8. Local models for spatial analysis

    CERN Document Server

    Lloyd, Christopher D

    2010-01-01

    Focusing on solutions, this second edition provides guidance to a wide variety of real-world problems. The text presents a complete introduction to key concepts and a clear mapping of the methods discussed. It also explores connections between methods. New chapters address spatial patterning in single variables and spatial relations. In addition, every chapter now includes links to key related studies. The author clearly distinguishes between local and global methods and provides more detailed coverage of geographical weighting, image texture measures, local spatial autocorrelation, and multic

  9. Prediction and assimilation of surf-zone processes using a Bayesian network: Part II: Inverse models

    Science.gov (United States)

    Plant, Nathaniel G.; Holland, K. Todd

    2011-01-01

    A Bayesian network model has been developed to simulate a relatively simple problem of wave propagation in the surf zone (detailed in Part I). Here, we demonstrate that this Bayesian model can provide both inverse modeling and data-assimilation solutions for predicting offshore wave heights and depth estimates given limited wave-height and depth information from an onshore location. The inverse method is extended to allow data assimilation using observational inputs that are not compatible with deterministic solutions of the problem. These inputs include sand bar positions (instead of bathymetry) and estimates of the intensity of wave breaking (instead of wave-height observations). Our results indicate that wave breaking information is essential to reduce prediction errors. In many practical situations, this information could be provided from a shore-based observer or from remote-sensing systems. We show that various combinations of the assimilated inputs significantly reduce the uncertainty in the estimates of water depths and wave heights in the model domain. Application of the Bayesian network model to new field data demonstrated significant predictive skill (R2 = 0.7) for the inverse estimate of a month-long time series of offshore wave heights. The Bayesian inverse results include uncertainty estimates that were shown to be most accurate when given uncertainty in the inputs (e.g., depth and tuning parameters). Furthermore, the inverse modeling was extended to directly estimate tuning parameters associated with the underlying wave-process model. The inverse estimates of the model parameters not only showed an offshore wave height dependence consistent with results of previous studies but the uncertainty estimates of the tuning parameters also explain previously reported variations in the model parameters.

  10. Mapping and prediction of schistosomiasis in Nigeria using compiled survey data and Bayesian geospatial modelling

    Directory of Open Access Journals (Sweden)

    Uwem F. Ekpo

    2013-05-01

    Full Text Available Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35 of the country’s 36 states, including the federal capital territory of Abuja, and found in 462 unique locations out of 833 different survey locations. Schistosoma haematobium, the predominant species in Nigeria, was found in 368 locations (79.8% covering 31 states, S. mansoni in 78 (16.7% locations in 22 states and S. intercalatum in 17 (3.7% locations in two states. S. haematobium and S. mansoni were found to be co-endemic in 22 states, while co-occurrence of all three species was only seen in one state (Rivers. The average prevalence for each species at each survey location varied between 0.5% and 100% for S. haematobium, 0.2% to 87% for S. mansoni and 1% to 10% for S. intercalatum. The estimated prevalence of S. haematobium, based on Bayesian geospatial predictive modelling with a set of bioclimatic variables, ranged from 0.2% to 75% with a mean prevalence of 23% for the country as a whole (95% confidence interval (CI: 22.8-23.1%. The model suggests that the mean temperature, annual precipitation and soil acidity significantly influence the spatial distribution. Prevalence estimates, adjusted for school-aged children in 2010, showed that the prevalence is <10% in most states with a few reaching as high as 50%. It was estimated that 11.3 million children require praziquantel annually (95% CI: 10.3-12.2 million.

  11. Bayesian inference for a wavefront model of the Neolithisation of Europe

    CERN Document Server

    Baggaley, Andrew W; Shukurov, Anvar; Boys, Richard J; Golightly, Andrew

    2012-01-01

    We consider a wavefront model for the spread of Neolithic culture across Europe, and use Bayesian inference techniques to provide estimates for the parameters within this model, as constrained by radiocarbon data from Southern and Western Europe. Our wavefront model allows for both an isotropic background spread (incorporating the effects of local geography), and a localized anisotropic spread associated with major waterways. We introduce an innovative numerical scheme to track the wavefront, allowing us to simulate the times of the first arrival at any site orders of magnitude more efficiently than traditional PDE approaches. We adopt a Bayesian approach to inference and use Gaussian process emulators to facilitate further increases in efficiency in the inference scheme, thereby making Markov chain Monte Carlo methods practical. We allow for uncertainty in the fit of our model, and also infer a parameter specifying the magnitude of this uncertainty. We obtain a magnitude for the background spread of order 1 ...

  12. A Bayesian Combined Model for Time-Dependent Turning Movement Proportions Estimation at Intersections

    Directory of Open Access Journals (Sweden)

    Pengpeng Jiao

    2014-01-01

    Full Text Available Time-dependent turning movement flows are very important input data for intelligent transportation systems but are impossible to be detected directly through current traffic surveillance systems. Existing estimation models have proved to be not accurate and reliable enough during all intervals. An improved way to address this problem is to develop a combined model framework that can integrate multiple submodels running simultaneously. This paper first presents a back propagation neural network model to estimate dynamic turning movements, as well as the self-adaptive learning rate approach and the gradient descent with momentum method for solving. Second, this paper develops an efficient Kalman filtering model and designs a revised sequential Kalman filtering algorithm. Based on the Bayesian method using both historical data and currently estimated results for error calibration, this paper further integrates above two submodels into a Bayesian combined model framework and proposes a corresponding algorithm. A field survey is implemented at an intersection in Beijing city to collect both time series of link counts and actual time-dependent turning movement flows, including historical and present data. The reported estimation results show that the Bayesian combined model is much more accurate and stable than other models.

  13. Integration of Geophysical Data into Structural Geological Modelling through Bayesian Networks

    Science.gov (United States)

    de la Varga, Miguel; Wellmann, Florian; Murdie, Ruth

    2016-04-01

    Structural geological models are widely used to represent the spatial distribution of relevant geological features. Several techniques exist to construct these models on the basis of different assumptions and different types of geological observations (e.g. Jessell et al., 2014). However, two problems are prevalent when constructing models: (i) observations and assumptions, and therefore also the constructed model, are subject to uncertainties, and (ii) additional information, such as geophysical data, is often available, but cannot be considered directly in the geological modelling step. In our work, we propose the integration of all available data into a Bayesian network including the generation of the implicit geological method by means of interpolation functions (Mallet, 1992; Lajaunie et al., 1997; Mallet, 2004; Carr et al., 2001; Hillier et al., 2014). As a result, we are able to increase the certainty of the resultant models as well as potentially learn features of our regional geology through data mining and information theory techniques. MCMC methods are used in order to optimize computational time and assure the validity of the results. Here, we apply the aforementioned concepts in a 3-D model of the Sandstone Greenstone Belt in the Archean Yilgarn Craton in Western Australia. The example given, defines the uncertainty in the thickness of greenstone as limited by Bouguer anomaly and the internal structure of the greenstone as limited by the magnetic signature of a banded iron formation. The incorporation of the additional data and specially the gravity provides an important reduction of the possible outcomes and therefore the overall uncertainty. References Carr, C. J., K. R. Beatson, B. J. Cherrie, J. T. Mitchell, R. W. Fright, C. B. McCallum, and R. T. Evans, 2001, Reconstruction and representation of 3D objects with radial basis functions: Proceedings of the 28th annual conference on Computer graphics and interactive techniques, 67-76. Jessell, M

  14. A Hierarchical Bayesian Model to Predict Self-Thinning Line for Chinese Fir in Southern China.

    Directory of Open Access Journals (Sweden)

    Xiongqing Zhang

    Full Text Available Self-thinning is a dynamic equilibrium between forest growth and mortality at full site occupancy. Parameters of the self-thinning lines are often confounded by differences across various stand and site conditions. For overcoming the problem of hierarchical and repeated measures, we used hierarchical Bayesian method to estimate the self-thinning line. The results showed that the self-thinning line for Chinese fir (Cunninghamia lanceolata (Lamb.Hook. plantations was not sensitive to the initial planting density. The uncertainty of model predictions was mostly due to within-subject variability. The simulation precision of hierarchical Bayesian method was better than that of stochastic frontier function (SFF. Hierarchical Bayesian method provided a reasonable explanation of the impact of other variables (site quality, soil type, aspect, etc. on self-thinning line, which gave us the posterior distribution of parameters of self-thinning line. The research of self-thinning relationship could be benefit from the use of hierarchical Bayesian method.

  15. Bayesian model for strategic level risk assessment in continuing airthworthiness of air transport

    OpenAIRE

    Jayakody-Arachchige, Dhanapala

    2010-01-01

    Continuing airworthiness (CAW) of aircraft is an essential pre-requisite for the safe operation of air transport. Human errors that occur in CAW organizations and processes could undermine the airworthiness and constitute a risk to flight safety. This thesis reports on a generic Bayesian model that has been designed to assess and quantify this risk. The model removes the vagueness inherent in the subjective methods of assessment of risk and its qualitative expression. Instead, relying on a...

  16. A Bayesian Based Functional Mixed-Effects Model for Analysis of LC-MS Data

    OpenAIRE

    Befekadu, Getachew K.; Tadesse, Mahlet G.; Ressom, Habtom W

    2009-01-01

    A Bayesian multilevel functional mixed-effects model with group specific random-effects is presented for analysis of liquid chromatography-mass spectrometry (LC-MS) data. The proposed framework allows alignment of LC-MS spectra with respect to both retention time (RT) and mass-to-charge ratio (m/z). Affine transformations are incorporated within the model to account for any variability along the RT and m/z dimensions. Simultaneous posterior inference of all unknown parameters is accomplished ...

  17. A Bayesian model for predicting face recognition performance using image quality

    OpenAIRE

    Dutta, Abhishek; Veldhuis, Raymond; Spreeuwers, Luuk

    2014-01-01

    Quality of a pair of facial images is a strong indicator of the uncertainty in decision about identity based on that image pair. In this paper, we describe a Bayesian approach to model the relation between image quality (like pose, illumination, noise, sharpness, etc) and corresponding face recognition performance. Experiment results based on the MultiPIE data set show that our model can accurately aggregate verification samples into groups for which the verification performance varies fairly...

  18. Determinants of Low Birth Weight in Malawi: Bayesian Geo-Additive Modelling

    OpenAIRE

    Ngwira, Alfred; Stanley, Christopher C.

    2015-01-01

    Studies on factors of low birth weight in Malawi have neglected the flexible approach of using smooth functions for some covariates in models. Such flexible approach reveals detailed relationship of covariates with the response. The study aimed at investigating risk factors of low birth weight in Malawi by assuming a flexible approach for continuous covariates and geographical random effect. A Bayesian geo-additive model for birth weight in kilograms and size of the child at birth (less than ...

  19. Flexible Bayesian Survival Modeling with Semiparametric Time-Dependent and Shape-Restricted Covariate Effects

    OpenAIRE

    Murray, Thomas A.; Hobbs, Brian P.; Sargent, Daniel J; Carlin, Bradley P.

    2016-01-01

    Presently, there are few options with available software to perform a fully Bayesian analysis of time-to-event data wherein the hazard is estimated semi- or non-parametrically. One option is the piecewise exponential model, which requires an often unrealistic assumption that the hazard is piecewise constant over time. The primary aim of this paper is to construct a tractable semiparametric alternative to the piecewise exponential model that assumes the hazard is continuous, and to provide mod...

  20. Bayesian uncertainty analysis for complex physical systems modelled by computer simulators with applications to tipping points

    Science.gov (United States)

    Caiado, C. C. S.; Goldstein, M.

    2015-09-01

    In this paper we present and illustrate basic Bayesian techniques for the uncertainty analysis of complex physical systems modelled by computer simulators. We focus on emulation and history matching and also discuss the treatment of observational errors and structural discrepancies in time series. We exemplify such methods using a four-box model for the termohaline circulation. We show how these methods may be applied to systems containing tipping points and how to treat possible discontinuities using multiple emulators.

  1. deBInfer: Bayesian inference for dynamical models of biological systems in R

    OpenAIRE

    Boersch-Supan, Philipp H.; Johnson, Leah R

    2016-01-01

    1. Differential equations (DEs) are commonly used to model the temporal evolution of biological systems, but statistical methods for comparing DE models to data and for parameter inference are relatively poorly developed. This is especially problematic in the context of biological systems where observations are often noisy and only a small number of time points may be available. 2. Bayesian approaches offer a coherent framework for parameter inference that can account for multiple sources of ...

  2. A Note on Bayesian Estimation for the Negative-Binomial Model

    OpenAIRE

    L. Lio, Y.

    2009-01-01

    2000 Mathematics Subject Classification: 62F15. The Negative Binomial model, which is generated by a simple mixture model, has been widely applied in the social, health and economic market prediction. The most commonly used methods were the maximum likelihood estimate (MLE) and the moment method estimate (MME). Bradlow et al. (2002) proposed a Bayesian inference with beta-prime and Pearson Type VI as priors for the negative binomial distribution. It is due to the complicated posterior dens...

  3. A Bayesian Estimation of Real Business-Cycle Models for the Turkish Economy

    OpenAIRE

    Hüseyin Taştan; Bekir Aşık

    2014-01-01

    We estimate a canonical small open economy real business-cycle model and its several extensions using a Bayesian approach to explore the effects of different structural shocks on macroeconomic fluctuations in Turkey. Alternative models include several theoretical exogenous shocks, such as those to temporary and permanent productivity, world interest rates, preferences, and domestic spending, as driving forces together with financial frictions. Results indicate that output is mostly driven by ...

  4. KING GEORGE ISLAND SPATIAL DATA MODEL

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Distribution,interoperability,interactivity,component are four main features of distributed GIS.Based on the principle of hypermap,hypermedia and distributed database,the paper comes up with a kind of distributed spatial data model which is in accordance with those features of distributed GIS.The model takes catalog service as the outline of spatial information globalization,and defines data structure of hypermap node in different level.Based on the model,it is feasible to manage and process distributed spatial information,and integrate multi_source,heterogeneous spatial data into a framework.Traditionally,to retrieve and access spatial data via Internet is only by theme or map name.With the concept of the model,it is possible to retrieve,load,and link spatial data by vector_based graphics on the Internet.

  5. A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning

    CERN Document Server

    Brochu, Eric; de Freitas, Nando

    2010-01-01

    We present a tutorial on Bayesian optimization, a method of finding the maximum of expensive cost functions. Bayesian optimization employs the Bayesian technique of setting a prior over the objective function and combining it with evidence to get a posterior function. This permits a utility-based selection of the next observation to make on the objective function, which must take into account both exploration (sampling from areas of high uncertainty) and exploitation (sampling areas likely to offer improvement over the current best observation). We also present two detailed extensions of Bayesian optimization, with experiments---active user modelling with preferences, and hierarchical reinforcement learning---and a discussion of the pros and cons of Bayesian optimization based on our experiences.

  6. Leveraging Spatial Model to Improve Indoor Tracking

    Science.gov (United States)

    Liu, L.; Xu, W.; Penard, W.; Zlatanova, S.

    2015-05-01

    In this paper, we leverage spatial model to process indoor localization results and then improve the track consisting of measured locations. We elaborate different parts of spatial model such as geometry, topology and semantics, and then present how they contribute to the processing of indoor tracks. The initial results of our experiment reveal that spatial model can support us to overcome problems such as tracks intersecting with obstacles and unstable shifts between two location measurements. In the future, we will investigate more exceptions of indoor tracking results and then develop additional spatial methods to reduce errors of indoor tracks.

  7. cudaBayesreg: Parallel Implementation of a Bayesian Multilevel Model for fMRI Data Analysis

    Directory of Open Access Journals (Sweden)

    Adelino R. Ferreira da Silva

    2011-10-01

    Full Text Available Graphic processing units (GPUs are rapidly gaining maturity as powerful general parallel computing devices. A key feature in the development of modern GPUs has been the advancement of the programming model and programming tools. Compute Unified Device Architecture (CUDA is a software platform for massively parallel high-performance computing on Nvidia many-core GPUs. In functional magnetic resonance imaging (fMRI, the volume of the data to be processed, and the type of statistical analysis to perform call for high-performance computing strategies. In this work, we present the main features of the R-CUDA package cudaBayesreg which implements in CUDA the core of a Bayesian multilevel model for the analysis of brain fMRI data. The statistical model implements a Gibbs sampler for multilevel/hierarchical linear models with a normal prior. The main contribution for the increased performance comes from the use of separate threads for fitting the linear regression model at each voxel in parallel. The R-CUDA implementation of the Bayesian model proposed here has been able to reduce significantly the run-time processing of Markov chain Monte Carlo (MCMC simulations used in Bayesian fMRI data analyses. Presently, cudaBayesreg is only configured for Linux systems with Nvidia CUDA support.

  8. THE PAndAS VIEW OF THE ANDROMEDA SATELLITE SYSTEM. I. A BAYESIAN SEARCH FOR DWARF GALAXIES USING SPATIAL AND COLOR-MAGNITUDE INFORMATION

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Nicolas F.; Ibata, Rodrigo A. [Observatoire Astronomique de Strasbourg, Université de Strasbourg, CNRS, UMR 7550, 11 rue de l' Université, F-67000 Strasbourg (France); McConnachie, Alan W. [NRC Herzberg Institute of Astrophysics, 5071 West Saanich Road, Victoria, BC V9E 2E7 (Canada); Mackey, A. Dougal [Research School of Astronomy and Astrophysics, The Australian National University, Mount Stromlo Observatory, via Cotter Road, Weston, ACT 2611 (Australia); Ferguson, Annette M. N. [Institute for Astronomy, University of Edinburgh, Blackford Hill, Edinburgh EH9 3HJ (United Kingdom); Irwin, Michael J. [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Lewis, Geraint F. [Institute of Astronomy, School of Physics A28, University of Sydney, NSW 2006 (Australia); Fardal, Mark A., E-mail: nicolas.martin@astro.unistra.fr [Department of Astronomy, University of Massachusetts, Amherst, MA 01003 (United States)

    2013-10-20

    We present a generic algorithm to search for dwarf galaxies in photometric catalogs and apply it to the Pan-Andromeda Archaeological Survey (PAndAS). The algorithm is developed in a Bayesian framework and, contrary to most dwarf galaxy search codes, makes use of both the spatial and color-magnitude information of sources in a probabilistic approach. Accounting for the significant contamination from the Milky Way foreground and from the structured stellar halo of the Andromeda galaxy, we recover all known dwarf galaxies in the PAndAS footprint with high significance, even for the least luminous ones. Some Andromeda globular clusters are also recovered and, in one case, discovered. We publish a list of the 143 most significant detections yielded by the algorithm. The combined properties of the 39 most significant isolated detections show hints that at least some of these trace genuine dwarf galaxies, too faint to be individually detected. Follow-up observations by the community are mandatory to establish which are real members of the Andromeda satellite system. The search technique presented here will be used in an upcoming contribution to determine the PAndAS completeness limits for dwarf galaxies. Although here tuned to the search of dwarf galaxies in the PAndAS data, the algorithm can easily be adapted to the search for any localized overdensity whose properties can be modeled reliably in the parameter space of any catalog.

  9. APPLICATION OF BAYESIAN AND GEOSTATISTICAL MODELING TO THE ENVIRONMENTAL MONITORING OF CS-137 AT THE IDAHO NATIONAL LABORATORY

    Energy Technology Data Exchange (ETDEWEB)

    Kara G. Eby

    2010-08-01

    At the Idaho National Laboratory (INL) Cs-137 concentrations above the U.S. Environmental Protection Agency risk-based threshold of 0.23 pCi/g may increase the risk of human mortality due to cancer. As a leader in nuclear research, the INL has been conducting nuclear activities for decades. Elevated anthropogenic radionuclide levels including Cs-137 are a result of atmospheric weapons testing, the Chernobyl accident, and nuclear activities occurring at the INL site. Therefore environmental monitoring and long-term surveillance of Cs-137 is required to evaluate risk. However, due to the large land area involved, frequent and comprehensive monitoring is limited. Developing a spatial model that predicts Cs-137 concentrations at unsampled locations will enhance the spatial characterization of Cs-137 in surface soils, provide guidance for an efficient monitoring program, and pinpoint areas requiring mitigation strategies. The predictive model presented herein is based on applied geostatistics using a Bayesian analysis of environmental characteristics across the INL site, which provides kriging spatial maps of both Cs-137 estimates and prediction errors. Comparisons are presented of two different kriging methods, showing that the use of secondary information (i.e., environmental characteristics) can provide improved prediction performance in some areas of the INL site.

  10. Spatial Pattern Dynamics in Aquatic Ecosystem Modelling

    NARCIS (Netherlands)

    Hong Li

    2009-01-01

    In this thesis, several modelling approaches are explored to represent spatial pattern dynamics of aquatic populations in aquatic ecosystems by the combination of models, knowledge and data in different scales. It is shown that including spatially distributed inputs retrieved from Remote Sensing i

  11. Dynamic spatial panels : models, methods, and inferences

    NARCIS (Netherlands)

    Elhorst, J. Paul

    2012-01-01

    This paper provides a survey of the existing literature on the specification and estimation of dynamic spatial panel data models, a collection of models for spatial panels extended to include one or more of the following variables and/or error terms: a dependent variable lagged in time, a dependent

  12. Intelligent spatial ecosystem modeling using parallel processors

    International Nuclear Information System (INIS)

    Spatial modeling of ecosystems is essential if one's modeling goals include developing a relatively realistic description of past behavior and predictions of the impacts of alternative management policies on future ecosystem behavior. Development of these models has been limited in the past by the large amount of input data required and the difficulty of even large mainframe serial computers in dealing with large spatial arrays. These two limitations have begun to erode with the increasing availability of remote sensing data and GIS systems to manipulate it, and the development of parallel computer systems which allow computation of large, complex, spatial arrays. Although many forms of dynamic spatial modeling are highly amenable to parallel processing, the primary focus in this project is on process-based landscape models. These models simulate spatial structure by first compartmentalizing the landscape into some geometric design and then describing flows within compartments and spatial processes between compartments according to location-specific algorithms. The authors are currently building and running parallel spatial models at the regional scale for the Patuxent River region in Maryland, the Everglades in Florida, and Barataria Basin in Louisiana. The authors are also planning a project to construct a series of spatially explicit linked ecological and economic simulation models aimed at assessing the long-term potential impacts of global climate change

  13. Bayesian rules and stochastic models for high accuracy prediction of solar radiation

    International Nuclear Information System (INIS)

    Highlights: • Global radiation prediction and PV energy integration. • Artificial intelligence and stochastic modeling in order to use the time series formalism. • Using Bayesian rules to select models. • MLP and ARMA forecasters are equivalent (nRMSE close to 40.5% for the both). • The hybridization of the three predictors (ARMA, MLP and persistence) induces very good results (nRMSE = 36.6%). - Abstract: It is essential to find solar predictive methods to massively insert renewable energies on the electrical distribution grid. The goal of this study is to find the best methodology allowing predicting with high accuracy the hourly global radiation. The knowledge of this quantity is essential for the grid manager or the private PV producer in order to anticipate fluctuations related to clouds occurrences and to stabilize the injected PV power. In this paper, we test both methodologies: single and hybrid predictors. In the first class, we include the multi-layer perceptron (MLP), auto-regressive and moving average (ARMA), and persistence models. In the second class, we mix these predictors with Bayesian rules to obtain ad hoc models selections, and Bayesian averages of outputs related to single models. If MLP and ARMA are equivalent (nRMSE close to 40.5% for the both), this hybridization allows a nRMSE gain upper than 14% points compared to the persistence estimation (nRMSE = 37% versus 51%)

  14. Bayesian model selection framework for identifying growth patterns in filamentous fungi.

    Science.gov (United States)

    Lin, Xiao; Terejanu, Gabriel; Shrestha, Sajan; Banerjee, Sourav; Chanda, Anindya

    2016-06-01

    This paper describes a rigorous methodology for quantification of model errors in fungal growth models. This is essential to choose the model that best describes the data and guide modeling efforts. Mathematical modeling of growth of filamentous fungi is necessary in fungal biology for gaining systems level understanding on hyphal and colony behaviors in different environments. A critical challenge in the development of these mathematical models arises from the indeterminate nature of their colony architecture, which is a result of processing diverse intracellular signals induced in response to a heterogeneous set of physical and nutritional factors. There exists a practical gap in connecting fungal growth models with measurement data. Here, we address this gap by introducing the first unified computational framework based on Bayesian inference that can quantify individual model errors and rank the statistical models based on their descriptive power against data. We show that this Bayesian model comparison is just a natural formalization of Occam׳s razor. The application of this framework is discussed in comparing three models in the context of synthetic data generated from a known true fungal growth model. This framework of model comparison achieves a trade-off between data fitness and model complexity and the quantified model error not only helps in calibrating and comparing the models, but also in making better predictions and guiding model refinements. PMID:27000772

  15. Bayesian spatio-temporal modelling of tobacco-related cancer mortality in Switzerland

    Directory of Open Access Journals (Sweden)

    Verena Jürgens

    2013-05-01

    Full Text Available Tobacco smoking is a main cause of disease in Switzerland; lung cancer being the most common cancer mortality in men and the second most common in women. Although disease-specific mortality is decreasing in men, it is steadily increasing in women. The four language regions in this country might play a role in this context as they are influenced in different ways by the cultural and social behaviour of neighbouring countries. Bayesian hierarchical spatio-temporal, negative binomial models were fitted on subgroup-specific death rates indirectly standardized by national references to explore age- and gender-specific spatio-temporal patterns of mortality due to lung cancer and other tobacco-related cancers in Switzerland for the time period 1969-2002. Differences influenced by linguistic region and life in rural or urban areas were also accounted for. Male lung cancer mortality was found to be rather homogeneous in space, whereas women were confirmed to be more affected in urban regions. Compared to the German-speaking part, female mortality was higher in the French-speaking part of the country, a result contradicting other reports of similar comparisons between France and Germany. The spatio-temporal patterns of mortality were similar for lung cancer and other tobacco-related cancers. The estimated mortality maps can support the planning in health care services and evaluation of a national tobacco control programme. Better understanding of spatial and temporal variation of cancer of the lung and other tobacco-related cancers may help in allocating resources for more effective screening, diagnosis and therapy. The methodology can be applied to similar studies in other settings.

  16. Bayesian modelling of the emission spectrum of the Joint European Torus Lithium Beam Emission Spectroscopy system.

    Science.gov (United States)

    Kwak, Sehyun; Svensson, J; Brix, M; Ghim, Y-C

    2016-02-01

    A Bayesian model of the emission spectrum of the JET lithium beam has been developed to infer the intensity of the Li I (2p-2s) line radiation and associated uncertainties. The detected spectrum for each channel of the lithium beam emission spectroscopy system is here modelled by a single Li line modified by an instrumental function, Bremsstrahlung background, instrumental offset, and interference filter curve. Both the instrumental function and the interference filter curve are modelled with non-parametric Gaussian processes. All free parameters of the model, the intensities of the Li line, Bremsstrahlung background, and instrumental offset, are inferred using Bayesian probability theory with a Gaussian likelihood for photon statistics and electronic background noise. The prior distributions of the free parameters are chosen as Gaussians. Given these assumptions, the intensity of the Li line and corresponding uncertainties are analytically available using a Bayesian linear inversion technique. The proposed approach makes it possible to extract the intensity of Li line without doing a separate background subtraction through modulation of the Li beam. PMID:26931843

  17. A Bayesian Hierarchical Model for Reconstructing Sea Levels: From Raw Data to Rates of Change

    CERN Document Server

    Cahill, Niamh; Horton, Benjamin P; Parnell, Andrew C

    2015-01-01

    We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical ({\\delta}13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation, which is flexible enough to formally accommodate additional proxies (in this case bulk-sediment {\\delta}13C values); (2) A chronology developed from an existing Bchron age-depth model, and (3) An existing errors-in-variables integrated Gaussian process (EIV-IGP) model for estimating rates of sea-level change. We illustrate our approach using a case study of Common Era sea-level variability from New Jersey, U.S.A. We develop a new Bayesian transfer function (B-TF), with and without the {\\delta}13C proxy and compare our results to those from a widely...

  18. Bayesian approach to color-difference models based on threshold and constant-stimuli methods.

    Science.gov (United States)

    Brusola, Fernando; Tortajada, Ignacio; Lengua, Ismael; Jordá, Begoña; Peris, Guillermo

    2015-06-15

    An alternative approach based on statistical Bayesian inference is presented to deal with the development of color-difference models and the precision of parameter estimation. The approach was applied to simulated data and real data, the latter published by selected authors involved with the development of color-difference formulae using traditional methods. Our results show very good agreement between the Bayesian and classical approaches. Among other benefits, our proposed methodology allows one to determine the marginal posterior distribution of each random individual parameter of the color-difference model. In this manner, it is possible to analyze the effect of individual parameters on the statistical significance calculation of a color-difference equation. PMID:26193510

  19. From least squares to multilevel modeling: A graphical introduction to Bayesian inference

    Science.gov (United States)

    Loredo, Thomas J.

    2016-01-01

    This tutorial presentation will introduce some of the key ideas and techniques involved in applying Bayesian methods to problems in astrostatistics. The focus will be on the big picture: understanding the foundations (interpreting probability, Bayes's theorem, the law of total probability and marginalization), making connections to traditional methods (propagation of errors, least squares, chi-squared, maximum likelihood, Monte Carlo simulation), and highlighting problems where a Bayesian approach can be particularly powerful (Poisson processes, density estimation and curve fitting with measurement error). The "graphical" component of the title reflects an emphasis on pictorial representations of some of the math, but also on the use of graphical models (multilevel or hierarchical models) for analyzing complex data. Code for some examples from the talk will be available to participants, in Python and in the Stan probabilistic programming language.

  20. PARALLEL ADAPTIVE MULTILEVEL SAMPLING ALGORITHMS FOR THE BAYESIAN ANALYSIS OF MATHEMATICAL MODELS

    KAUST Repository

    Prudencio, Ernesto

    2012-01-01

    In recent years, Bayesian model updating techniques based on measured data have been applied to many engineering and applied science problems. At the same time, parallel computational platforms are becoming increasingly more powerful and are being used more frequently by the engineering and scientific communities. Bayesian techniques usually require the evaluation of multi-dimensional integrals related to the posterior probability density function (PDF) of uncertain model parameters. The fact that such integrals cannot be computed analytically motivates the research of stochastic simulation methods for sampling posterior PDFs. One such algorithm is the adaptive multilevel stochastic simulation algorithm (AMSSA). In this paper we discuss the parallelization of AMSSA, formulating the necessary load balancing step as a binary integer programming problem. We present a variety of results showing the effectiveness of load balancing on the overall performance of AMSSA in a parallel computational environment.